Using Framework Analysis in nursing research: a worked example.
Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica
2013-11-01
To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.
Global/local methods research using a common structural analysis framework
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.
1991-01-01
Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
Multi-Disciplinary Analysis and Optimization Frameworks
NASA Technical Reports Server (NTRS)
Naiman, Cynthia Gutierrez
2009-01-01
Since July 2008, the Multidisciplinary Analysis & Optimization Working Group (MDAO WG) of the Systems Analysis Design & Optimization (SAD&O) discipline in the Fundamental Aeronautics Program s Subsonic Fixed Wing (SFW) project completed one major milestone, Define Architecture & Interfaces for Next Generation Open Source MDAO Framework Milestone (9/30/08), and is completing the Generation 1 Framework validation milestone, which is due December 2008. Included in the presentation are: details of progress on developing the Open MDAO framework, modeling and testing the Generation 1 Framework, progress toward establishing partnerships with external parties, and discussion of additional potential collaborations
Informing the NCA: EPA's Climate Change Impact and Risk Analysis Framework
NASA Astrophysics Data System (ADS)
Sarofim, M. C.; Martinich, J.; Kolian, M.; Crimmins, A. R.
2017-12-01
The Climate Change Impact and Risk Analysis (CIRA) framework is designed to quantify the physical impacts and economic damages in the United States under future climate change scenarios. To date, the framework has been applied to 25 sectors, using scenarios and projections developed for the Fourth National Climate Assessment. The strength of this framework has been in the use of consistent climatic, socioeconomic, and technological assumptions and inputs across the impact sectors to maximize the ease of cross-sector comparison. The results of the underlying CIRA sectoral analyses are informing the sustained assessment process by helping to address key gaps related to economic valuation and risk. Advancing capacity and scientific literature in this area has created opportunity to consider future applications and strengthening of the framework. This presentation will describe the CIRA framework, present results for various sectors such as heat mortality, air & water quality, winter recreation, and sea level rise, and introduce potential enhancements that can improve the utility of the framework for decision analysis.
ERIC Educational Resources Information Center
Newton, Paul E.
2016-01-01
This paper argues that the dominant framework for conceptualizing validation evidence and analysis--the "five sources" framework from the 1999 "Standards"--is seriously limited. Its limitation raises a significant barrier to understanding the nature of comprehensive validation, and this presents a significant threat to…
Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.
Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini
2016-01-01
This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P < .001 and P < .003, respectively). Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.
A Role for Language Analysis in Mathematics Textbook Analysis
ERIC Educational Resources Information Center
O'Keeffe, Lisa; O'Donoghue, John
2015-01-01
In current textbook analysis research, there is a strong focus on the content, structure and expectation presented by the textbook as elements for analysis. This research moves beyond such foci and proposes a framework for textbook language analysis which is intended to be integrated into an overall framework for mathematics textbook analysis. The…
A framework for biodynamic feedthrough analysis--part I: theoretical foundations.
Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H
2014-09-01
Biodynamic feedthrough (BDFT) is a complex phenomenon, which has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, a framework for biodynamic feedthrough analysis is presented. The goal of this framework is two-fold. First, it provides some common ground between the seemingly large range of different approaches existing in the BDFT literature. Second, the framework itself allows for gaining new insights into BDFT phenomena. It will be shown how relevant signals can be obtained from measurement, how different BDFT dynamics can be derived from them, and how these different dynamics are related. Using the framework, BDFT can be dissected into several dynamical relationships, each relevant in understanding BDFT phenomena in more detail. The presentation of the BDFT framework is divided into two parts. This paper, Part I, addresses the theoretical foundations of the framework. Part II, which is also published in this issue, addresses the validation of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation.
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2012-01-01
A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2011-01-01
A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.
ERIC Educational Resources Information Center
Pashby, Karen
2015-01-01
This paper presents a critical framework applied to findings from a critical discourse analysis of curriculum and lesson plans in Alberta to examine the assumption that Canada is an ideal place for global citizenship education. The analysis draws on a framework that presents a critique of modernity to recognize a conflation within calls for new…
A hierarchical-multiobjective framework for risk management
NASA Technical Reports Server (NTRS)
Haimes, Yacov Y.; Li, Duan
1991-01-01
A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.
Boom Minimization Framework for Supersonic Aircraft Using CFD Analysis
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Rallabhandi, Sriram K.
2010-01-01
A new framework is presented for shape optimization using analytical shape functions and high-fidelity computational fluid dynamics (CFD) via Cart3D. The focus of the paper is the system-level integration of several key enabling analysis tools and automation methods to perform shape optimization and reduce sonic boom footprint. A boom mitigation case study subject to performance, stability and geometrical requirements is presented to demonstrate a subset of the capabilities of the framework. Lastly, a design space exploration is carried out to assess the key parameters and constraints driving the design.
Negotiation Process Analysis: A Research and Training Tool.
ERIC Educational Resources Information Center
Williams, Timothy
This paper proposes the use of interaction process analysis to study negotiation behaviors. Following a review of current literature in the field, the paper presents a theoretical framework for the analysis of both labor/management and social negotiation processes. Central to the framework described are two systems of activities that together…
Salmon, P; Williamson, A; Lenné, M; Mitsopoulos-Rubens, E; Rudin-Brown, C M
2010-08-01
Safety-compromising accidents occur regularly in the led outdoor activity domain. Formal accident analysis is an accepted means of understanding such events and improving safety. Despite this, there remains no universally accepted framework for collecting and analysing accident data in the led outdoor activity domain. This article presents an application of Rasmussen's risk management framework to the analysis of the Lyme Bay sea canoeing incident. This involved the development of an Accimap, the outputs of which were used to evaluate seven predictions made by the framework. The Accimap output was also compared to an analysis using an existing model from the led outdoor activity domain. In conclusion, the Accimap output was found to be more comprehensive and supported all seven of the risk management framework's predictions, suggesting that it shows promise as a theoretically underpinned approach for analysing, and learning from, accidents in the led outdoor activity domain. STATEMENT OF RELEVANCE: Accidents represent a significant problem within the led outdoor activity domain. This article presents an evaluation of a risk management framework that can be used to understand such accidents and to inform the development of accident countermeasures and mitigation strategies for the led outdoor activity domain.
ERIC Educational Resources Information Center
Psillos, D.; Tselfes, Vassilis; Kariotoglou, Petros
2004-01-01
In the present paper we propose a theoretical framework for an epistemological modelling of teaching-learning (didactical) activities, which draws on recent studies of scientific practice. We present and analyse the framework, which includes three categories: namely, Cosmos-Evidence-Ideas (CEI). We also apply this framework in order to model a…
Multidimensional Functional Behaviour Assessment within a Problem Analysis Framework.
ERIC Educational Resources Information Center
Ryba, Ken; Annan, Jean
This paper presents a new approach to contextualized problem analysis developed for use with multimodal Functional Behaviour Assessment (FBA) at Massey University in Auckland, New Zealand. The aim of problem analysis is to simplify complex problems that are difficult to understand. It accomplishes this by providing a high order framework that can…
CLARA: CLAS12 Reconstruction and Analysis Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gyurjyan, Vardan; Matta, Sebastian Mancilla; Oyarzun, Ricardo
2016-11-01
In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.
A framework for analysis of sentinel events in medical student education.
Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A
2013-11-01
Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.
A Decision Support Framework For Science-Based, Multi-Stakeholder Deliberation: A Coral Reef Example
We present a decision support framework for science-based assessment and multi-stakeholder deliberation. The framework consists of two parts: a DPSIR (Drivers-Pressures-States-Impacts-Responses) analysis to identify the important causal relationships among anthropogenic environ...
Hankivsky, Olena; Grace, Daniel; Hunting, Gemma; Giesbrecht, Melissa; Fridkin, Alycia; Rudrum, Sarah; Ferlatte, Olivier; Clark, Natalie
2014-12-10
In the field of health, numerous frameworks have emerged that advance understandings of the differential impacts of health policies to produce inclusive and socially just health outcomes. In this paper, we present the development of an important contribution to these efforts - an Intersectionality-Based Policy Analysis (IBPA) Framework. Developed over the course of two years in consultation with key stakeholders and drawing on best and promising practices of other equity-informed approaches, this participatory and iterative IBPA Framework provides guidance and direction for researchers, civil society, public health professionals and policy actors seeking to address the challenges of health inequities across diverse populations. Importantly, we present the application of the IBPA Framework in seven priority health-related policy case studies. The analysis of each case study is focused on explaining how IBPA: 1) provides an innovative structure for critical policy analysis; 2) captures the different dimensions of policy contexts including history, politics, everyday lived experiences, diverse knowledges and intersecting social locations; and 3) generates transformative insights, knowledge, policy solutions and actions that cannot be gleaned from other equity-focused policy frameworks. The aim of this paper is to inspire a range of policy actors to recognize the potential of IBPA to foreground the complex contexts of health and social problems, and ultimately to transform how policy analysis is undertaken.
ERIC Educational Resources Information Center
Games, Ivan Alex
2008-01-01
This article discusses a framework for the analysis and assessment of twenty-first-century language and literacy practices in game and design-based contexts. It presents the framework in the context of game design within "Gamestar Mechanic", an innovative game-based learning environment where children learn the Discourse of game design. It…
Toward a public analysis database for LHC new physics searches using M ADA NALYSIS 5
NASA Astrophysics Data System (ADS)
Dumont, B.; Fuks, B.; Kraml, S.; Bein, S.; Chalons, G.; Conte, E.; Kulkarni, S.; Sengupta, D.; Wymant, C.
2015-02-01
We present the implementation, in the MadAnalysis 5 framework, of several ATLAS and CMS searches for supersymmetry in data recorded during the first run of the LHC. We provide extensive details on the validation of our implementations and propose to create a public analysis database within this framework.
Increased flexibility for modeling telemetry and nest-survival data using the multistate framework
Devineau, Olivier; Kendall, William L.; Doherty, Paul F.; Shenk, Tanya M.; White, Gary C.; Lukacs, Paul M.; Burnham, Kenneth P.
2014-01-01
Although telemetry is one of the most common tools used in the study of wildlife, advances in the analysis of telemetry data have lagged compared to progress in the development of telemetry devices. We demonstrate how standard known-fate telemetry and related nest-survival data analysis models are special cases of the more general multistate framework. We present a short theoretical development, and 2 case examples regarding the American black duck and the mallard. We also present a more complex lynx data analysis. Although not necessary in all situations, the multistate framework provides additional flexibility to analyze telemetry data, which may help analysts and biologists better deal with the vagaries of real-world data collection.
User-Centric Approach for Benchmark RDF Data Generator in Big Data Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purohit, Sumit; Paulson, Patrick R.; Rodriguez, Luke R.
This research focuses on user-centric approach of building such tools and proposes a flexible, extensible, and easy to use framework to support performance analysis of Big Data systems. Finally, case studies from two different domains are presented to validate the framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dale, Virginia H.; Efroymson, Rebecca Ann; Kline, Keith L.
A framework for selecting and evaluating indicators of bioenergy sustainability is presented. This framework is designed to facilitate decision-making about which indicators are useful for assessing sustainability of bioenergy systems and supporting their deployment. Efforts to develop sustainability indicators in the United States and Europe are reviewed. The first steps of the framework for indicator selection are defining the sustainability goals and other goals for a bioenergy project or program, gaining an understanding of the context, and identifying the values of stakeholders. From the goals, context, and stakeholders, the objectives for analysis and criteria for indicator selection can be developed.more » The user of the framework identifies and ranks indicators, applies them in an assessment, and then evaluates their effectiveness, while identifying gaps that prevent goals from being met, assessing lessons learned, and moving toward best practices. The framework approach emphasizes that the selection of appropriate criteria and indicators is driven by the specific purpose of an analysis. Realistic goals and measures of bioenergy sustainability can be developed systematically with the help of the framework presented here.« less
A framework for selecting indicators of bioenergy sustainability
Dale, Virginia H.; Efroymson, Rebecca Ann; Kline, Keith L.; ...
2015-05-11
A framework for selecting and evaluating indicators of bioenergy sustainability is presented. This framework is designed to facilitate decision-making about which indicators are useful for assessing sustainability of bioenergy systems and supporting their deployment. Efforts to develop sustainability indicators in the United States and Europe are reviewed. The first steps of the framework for indicator selection are defining the sustainability goals and other goals for a bioenergy project or program, gaining an understanding of the context, and identifying the values of stakeholders. From the goals, context, and stakeholders, the objectives for analysis and criteria for indicator selection can be developed.more » The user of the framework identifies and ranks indicators, applies them in an assessment, and then evaluates their effectiveness, while identifying gaps that prevent goals from being met, assessing lessons learned, and moving toward best practices. The framework approach emphasizes that the selection of appropriate criteria and indicators is driven by the specific purpose of an analysis. Realistic goals and measures of bioenergy sustainability can be developed systematically with the help of the framework presented here.« less
A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS
This paper discusses a framework for fine-scale CFD modeling that may be developed to complement the present Community Multi-scale Air Quality (CMAQ) modeling system which itself is a computational fluid dynamics model. A goal of this presentation is to stimulate discussions on w...
ERIC Educational Resources Information Center
Raveh, Ira; Koichu, Boris; Peled, Irit; Zaslavsky, Orit
2016-01-01
In this article we present an integrative framework of knowledge for teaching the standard algorithms of the four basic arithmetic operations. The framework is based on a mathematical analysis of the algorithms, a connectionist perspective on teaching mathematics and an analogy with previous frameworks of knowledge for teaching arithmetic…
Representations of the World in Language Textbooks
ERIC Educational Resources Information Center
Risager, Karen
2018-01-01
This book presents a new and comprehensive framework for the analysis of representations of culture, society and the world in textbooks for foreign and second language learning. The framework is transferable to other kinds of learning materials and to other subjects. The framework distinguishes between five approaches: national studies,…
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2006-01-01
A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis - Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2007-01-01
A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.
ERIC Educational Resources Information Center
Nitko, Anthony J.; Hsu, Tse-chi
Item analysis procedures appropriate for domain-referenced classroom testing are described. A conceptual framework within which item statistics can be considered and promising statistics in light of this framework are presented. The sampling fluctuations of the more promising item statistics for sample sizes comparable to the typical classroom…
ERIC Educational Resources Information Center
Tang, Kok-Sing; Delgado, Cesar; Moje, Elizabeth Birr
2014-01-01
This paper presents an integrative framework for analyzing science meaning-making with representations. It integrates the research on multiple representations and multimodal representations by identifying and leveraging the differences in their units of analysis in two dimensions: timescale and compositional grain size. Timescale considers the…
VisRseq: R-based visual framework for analysis of sequencing data
2015-01-01
Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. Conclusions To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights. PMID:26328469
VisRseq: R-based visual framework for analysis of sequencing data.
Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M
2015-01-01
Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.
FAST: a framework for simulation and analysis of large-scale protein-silicon biosensor circuits.
Gu, Ming; Chakrabartty, Shantanu
2013-08-01
This paper presents a computer aided design (CAD) framework for verification and reliability analysis of protein-silicon hybrid circuits used in biosensors. It is envisioned that similar to integrated circuit (IC) CAD design tools, the proposed framework will be useful for system level optimization of biosensors and for discovery of new sensing modalities without resorting to laborious fabrication and experimental procedures. The framework referred to as FAST analyzes protein-based circuits by solving inverse problems involving stochastic functional elements that admit non-linear relationships between different circuit variables. In this regard, FAST uses a factor-graph netlist as a user interface and solving the inverse problem entails passing messages/signals between the internal nodes of the netlist. Stochastic analysis techniques like density evolution are used to understand the dynamics of the circuit and estimate the reliability of the solution. As an example, we present a complete design flow using FAST for synthesis, analysis and verification of our previously reported conductometric immunoassay that uses antibody-based circuits to implement forward error-correction (FEC).
Bondy, Andy; Tincani, Matt; Frost, Lori
2004-01-01
This paper presents Skinner's (1957) analysis of verbal behavior as a framework for understanding language acquisition in children with autism. We describe Skinner's analysis of pure and impure verbal operants and illustrate how this analysis may be applied to the design of communication training programs. The picture exchange communication system (PECS) is a training program influenced by Skinner's framework. We describe the training sequence associated with PECS and illustrate how this sequence may establish multiply controlled verbal behavior in children with autism. We conclude with an examination of how Skinner's framework may apply to other communication modalities and training strategies.
NASA Astrophysics Data System (ADS)
Hadjimichael, A.; Corominas, L.; Comas, J.
2017-12-01
With sustainable development as their overarching goal, urban wastewater system (UWS) managers need to take into account multiple social, economic, technical and environmental facets related to their decisions. In this complex decision-making environment, uncertainty can be formidable. It is present both in the ways the system is interpreted stochastically, but also in its natural ever-shifting behavior. This inherent uncertainty suggests that wiser decisions would be made under an adaptive and iterative decision-making regime. No decision-support framework has been presented in the literature to effectively addresses all these needs. The objective of this work is to describe such a conceptual framework to evaluate and compare alternative solutions for various UWS challenges within an adaptive management structure. Socio-economic aspects such as externalities are taken into account, along with other traditional criteria as necessary. Robustness, reliability and resilience analyses test the performance of the system against present and future variability. A valuation uncertainty analysis incorporates uncertain valuation assumptions in the decision-making process. The framework is demonstrated with an application to a case study presenting a typical problem often faced by managers: poor river water quality, increasing population, and more stringent water quality legislation. The application of the framework made use of: i) a cost-benefit analysis including monetized environmental benefits and damages; ii) a robustness analysis of system performance against future conditions; iii) reliability and resilience analyses of the system given contextual variability; and iv) a valuation uncertainty analysis of model parameters. The results suggest that the installation of bigger volumes would give rise to increased benefits despite larger capital costs, as well as increased robustness and resilience. Population numbers appear to affect the estimated benefits most, followed by electricity prices and climate change projections. The presented framework is expected to be a valuable tool for the next generation of UWS decision-making and the application demonstrates a novel and valuable integration of metrics and methods for UWS analysis.
Evidence-Based Leadership Development: The 4L Framework
ERIC Educational Resources Information Center
Scott, Shelleyann; Webber, Charles F.
2008-01-01
Purpose: This paper aims to use the results of three research initiatives to present the life-long learning leader 4L framework, a model for leadership development intended for use by designers and providers of leadership development programming. Design/methodology/approach: The 4L model is a conceptual framework that emerged from the analysis of…
A Framework for Teaching Practice-Based Research with a Focus on Service Users
ERIC Educational Resources Information Center
Austin, Michael J.; Isokuortti, Nanne
2016-01-01
The integration of research and practice in social work education and agency practice is both complex and challenging. The analysis presented here builds upon the classic social work generalist framework (engagement, assessment, service planning and implementation, service evaluation, and termination) by developing a three-part framework to…
Entity-Centric Abstraction and Modeling Framework for Transportation Architectures
NASA Technical Reports Server (NTRS)
Lewe, Jung-Ho; DeLaurentis, Daniel A.; Mavris, Dimitri N.; Schrage, Daniel P.
2007-01-01
A comprehensive framework for representing transpportation architectures is presented. After discussing a series of preceding perspectives and formulations, the intellectual underpinning of the novel framework using an entity-centric abstraction of transportation is described. The entities include endogenous and exogenous factors and functional expressions are offered that relate these and their evolution. The end result is a Transportation Architecture Field which permits analysis of future concepts under the holistic perspective. A simulation model which stems from the framework is presented and exercised producing results which quantify improvements in air transportation due to advanced aircraft technologies. Finally, a modeling hypothesis and its accompanying criteria are proposed to test further use of the framework for evaluating new transportation solutions.
ERIC Educational Resources Information Center
Pringle, James; Huisman, Jeroen
2011-01-01
In analyses of higher education systems, many models and frameworks are based on governance, steering, or coordination models. Although much can be gained by such analyses, we argue that the language used in the present-day policy documents (knowledge economy, competitive position, etc.) calls for an analysis of higher education as an industry. In…
Plamondon, Katrina M; Bottorff, Joan L; Cole, Donald C
2015-11-01
Deliberative dialogue (DD) is a knowledge translation strategy that can serve to generate rich data and bridge health research with action. An intriguing alternative to other modes of generating data, the purposeful and evidence-informed conversations characteristic of DD generate data inclusive of collective interpretations. These data are thus dialogic, presenting complex challenges for qualitative analysis. In this article, we discuss the nature of data generated through DD, orienting ourselves toward a theoretically grounded approach to analysis. We offer an integrated framework for analysis, balancing analytical strategies of categorizing and connecting with the use of empathetic and suspicious interpretive lenses. In this framework, data generation and analysis occur in concert, alongside engaging participants and synthesizing evidence. An example of application is provided, demonstrating nuances of the framework. We conclude with reflections on the strengths and limitations of the framework, suggesting how it may be relevant in other qualitative health approaches. © The Author(s) 2015.
Multimedia-modeling integration development environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelton, Mitchell A.; Hoopes, Bonnie L.
2002-09-02
There are many framework systems available; however, the purpose of the framework presented here is to capitalize on the successes of the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) and Multi-media Multi-pathway Multi-receptor Risk Assessment (3MRA) methodology as applied to the Hazardous Waste Identification Rule (HWIR) while focusing on the development of software tools to simplify the module developer?s effort of integrating a module into the framework.
White Dialectics: A New Framework for Theory, Research, and Practice with White Students
ERIC Educational Resources Information Center
Todd, Nathan R.; Abrams, Elizabeth M.
2011-01-01
This article presents White dialectics, or the tensions that White students experience as dominant group members in the United States, as a new framework to understand and intervene with White students and counselor trainees. Developed from and supported by our qualitative analysis, the authors present the six dialectics of (a) Whiteness and self,…
An investigation of social media data during a product recall scandal
NASA Astrophysics Data System (ADS)
Tse, Ying Kei; Loh, Hanlin; Ding, Juling; Zhang, Minhao
2018-07-01
As social media has become an important part of modern daily life, users often share product opinions online and these tend to spike when large companies undergo crises. This paper investigates customer online responses to a large company crisis by uncovering hidden insights in social media comments and presents a framework for handling social media data and crisis management. Analysis of textual Facebook data from users responding to the 2013 horsemeat scandal is presented. In this study, we used a novel comprehensive data analysis framework alongside a text-mining framework to objectively classify and understand customer perceptions during this horsemeat scandal. This framework provides an effective approach for investigating customer perception during a company crisis and measures the effectiveness of crisis management practices which the company has adopted. Our analyses show that social media can provide important insights into customer behaviour during crisis communications.
Multiscale hidden Markov models for photon-limited imaging
NASA Astrophysics Data System (ADS)
Nowak, Robert D.
1999-06-01
Photon-limited image analysis is often hindered by low signal-to-noise ratios. A novel Bayesian multiscale modeling and analysis method is developed in this paper to assist in these challenging situations. In addition to providing a very natural and useful framework for modeling an d processing images, Bayesian multiscale analysis is often much less computationally demanding compared to classical Markov random field models. This paper focuses on a probabilistic graph model called the multiscale hidden Markov model (MHMM), which captures the key inter-scale dependencies present in natural image intensities. The MHMM framework presented here is specifically designed for photon-limited imagin applications involving Poisson statistics, and applications to image intensity analysis are examined.
Software for Data Analysis with Graphical Models
NASA Technical Reports Server (NTRS)
Buntine, Wray L.; Roy, H. Scott
1994-01-01
Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.
Causality Analysis of fMRI Data Based on the Directed Information Theory Framework.
Wang, Zhe; Alahmadi, Ahmed; Zhu, David C; Li, Tongtong
2016-05-01
This paper aims to conduct fMRI-based causality analysis in brain connectivity by exploiting the directed information (DI) theory framework. Unlike the well-known Granger causality (GC) analysis, which relies on the linear prediction technique, the DI theory framework does not have any modeling constraints on the sequences to be evaluated and ensures estimation convergence. Moreover, it can be used to generate the GC graphs. In this paper, first, we introduce the core concepts in the DI framework. Second, we present how to conduct causality analysis using DI measures between two time series. We provide the detailed procedure on how to calculate the DI for two finite-time series. The two major steps involved here are optimal bin size selection for data digitization and probability estimation. Finally, we demonstrate the applicability of DI-based causality analysis using both the simulated data and experimental fMRI data, and compare the results with that of the GC analysis. Our analysis indicates that GC analysis is effective in detecting linear or nearly linear causal relationship, but may have difficulty in capturing nonlinear causal relationships. On the other hand, DI-based causality analysis is more effective in capturing both linear and nonlinear causal relationships. Moreover, it is observed that brain connectivity among different regions generally involves dynamic two-way information transmissions between them. Our results show that when bidirectional information flow is present, DI is more effective than GC to quantify the overall causal relationship.
Exploring the Application of a Conceptual Framework in a Social MALL App
ERIC Educational Resources Information Center
Read, Timothy; Bárcena, Elena; Kukulska-Hulme, Agnes
2016-01-01
This article presents a prototype social Mobile Assisted Language Learning (henceforth, MALL) app based on Kukulska-Hulme's (2012) conceptual framework. This research allows the exploration of time, place and activity type as key factors in the design of MALL apps, and is the first step toward a systematic analysis of such a framework in this type…
Revised Community of Inquiry Framework: Examining Learning Presence in a Blended Mode of Delivery
ERIC Educational Resources Information Center
Pool, Jessica; Reitsma, Gerda; van den Berg, Dirk
2017-01-01
This paper presents a study grounded in the Community of Inquiry (CoI) framework using qualitative content analysis and focus group interviews in an effort to identify aspects of learning presence in a blended learning course. Research has suggested that the CoI framework may need additional emphasis based on the roles of strategic learners in…
ERIC Educational Resources Information Center
Prentice, Diana B.; Carlin, John
Arguing that state and local political issue campaigns warrant increased attention from communication scholars, this paper presents a rationale for analysis of issue campaigns, develops a framework for organizing and analyzing such campaigns, and applies the framework to an analysis of the 1986 campaign for the sale of liquor "by the…
Towards a Cloud Based Smart Traffic Management Framework
NASA Astrophysics Data System (ADS)
Rahimi, M. M.; Hakimpour, F.
2017-09-01
Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.
A novel framework for virtual prototyping of rehabilitation exoskeletons.
Agarwal, Priyanshu; Kuo, Pei-Hsin; Neptune, Richard R; Deshpande, Ashish D
2013-06-01
Human-worn rehabilitation exoskeletons have the potential to make therapeutic exercises increasingly accessible to disabled individuals while reducing the cost and labor involved in rehabilitation therapy. In this work, we propose a novel human-model-in-the-loop framework for virtual prototyping (design, control and experimentation) of rehabilitation exoskeletons by merging computational musculoskeletal analysis with simulation-based design techniques. The framework allows to iteratively optimize design and control algorithm of an exoskeleton using simulation. We introduce biomechanical, morphological, and controller measures to quantify the performance of the device for optimization study. Furthermore, the framework allows one to carry out virtual experiments for testing specific "what-if" scenarios to quantify device performance and recovery progress. To illustrate the application of the framework, we present a case study wherein the design and analysis of an index-finger exoskeleton is carried out using the proposed framework.
Neurocognitive mechanisms of perception-action coordination: a review and theoretical integration.
Ridderinkhof, K Richard
2014-10-01
The present analysis aims at a theoretical integration of, and a systems-neuroscience perspective on, a variety of historical and contemporary views on perception-action coordination (PAC). We set out to determine the common principles or lawful linkages between sensory and motor systems that explain how perception is action-oriented and how action is perceptually guided. To this end, we analyze the key ingredients to such an integrated framework, examine the architecture of dual-system conjectures of PAC, and endeavor in an historical analysis of the key characteristics, mechanisms, and phenomena of PACs. This analysis will reveal that dual-systems views are in need of fundamental re-thinking, and its elements will be amalgamated with current views on action-oriented predictive processing into a novel integrative theoretical framework (IMPPACT: Impetus, Motivation, and Prediction in Perception-Action Coordination theory). From this framework and its neurocognitive architecture we derive a number of non-trivial predictions regarding conative, motive-driven PAC. We end by presenting a brief outlook on how IMPPACT might present novel insights into certain pathologies and into action expertise. Copyright © 2014 Elsevier Ltd. All rights reserved.
Op den Akker, Harm; Cabrita, Miriam; Op den Akker, Rieks; Jones, Valerie M; Hermens, Hermie J
2015-06-01
This paper presents a comprehensive and practical framework for automatic generation of real-time tailored messages in behavior change applications. Basic aspects of motivational messages are time, intention, content and presentation. Tailoring of messages to the individual user may involve all aspects of communication. A linear modular system is presented for generating such messages. It is explained how properties of user and context are taken into account in each of the modules of the system and how they affect the linguistic presentation of the generated messages. The model of motivational messages presented is based on an analysis of existing literature as well as the analysis of a corpus of motivational messages used in previous studies. The model extends existing 'ontology-based' approaches to message generation for real-time coaching systems found in the literature. Practical examples are given on how simple tailoring rules can be implemented throughout the various stages of the framework. Such examples can guide further research by clarifying what it means to use e.g. user targeting to tailor a message. As primary example we look at the issue of promoting daily physical activity. Future work is pointed out in applying the present model and framework, defining efficient ways of evaluating individual tailoring components, and improving effectiveness through the creation of accurate and complete user- and context models. Copyright © 2015 Elsevier Inc. All rights reserved.
Fast and Efficient Feature Engineering for Multi-Cohort Analysis of EHR Data.
Ozery-Flato, Michal; Yanover, Chen; Gottlieb, Assaf; Weissbrod, Omer; Parush Shear-Yashuv, Naama; Goldschmidt, Yaara
2017-01-01
We present a framework for feature engineering, tailored for longitudinal structured data, such as electronic health records (EHRs). To fast-track feature engineering and extraction, the framework combines general-use plug-in extractors, a multi-cohort management mechanism, and modular memoization. Using this framework, we rapidly extracted thousands of features from diverse and large healthcare data sources in multiple projects.
Jesunathadas, Mark; Poston, Brach; Santello, Marco; Ye, Jieping; Panchanathan, Sethuraman
2014-01-01
Many studies have attempted to monitor fatigue from electromyogram (EMG) signals. However, fatigue affects EMG in a subject-specific manner. We present here a subject-independent framework for monitoring the changes in EMG features that accompany muscle fatigue based on principal component analysis and factor analysis. The proposed framework is based on several time- and frequency-domain features, unlike most of the existing work, which is based on two to three features. Results show that latent factors obtained from factor analysis on these features provide a robust and unified framework. This framework learns a model from EMG signals of multiple subjects, that form a reference group, and monitors the changes in EMG features during a sustained submaximal contraction on a test subject on a scale from zero to one. The framework was tested on EMG signals collected from 12 muscles of eight healthy subjects. The distribution of factor scores of the test subject, when mapped onto the framework was similar for both the subject-specific and subject-independent cases. PMID:22498666
Edwards, Jeffrey R; Lambert, Lisa Schurer
2007-03-01
Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated and the mediated effects under investigation. This article presents a general analytical framework for combining moderation and mediation that integrates moderated regression analysis and path analysis. This framework clarifies how moderator variables influence the paths that constitute the direct, indirect, and total effects of mediated models. The authors empirically illustrate this framework and give step-by-step instructions for estimation and interpretation. They summarize the advantages of their framework over current approaches, explain how it subsumes moderated mediation and mediated moderation, and describe how it can accommodate additional moderator and mediator variables, curvilinear relationships, and structural equation models with latent variables. (c) 2007 APA, all rights reserved.
Material and morphology parameter sensitivity analysis in particulate composite materials
NASA Astrophysics Data System (ADS)
Zhang, Xiaoyu; Oskay, Caglar
2017-12-01
This manuscript presents a novel parameter sensitivity analysis framework for damage and failure modeling of particulate composite materials subjected to dynamic loading. The proposed framework employs global sensitivity analysis to study the variance in the failure response as a function of model parameters. In view of the computational complexity of performing thousands of detailed microstructural simulations to characterize sensitivities, Gaussian process (GP) surrogate modeling is incorporated into the framework. In order to capture the discontinuity in response surfaces, the GP models are integrated with a support vector machine classification algorithm that identifies the discontinuities within response surfaces. The proposed framework is employed to quantify variability and sensitivities in the failure response of polymer bonded particulate energetic materials under dynamic loads to material properties and morphological parameters that define the material microstructure. Particular emphasis is placed on the identification of sensitivity to interfaces between the polymer binder and the energetic particles. The proposed framework has been demonstrated to identify the most consequential material and morphological parameters under vibrational and impact loads.
Computational structural mechanics methods research using an evolving framework
NASA Technical Reports Server (NTRS)
Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.
1990-01-01
Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.
NASA Astrophysics Data System (ADS)
Kenkre, V. M.; Scott, J. E.; Pease, E. A.; Hurd, A. J.
1998-05-01
A theoretical framework for the analysis of the stress distribution in granular materials is presented. It makes use of a transformation of the vertical spatial coordinate into a formal time variable and the subsequent study of a generally non-Markoffian, i.e., memory-possessing (nonlocal) propagation equation. Previous treatments are obtained as particular cases corresponding to, respectively, wavelike and diffusive limits of the general evolution. Calculations are presented for stress propagation in bounded and unbounded media. They can be used to obtain desired features such as a prescribed stress distribution within the compact.
Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579
Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.
GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals
NASA Astrophysics Data System (ADS)
Agostini, M.; Pandola, L.; Zavarise, P.; Volynets, O.
2011-08-01
GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.
A streamlined Python framework for AT-TPC data analysis
NASA Astrophysics Data System (ADS)
Taylor, J. Z.; Bradt, J.; Bazin, D.; Kuchera, M. P.
2017-09-01
User-friendly data analysis software has been developed for the Active-Target Time Projection Chamber (AT-TPC) experiment at the National Superconducting Cyclotron Laboratory at Michigan State University. The AT-TPC, commissioned in 2014, is a gas-filled detector that acts as both the detector and target for high-efficiency detection of low-intensity, exotic nuclear reactions. The pytpc framework is a Python package for analyzing AT-TPC data. The package was developed for the analysis of 46Ar(p, p) data. The existing software was used to analyze data produced by the 40Ar(p, p) experiment that ran in August, 2015. Usage of the package was documented in an analysis manual both to improve analysis steps and aid in the work of future AT-TPC users. Software features and analysis methods in the pytpc framework will be presented along with the 40Ar results.
JSEM: A Framework for Identifying and Evaluating Indicators.
ERIC Educational Resources Information Center
Hyman, Jeffrey B.; Leibowitz, Scott G.
2001-01-01
Presents an approach to identifying and evaluating combinations of indicators when the mathematical relationships between the indicators and an endpoint may not be quantified, a limitation common to many ecological assessments. Uses the framework of Structural Equation Modeling (SEM), which combines path analysis with measurement model, to…
NASA Astrophysics Data System (ADS)
Ruiz, Rafael O.; Meruane, Viviana
2017-06-01
The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.
Multilevel analysis of sports video sequences
NASA Astrophysics Data System (ADS)
Han, Jungong; Farin, Dirk; de With, Peter H. N.
2006-01-01
We propose a fully automatic and flexible framework for analysis and summarization of tennis broadcast video sequences, using visual features and specific game-context knowledge. Our framework can analyze a tennis video sequence at three levels, which provides a broad range of different analysis results. The proposed framework includes novel pixel-level and object-level tennis video processing algorithms, such as a moving-player detection taking both the color and the court (playing-field) information into account, and a player-position tracking algorithm based on a 3-D camera model. Additionally, we employ scene-level models for detecting events, like service, base-line rally and net-approach, based on a number real-world visual features. The system can summarize three forms of information: (1) all court-view playing frames in a game, (2) the moving trajectory and real-speed of each player, as well as relative position between the player and the court, (3) the semantic event segments in a game. The proposed framework is flexible in choosing the level of analysis that is desired. It is effective because the framework makes use of several visual cues obtained from the real-world domain to model important events like service, thereby increasing the accuracy of the scene-level analysis. The paper presents attractive experimental results highlighting the system efficiency and analysis capabilities.
NASA Astrophysics Data System (ADS)
Wollman, Adam J. M.; Miller, Helen; Foster, Simon; Leake, Mark C.
2016-10-01
Staphylococcus aureus is an important pathogen, giving rise to antimicrobial resistance in cell strains such as Methicillin Resistant S. aureus (MRSA). Here we report an image analysis framework for automated detection and image segmentation of cells in S. aureus cell clusters, and explicit identification of their cell division planes. We use a new combination of several existing analytical tools of image analysis to detect cellular and subcellular morphological features relevant to cell division from millisecond time scale sampled images of live pathogens at a detection precision of single molecules. We demonstrate this approach using a fluorescent reporter GFP fused to the protein EzrA that localises to a mid-cell plane during division and is involved in regulation of cell size and division. This image analysis framework presents a valuable platform from which to study candidate new antimicrobials which target the cell division machinery, but may also have more general application in detecting morphologically complex structures of fluorescently labelled proteins present in clusters of other types of cells.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Mandelli, Diego; Rasmussen, Martin
2016-06-01
This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: •more » Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.« less
NASA Astrophysics Data System (ADS)
Alseddiqi, M.; Mishra, R.; Pislaru, C.
2012-05-01
The paper presents the results from a quality framework to measure the effectiveness of a new engineering course entitled 'school-based learning (SBL) to work-based learning (WBL) transition module' in the Technical and Vocational Education (TVE) system in Bahrain. The framework is an extended version of existing information quality frameworks with respect to pedagogical and technological contexts. It incorporates specific pedagogical and technological dimensions as per the Bahrain modern industry requirements. Users' views questionnaire on the effectiveness of the new transition module was distributed to various stakeholders including TVE teachers and students. The aim was to receive critical information in diagnosing, monitoring and evaluating different views and perceptions about the effectiveness of the new module. The analysis categorised the quality dimensions by their relative importance. This was carried out using the principal component analysis available in SPSS. The analysis clearly identified the most important quality dimensions integrated in the new module for SBL-to-WBL transition. It was also apparent that the new module contains workplace proficiencies, prepares TVE students for work placement, provides effective teaching and learning methodologies, integrates innovative technology in the process of learning, meets modern industrial needs, and presents a cooperative learning environment for TVE students. From the principal component analysis finding, to calculate the percentage of relative importance of each factor and its quality dimensions, was significant. The percentage comparison would justify the most important factor as well as the most important quality dimensions. Also, the new, re-arranged quality dimensions from the finding with an extended number of factors tended to improve the extended version of the quality information framework to a revised quality framework.
Semantics-enabled service discovery framework in the SIMDAT pharma grid.
Qu, Cangtao; Zimmermann, Falk; Kumpf, Kai; Kamuzinzi, Richard; Ledent, Valérie; Herzog, Robert
2008-03-01
We present the design and implementation of a semantics-enabled service discovery framework in the data Grids for process and product development using numerical simulation and knowledge discovery (SIMDAT) Pharma Grid, an industry-oriented Grid environment for integrating thousands of Grid-enabled biological data services and analysis services. The framework consists of three major components: the Web ontology language (OWL)-description logic (DL)-based biological domain ontology, OWL Web service ontology (OWL-S)-based service annotation, and semantic matchmaker based on the ontology reasoning. Built upon the framework, workflow technologies are extensively exploited in the SIMDAT to assist biologists in (semi)automatically performing in silico experiments. We present a typical usage scenario through the case study of a biological workflow: IXodus.
Das, Ranjana; Ytre-Arne, Brita
2017-01-01
We write this article presenting frameworks and findings from an international network on audience research, as we stand 75 years from Herta Herzog’s classic investigation of radio listeners, published in Lazarsfeld and Stanton’s 1944 war edition of Radio Research. The article aims to contribute to and advance a rich strand of self-reflexive stock-taking and sorting of future research priorities within the transforming field of audience analysis, by drawing on the collective efforts of CEDAR – Consortium on Emerging Directions in Audience Research – a 14-country network (2015–2018) funded by the Arts and Humanities Research Council, United Kingdom, which conducted a foresight analysis exercise on developing current trends and future scenarios for audiences and audience research in the year 2030. First, we wish to present the blueprint of what we did and how we did it – by discussing the questions, contexts and frameworks for our project. We hope this is useful for anyone considering a foresight analysis task, an approach we present as an innovative and rigorous tool for assessing and understanding the future of a field. Second, we present findings from our analysis of pivotal transformations in the field and the future scenarios we constructed for audiences, as media technologies rapidly change with the arrival of the Internet of Things and changes on many levels occur in audience practices. These findings not only make sense of a transformative decade that we have just lived through but they present possibilities for the future, outlining areas for individual and collective intellectual commitment. PMID:29276327
A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON
King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix
2008-01-01
As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597
Framework for Analysis of Mitigation in Courts
2005-01-01
present a framework for a pragmatic analysis of mitigation in courts. The study focuses on discursive acts and aspects of discursive acts the purpose...use it is associated with coping (Lazarus, 1999) with (inevitable) negative events or experiences. It is also linked to studies of pragmatics...The concept of ‘hedges’ or ‘metalinguistic operators’ such as ‘more or less’, ‘like’, ‘sort of’ etc. in turn originated in studies on the fuzzy-set
Enterprise application architecture development based on DoDAF and TOGAF
NASA Astrophysics Data System (ADS)
Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng
2017-05-01
For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.
Disgust: Evolved Function and Structure
ERIC Educational Resources Information Center
Tybur, Joshua M.; Lieberman, Debra; Kurzban, Robert; DeScioli, Peter
2013-01-01
Interest in and research on disgust has surged over the past few decades. The field, however, still lacks a coherent theoretical framework for understanding the evolved function or functions of disgust. Here we present such a framework, emphasizing 2 levels of analysis: that of evolved function and that of information processing. Although there is…
Networked Learning for Agricultural Extension: A Framework for Analysis and Two Cases
ERIC Educational Resources Information Center
Kelly, Nick; Bennett, John McLean; Starasts, Ann
2017-01-01
Purpose: This paper presents economic and pedagogical motivations for adopting information and communications technology (ICT)- mediated learning networks in agricultural education and extension. It proposes a framework for networked learning in agricultural extension and contributes a theoretical and case-based rationale for adopting the…
Structure and Strength in Causal Induction
ERIC Educational Resources Information Center
Griffiths, Thomas L.; Tenenbaum, Joshua B.
2005-01-01
We present a framework for the rational analysis of elemental causal induction--learning about the existence of a relationship between a single cause and effect--based upon causal graphical models. This framework makes precise the distinction between causal structure and causal strength: the difference between asking whether a causal relationship…
A Dual-Identity Framework for Understanding Lesbian Experience
ERIC Educational Resources Information Center
Fingerhut, Adam W.; Peplau, Letitia Anne; Ghavami, Negin
2005-01-01
The diverse life experiences of contemporary lesbians are shaped by women's differing ties to two social worlds, the majority heterosexual society and the minority subculture of the lesbian or sexual-minority world. This article presents a detailed conceptual analysis of a dual-identity framework that emphasizes lesbians' simultaneous affiliations…
Global/local methods research using the CSM testbed
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. Hayden, Jr.; Thompson, Danniella M.
1990-01-01
Research activities in global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
A Three-Dimensional Analysis of Black Leadership.
ERIC Educational Resources Information Center
McDaniel, Clyde O., Jr.; Balgopal, Pallassana R.
This book presents an analysis of black leadership from three perspectives: theoretical, historical, and empirical. After deducing the situational-interactional approach as a useful framework, the authors analyze black leadership from 1841 to the present. This period is divided into six time periods, and black leadership and the strategies used by…
Analysis of poetic literature using B. F. Skinner's theoretical framework from verbal behavior
Luke, Nicole M.
2003-01-01
This paper examines Skinner's work on verbal behavior in the context of literature as a particular class of written verbal behavior. It looks at contemporary literary theory and analysis and the contributions that Skinner's theoretical framework can make. Two diverse examples of poetic literature are chosen and analyzed following Skinner's framework, examining the dynamic interplay between the writer and reader that take place within the bounds of the work presented. It is concluded that Skinner's hypotheses about verbal behavior and the functional approach to understanding it have much to offer literary theorists in their efforts to understand literary works and should be more carefully examined.
Rachid, G; El Fadel, M
2013-08-15
This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region. Copyright © 2013 Elsevier Ltd. All rights reserved.
Health systems strengthening: a common classification and framework for investment analysis
Shakarishvili, George; Lansang, Mary Ann; Mitta, Vinod; Bornemisza, Olga; Blakley, Matthew; Kley, Nicole; Burgess, Craig; Atun, Rifat
2011-01-01
Significant scale-up of donors’ investments in health systems strengthening (HSS), and the increased application of harmonization mechanisms for jointly channelling donor resources in countries, necessitate the development of a common framework for tracking donors’ HSS expenditures. Such a framework would make it possible to comparatively analyse donors’ contributions to strengthening specific aspects of countries’ health systems in multi-donor-supported HSS environments. Four pre-requisite factors are required for developing such a framework: (i) harmonization of conceptual and operational understanding of what constitutes HSS; (ii) development of a common set of criteria to define health expenditures as contributors to HSS; (iii) development of a common HSS classification system; and (iv) harmonization of HSS programmatic and financial data to allow for inter-agency comparative analyses. Building on the analysis of these aspects, the paper proposes a framework for tracking donors’ investments in HSS, as a departure point for further discussions aimed at developing a commonly agreed approach. Comparative analysis of financial allocations by the Global Fund to Fight AIDS, Tuberculosis and Malaria and the GAVI Alliance for HSS, as an illustrative example of applying the proposed framework in practice, is also presented. PMID:20952397
Augmenting breath regulation using a mobile driven virtual reality therapy framework.
Abushakra, Ahmad; Faezipour, Miad
2014-05-01
This paper presents a conceptual framework of a virtual reality therapy to assist individuals, especially lung cancer patients or those with breathing disorders to regulate their breath through real-time analysis of respiration movements using a smartphone. Virtual reality technology is an attractive means for medical simulations and treatment, particularly for patients with cancer. The theories, methodologies and approaches, and real-world dynamic contents for all the components of this virtual reality therapy (VRT) via a conceptual framework using the smartphone will be discussed. The architecture and technical aspects of the offshore platform of the virtual environment will also be presented.
Breaking the Code: Engaging Practitioners in Critical Analysis of Adult Educational Literature.
ERIC Educational Resources Information Center
Brookfield, Stephen
1993-01-01
Presents a framework for critical reflection on adult education literature in four sets of questions: methodological, experiential, communicative, and political. Addresses reasons for student resistance to critical analysis. (SK)
An evolutionary approach to the group analysis of global geophysical data
NASA Technical Reports Server (NTRS)
Vette, J. I.
1979-01-01
The coordinated data analysis that developed within the International Magnetospheric Study is presented. A tracing of its development along with various activities taking place within this framework are reported.
Unified Framework for Development, Deployment and Robust Testing of Neuroimaging Algorithms
Joshi, Alark; Scheinost, Dustin; Okuda, Hirohito; Belhachemi, Dominique; Murphy, Isabella; Staib, Lawrence H.; Papademetris, Xenophon
2011-01-01
Developing both graphical and command-line user interfaces for neuroimaging algorithms requires considerable effort. Neuroimaging algorithms can meet their potential only if they can be easily and frequently used by their intended users. Deployment of a large suite of such algorithms on multiple platforms requires consistency of user interface controls, consistent results across various platforms and thorough testing. We present the design and implementation of a novel object-oriented framework that allows for rapid development of complex image analysis algorithms with many reusable components and the ability to easily add graphical user interface controls. Our framework also allows for simplified yet robust nightly testing of the algorithms to ensure stability and cross platform interoperability. All of the functionality is encapsulated into a software object requiring no separate source code for user interfaces, testing or deployment. This formulation makes our framework ideal for developing novel, stable and easy-to-use algorithms for medical image analysis and computer assisted interventions. The framework has been both deployed at Yale and released for public use in the open source multi-platform image analysis software—BioImage Suite (bioimagesuite.org). PMID:21249532
Master of Puppets: Cooperative Multitasking for In Situ Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morozov, Dmitriy; Lukic, Zarija
2016-01-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monozov, Dmitriy; Lukie, Zarija
2016-04-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less
English Language Teachers' Burnout within the Cultural Dimensions Framework
ERIC Educational Resources Information Center
Saboori, Fahime; Pishghadam, Reza
2016-01-01
The aim of the present study was to explore burnout among Iranian English as a Foreign Language (EFL) teachers within Hofstede's cultural framework. To this end, first multiple correspondence analysis was run, and the results of which revealed a significant relationship between the cultural dimensions and the burnout components. Next, multiple…
Mediation Analysis in a Latent Growth Curve Modeling Framework
ERIC Educational Resources Information Center
von Soest, Tilmann; Hagtvet, Knut A.
2011-01-01
This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…
Design-Based Research: Case of a Teaching Sequence on Mechanics
ERIC Educational Resources Information Center
Tiberghien, Andree; Vince, Jacques; Gaidioz, Pierre
2009-01-01
Design-based research, and particularly its theoretical status, is a subject of debate in the science education community. In the first part of this paper, a theoretical framework drawn up to develop design-based research will be presented. This framework is mainly based on epistemological analysis of physics modelling, learning and teaching…
Complexity Framework for Sustainability: An Analysis of Five Papers
ERIC Educational Resources Information Center
Putnik, Goran D.
2009-01-01
Purpose: The purpose of this paper is to present an examination of the concepts and mechanisms of complexity and learning usability and applicability for management in turbulent environments as well as their examination through the Chaordic system thinking (CST) lenses and framework. Contributing to awareness of how different mechanisms could be…
Big data analysis framework for healthcare and social sectors in Korea.
Song, Tae-Min; Ryu, Seewon
2015-01-01
We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached.
Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky
2013-10-01
There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.
Nonlinear analysis of structures. [within framework of finite element method
NASA Technical Reports Server (NTRS)
Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.
1974-01-01
The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.
2010-12-14
Robotics Research Institute Auditorium Riverbend Campus, Fort Worth, TX Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden...for HP and HSI. Format Participants were presented with information on frameworks and the benefits they can have in support of the various roles... benefit from understanding human performance. She concluded with the objectives of the workshop: 1. Evaluate strategic frameworks for representing
Trends in Distance Education: A Content Analysis of Master's Thesis
ERIC Educational Resources Information Center
Durak, Gürhan; Çankaya, Serkan; Yunkul, Eyup; Urfa, Mehmet; Toprakliklioglu, Kivanç; Arda, Yagmur; Inam, Nazmiye
2017-01-01
The present study aimed at presenting the results of content analysis on Master's Theses carried out in the field of distance education at higher education level in Turkey between 1986 and 2015. A total of 285 Master's Theses were examined to determine the key words, academic disciplines, research areas, theoretical frameworks, research designs…
Pneumothorax detection in chest radiographs using local and global texture signatures
NASA Astrophysics Data System (ADS)
Geva, Ofer; Zimmerman-Moreno, Gali; Lieberman, Sivan; Konen, Eli; Greenspan, Hayit
2015-03-01
A novel framework for automatic detection of pneumothorax abnormality in chest radiographs is presented. The suggested method is based on a texture analysis approach combined with supervised learning techniques. The proposed framework consists of two main steps: at first, a texture analysis process is performed for detection of local abnormalities. Labeled image patches are extracted in the texture analysis procedure following which local analysis values are incorporated into a novel global image representation. The global representation is used for training and detection of the abnormality at the image level. The presented global representation is designed based on the distinctive shape of the lung, taking into account the characteristics of typical pneumothorax abnormalities. A supervised learning process was performed on both the local and global data, leading to trained detection system. The system was tested on a dataset of 108 upright chest radiographs. Several state of the art texture feature sets were experimented with (Local Binary Patterns, Maximum Response filters). The optimal configuration yielded sensitivity of 81% with specificity of 87%. The results of the evaluation are promising, establishing the current framework as a basis for additional improvements and extensions.
Selection of remedial alternatives for mine sites: a multicriteria decision analysis approach.
Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon
2013-04-15
The selection of remedial alternatives for mine sites is a complex task because it involves multiple criteria and often with conflicting objectives. However, an existing framework used to select remedial alternatives lacks multicriteria decision analysis (MCDA) aids and does not consider uncertainty in the selection of alternatives. The objective of this paper is to improve the existing framework by introducing deterministic and probabilistic MCDA methods. The Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) methods have been implemented in this study. The MCDA analysis involves processing inputs to the PROMETHEE methods that are identifying the alternatives, defining the criteria, defining the criteria weights using analytical hierarchical process (AHP), defining the probability distribution of criteria weights, and conducting Monte Carlo Simulation (MCS); running the PROMETHEE methods using these inputs; and conducting a sensitivity analysis. A case study was presented to demonstrate the improved framework at a mine site. The results showed that the improved framework provides a reliable way of selecting remedial alternatives as well as quantifying the impact of different criteria on selecting alternatives. Copyright © 2013 Elsevier Ltd. All rights reserved.
Wang, Mingming; Sweetapple, Chris; Fu, Guangtao; Farmani, Raziyeh; Butler, David
2017-10-01
This paper presents a new framework for decision making in sustainable drainage system (SuDS) scheme design. It integrates resilience, hydraulic performance, pollution control, rainwater usage, energy analysis, greenhouse gas (GHG) emissions and costs, and has 12 indicators. The multi-criteria analysis methods of entropy weight and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) were selected to support SuDS scheme selection. The effectiveness of the framework is demonstrated with a SuDS case in China. Indicators used include flood volume, flood duration, a hydraulic performance indicator, cost and resilience. Resilience is an important design consideration, and it supports scheme selection in the case study. The proposed framework will help a decision maker to choose an appropriate design scheme for implementation without subjectivity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Structure-Specific Statistical Mapping of White Matter Tracts
Yushkevich, Paul A.; Zhang, Hui; Simon, Tony; Gee, James C.
2008-01-01
We present a new model-based framework for the statistical analysis of diffusion imaging data associated with specific white matter tracts. The framework takes advantage of the fact that several of the major white matter tracts are thin sheet-like structures that can be effectively modeled by medial representations. The approach involves segmenting major tracts and fitting them with deformable geometric medial models. The medial representation makes it possible to average and combine tensor-based features along directions locally perpendicular to the tracts, thus reducing data dimensionality and accounting for errors in normalization. The framework enables the analysis of individual white matter structures, and provides a range of possibilities for computing statistics and visualizing differences between cohorts. The framework is demonstrated in a study of white matter differences in pediatric chromosome 22q11.2 deletion syndrome. PMID:18407524
A framework for the design and development of physical employment tests and standards.
Payne, W; Harvey, J
2010-07-01
Because operational tasks in the uniformed services (military, police, fire and emergency services) are physically demanding and incur the risk of injury, employment policy in these services is usually competency based and predicated on objective physical employment standards (PESs) based on physical employment tests (PETs). In this paper, a comprehensive framework for the design of PETs and PESs is presented. Three broad approaches to physical employment testing are described and compared: generic predictive testing; task-related predictive testing; task simulation testing. Techniques for the selection of a set of tests with good coverage of job requirements, including job task analysis, physical demands analysis and correlation analysis, are discussed. Regarding individual PETs, theoretical considerations including measurability, discriminating power, reliability and validity, and practical considerations, including development of protocols, resource requirements, administrative issues and safety, are considered. With regard to the setting of PESs, criterion referencing and norm referencing are discussed. STATEMENT OF RELEVANCE: This paper presents an integrated and coherent framework for the development of PESs and hence provides a much needed theoretically based but practically oriented guide for organisations seeking to establish valid and defensible PESs.
NASA Astrophysics Data System (ADS)
Hughes, Allen A.
1994-12-01
Public safety can be enhanced through the development of a comprehensive medical device risk management. This can be accomplished through case studies using a framework that incorporates cost-benefit analysis in the evaluation of risk management attributes. This paper presents a framework for evaluating the risk management system for regulatory Class III medical devices. The framework consists of the following sixteen attributes of a comprehensive medical device risk management system: fault/failure analysis, premarket testing/clinical trials, post-approval studies, manufacturer sponsored hospital studies, product labeling, establishment inspections, problem reporting program, mandatory hospital reporting, medical literature surveillance, device/patient registries, device performance monitoring, returned product analysis, autopsy program, emergency treatment funds/interim compensation, product liability, and alternative compensation mechanisms. Review of performance histories for several medical devices can reveal the value of information for many attributes, and also the inter-dependencies of the attributes in generating risk information flow. Such an information flow network is presented as a starting point for enhancing medical device risk management by focusing on attributes with high net benefit values and potential to spur information dissemination.
Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S
2013-01-01
Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.
ERIC Educational Resources Information Center
Fenollar, Pedro; Roman, Sergio; Cuestas, Pedro J.
2007-01-01
Background: The prediction and explanation of academic performance and the investigation of the factors relating to the academic success and persistence of students are topics of utmost importance in higher education. Aims: The main aim of the present study is to develop and test a conceptual framework in a university context, where the effects of…
ERIC Educational Resources Information Center
Murray, Victor; Jick, Todd D.
This paper presents a conceptual framework for analyzing the impact of funding cutbacks on human services organizations (HSOs). HSOs include publicly-funded educational, health, welfare, and cultural organizations. The framework identifies five categories of variables which influence an organization's reaction to cutbacks. Category one, "objective…
The chronic care model versus disease management programs: a transaction cost analysis approach.
Leeman, Jennifer; Mark, Barbara
2006-01-01
The present article applies transaction cost analysis as a framework for better understanding health plans' decisions to improve chronic illness management by using disease management programs versus redesigning care within physician practices.
Near-miss incident management in the chemical process industry.
Phimister, James R; Oktem, Ulku; Kleindorfer, Paul R; Kunreuther, Howard
2003-06-01
This article provides a systematic framework for the analysis and improvement of near-miss programs in the chemical process industries. Near-miss programs improve corporate environmental, health, and safety (EHS) performance through the identification and management of near misses. Based on more than 100 interviews at 20 chemical and pharmaceutical facilities, a seven-stage framework has been developed and is presented herein. The framework enables sites to analyze their own near-miss programs, identify weak management links, and implement systemwide improvements.
Mitchell, Brett G; Gardner, Anne
2014-03-01
To present a discussion on theoretical frameworks in infection prevention and control. Infection prevention and control programmes have been in place for several years in response to the incidence of healthcare-associated infections and their associated morbidity and mortality. Theoretical frameworks play an important role in formalizing the understanding of infection prevention activities. Discussion paper. A literature search using electronic databases was conducted for published articles in English addressing theoretical frameworks in infection prevention and control between 1980-2012. Nineteen papers that included a reference to frameworks were identified in the review. A narrative analysis of these papers was completed. Two models were identified and neither included the role of surveillance. To reduce the risk of acquiring a healthcare-associated infection, a multifaceted approach to infection prevention is required. One key component in this approach is surveillance. The review identified two infection prevention and control frameworks, yet these are rarely applied in infection prevention and control programmes. Only one framework considered the multifaceted approach required for infection prevention. It did not, however, incorporate the role of surveillance. We present a framework that incorporates the role of surveillance into a biopsychosocial approach to infection prevention and control. Infection prevention and control programmes and associated research are led primarily by nurses. There is a need for an explicit infection prevention and control framework incorporating the important role that surveillance has in infection prevention activities. This study presents one framework for further critique and discussion. © 2013 John Wiley & Sons Ltd.
An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis.
Wang, Zi; Ramsey, Benjamin J; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong
2016-01-01
With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations.
An ICT Adoption Framework for Education: A Case Study in Public Secondary School of Indonesia
NASA Astrophysics Data System (ADS)
Nurjanah, S.; Santoso, H. B.; Hasibuan, Z. A.
2017-01-01
This paper presents preliminary research findings on the ICT adoption framework for education. Despite many studies have been conducted on ICT adoption framework in education at various countries, they are lack of analysis on the degree of component contribution to the success to the framework. In this paper a set of components that link to ICT adoption in education is observed based on literatures and explorative analysis. The components are Infrastructure, Application, User Skills, Utilization, Finance, and Policy. The components are used as a basis to develop a questionnaire to capture the current ICT adoption condition in schools. The data from questionnaire are processed using Structured Equation Model (SEM). The results show that each component contributes differently to the ICT adoption framework. Finance provides the strongest affect to Infrastructure readiness, whilst User Skills provides the strongest affect to Utilization. The study concludes that development of ICT adoption framework should consider components contribution weights among the components that can be used to guide the implementation of ICT adoption in education.
Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih
2015-01-01
The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions.
Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih
2015-01-01
The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions. PMID:26600156
Analytical method of waste allocation in waste management systems: Concept, method and case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergeron, Francis C., E-mail: francis.b.c@videotron.ca
Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. Themore » conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste management in Geneva.« less
Big Data Analysis Framework for Healthcare and Social Sectors in Korea
Song, Tae-Min
2015-01-01
Objectives We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. Methods We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Results Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. Conclusions There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached. PMID:25705552
Language-Based Curriculum Analysis: A Collaborative Assessment and Intervention Process.
ERIC Educational Resources Information Center
Prelock, Patricia A.
1997-01-01
Presents a systematic process for completing a language-based curriculum analysis to address curriculum expectations that may challenge students with communication impairments. Analysis of vocabulary and the demands for comprehension, oral, and written expression within specific content areas provides a framework for collaboration between teachers…
Invitation to Consumer Behavior Analysis
ERIC Educational Resources Information Center
Foxall, Gordon R.
2010-01-01
This article presents an introduction to consumer behavior analysis by describing the Behavioral Perspective Model of consumer choice and showing how research has, first, confirmed this framework and, second, opened up behavior analysis and behavioral economics to the study of consumer behavior in natural settings. It concludes with a discussion…
The Effect of Framework Design on Stress Distribution in Implant-Supported FPDs: A 3-D FEM Study
Eraslan, Oguz; Inan, Ozgur; Secilmis, Asli
2010-01-01
Objectives: The biomechanical behavior of the superstructure plays an important role in the functional longevity of dental implants. However, information about the influence of framework design on stresses transmitted to the implants and supporting tissues is limited. The purpose of this study was to evaluate the effects of framework designs on stress distribution at the supporting bone and supporting implants. Methods: In this study, the three-dimensional (3D) finite element stress analysis method was used. Three types of 3D mathematical models simulating three different framework designs for implant-supported 3-unit posterior fixed partial dentures were prepared with supporting structures. Convex (1), concave (2), and conventional (3) pontic framework designs were simulated. A 300-N static vertical occlusal load was applied on the node at the center of occlusal surface of the pontic to calculate the stress distributions. As a second condition, frameworks were directly loaded to evaluate the effect of the framework design clearly. The Solidworks/Cosmosworks structural analysis programs were used for finite element modeling/analysis. Results: The analysis of the von Mises stress values revealed that maximum stress concentrations were located at the loading areas for all models. The pontic side marginal edges of restorations and the necks of implants were other stress concentration regions. There was no clear difference among models when the restorations were loaded at occlusal surfaces. When the veneering porcelain was removed, and load was applied directly to the framework, there was a clear increase in stress concentration with a concave design on supporting implants and bone structure. Conclusions: The present study showed that the use of a concave design in the pontic frameworks of fixed partial dentures increases the von Mises stress levels on implant abutments and supporting bone structure. However, the veneering porcelain element reduces the effect of the framework and compensates for design weaknesses. PMID:20922156
Static Analysis of Mobile Programs
2017-02-01
information flow analysis has the potential to significantly aid human auditors , but it is handicapped by high false positive rates. Instead, auditors ...presents these specifications to a human auditor for validation. We have implemented this framework for a taint analysis of An- droid apps that relies on...of queries to a human auditor . 6.4 Inferring Library Information Flow Specifications Using Dynamic Anal- ysis In [15], we present a technique to mine
ERIC Educational Resources Information Center
Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa
2009-01-01
This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…
Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data
NASA Astrophysics Data System (ADS)
Okladnikov, I.; Gordov, E. P.; Titov, A. G.
2011-12-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, meteorological observational data for the territory of the former USSR for the 20th century, and others. Current version of the system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The software framework presented allows rapid development of Web-GIS systems for geophysical data analysis thus providing specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. This work is partially supported by RFBR grants #10-07-00547, #11-05-01190, and SB RAS projects 4.31.1.5, 4.31.2.7, 4, 8, 9, 50 and 66.
Comparative and Predictive Multimedia Assessments Using Monte Carlo Uncertainty Analyses
NASA Astrophysics Data System (ADS)
Whelan, G.
2002-05-01
Multiple-pathway frameworks (sometimes referred to as multimedia models) provide a platform for combining medium-specific environmental models and databases, such that they can be utilized in a more holistic assessment of contaminant fate and transport in the environment. These frameworks provide a relatively seamless transfer of information from one model to the next and from databases to models. Within these frameworks, multiple models are linked, resulting in models that consume information from upstream models and produce information to be consumed by downstream models. The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) is an example, which allows users to link their models to other models and databases. FRAMES is an icon-driven, site-layout platform that is an open-architecture, object-oriented system that interacts with environmental databases; helps the user construct a Conceptual Site Model that is real-world based; allows the user to choose the most appropriate models to solve simulation requirements; solves the standard risk paradigm of release transport and fate; and exposure/risk assessments to people and ecology; and presents graphical packages for analyzing results. FRAMES is specifically designed allow users to link their own models into a system, which contains models developed by others. This paper will present the use of FRAMES to evaluate potential human health exposures using real site data and realistic assumptions from sources, through the vadose and saturated zones, to exposure and risk assessment at three real-world sites, using the Multimedia Environmental Pollutant Assessment System (MEPAS), which is a multimedia model contained within FRAMES. These real-world examples use predictive and comparative approaches coupled with a Monte Carlo analysis. A predictive analysis is where models are calibrated to monitored site data, prior to the assessment, and a comparative analysis is where models are not calibrated but based solely on literature or judgement and is usually used to compare alternatives. In many cases, a combination is employed where the model is calibrated to a portion of the data (e.g., to determine hydrodynamics), then used to compare alternatives. Three subsurface-based multimedia examples are presented, increasing in complexity. The first presents the application of a predictive, deterministic assessment; the second presents a predictive and comparative, Monte Carlo analysis; and the third presents a comparative, multi-dimensional Monte Carlo analysis. Endpoints are typically presented in terms of concentration, hazard, risk, and dose, and because the vadose zone model typically represents a connection between a source and the aquifer, it does not generally represent the final medium in a multimedia risk assessment.
NASA Astrophysics Data System (ADS)
Manstetten, Paul; Filipovic, Lado; Hössinger, Andreas; Weinbub, Josef; Selberherr, Siegfried
2017-02-01
We present a computationally efficient framework to compute the neutral flux in high aspect ratio structures during three-dimensional plasma etching simulations. The framework is based on a one-dimensional radiosity approach and is applicable to simulations of convex rotationally symmetric holes and convex symmetric trenches with a constant cross-section. The framework is intended to replace the full three-dimensional simulation step required to calculate the neutral flux during plasma etching simulations. Especially for high aspect ratio structures, the computational effort, required to perform the full three-dimensional simulation of the neutral flux at the desired spatial resolution, conflicts with practical simulation time constraints. Our results are in agreement with those obtained by three-dimensional Monte Carlo based ray tracing simulations for various aspect ratios and convex geometries. With this framework we present a comprehensive analysis of the influence of the geometrical properties of high aspect ratio structures as well as of the particle sticking probability on the neutral particle flux.
Clark, Phillip G; Cott, Cheryl; Drinka, Theresa J K
2007-12-01
Interprofessional teamwork is an essential and expanding form of health care practice. While moral issues arising in teamwork relative to the patient have been explored, the analysis of ethical issues regarding the function of the team itself is limited. This paper develops a conceptual framework for organizing and analyzing the different types of ethical issues in interprofessional teamwork. This framework is a matrix that maps the elements of principles, structures, and processes against individual, team, and organizational levels. A case study is presented that illustrates different dimensions of these topics, based on the application of this framework. Finally, a set of conclusions and recommendations is presented to summarize the integration of theory and practice in interprofessional ethics, including: (i) importance of a framework, (ii) interprofessional ethics discourse, and (iii) interprofessional ethics as an emerging field. The goal of this paper is to begin a dialogue and discussion on the ethical issues confronting interprofessional teams and to lay the foundation for an expanding discourse on interprofessional ethics.
A Decision Support Framework for Science-Based, Multi-Stakeholder Deliberation: A Coral Reef Example
NASA Astrophysics Data System (ADS)
Rehr, Amanda P.; Small, Mitchell J.; Bradley, Patricia; Fisher, William S.; Vega, Ann; Black, Kelly; Stockton, Tom
2012-12-01
We present a decision support framework for science-based assessment and multi-stakeholder deliberation. The framework consists of two parts: a DPSIR (Drivers-Pressures-States-Impacts-Responses) analysis to identify the important causal relationships among anthropogenic environmental stressors, processes, and outcomes; and a Decision Landscape analysis to depict the legal, social, and institutional dimensions of environmental decisions. The Decision Landscape incorporates interactions among government agencies, regulated businesses, non-government organizations, and other stakeholders. It also identifies where scientific information regarding environmental processes is collected and transmitted to improve knowledge about elements of the DPSIR and to improve the scientific basis for decisions. Our application of the decision support framework to coral reef protection and restoration in the Florida Keys focusing on anthropogenic stressors, such as wastewater, proved to be successful and offered several insights. Using information from a management plan, it was possible to capture the current state of the science with a DPSIR analysis as well as important decision options, decision makers and applicable laws with a the Decision Landscape analysis. A structured elicitation of values and beliefs conducted at a coral reef management workshop held in Key West, Florida provided a diversity of opinion and also indicated a prioritization of several environmental stressors affecting coral reef health. The integrated DPSIR/Decision landscape framework for the Florida Keys developed based on the elicited opinion and the DPSIR analysis can be used to inform management decisions, to reveal the role that further scientific information and research might play to populate the framework, and to facilitate better-informed agreement among participants.
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
A one health framework for estimating the economic costs of zoonotic diseases on society.
Narrod, Clare; Zinsstag, Jakob; Tiongco, Marites
2012-06-01
This article presents an integrated epidemiological and economic framework for assessing zoonoses using a "one health" concept. The framework allows for an understanding of the cross-sector economic impact of zoonoses using modified risk analysis and detailing a range of analytical tools. The goal of the framework is to link the analysis outputs of animal and human disease transmission models, economic impact models and evaluation of risk management options to gain improved understanding of factors affecting the adoption of risk management strategies so that investment planning includes the most promising interventions (or sets of interventions) in an integrated fashion. A more complete understanding of the costs of the disease and the costs and benefits of control measures would promote broader implementation of the most efficient and effective control measures, contributing to improved animal and human health, better livelihood outcomes for the poor and macroeconomic growth.
Ethical hot spots of combined individual and group therapy: applying four ethical systems.
Brabender, Virginia M; Fallon, April
2009-01-01
Abstract Combined therapy presents ethical quandaries that occur in individual psychotherapy and group psychotherapy, and dilemmas specifically associated with their integration. This paper examines two types of ethical frameworks (a classical principle-based framework and a set of context-based frameworks) for addressing the ethical hot spots of combined therapy: self-referral, transfer of information, and termination. The principle-based approach enables the practitioner to see what core values may be served or violated by different courses of action in combined therapy dilemmas. Yet, the therapist is more likely to do justice to the complexity and richness of the combined therapy situation by supplementing a principle analysis with three additional ethical frameworks. These approaches are: virtue ethics, feminist ethics, and casuistry. An analysis of three vignettes illustrates how these contrasting ethical models not only expand the range of features to which the therapist attends but also the array of solutions the therapist generates.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems.
Mahadevan, Vijay S; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-08-06
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-01-01
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework. PMID:24982250
This paper presents preliminary results from our ongoing work on the development of “FREIDA in Ports”: an interactive information resource and modeling framework for port communities, that may be used to enhance resilience to climate change and enable sustainable deve...
Joanna Endter-Wada; Dale J. Blahna
2011-01-01
This article presents the " Linkages to Public Land" (LPL) Framework, a general but comprehensive data-gathering and analysis approach aimed at informing citizen and agency decision making about the social environment of public land. This social assessment and planning approach identifies and categorizes various types of linkages that people have to public...
ERIC Educational Resources Information Center
Mathur, Sarup R.; Corley, Kathleen M.
2014-01-01
This article argues for the need to discuss the topic of ethics in the classroom and presents five frameworks of ethics that have been applied to education. A case analysis used in workshops with educators in the field of Special Education is described, and the benefits of sharing narratives are discussed. The authors offer suggestions, grounded…
ERIC Educational Resources Information Center
Jovanovic, Aleksandar; Jankovic, Anita; Jovanovic, Snezana Markovic; Peric, Vladan; Vitosevic, Biljana; Pavlovic, Milos
2015-01-01
The paper describes the delivery of the courses in the framework of the project implementation and presents the effect the change in the methodology had on student performance as measured by final grade. Methodology: University of Pristina piloted blended courses in 2013 under the framework of the Tempus BLATT project. The blended learning…
ATR evaluation through the synthesis of multiple performance measures
NASA Astrophysics Data System (ADS)
Bassham, Christopher B.; Klimack, William K.; Bauer, Kenneth W., Jr.
2002-07-01
This research demonstrates the application of decision analysis (DA) techniques to decisions made within Automatic Target Recognition (ATR) technology development. This work is accomplished to improve the means by which ATR technologies are evaluated. The first step in this research was to create a flexible decision analysis framework that could be applied to several decisions across different ATR programs evaluated by the Comprehensive ATR Scientific Evaluation (COMPASE) Center of the Air Force Research Laboratory (AFRL). For the purposes of this research, a single COMPASE Center representative provided the value, utility, and preference functions for the DA framework. The DA framework employs performance measures collected during ATR classification system (CS) testing to calculate value and utility scores. The authors gathered data from the Moving and Stationary Target Acquisition and Recognition (MSTAR) program to demonstrate how the decision framework could be used to evaluate three different ATR CSs. A decision-maker may use the resultant scores to gain insight into any of the decisions that occur throughout the lifecycle of ATR technologies. Additionally, a means of evaluating ATR CS self-assessment ability is presented. This represents a new criterion that emerged from this study, and no present evaluation metric is known.
ERIC Educational Resources Information Center
Delrio, Claudio; Ami, Zvi Ben; de Groot, Reuma; Drachmann, Raul; Ilomaki, Liisa
2008-01-01
The aim of this report is, first of all, to present the KP-Lab approach toward stakeholders in the wider framework of European policies. Secondly, the KP-Lab definition of stakeholders and the strategy to address different stakeholders needs, concerns and expectations is presented in the following paragraphs. The second chapter presents concrete…
Inverse problems in heterogeneous and fractured media using peridynamics
Turner, Daniel Z.; van Bloemen Waanders, Bart G.; Parks, Michael L.
2015-12-10
The following work presents an adjoint-based methodology for solving inverse problems in heterogeneous and fractured media using state-based peridynamics. We show that the inner product involving the peridynamic operators is self-adjoint. The proposed method is illustrated for several numerical examples with constant and spatially varying material parameters as well as in the context of fractures. We also present a framework for obtaining material parameters by integrating digital image correlation (DIC) with inverse analysis. This framework is demonstrated by evaluating the bulk and shear moduli for a sample of nuclear graphite using digital photographs taken during the experiment. The resulting measuredmore » values correspond well with other results reported in the literature. Lastly, we show that this framework can be used to determine the load state given observed measurements of a crack opening. Furthermore, this type of analysis has many applications in characterizing subsurface stress-state conditions given fracture patterns in cores of geologic material.« less
An Empirical Study on Needs Analysis of College Business English Course
ERIC Educational Resources Information Center
Wu, Yan
2012-01-01
Under the theoretical framework of needs analysis, this paper is aimed to give insights into the college business English learners' needs (including target situation needs, learning situation needs and present situation needs). The analysis of the research data has provided teachers insights into business English teaching related issues.
Decreasing the Risk of Adopting New Interactive Instructional Delivery Technologies.
ERIC Educational Resources Information Center
Dennis, Verl E.
1993-01-01
Discusses new interactive training technologies; considers risks of adopting a new technology; and presents the conceptual framework of technology life cycle analysis that provides timing information for the adoption of a new technology that should be used in addition to cost-benefit analysis and technical analysis. (LRW)
Rose, Adam; Avetisyan, Misak; Chatterjee, Samrat
2014-08-01
This article presents a framework for economic consequence analysis of terrorism countermeasures. It specifies major categories of direct and indirect costs, benefits, spillover effects, and transfer payments that must be estimated in a comprehensive assessment. It develops a spreadsheet tool for data collection, storage, and refinement, as well as estimation of the various components of the necessary economic accounts. It also illustrates the usefulness of the framework in the first assessment of the tradeoffs between enhanced security and changes in commercial activity in an urban area, with explicit attention to the role of spillover effects. The article also contributes a practical user interface to the model for emergency managers. © 2014 Society for Risk Analysis.
Uncertainty Analysis of Consequence Management (CM) Data Products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole
The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.
NASA Astrophysics Data System (ADS)
Hawkins, Donovan Lee
In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.
BeamDyn: A High-Fidelity Wind Turbine Blade Solver in the FAST Modular Framework: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Q.; Sprague, M.; Jonkman, J.
2015-01-01
BeamDyn, a Legendre-spectral-finite-element implementation of geometrically exact beam theory (GEBT), was developed to meet the design challenges associated with highly flexible composite wind turbine blades. In this paper, the governing equations of GEBT are reformulated into a nonlinear state-space form to support its coupling within the modular framework of the FAST wind turbine computer-aided engineering (CAE) tool. Different time integration schemes (implicit and explicit) were implemented and examined for wind turbine analysis. Numerical examples are presented to demonstrate the capability of this new beam solver. An example analysis of a realistic wind turbine blade, the CX-100, is also presented asmore » validation.« less
Hassan, Sally; Huang, Hsini; Warren, Kim; Mahdavi, Behzad; Smith, David; Jong, Simcha; Farid, Suzanne S
2016-04-01
Some allogeneic cell therapies requiring a high dose of cells for large indication groups demand a change in cell expansion technology, from planar units to microcarriers in single-use bioreactors for the market phase. The aim was to model the optimal timing for making this change. A development lifecycle cash flow framework was created to examine the implications of process changes to microcarrier cultures at different stages of a cell therapy's lifecycle. The analysis performed under assumptions used in the framework predicted that making this switch earlier in development is optimal from a total expected out-of-pocket cost perspective. From a risk-adjusted net present value view, switching at Phase I is economically competitive but a post-approval switch can offer the highest risk-adjusted net present value as the cost of switching is offset by initial market penetration with planar technologies. The framework can facilitate early decision-making during process development.
NASA Astrophysics Data System (ADS)
Hyater-Adams, Simone; Fracchiolla, Claudia; Finkelstein, Noah; Hinko, Kathleen
2018-06-01
Studies on physics identity are appearing more frequently and often responding to increased awareness of the underrepresentation of students of color in physics. In our broader research, we focus our efforts on understanding how racial identity and physics identity are negotiated throughout the experiences of Black physicists. In this paper, we present a Critical Physics Identity framework that can be used to examine racialized physics identity and demonstrate the utility of this framework by analyzing interviews with four physicists. Our framework draws from prior constructs of physics identity and racialized identity and provides operational definitions of six interacting dimensions. In this paper, we present the operationalized constructs, demonstrate how we use these constructs to code narrative data, as well as outline three methods of analysis that may be applied to study systems and structures and their influences on the experiences of Black students.
A Formal Framework for the Analysis of Algorithms That Recover From Loss of Separation
NASA Technical Reports Server (NTRS)
Butler, RIcky W.; Munoz, Cesar A.
2008-01-01
We present a mathematical framework for the specification and verification of state-based conflict resolution algorithms that recover from loss of separation. In particular, we propose rigorous definitions of horizontal and vertical maneuver correctness that yield horizontal and vertical separation, respectively, in a bounded amount of time. We also provide sufficient conditions for independent correctness, i.e., separation under the assumption that only one aircraft maneuvers, and for implicitly coordinated correctness, i.e., separation under the assumption that both aircraft maneuver. An important benefit of this approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).
Multidisciplinary Environments: A History of Engineering Framework Development
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Gillian, Ronnie E.
2006-01-01
This paper traces the history of engineering frameworks and their use by Multidisciplinary Design Optimization (MDO) practitioners. The approach is to reference papers that have been presented at one of the ten previous Multidisciplinary Analysis and Optimization (MA&O) conferences. By limiting the search to MA&O papers, the authors can (1) identify the key ideas that led to general purpose MDO frameworks and (2) uncover roadblocks that delayed the development of these ideas. The authors make no attempt to assign credit for revolutionary ideas or to assign blame for missed opportunities. Rather, the goal is to trace the various threads of computer architecture and software framework research and to observe how these threads contributed to the commercial framework products available today.
Structural Analysis in a Conceptual Design Framework
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.
2012-01-01
Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.
Götschi, Thomas; de Nazelle, Audrey; Brand, Christian; Gerike, Regine
2017-09-01
This paper reviews the use of conceptual frameworks in research on active travel, such as walking and cycling. Generic framework features and a wide range of contents are identified and synthesized into a comprehensive framework of active travel behavior, as part of the Physical Activity through Sustainable Transport Approaches project (PASTA). PASTA is a European multinational, interdisciplinary research project on active travel and health. Along with an exponential growth in active travel research, a growing number of conceptual frameworks has been published since the early 2000s. Earlier frameworks are simpler and emphasize the distinction of environmental vs. individual factors, while more recently several studies have integrated travel behavior theories more thoroughly. Based on the reviewed frameworks and various behavioral theories, we propose the comprehensive PASTA conceptual framework of active travel behavior. We discuss how it can guide future research, such as data collection, data analysis, and modeling of active travel behavior, and present some examples from the PASTA project.
A new framework for comprehensive, robust, and efficient global sensitivity analysis: 1. Theory
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin V.
2016-01-01
Computer simulation models are continually growing in complexity with increasingly more factors to be identified. Sensitivity Analysis (SA) provides an essential means for understanding the role and importance of these factors in producing model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to "variogram analysis," that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. Synthetic functions that resemble actual model response surfaces are used to illustrate the concepts, and show VARS to be as much as two orders of magnitude more computationally efficient than the state-of-the-art Sobol approach. In a companion paper, we propose a practical implementation strategy, and demonstrate the effectiveness, efficiency, and reliability (robustness) of the VARS framework on real-data case studies.
Transport induced by mean-eddy interaction: II. Analysis of transport processes
NASA Astrophysics Data System (ADS)
Ide, Kayo; Wiggins, Stephen
2015-03-01
We present a framework for the analysis of transport processes resulting from the mean-eddy interaction in a flow. The framework is based on the Transport Induced by the Mean-Eddy Interaction (TIME) method presented in a companion paper (Ide and Wiggins, 2014) [1]. The TIME method estimates the (Lagrangian) transport across stationary (Eulerian) boundaries defined by chosen streamlines of the mean flow. Our framework proceeds after first carrying out a sequence of preparatory steps that link the flow dynamics to the transport processes. This includes the construction of the so-called "instantaneous flux" as the Hovmöller diagram. Transport processes are studied by linking the signals of the instantaneous flux field to the dynamical variability of the flow. This linkage also reveals how the variability of the flow contributes to the transport. The spatio-temporal analysis of the flux diagram can be used to assess the efficiency of the variability in transport processes. We apply the method to the double-gyre ocean circulation model in the situation where the Rossby-wave mode dominates the dynamic variability. The spatio-temporal analysis shows that the inter-gyre transport is controlled by the circulating eddy vortices in the fast eastward jet region, whereas the basin-scale Rossby waves have very little impact.
Tsiknakis, Manolis; Kouroubali, Angelina
2009-01-01
The paper presents an application of the "Fit between Individuals, Task and Technology" (FITT) framework to analyze the socio-organizational-technical factors that influence IT adoption in the healthcare domain. The FITT framework was employed as the theoretical instrument for a retrospective analysis of a 15-year effort in implementing IT systems and eHealth services in the context of a Regional Health Information Network in Crete. Quantitative and qualitative research methods, interviews and participant observations were employed to gather data from a case study that involved the entire region of Crete. The detailed analysis of the case study based on the FITT framework, showed common features, but also differences of IT adoption within the various health organizations. The emerging picture is a complex nexus of factors contributing to IT adoption, and multi-level interventional strategies to promote IT use. The work presented in this paper shows the applicability of the FITT framework in explaining the complexity of aspects observed in the implementation of healthcare information systems. The reported experiences reveal that fit management can be viewed as a system with a feedback loop that is never really stable, but ever changing based on external factors or deliberate interventions. Management of fit, therefore, becomes a constant and complex task for the whole life cycle of IT systems.
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
Using the 'protective environment' framework to analyse children's protection needs in Darfur.
Ager, Alastair; Boothby, Neil; Bremer, Megan
2009-10-01
A major humanitarian concern during the continuing crisis in Darfur, Sudan, has been the protection of children, although there has been little in the way of comprehensive analysis to guide intervention. Founded on a situational analysis conducted between October 2005 and March 2006, this paper documents the significant threats to children's well-being directly linked to the political conflict. It demonstrates the role of non-conflict factors in exacerbating these dangers and in promoting additional protection violations, and it uses the 'protective environment' framework (UNICEF Sudan, 2006a) to identify systematic features of the current environment that put children at risk. This framework is shown to provide a coherent basis for assessment and planning, prompting broad, multidisciplinary analysis, concentrating on preventive and protective action, and fostering a systemic approach (rather than placing an undue focus on the discrete needs of 'vulnerable groups'). Constraints on its present utility in emergency settings are also noted.
XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework
NASA Astrophysics Data System (ADS)
Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò
2017-08-01
We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.
Steps toward improving ethical evaluation in health technology assessment: a proposed framework.
Assasi, Nazila; Tarride, Jean-Eric; O'Reilly, Daria; Schwartz, Lisa
2016-06-06
While evaluation of ethical aspects in health technology assessment (HTA) has gained much attention during the past years, the integration of ethics in HTA practice still presents many challenges. In response to the increasing demand for expansion of health technology assessment (HTA) methodology to include ethical issues more systematically, this article reports on a multi-stage study that aimed at construction of a framework for improving the integration of ethics in HTA. The framework was developed through the following phases: 1) a systematic review and content analysis of guidance documents for ethics in HTA; 2) identification of factors influencing the integration of ethical considerations in HTA; 3) preparation of an action-oriented framework based on the key elements of the existing guidance documents and identified barriers to and facilitators of their implementation; and 4) expert consultation and revision of the framework. The proposed framework consists of three main components: an algorithmic flowchart, which exhibits the different steps of an ethical inquiry throughout the HTA process, including: defining the objectives and scope of the evaluation, stakeholder analysis, assessing organizational capacity, framing ethical evaluation questions, ethical analysis, deliberation, and knowledge translation; a stepwise guide, which focuses on the task objectives and potential questions that are required to be addressed at each step; and a list of some commonly recommended or used tools to help facilitate the evaluation process. The proposed framework can be used to support and promote good practice in integration of ethics into HTA. However, further validation of the framework through case studies and expert consultation is required to establish its utility for HTA practice.
ERIC Educational Resources Information Center
Bondy, A.; Tincani, M.; Frost, L.
2004-01-01
This paper presents Skinner's (1957) analysis of verbal behavior as a framework for understanding language acquisition in children with autism. We describe Skinner's analysis of pure and impure verbal operants and illustrate how this analysis may be applied to the design of communication training programs. The picture exchange communication system…
a Simulation-As Framework Facilitating Webgis Based Installation Planning
NASA Astrophysics Data System (ADS)
Zheng, Z.; Chang, Z. Y.; Fei, Y. F.
2017-09-01
Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users' operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents' process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.
Using Decision Analysis to Improve Malaria Control Policy Making
Kramer, Randall; Dickinson, Katherine L.; Anderson, Richard M.; Fowler, Vance G.; Miranda, Marie Lynn; Mutero, Clifford M.; Saterson, Kathryn A.; Wiener, Jonathan B.
2013-01-01
Malaria and other vector-borne diseases represent a significant and growing burden in many tropical countries. Successfully addressing these threats will require policies that expand access to and use of existing control methods, such as insecticide-treated bed nets and artemesinin combination therapies for malaria, while weighing the costs and benefits of alternative approaches over time. This paper argues that decision analysis provides a valuable framework for formulating such policies and combating the emergence and re-emergence of malaria and other diseases. We outline five challenges that policy makers and practitioners face in the struggle against malaria, and demonstrate how decision analysis can help to address and overcome these challenges. A prototype decision analysis framework for malaria control in Tanzania is presented, highlighting the key components that a decision support tool should include. Developing and applying such a framework can promote stronger and more effective linkages between research and policy, ultimately helping to reduce the burden of malaria and other vector-borne diseases. PMID:19356821
Data Analysis with Graphical Models: Software Tools
NASA Technical Reports Server (NTRS)
Buntine, Wray L.
1994-01-01
Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.
Duncan, Susan M; Thorne, Sally; Van Neste-Kenny, Jocelyne; Tate, Betty
2012-05-01
Academic nursing leaders play a crucial role in the policy context for nursing education. Effectiveness in this role requires that they work together in presenting nursing education issues from a position of strength, informed by a critical analysis of policy pertaining to the delivery of quality nursing education and scholarship. We describe a collective process of dialog and critical analysis whereby nurse leaders in one Canadian province addressed pressing policy issues facing governments, nursing programs, faculty, and students. Consensus among academic nurse leaders, formalized through the development of a policy action framework, has enabled us to take a stand, at times highly contested, in the politicized arena of the nursing shortage. We present the components of a policy action framework for nursing education and share examples of how we have used a critical approach to analyze and frame policy issues in nursing education for inclusion on policy agendas. We believe our work has influenced provincial and national thinking about policy in nursing education is the foundation of our conclusion that political presence and shared strategy among academic nursing leaders is undeniably critical in the global context of nursing today. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria Rivas; Papalambros, Panos Y.; Baker, Wayne E.
2015-01-01
This paper examines four primary methods of working across disciplines during R&D and early design of large-scale complex engineered systems such as aerospace systems. A conceptualized framework, called the Combining System Elements framework, is presented to delineate several aspects of cross-discipline and system integration practice. The framework is derived from a theoretical and empirical analysis of current work practices in actual operational settings and is informed by theories from organization science and engineering. The explanatory framework may be used by teams to clarify assumptions and associated work practices, which may reduce ambiguity in understanding diverse approaches to early systems research, development and design. The framework also highlights that very different engineering results may be obtained depending on work practices, even when the goals for the engineered system are the same.
A case study using the PrOACT-URL and BRAT frameworks for structured benefit risk assessment.
Nixon, Richard; Dierig, Christoph; Mt-Isa, Shahrul; Stöckert, Isabelle; Tong, Thaison; Kuhls, Silvia; Hodgson, Gemma; Pears, John; Waddingham, Ed; Hockley, Kimberley; Thomson, Andrew
2016-01-01
While benefit-risk assessment is a key component of the drug development and maintenance process, it is often described in a narrative. In contrast, structured benefit-risk assessment builds on established ideas from decision analysis and comprises a qualitative framework and quantitative methodology. We compare two such frameworks, applying multi-criteria decision-analysis (MCDA) within the PrOACT-URL framework and weighted net clinical benefit (wNCB), within the BRAT framework. These are applied to a case study of natalizumab for the treatment of relapsing remitting multiple sclerosis. We focus on the practical considerations of applying these methods and give recommendations for visual presentation of results. In the case study, we found structured benefit-risk analysis to be a useful tool for structuring, quantifying, and communicating the relative benefit and safety profiles of drugs in a transparent, rational and consistent way. The two frameworks were similar. MCDA is a generic and flexible methodology that can be used to perform a structured benefit-risk in any common context. wNCB is a special case of MCDA and is shown to be equivalent to an extension of the number needed to treat (NNT) principle. It is simpler to apply and understand than MCDA and can be applied when all outcomes are measured on a binary scale. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Pörzse, Gábor
2009-08-09
Research and development (R&D) has been playing a leading role in the European Community's history since the very beginning of European integration. Its importance has grown in recent years, after the launch of the Lisbon strategy. Framework programs have always played a considerable part in community research. The aim of their introduction was to fine tune national R&D activities, and to successfully divide research tasks between the Community and the member states. The Community, from the very outset, has acknowledged the importance of life sciences. It is no coincidence that life sciences have become the second biggest priority in the last two framework programs. This study provides a historical, and at the same time analytical and evaluative review of community R&D policy and activity from the starting point of its development until the present day. It examines in detail how the changes in structure, conditional system, regulations and priorities of the framework programs have followed the formation of social and economic needs. The paper puts special emphasis on the analysis of the development of life science research, presenting how they have met the challenges of the age, and how they have been built into the framework programs. Another research area of the present study is to elaborate how successfully Hungarian researchers have been joining the community research, especially the framework programs in the field of life sciences. To answer these questions, it was essential to survey, process and analyze the data available in the national and European public and closed databases. Contrary to the previous documents, this analysis doesn't concentrate on the political and scientific background. It outlines which role community research has played in sustainable social and economic development and competitiveness, how it has supported common policies and how the processes of integration have been deepening. Besides, the present paper offers a complete review of the given field, from its foundation up until the present day, by elaborating the newest initiatives and ideas for the future. This work is also novel from the point of view of the given professional field, the life sciences in the framework programs, and processing and evaluating of data of Hungarian participation in the 5th and 6th framework programs in the field of life sciences.
ERIC Educational Resources Information Center
Erduran, Sibel
Eight physical science textbooks were analyzed for coverage on acids, bases, and neutralization. At the level of the text, clarity and coherence of statements were investigated. The conceptual framework for this topic was represented in a concept map which was used as a coding tool for tracing concepts and links present in textbooks. Cognitive…
NASA Astrophysics Data System (ADS)
El-Gafy, Mohamed Anwar
Transportation projects will have impact on the environment. The general environmental pollution and damage caused by roads is closely associated with the level of economic activity. Although Environmental Impact Assessments (EIAs) are dependent on geo-spatial information in order to make an assessment, there are no rules per se how to conduct an environmental assessment. Also, the particular objective of each assessment is dictated case-by-case, based on what information and analyses are required. The conventional way of Environmental Impact Assessment (EIA) study is a time consuming process because it has large number of dependent and independent variables which have to be taken into account, which also have different consequences. With the emergence of satellite remote sensing technology and Geographic Information Systems (GIS), this research presents a new framework for the analysis phase of the Environmental Impact Assessment (EIA) for transportation projects based on the integration between remote sensing technology, geographic information systems, and spatial modeling. By integrating the merits of the map overlay method and the matrix method, the framework analyzes comprehensively the environmental vulnerability around the road and its impact on the environment. This framework is expected to: (1) improve the quality of the decision making process, (2) be applied both to urban and inter-urban projects, regardless of transport mode, and (3) present the data and make the appropriate analysis to support the decision of the decision-makers and allow them to present these data to the public hearings in a simple manner. Case studies, transportation projects in the State of Florida, were analyzed to illustrate the use of the decision support framework and demonstrate its capabilities. This cohesive and integrated system will facilitate rational decisions through cost effective coordination of environmental information and data management that can be tailored to specific projects. The framework would facilitate collecting, organizing, analyzing, archiving, and coordinating the information and data necessary to support technical and policy transportation decisions.
NASA Astrophysics Data System (ADS)
Poya, Roman; Gil, Antonio J.; Ortigosa, Rogelio
2017-07-01
The paper presents aspects of implementation of a new high performance tensor contraction framework for the numerical analysis of coupled and multi-physics problems on streaming architectures. In addition to explicit SIMD instructions and smart expression templates, the framework introduces domain specific constructs for the tensor cross product and its associated algebra recently rediscovered by Bonet et al. (2015, 2016) in the context of solid mechanics. The two key ingredients of the presented expression template engine are as follows. First, the capability to mathematically transform complex chains of operations to simpler equivalent expressions, while potentially avoiding routes with higher levels of computational complexity and, second, to perform a compile time depth-first or breadth-first search to find the optimal contraction indices of a large tensor network in order to minimise the number of floating point operations. For optimisations of tensor contraction such as loop transformation, loop fusion and data locality optimisations, the framework relies heavily on compile time technologies rather than source-to-source translation or JIT techniques. Every aspect of the framework is examined through relevant performance benchmarks, including the impact of data parallelism on the performance of isomorphic and nonisomorphic tensor products, the FLOP and memory I/O optimality in the evaluation of tensor networks, the compilation cost and memory footprint of the framework and the performance of tensor cross product kernels. The framework is then applied to finite element analysis of coupled electro-mechanical problems to assess the speed-ups achieved in kernel-based numerical integration of complex electroelastic energy functionals. In this context, domain-aware expression templates combined with SIMD instructions are shown to provide a significant speed-up over the classical low-level style programming techniques.
Putting Public Health Ethics into Practice: A Systematic Framework
Marckmann, Georg; Schmidt, Harald; Sofaer, Neema; Strech, Daniel
2015-01-01
It is widely acknowledged that public health practice raises ethical issues that require a different approach than traditional biomedical ethics. Several frameworks for public health ethics (PHE) have been proposed; however, none of them provides a practice-oriented combination of the two necessary components: (1) a set of normative criteria based on an explicit ethical justification and (2) a structured methodological approach for applying the resulting normative criteria to concrete public health (PH) issues. Building on prior work in the field and integrating valuable elements of other approaches to PHE, we present a systematic ethical framework that shall guide professionals in planning, conducting, and evaluating PH interventions. Based on a coherentist model of ethical justification, the proposed framework contains (1) an explicit normative foundation with five substantive criteria and seven procedural conditions to guarantee a fair decision process, and (2) a six-step methodological approach for applying the criteria and conditions to the practice of PH and health policy. The framework explicitly ties together ethical analysis and empirical evidence, thus striving for evidence-based PHE. It can provide normative guidance to those who analyze the ethical implications of PH practice including academic ethicists, health policy makers, health technology assessment bodies, and PH professionals. It will enable those who implement a PH intervention and those affected by it (i.e., the target population) to critically assess whether and how the required ethical considerations have been taken into account. Thereby, the framework can contribute to assuring the quality of ethical analysis in PH. Whether the presented framework will be able to achieve its goals has to be determined by evaluating its practical application. PMID:25705615
Analysis of Public Policies for Sexuality Education in Germany and The Netherlands
ERIC Educational Resources Information Center
Aronowitz, Teri; Fawcett, Jacqueline
2015-01-01
The purpose of this article is to present an analysis of the philosophical, historical, sociological, political, and economic perspectives reflected in the public policies about lifespan sexuality education of Germany and The Netherlands. A new conceptual framework for analysis and evaluation of sexuality education policies that integrates the…
NASA Astrophysics Data System (ADS)
Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan
2018-01-01
Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.
Andrijcic, Eva; Horowitz, Barry
2006-08-01
The article is based on the premise that, from a macro-economic viewpoint, cyber attacks with long-lasting effects are the most economically significant, and as a result require more attention than attacks with short-lasting effects that have historically been more represented in literature. In particular, the article deals with evaluation of cyber security risks related to one type of attack with long-lasting effects, namely, theft of intellectual property (IP) by foreign perpetrators. An International Consequence Analysis Framework is presented to determine (1) the potential macro-economic consequences of cyber attacks that result in stolen IP from companies in the United States, and (2) the likely sources of such attacks. The framework presented focuses on IP theft that enables foreign companies to make economic gains that would have otherwise benefited the U.S. economy. Initial results are presented.
Christin, Zachary; Bagstad, Kenneth J.; Verdone, Michael
2016-01-01
Restoring degraded forests and agricultural lands has become a global conservation priority. A growing number of tools can quantify ecosystem service tradeoffs associated with forest restoration. This evolving “tools landscape” presents a dilemma: more tools are available, but selecting appropriate tools has become more challenging. We present a Restoration Ecosystem Service Tool Selector (RESTS) framework that describes key characteristics of 13 ecosystem service assessment tools. Analysts enter information about their decision context, services to be analyzed, and desired outputs. Tools are filtered and presented based on five evaluative criteria: scalability, cost, time requirements, handling of uncertainty, and applicability to benefit-cost analysis. RESTS uses a spreadsheet interface but a web-based interface is planned. Given the rapid evolution of ecosystem services science, RESTS provides an adaptable framework to guide forest restoration decision makers toward tools that can help quantify ecosystem services in support of restoration.
A Profile-Based Framework for Factorial Similarity and the Congruence Coefficient.
Hartley, Anselma G; Furr, R Michael
2017-01-01
We present a novel profile-based framework for understanding factorial similarity in the context of exploratory factor analysis in general, and for understanding the congruence coefficient (a commonly used index of factor similarity) specifically. First, we introduce the profile-based framework articulating factorial similarity in terms of 3 intuitive components: general saturation similarity, differential saturation similarity, and configural similarity. We then articulate the congruence coefficient in terms of these components, along with 2 additional profile-based components, and we explain how these components resolve ambiguities that can be-and are-found when using the congruence coefficient. Finally, we present secondary analyses revealing that profile-based components of factorial are indeed linked to experts' actual evaluations of factorial similarity. Overall, the profile-based approach we present offers new insights into the ways in which researchers can examine factor similarity and holds the potential to enhance researchers' ability to understand the congruence coefficient.
Application of Control Volume Analysis to Cerebrospinal Fluid Dynamics
NASA Astrophysics Data System (ADS)
Wei, Timothy; Cohen, Benjamin; Anor, Tomer; Madsen, Joseph
2011-11-01
Hydrocephalus is among the most common birth defects and may not be prevented nor cured. Afflicted individuals face serious issues, which at present are too complicated and not well enough understood to treat via systematic therapies. This talk outlines the framework and application of a control volume methodology to clinical Phase Contrast MRI data. Specifically, integral control volume analysis utilizes a fundamental, fluid dynamics methodology to quantify intracranial dynamics within a precise, direct, and physically meaningful framework. A chronically shunted, hydrocephalic patient in need of a revision procedure was used as an in vivo case study. Magnetic resonance velocity measurements within the patient's aqueduct were obtained in four biomedical state and were analyzed using the methods presented in this dissertation. Pressure force estimates were obtained, showing distinct differences in amplitude, phase, and waveform shape for different intracranial states within the same individual. Thoughts on the physiological and diagnostic research and development implications/opportunities will be presented.
Khadam, Ibrahim; Kaluarachchi, Jagath J
2003-07-01
Decision analysis in subsurface contamination management is generally carried out through a traditional engineering economic viewpoint. However, new advances in human health risk assessment, namely, the probabilistic risk assessment, and the growing awareness of the importance of soft data in the decision-making process, require decision analysis methodologies that are capable of accommodating non-technical and politically biased qualitative information. In this work, we discuss the major limitations of the currently practiced decision analysis framework, which evolves around the definition of risk and cost of risk, and its poor ability to communicate risk-related information. A demonstration using a numerical example was conducted to provide insight on these limitations of the current decision analysis framework. The results from this simple ground water contamination and remediation scenario were identical to those obtained from studies carried out on existing Superfund sites, which suggests serious flaws in the current risk management framework. In order to provide a perspective on how these limitations may be avoided in future formulation of the management framework, more matured and well-accepted approaches to decision analysis in dam safety and the utility industry, where public health and public investment are of great concern, are presented and their applicability in subsurface remediation management is discussed. Finally, in light of the success of the application of risk-based decision analysis in dam safety and the utility industry, potential options for decision analysis in subsurface contamination management are discussed.
A framework for characterizing eHealth literacy demands and barriers.
Chan, Connie V; Kaufman, David R
2011-11-17
Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum.
A Framework for Characterizing eHealth Literacy Demands and Barriers
Chan, Connie V
2011-01-01
Background Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. Objective We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. Methods We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. Results The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. Conclusions The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum. PMID:22094891
A Framework for Assessment of Aviation Safety Technology Portfolios
NASA Technical Reports Server (NTRS)
Jones, Sharon M.; Reveley, Mary S.
2014-01-01
The programs within NASA's Aeronautics Research Mission Directorate (ARMD) conduct research and development to improve the national air transportation system so that Americans can travel as safely as possible. NASA aviation safety systems analysis personnel support various levels of ARMD management in their fulfillment of system analysis and technology prioritization as defined in the agency's program and project requirements. This paper provides a framework for the assessment of aviation safety research and technology portfolios that includes metrics such as projected impact on current and future safety, technical development risk and implementation risk. The paper also contains methods for presenting portfolio analysis and aviation safety Bayesian Belief Network (BBN) output results to management using bubble charts and quantitative decision analysis techniques.
Imperial College near infrared spectroscopy neuroimaging analysis framework.
Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong
2018-01-01
This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.
NASA Astrophysics Data System (ADS)
Chalabi, Zaid; Milojevic, Ai; Doherty, Ruth M.; Stevenson, David S.; MacKenzie, Ian A.; Milner, James; Vieno, Massimo; Williams, Martin; Wilkinson, Paul
2017-10-01
A decision support system for evaluating UK air quality policies is presented. It combines the output from a chemistry transport model, a health impact model and other impact models within a multi-criteria decision analysis (MCDA) framework. As a proof-of-concept, the MCDA framework is used to evaluate and compare idealized emission reduction policies in four sectors (combustion in energy and transformation industries, non-industrial combustion plants, road transport and agriculture) and across six outcomes or criteria (mortality, health inequality, greenhouse gas emissions, biodiversity, crop yield and air quality legal compliance). To illustrate a realistic use of the MCDA framework, the relative importance of the criteria were elicited from a number of stakeholders acting as proxy policy makers. In the prototype decision problem, we show that reducing emissions from industrial combustion (followed very closely by road transport and agriculture) is more advantageous than equivalent reductions from the other sectors when all the criteria are taken into account. Extensions of the MCDA framework to support policy makers in practice are discussed.
Knowledge Discovery from Vibration Measurements
Li, Jian; Wang, Daoyao
2014-01-01
The framework as well as the particular algorithms of pattern recognition process is widely adopted in structural health monitoring (SHM). However, as a part of the overall process of knowledge discovery from data bases (KDD), the results of pattern recognition are only changes and patterns of changes of data features. In this paper, based on the similarity between KDD and SHM and considering the particularity of SHM problems, a four-step framework of SHM is proposed which extends the final goal of SHM from detecting damages to extracting knowledge to facilitate decision making. The purposes and proper methods of each step of this framework are discussed. To demonstrate the proposed SHM framework, a specific SHM method which is composed by the second order structural parameter identification, statistical control chart analysis, and system reliability analysis is then presented. To examine the performance of this SHM method, real sensor data measured from a lab size steel bridge model structure are used. The developed four-step framework of SHM has the potential to clarify the process of SHM to facilitate the further development of SHM techniques. PMID:24574933
A computational framework for prime implicants identification in noncoherent dynamic systems.
Di Maio, Francesco; Baronchelli, Samuele; Zio, Enrico
2015-01-01
Dynamic reliability methods aim at complementing the capability of traditional static approaches (e.g., event trees [ETs] and fault trees [FTs]) by accounting for the system dynamic behavior and its interactions with the system state transition process. For this, the system dynamics is here described by a time-dependent model that includes the dependencies with the stochastic transition events. In this article, we present a novel computational framework for dynamic reliability analysis whose objectives are i) accounting for discrete stochastic transition events and ii) identifying the prime implicants (PIs) of the dynamic system. The framework entails adopting a multiple-valued logic (MVL) to consider stochastic transitions at discretized times. Then, PIs are originally identified by a differential evolution (DE) algorithm that looks for the optimal MVL solution of a covering problem formulated for MVL accident scenarios. For testing the feasibility of the framework, a dynamic noncoherent system composed of five components that can fail at discretized times has been analyzed, showing the applicability of the framework to practical cases. © 2014 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pullum, Laura L; Symons, Christopher T
2011-01-01
Machine learning is used in many applications, from machine vision to speech recognition to decision support systems, and is used to test applications. However, though much has been done to evaluate the performance of machine learning algorithms, little has been done to verify the algorithms or examine their failure modes. Moreover, complex learning frameworks often require stepping beyond black box evaluation to distinguish between errors based on natural limits on learning and errors that arise from mistakes in implementation. We present a conceptual architecture, failure model and taxonomy, and failure modes and effects analysis (FMEA) of a semi-supervised, multi-modal learningmore » system, and provide specific examples from its use in a radiological analysis assistant system. The goal of the research described in this paper is to provide a foundation from which dependability analysis of systems using semi-supervised, multi-modal learning can be conducted. The methods presented provide a first step towards that overall goal.« less
Morrison, P; Burnard, P
1989-04-01
The theoretical framework known as Six Category Intervention Analysis is described. This framework has been used in the teaching of interpersonal skills in various settings but there appears to be little or no empirical work to test out the theory. In the present study, an instrument was devised for assessing student nurses' perceptions of their interpersonal skills based on the category analysis. The findings of the study are presented and a quantitative comparison is made with the results of an earlier study of trained nurses' perceptions. Marked similarities were noted between the two sets of findings. The key trend to emerge was that both groups of nurses tended to perceive themselves as being more authoritative and less facilitative in their interpersonal relationships, in terms of the category analysis. This trend and others are discussed and suggestions made for future directions in research and training in the field of interpersonal skills in nursing. Implications for the theory of six category intervention analysis are also discussed.
Microgravity isolation system design: A modern control analysis framework
NASA Technical Reports Server (NTRS)
Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.
1994-01-01
Many acceleration-sensitive, microgravity science experiments will require active vibration isolation from the manned orbiters on which they will be mounted. The isolation problem, especially in the case of a tethered payload, is a complex three-dimensional one that is best suited to modern-control design methods. These methods, although more powerful than their classical counterparts, can nonetheless go only so far in meeting the design requirements for practical systems. Once a tentative controller design is available, it must still be evaluated to determine whether or not it is fully acceptable, and to compare it with other possible design candidates. Realistically, such evaluation will be an inherent part of a necessary iterative design process. In this paper, an approach is presented for applying complex mu-analysis methods to a closed-loop vibration isolation system (experiment plus controller). An analysis framework is presented for evaluating nominal stability, nominal performance, robust stability, and robust performance of active microgravity isolation systems, with emphasis on the effective use of mu-analysis methods.
NASA Technical Reports Server (NTRS)
Braun, R. D.; Kroo, I. M.
1995-01-01
Collaborative optimization is a design architecture applicable in any multidisciplinary analysis environment but specifically intended for large-scale distributed analysis applications. In this approach, a complex problem is hierarchically de- composed along disciplinary boundaries into a number of subproblems which are brought into multidisciplinary agreement by a system-level coordination process. When applied to problems in a multidisciplinary design environment, this scheme has several advantages over traditional solution strategies. These advantageous features include reducing the amount of information transferred between disciplines, the removal of large iteration-loops, allowing the use of different subspace optimizers among the various analysis groups, an analysis framework which is easily parallelized and can operate on heterogenous equipment, and a structural framework that is well-suited for conventional disciplinary organizations. In this article, the collaborative architecture is developed and its mathematical foundation is presented. An example application is also presented which highlights the potential of this method for use in large-scale design applications.
Determination Of Slope Instability Using Spatially Integrated Mapping Framework
NASA Astrophysics Data System (ADS)
Baharuddin, I. N. Z.; Omar, R. C.; Roslan, R.; Khalid, N. H. N.; Hanifah, M. I. M.
2016-11-01
The determination and identification of slope instability are often rely on data obtained from in-situ soil investigation work where it involves the logistic of machineries and manpower, thus these aspects may increase the cost especially for remote locations. Therefore a method, which is able to identify possible slope instability without frequent ground walkabout survey, is needed. This paper presents the method used in prediction of slope instability using spatial integrated mapping framework which applicable for remote areas such as tropical forest and natural hilly terrain. Spatial data such as geology, topography, land use map, slope angle and elevation were used in regional analysis during desktop study. Through this framework, the occurrence of slope instability was able to be identified and was validate using a confirmatory site- specific analysis.
An Integrated Framework for Parameter-based Optimization of Scientific Workflows.
Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel
2009-01-01
Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.
NASA Astrophysics Data System (ADS)
Hess, M. R.; Petrovic, V.; Kuester, F.
2017-08-01
Digital documentation of cultural heritage structures is increasingly more common through the application of different imaging techniques. Many works have focused on the application of laser scanning and photogrammetry techniques for the acquisition of threedimensional (3D) geometry detailing cultural heritage sites and structures. With an abundance of these 3D data assets, there must be a digital environment where these data can be visualized and analyzed. Presented here is a feedback driven visualization framework that seamlessly enables interactive exploration and manipulation of massive point cloud data. The focus of this work is on the classification of different building materials with the goal of building more accurate as-built information models of historical structures. User defined functions have been tested within the interactive point cloud visualization framework to evaluate automated and semi-automated classification of 3D point data. These functions include decisions based on observed color, laser intensity, normal vector or local surface geometry. Multiple case studies are presented here to demonstrate the flexibility and utility of the presented point cloud visualization framework to achieve classification objectives.
Development of a theoretical framework for analyzing cerebrospinal fluid dynamics
Cohen, Benjamin; Voorhees, Abram; Vedel, Søren; Wei, Timothy
2009-01-01
Background To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservation; but control volume analysis enforces independent conditions on pressure and volume. Previously, utilization of clinical measurements has been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Methods Control volume analysis is presented to introduce the reader to the theoretical background of this foundational fluid mechanics technique for application to general control volumes. This approach is able to directly incorporate the diverse measurements obtained by clinicians to better elucidate intracranial dynamics and progression to disorder. Results Several examples of meaningful intracranial control volumes and the particular measurement sets needed for the analysis are discussed. Conclusion Control volume analysis provides a framework to guide the type and location of measurements and also a way to interpret the resulting data within a fundamental fluid physics analysis. PMID:19772652
Dafalla, Tarig Dafalla Mohamed; Kushniruk, Andre W; Borycki, Elizabeth M
2015-01-01
A pragmatic evaluation framework for evaluating the usability and usefulness of an e-learning intervention for a patient clinical information scheduling system is presented in this paper. The framework was conceptualized based on two different but related concepts (usability and usefulness) and selection of appropriate and valid methods of data collection and analysis that included: (1) Low-Cost Rapid Usability Engineering (LCRUE), (2) Cognitive Task Analysis (CTA), (3) Heuristic Evaluation (HE) criteria for web-based learning, and (4) Software Usability Measurement Inventory (SUMI). The results of the analysis showed some areas where usability that were related to General Interface Usability (GIU), instructional design and content was problematic; some of which might account for the poorly rated aspects of usability when subjectively measured. This paper shows that using a pragmatic framework can be a useful way, not only for measuring the usability and usefulness, but also for providing a practical objective evidences for learning and continuous quality improvement of e-learning systems. The findings should be of interest to educators, developers, designers, researchers, and usability practitioners involved in the development of e-learning systems in healthcare. This framework could be an appropriate method for assessing the usability, usefulness and safety of health information systems both in the laboratory and in the clinical context.
Sacks, G; Swinburn, B; Lawrence, M
2009-01-01
A comprehensive policy approach is needed to control the growing obesity epidemic. This paper proposes the Obesity Policy Action (OPA) framework, modified from the World Health Organization framework for the implementation of the Global Strategy on Diet, Physical Activity and Health, to provide specific guidance for governments to systematically identify areas for obesity policy action. The proposed framework incorporates three different public health approaches to addressing obesity: (i) 'upstream' policies influence either the broad social and economic conditions of society (e.g. taxation, education, social security) or the food and physical activity environments to make healthy eating and physical activity choices easier; (ii) 'midstream' policies are aimed at directly influencing population behaviours; and (iii) 'downstream' policies support health services and clinical interventions. A set of grids for analysing potential policies to support obesity prevention and management is presented. The general pattern that emerges from populating the analysis grids as they relate to the Australian context is that all sectors and levels of government, non-governmental organizations and private businesses have multiple opportunities to contribute to reducing obesity. The proposed framework and analysis grids provide a comprehensive approach to mapping the policy environment related to obesity, and a tool for identifying policy gaps, barriers and opportunities.
BAMSI: a multi-cloud service for scalable distributed filtering of massive genome data.
Ausmees, Kristiina; John, Aji; Toor, Salman Z; Hellander, Andreas; Nettelblad, Carl
2018-06-26
The advent of next-generation sequencing (NGS) has made whole-genome sequencing of cohorts of individuals a reality. Primary datasets of raw or aligned reads of this sort can get very large. For scientific questions where curated called variants are not sufficient, the sheer size of the datasets makes analysis prohibitively expensive. In order to make re-analysis of such data feasible without the need to have access to a large-scale computing facility, we have developed a highly scalable, storage-agnostic framework, an associated API and an easy-to-use web user interface to execute custom filters on large genomic datasets. We present BAMSI, a Software as-a Service (SaaS) solution for filtering of the 1000 Genomes phase 3 set of aligned reads, with the possibility of extension and customization to other sets of files. Unique to our solution is the capability of simultaneously utilizing many different mirrors of the data to increase the speed of the analysis. In particular, if the data is available in private or public clouds - an increasingly common scenario for both academic and commercial cloud providers - our framework allows for seamless deployment of filtering workers close to data. We show results indicating that such a setup improves the horizontal scalability of the system, and present a possible use case of the framework by performing an analysis of structural variation in the 1000 Genomes data set. BAMSI constitutes a framework for efficient filtering of large genomic data sets that is flexible in the use of compute as well as storage resources. The data resulting from the filter is assumed to be greatly reduced in size, and can easily be downloaded or routed into e.g. a Hadoop cluster for subsequent interactive analysis using Hive, Spark or similar tools. In this respect, our framework also suggests a general model for making very large datasets of high scientific value more accessible by offering the possibility for organizations to share the cost of hosting data on hot storage, without compromising the scalability of downstream analysis.
Callon, Wynne; Beach, Mary Catherine; Links, Anne R; Wasserman, Carly; Boss, Emily F
2018-03-11
We aimed to develop a comprehensive, descriptive framework to measure shared decision making (SDM) in clinical encounters. We combined a top-down (theoretical) approach with a bottom-up approach based on audio-recorded dialogue to identify all communication processes related to decision making. We coded 55 pediatric otolaryngology visits using the framework and report interrater reliability. We identified 14 clinician behaviors and 5 patient behaviors that have not been previously described, and developed a new SDM framework that is descriptive (what does happen) rather than normative (what should happen). Through the bottom-up approach we identified three broad domains not present in other SDM frameworks: socioemotional support, understandability of clinician dialogue, and recommendation-giving. We also specify the ways in which decision-making roles are assumed implicitly rather than discussed explicitly. Interrater reliability was >75% for 92% of the coded behaviors. This SDM framework allows for a more expansive understanding and analysis of how decision making takes place in clinical encounters, including new domains and behaviors not present in existing measures. We hope that this new framework will bring attention to a broader conception of SDM and allow researchers to further explore the new domains and behaviors identified. Copyright © 2018. Published by Elsevier B.V.
Preface Sections in English and Arabic Linguistics Books: A Rhetorico-Cultural Analysis
ERIC Educational Resources Information Center
Al-Zubaidi, Nassier A. G.; Jasim, Tahani Awad
2016-01-01
The present paper is a genre analysis of linguistics books prefaces in English and Arabic. Following Swales' (1990) genre framework, this study is a small scale-based generic analysis of 80 preface texts, equally divided into 40 texts from English and Arabic. The corpus analysis revealed that to perform its communicative function, the genre of the…
A Study on the Security Levels of Spread-Spectrum Embedding Schemes in the WOA Framework.
Wang, Yuan-Gen; Zhu, Guopu; Kwong, Sam; Shi, Yun-Qing
2017-08-23
Security analysis is a very important issue for digital watermarking. Several years ago, according to Kerckhoffs' principle, the famous four security levels, namely insecurity, key security, subspace security, and stego-security, were defined for spread-spectrum (SS) embedding schemes in the framework of watermarked-only attack. However, up to now there has been little application of the definition of these security levels to the theoretical analysis of the security of SS embedding schemes, due to the difficulty of the theoretical analysis. In this paper, based on the security definition, we present a theoretical analysis to evaluate the security levels of five typical SS embedding schemes, which are the classical SS, the improved SS (ISS), the circular extension of ISS, the nonrobust and robust natural watermarking, respectively. The theoretical analysis of these typical SS schemes are successfully performed by taking advantage of the convolution of probability distributions to derive the probabilistic models of watermarked signals. Moreover, simulations are conducted to illustrate and validate our theoretical analysis. We believe that the theoretical and practical analysis presented in this paper can bridge the gap between the definition of the four security levels and its application to the theoretical analysis of SS embedding schemes.
Chivukula, V; Mousel, J; Lu, J; Vigmostad, S
2014-12-01
The current research presents a novel method in which blood particulates - biconcave red blood cells (RBCs) and spherical cells are modeled using isogeometric analysis, specifically Non-Uniform Rational B-Splines (NURBS) in 3-D. The use of NURBS ensures that even with a coarse representation, the geometry of the blood particulates maintains an accurate description when subjected to large deformations. The fundamental advantage of this method is the coupling of the geometrical description and the stress analysis of the cell membrane into a single, unified framework. Details on the modeling approach, implementation of boundary conditions and the membrane mechanics analysis using isogeometric modeling are presented, along with validation cases for spherical and biconcave cells. Using NURBS - based isogeometric analysis, the behavior of individual cells in fluid flow is presented and analyzed in different flow regimes using as few as 176 elements for a spherical cell and 220 elements for a biconcave RBC. This work provides a framework for modeling a large number of 3-D deformable biological cells, each with its own geometric description and membrane properties. To the best knowledge of the authors, this is the first application of the NURBS - based isogeometric analysis to model and simulate blood particulates in flow in 3D. Copyright © 2014 John Wiley & Sons, Ltd.
Efroymson, R A; Suter, G W
2001-04-01
An ecological risk assessment framework for aircraft overflights has been developed, with special emphasis on military applications. This article presents the analysis of effects and risk characterization phases; the problem formulation and exposure analysis phases are presented in a companion article. The framework addresses the effects of sound, visual stressors, and collision on the abundance and production of wildlife populations. Profiles of effects, including thresholds, are highlighted for two groups of endpoint species: ungulates (hoofed mammals) and pinnipeds (seals, sea lions, walruses). Several factors complicate the analysis of effects for aircraft overflights. Studies of the effects of aircraft overflights previously have not been associated with a quantitative assessment framework; therefore no consistent relations between exposure and population-level response have been developed. Information on behavioral effects of overflights by military aircraft (or component stressors) on most wildlife species is sparse. Moreover, models that relate behavioral changes to abundance or reproduction, and those that relate behavioral or hearing effects thresholds from one population to another are generally not available. The aggregation of sound frequencies, durations, and the view of the aircraft into the single exposure metric of slant distance is not always the best predictor of effects, but effects associated with more specific exposure metrics (e.g., narrow sound spectra) may not be easily determined or added. The weight of evidence and uncertainty analyses of the risk characterization for overflights are also discussed in this article.
DECISION-COMPONENTS OF NICE'S TECHNOLOGY APPRAISALS ASSESSMENT FRAMEWORK.
de Folter, Joost; Trusheim, Mark; Jonsson, Pall; Garner, Sarah
2018-01-01
Value assessment frameworks have gained prominence recently in the context of U.S. healthcare. Such frameworks set out a series of factors that are considered in funding decisions. The UK's National Institute of Health and Care Excellence (NICE) is an established health technology assessment (HTA) agency. We present a novel application of text analysis that characterizes NICE's Technology Appraisals in the context of the newer assessment frameworks and present the results in a visual way. A total of 243 documents of NICE's medicines guidance from 2007 to 2016 were analyzed. Text analysis was used to identify a hierarchical set of decision factors considered in the assessments. The frequency of decision factors stated in the documents was determined and their association with terms related to uncertainty. The results were incorporated into visual representations of hierarchical factors. We identified 125 decision factors, and hierarchically grouped these into eight domains: Clinical Effectiveness, Cost Effectiveness, Condition, Current Practice, Clinical Need, New Treatment, Studies, and Other Factors. Textual analysis showed all domains appeared consistently in the guidance documents. Many factors were commonly associated with terms relating to uncertainty. A series of visual representations was created. This study reveals the complexity and consistency of NICE's decision-making processes and demonstrates that cost effectiveness is not the only decision-criteria. The study highlights the importance of processes and methodology that can take both quantitative and qualitative information into account. Visualizations can help effectively communicate this complex information during the decision-making process and subsequently to stakeholders.
FRANOPP: Framework for analysis and optimization problems user's guide
NASA Technical Reports Server (NTRS)
Riley, K. M.
1981-01-01
Framework for analysis and optimization problems (FRANOPP) is a software aid for the study and solution of design (optimization) problems which provides the driving program and plotting capability for a user generated programming system. In addition to FRANOPP, the programming system also contains the optimization code CONMIN, and two user supplied codes, one for analysis and one for output. With FRANOPP the user is provided with five options for studying a design problem. Three of the options utilize the plot capability and present an indepth study of the design problem. The study can be focused on a history of the optimization process or on the interaction of variables within the design problem.
Identifying influential factors of business process performance using dependency analysis
NASA Astrophysics Data System (ADS)
Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank
2011-02-01
We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.
Heterogeneous data fusion for brain tumor classification.
Metsis, Vangelis; Huang, Heng; Andronesi, Ovidiu C; Makedon, Fillia; Tzika, Aria
2012-10-01
Current research in biomedical informatics involves analysis of multiple heterogeneous data sets. This includes patient demographics, clinical and pathology data, treatment history, patient outcomes as well as gene expression, DNA sequences and other information sources such as gene ontology. Analysis of these data sets could lead to better disease diagnosis, prognosis, treatment and drug discovery. In this report, we present a novel machine learning framework for brain tumor classification based on heterogeneous data fusion of metabolic and molecular datasets, including state-of-the-art high-resolution magic angle spinning (HRMAS) proton (1H) magnetic resonance spectroscopy and gene transcriptome profiling, obtained from intact brain tumor biopsies. Our experimental results show that our novel framework outperforms any analysis using individual dataset.
Person-centered nursing practice with older people in Ireland.
Landers, Margaret G; McCarthy, Geraldine M
2007-01-01
This column presents an analysis of McCormack's conceptual framework for person-centered practice with older people as a theoretical basis for the delivery of care of older adults in an Irish context. The evaluative process is guided by the framework proposed by Fawcett (2000) for the analysis and evaluation of conceptual models of nursing. The historical evolution, philosophical claims, and an overview of the content of the model are addressed. The following criteria are then applied: logical congruence, the generation of the theory, the credibility of the model, and the contribution of the model to the discipline of nursing.
Analysis model for personal eHealth solutions and services.
Mykkänen, Juha; Tuomainen, Mika; Luukkonen, Irmeli; Itälä, Timo
2010-01-01
In this paper, we present a framework for analysing and assessing various features of personal wellbeing information management services and solutions such as personal health records and citizen-oriented eHealth services. The model is based on general functional and interoperability standards for personal health management applications and generic frameworks for different aspects of analysis. It has been developed and used in the MyWellbeing project in Finland to provide baseline for the research, development and comparison of many different personal wellbeing and health management solutions and to support the development of unified "Coper" concept for citizen empowerment.
Principles of qualitative analysis in the chromatographic context.
Valcárcel, M; Cárdenas, S; Simonet, B M; Carrillo-Carrión, C
2007-07-27
This article presents the state of the art of qualitative analysis in the framework of the chromatographic analysis. After establishing the differences between two main classes of qualitative analysis (analyte identification and sample classification/qualification) the particularities of instrumental qualitative analysis are commented on. Qualitative chromatographic analysis for sample classification/qualification through the so-called chromatographic fingerprint (for complex samples) or the volatiles profile (through the direct coupling headspace-mass spectrometry using the chromatograph as interface) is discussed. Next, more technical exposition of the qualitative chromatographic information is presented supported by a variety of representative examples.
Auditory Scene Analysis: An Attention Perspective
ERIC Educational Resources Information Center
Sussman, Elyse S.
2017-01-01
Purpose: This review article provides a new perspective on the role of attention in auditory scene analysis. Method: A framework for understanding how attention interacts with stimulus-driven processes to facilitate task goals is presented. Previously reported data obtained through behavioral and electrophysiological measures in adults with normal…
Individual Differences, Intelligence, and Behavior Analysis
ERIC Educational Resources Information Center
Williams, Ben; Myerson, Joel; Hale, Sandra
2008-01-01
Despite its avowed goal of understanding individual behavior, the field of behavior analysis has largely ignored the determinants of consistent differences in level of performance among individuals. The present article discusses major findings in the study of individual differences in intelligence from the conceptual framework of a functional…
Theory and applications of structured light single pixel imaging
NASA Astrophysics Data System (ADS)
Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.
2018-02-01
Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; ...
2014-06-30
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in ordermore » to reduce the overall numerical uncertainty while leveraging available computational resources. Finally, the coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.« less
Design and applications of a multimodality image data warehouse framework.
Wong, Stephen T C; Hoo, Kent Soo; Knowlton, Robert C; Laxer, Kenneth D; Cao, Xinhau; Hawkins, Randall A; Dillon, William P; Arenson, Ronald L
2002-01-01
A comprehensive data warehouse framework is needed, which encompasses imaging and non-imaging information in supporting disease management and research. The authors propose such a framework, describe general design principles and system architecture, and illustrate a multimodality neuroimaging data warehouse system implemented for clinical epilepsy research. The data warehouse system is built on top of a picture archiving and communication system (PACS) environment and applies an iterative object-oriented analysis and design (OOAD) approach and recognized data interface and design standards. The implementation is based on a Java CORBA (Common Object Request Broker Architecture) and Web-based architecture that separates the graphical user interface presentation, data warehouse business services, data staging area, and backend source systems into distinct software layers. To illustrate the practicality of the data warehouse system, the authors describe two distinct biomedical applications--namely, clinical diagnostic workup of multimodality neuroimaging cases and research data analysis and decision threshold on seizure foci lateralization. The image data warehouse framework can be modified and generalized for new application domains.
Design and Applications of a Multimodality Image Data Warehouse Framework
Wong, Stephen T.C.; Hoo, Kent Soo; Knowlton, Robert C.; Laxer, Kenneth D.; Cao, Xinhau; Hawkins, Randall A.; Dillon, William P.; Arenson, Ronald L.
2002-01-01
A comprehensive data warehouse framework is needed, which encompasses imaging and non-imaging information in supporting disease management and research. The authors propose such a framework, describe general design principles and system architecture, and illustrate a multimodality neuroimaging data warehouse system implemented for clinical epilepsy research. The data warehouse system is built on top of a picture archiving and communication system (PACS) environment and applies an iterative object-oriented analysis and design (OOAD) approach and recognized data interface and design standards. The implementation is based on a Java CORBA (Common Object Request Broker Architecture) and Web-based architecture that separates the graphical user interface presentation, data warehouse business services, data staging area, and backend source systems into distinct software layers. To illustrate the practicality of the data warehouse system, the authors describe two distinct biomedical applications—namely, clinical diagnostic workup of multimodality neuroimaging cases and research data analysis and decision threshold on seizure foci lateralization. The image data warehouse framework can be modified and generalized for new application domains. PMID:11971885
Working of Ideology in the TV Commercials of Cold Drinks in Pakistani Media
ERIC Educational Resources Information Center
Ahmad, Madiha; Ahmad, Sofia; Ijaz, Nida; Batool, Sumera; Abid, Maratab
2015-01-01
The article aims at the analysis of the TV commercials of three carbonated cold drinks from Pakistani media. The analysis will be carried out using the three dimensional framework presented by Fairclough. Through the analysis, the ideological framing of the commercials will be brought to light. To achieve this purpose different techniques used by…
ELM Meets Urban Big Data Analysis: Case Studies
Chen, Huajun; Chen, Jiaoyan
2016-01-01
In the latest years, the rapid progress of urban computing has engendered big issues, which creates both opportunities and challenges. The heterogeneous and big volume of data and the big difference between physical and virtual worlds have resulted in lots of problems in quickly solving practical problems in urban computing. In this paper, we propose a general application framework of ELM for urban computing. We present several real case studies of the framework like smog-related health hazard prediction and optimal retain store placement. Experiments involving urban data in China show the efficiency, accuracy, and flexibility of our proposed framework. PMID:27656203
Doukas, Charalampos; Goudas, Theodosis; Fischer, Simon; Mierswa, Ingo; Chatziioannou, Aristotle; Maglogiannis, Ilias
2010-01-01
This paper presents an open image-mining framework that provides access to tools and methods for the characterization of medical images. Several image processing and feature extraction operators have been implemented and exposed through Web Services. Rapid-Miner, an open source data mining system has been utilized for applying classification operators and creating the essential processing workflows. The proposed framework has been applied for the detection of salient objects in Obstructive Nephropathy microscopy images. Initial classification results are quite promising demonstrating the feasibility of automated characterization of kidney biopsy images.
African Primary Care Research: Qualitative data analysis and writing results
Govender, Indiran; Ogunbanjo, Gboyega A.; Mash, Bob
2014-01-01
Abstract This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given. PMID:26245437
African Primary Care Research: qualitative data analysis and writing results.
Mabuza, Langalibalele H; Govender, Indiran; Ogunbanjo, Gboyega A; Mash, Bob
2014-06-05
This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given.
Non-lambertian reflectance modeling and shape recovery of faces using tensor splines.
Kumar, Ritwik; Barmpoutis, Angelos; Banerjee, Arunava; Vemuri, Baba C
2011-03-01
Modeling illumination effects and pose variations of a face is of fundamental importance in the field of facial image analysis. Most of the conventional techniques that simultaneously address both of these problems work with the Lambertian assumption and thus fall short of accurately capturing the complex intensity variation that the facial images exhibit or recovering their 3D shape in the presence of specularities and cast shadows. In this paper, we present a novel Tensor-Spline-based framework for facial image analysis. We show that, using this framework, the facial apparent BRDF field can be accurately estimated while seamlessly accounting for cast shadows and specularities. Further, using local neighborhood information, the same framework can be exploited to recover the 3D shape of the face (to handle pose variation). We quantitatively validate the accuracy of the Tensor Spline model using a more general model based on the mixture of single-lobed spherical functions. We demonstrate the effectiveness of our technique by presenting extensive experimental results for face relighting, 3D shape recovery, and face recognition using the Extended Yale B and CMU PIE benchmark data sets.
Eastwood, John G; Kemp, Lynn A; Jalaludin, Bin B
2016-01-01
We have recently described a protocol for a study that aims to build a theory of neighbourhood context and postnatal depression. That protocol proposed a critical realist Explanatory Theory Building Method comprising of an: (1) emergent phase, (2) construction phase, and (3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design was described. The protocol also described in detail the Theory Construction Phase which will be presented here. The Theory Construction Phase will include: (1) defining stratified levels; (2) analytic resolution; (3) abductive reasoning; (4) comparative analysis (triangulation); (5) retroduction; (6) postulate and proposition development; (7) comparison and assessment of theories; and (8) conceptual frameworks and model development. The stratified levels of analysis in this study were predominantly social and psychological. The abductive analysis used the theoretical frames of: Stress Process; Social Isolation; Social Exclusion; Social Services; Social Capital, Acculturation Theory and Global-economic level mechanisms. Realist propositions are presented for each analysis of triangulated data. Inference to best explanation is used to assess and compare theories. A conceptual framework of maternal depression, stress and context is presented that includes examples of mechanisms at psychological, social, cultural and global-economic levels. Stress was identified as a necessary mechanism that has the tendency to cause several outcomes including depression, anxiety, and health harming behaviours. The conceptual framework subsequently included conditional mechanisms identified through the retroduction including the stressors of isolation and expectations and buffers of social support and trust. The meta-theory of critical realism is used here to generate and construct social epidemiological theory using stratified ontology and both abductive and retroductive analysis. The findings will be applied to the development of a middle range theory and subsequent programme theory for local perinatal child and family interventions.
Wiegmann, D A; Shappell, S A
2001-11-01
The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based on Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. The HFACS framework was used to analyze human error data associated with aircrew-related commercial aviation accidents that occurred between January 1990 and December 1996 using database records maintained by the NTSB and the FAA. Investigators were able to reliably accommodate all the human causal factors associated with the commercial aviation accidents examined in this study using the HFACS system. In addition, the classification of data using HFACS highlighted several critical safety issues in need of intervention research. These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation arena. However, additional research is needed to examine its applicability to areas outside the flight deck, such as aircraft maintenance and air traffic control domains.
A novel water quality data analysis framework based on time-series data mining.
Deng, Weihui; Wang, Guoyin
2017-07-01
The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sizo, Anton; Noble, Bram F; Bell, Scott
2016-03-01
This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.
Dotson, G Scott; Hudson, Naomi L; Maier, Andrew
2015-01-01
Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management.
NASA Astrophysics Data System (ADS)
Sizo, Anton; Noble, Bram F.; Bell, Scott
2016-03-01
This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.
Dotson, G. Scott; Hudson, Naomi L.; Maier, Andrew
2016-01-01
Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management. PMID:26312660
A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...
"Scaffolding" through Talk in Groupwork Learning
ERIC Educational Resources Information Center
Panselinas, Giorgos; Komis, Vassilis
2009-01-01
In the present study, we develop and deploy a conceptual framework of "scaffolding" in groupwork learning, through the analysis of the pursuit of a learning goal over time. The analysis follows individuals' different experiences of an interaction as well as collective experiences, considering individual attainment as a result of a bi-directional…
Affordance Analysis--Matching Learning Tasks with Learning Technologies
ERIC Educational Resources Information Center
Bower, Matt
2008-01-01
This article presents a design methodology for matching learning tasks with learning technologies. First a working definition of "affordances" is provided based on the need to describe the action potentials of the technologies (utility). Categories of affordances are then proposed to provide a framework for analysis. Following this, a…
2010-03-01
80 Discussion ...methods used to construct the survey questionnaire and discuss the data analysis methodology. Chapter IV will present the analysis of the data...2 provides a snapshot of the similarities and differences across various studies on new product development. The subsequent paragraphs will discuss
A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.
Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao
2018-05-23
The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.
NASA Astrophysics Data System (ADS)
Razavi, S.; Gupta, H. V.
2015-12-01
Earth and environmental systems models (EESMs) are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. Complexity and dimensionality are manifested by introducing many different factors in EESMs (i.e., model parameters, forcings, boundary conditions, etc.) to be identified. Sensitivity Analysis (SA) provides an essential means for characterizing the role and importance of such factors in producing the model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to 'variogram analysis', that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are limiting cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.
A framework for telehealth program evaluation.
Nepal, Surya; Li, Jane; Jang-Jaccard, Julian; Alem, Leila
2014-04-01
Evaluating telehealth programs is a challenging task, yet it is the most sensible first step when embarking on a telehealth study. How can we frame and report on telehealth studies? What are the health services elements to select based on the application needs? What are the appropriate terms to use to refer to such elements? Various frameworks have been proposed in the literature to answer these questions, and each framework is defined by a set of properties covering different aspects of telehealth systems. The most common properties include application, technology, and functionality. With the proliferation of telehealth, it is important not only to understand these properties, but also to define new properties to account for a wider range of context of use and evaluation outcomes. This article presents a comprehensive framework for delivery design, implementation, and evaluation of telehealth services. We first survey existing frameworks proposed in the literature and then present our proposed comprehensive multidimensional framework for telehealth. Six key dimensions of the proposed framework include health domains, health services, delivery technologies, communication infrastructure, environment setting, and socioeconomic analysis. We define a set of example properties for each dimension. We then demonstrate how we have used our framework to evaluate telehealth programs in rural and remote Australia. A few major international studies have been also mapped to demonstrate the feasibility of the framework. The key characteristics of the framework are as follows: (a) loosely coupled and hence easy to use, (b) provides a basis for describing a wide range of telehealth programs, and (c) extensible to future developments and needs.
The Coronal Analysis of SHocks and Waves (CASHeW) framework
NASA Astrophysics Data System (ADS)
Kozarev, Kamen A.; Davey, Alisdair; Kendrick, Alexander; Hammer, Michael; Keith, Celeste
2017-11-01
Coronal bright fronts (CBF) are large-scale wavelike disturbances in the solar corona, related to solar eruptions. They are observed (mostly in extreme ultraviolet (EUV) light) as transient bright fronts of finite width, propagating away from the eruption source location. Recent studies of individual solar eruptive events have used EUV observations of CBFs and metric radio type II burst observations to show the intimate connection between waves in the low corona and coronal mass ejection (CME)-driven shocks. EUV imaging with the atmospheric imaging assembly instrument on the solar dynamics observatory has proven particularly useful for detecting large-scale short-lived CBFs, which, combined with radio and in situ observations, holds great promise for early CME-driven shock characterization capability. This characterization can further be automated, and related to models of particle acceleration to produce estimates of particle fluxes in the corona and in the near Earth environment early in events. We present a framework for the coronal analysis of shocks and waves (CASHeW). It combines analysis of NASA Heliophysics System Observatory data products and relevant data-driven models, into an automated system for the characterization of off-limb coronal waves and shocks and the evaluation of their capability to accelerate solar energetic particles (SEPs). The system utilizes EUV observations and models written in the interactive data language. In addition, it leverages analysis tools from the SolarSoft package of libraries, as well as third party libraries. We have tested the CASHeW framework on a representative list of coronal bright front events. Here we present its features, as well as initial results. With this framework, we hope to contribute to the overall understanding of coronal shock waves, their importance for energetic particle acceleration, as well as to the better ability to forecast SEP events fluxes.
NASA Astrophysics Data System (ADS)
Tolba, Khaled Ibrahim; Morgenthal, Guido
2018-01-01
This paper presents an analysis of the scalability and efficiency of a simulation framework based on the vortex particle method. The code is applied for the numerical aerodynamic analysis of line-like structures. The numerical code runs on multicore CPU and GPU architectures using OpenCL framework. The focus of this paper is the analysis of the parallel efficiency and scalability of the method being applied to an engineering test case, specifically the aeroelastic response of a long-span bridge girder at the construction stage. The target is to assess the optimal configuration and the required computer architecture, such that it becomes feasible to efficiently utilise the method within the computational resources available for a regular engineering office. The simulations and the scalability analysis are performed on a regular gaming type computer.
ERIC Educational Resources Information Center
Sauve, Lucie; Brunelle, Renee; Berryman, Tom
2005-01-01
This article presents and discusses some results of the authors' analysis of international and national institutional documents related to environmental education from the 1970s to the present day. The aim of the study is to present a critical characterization of how environmental education is conceptualized and introduced through the ongoing…
Some applications of categorical data analysis to epidemiological studies.
Grizzle, J E; Koch, G G
1979-01-01
Several examples of categorized data from epidemiological studies are analyzed to illustrate that more informative analysis than tests of independence can be performed by fitting models. All of the analyses fit into a unified conceptual framework that can be performed by weighted least squares. The methods presented show how to calculate point estimate of parameters, asymptotic variances, and asymptotically valid chi 2 tests. The examples presented are analysis of relative risks estimated from several 2 x 2 tables, analysis of selected features of life tables, construction of synthetic life tables from cross-sectional studies, and analysis of dose-response curves. PMID:540590
A high-level 3D visualization API for Java and ImageJ.
Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin
2010-05-21
Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.
The role of language in learning physics
NASA Astrophysics Data System (ADS)
Brookes, David T.
Many studies in PER suggest that language poses a serious difficulty for students learning physics. These difficulties are mostly attributed to misunderstanding of specialized terminology. This terminology often assigns new meanings to everyday terms used to describe physical models and phenomena. In this dissertation I present a novel approach to analyzing of the role of language in learning physics. This approach is based on the analysis of the historical development of physics ideas, the language of modern physicists, and students' difficulties in the areas of quantum mechanics, classical mechanics, and thermodynamics. These data are analyzed using linguistic tools borrowed from cognitive linguistics and systemic functional grammar. Specifically, I combine the idea of conceptual metaphor and grammar to build a theoretical framework that accounts for: (1) the role and function that language serves for physicists when they speak and reason about physical ideas and phenomena, (2) specific features of students' reasoning and difficulties that may be related to or derived from language that students read or hear. The theoretical framework is developed using the methodology of a grounded theoretical approach. The theoretical framework allows us to make predictions about the relationship between student discourse and their conceptual and problem solving difficulties. Tests of the theoretical framework are presented in the context of "heat" in thermodynamics and "force" in dynamics. In each case the language that students use to reason about the concepts of "heat" and "force" is analyzed using the theoretical framework. The results of this analysis show that language is very important in students' learning. In particular, students are (1) using features of physicists' conceptual metaphors to reason about physical phenomena, often overextending and misapplying these features, (2) drawing cues from the grammar of physicists' speech and writing to categorize physics concepts; this categorization of physics concepts plays a key role in students' ability to solve physics problems. In summary, I present a theoretical framework that provides a possible explanation of the role that language plays in learning physics. The framework also attempts to account for how and why physicists' language influences students in the way that it does.
Assimilation Ideology: Critically Examining Underlying Messages in Multicultural Literature
ERIC Educational Resources Information Center
Yoon, Bogum; Simpson, Anne; Haag, Claudia
2010-01-01
Using the framework of multicultural education, this article presents an analysis of multicultural picture books that depict the features of assimilation ideology. The findings suggest that assimilationist ideas are presented through the main characters' identities in the resolution of the story and through the portrayal of a glorified dominant…
Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment
NASA Technical Reports Server (NTRS)
Lee, Meemong; Bowman, Kevin
2014-01-01
Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.
Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework
NASA Astrophysics Data System (ADS)
Gannon, C.
2017-12-01
As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.
NASA Astrophysics Data System (ADS)
Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir
2017-06-01
We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.
Overcoming complexities: Damage detection using dictionary learning framework
NASA Astrophysics Data System (ADS)
Alguri, K. Supreet; Melville, Joseph; Deemer, Chris; Harley, Joel B.
2018-04-01
For in situ damage detection, guided wave structural health monitoring systems have been widely researched due to their ability to evaluate large areas and their ability detect many types of damage. These systems often evaluate structural health by recording initial baseline measurements from a pristine (i.e., undamaged) test structure and then comparing later measurements with that baseline. Yet, it is not always feasible to have a pristine baseline. As an alternative, substituting the baseline with data from a surrogate (nearly identical and pristine) structure is a logical option. While effective in some circumstance, surrogate data is often still a poor substitute for pristine baseline measurements due to minor differences between the structures. To overcome this challenge, we present a dictionary learning framework to adapt surrogate baseline data to better represent an undamaged test structure. We compare the performance of our framework with two other surrogate-based damage detection strategies: (1) using raw surrogate data for comparison and (2) using sparse wavenumber analysis, a precursor to our framework for improving the surrogate data. We apply our framework to guided wave data from two 108 mm by 108 mm aluminum plates. With 20 measurements, we show that our dictionary learning framework achieves a 98% accuracy, raw surrogate data achieves a 92% accuracy, and sparse wavenumber analysis achieves a 57% accuracy.
Integrating Data Clustering and Visualization for the Analysis of 3D Gene Expression Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Data Analysis and Visualization; nternational Research Training Group ``Visualization of Large and Unstructured Data Sets,'' University of Kaiserslautern, Germany; Computational Research Division, Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720, USA
2008-05-12
The recent development of methods for extracting precise measurements of spatial gene expression patterns from three-dimensional (3D) image data opens the way for new analyses of the complex gene regulatory networks controlling animal development. We present an integrated visualization and analysis framework that supports user-guided data clustering to aid exploration of these new complex datasets. The interplay of data visualization and clustering-based data classification leads to improved visualization and enables a more detailed analysis than previously possible. We discuss (i) integration of data clustering and visualization into one framework; (ii) application of data clustering to 3D gene expression data; (iii)more » evaluation of the number of clusters k in the context of 3D gene expression clustering; and (iv) improvement of overall analysis quality via dedicated post-processing of clustering results based on visualization. We discuss the use of this framework to objectively define spatial pattern boundaries and temporal profiles of genes and to analyze how mRNA patterns are controlled by their regulatory transcription factors.« less
Raut, Savita V; Yadav, Dinkar M
2018-03-28
This paper presents an fMRI signal analysis methodology using geometric mean curve decomposition (GMCD) and mutual information-based voxel selection framework. Previously, the fMRI signal analysis has been conducted using empirical mean curve decomposition (EMCD) model and voxel selection on raw fMRI signal. The erstwhile methodology loses frequency component, while the latter methodology suffers from signal redundancy. Both challenges are addressed by our methodology in which the frequency component is considered by decomposing the raw fMRI signal using geometric mean rather than arithmetic mean and the voxels are selected from EMCD signal using GMCD components, rather than raw fMRI signal. The proposed methodologies are adopted for predicting the neural response. Experimentations are conducted in the openly available fMRI data of six subjects, and comparisons are made with existing decomposition models and voxel selection frameworks. Subsequently, the effect of degree of selected voxels and the selection constraints are analyzed. The comparative results and the analysis demonstrate the superiority and the reliability of the proposed methodology.
Developing an evaluation framework for clinical redesign programs: lessons learnt.
Samaranayake, Premaratne; Dadich, Ann; Fitzgerald, Anneke; Zeitz, Kathryn
2016-09-19
Purpose The purpose of this paper is to present lessons learnt through the development of an evaluation framework for a clinical redesign programme - the aim of which was to improve the patient journey through improved discharge practices within an Australian public hospital. Design/methodology/approach The development of the evaluation framework involved three stages - namely, the analysis of secondary data relating to the discharge planning pathway; the analysis of primary data including field-notes and interview transcripts on hospital processes; and the triangulation of these data sets to devise the framework. The evaluation framework ensured that resource use, process management, patient satisfaction, and staff well-being and productivity were each connected with measures, targets, and the aim of clinical redesign programme. Findings The application of business process management and a balanced scorecard enabled a different way of framing the evaluation, ensuring measurable outcomes were connected to inputs and outputs. Lessons learnt include: first, the importance of mixed-methods research to devise the framework and evaluate the redesigned processes; second, the need for appropriate tools and resources to adequately capture change across the different domains of the redesign programme; and third, the value of developing and applying an evaluative framework progressively. Research limitations/implications The evaluation framework is limited by its retrospective application to a clinical process redesign programme. Originality/value This research supports benchmarking with national and international practices in relation to best practice healthcare redesign processes. Additionally, it provides a theoretical contribution on evaluating health services improvement and redesign initiatives.
Multi-flexible-body analysis for application to wind turbine control design
NASA Astrophysics Data System (ADS)
Lee, Donghoon
The objective of the present research is to build a theoretical and computational framework for the aeroelastic analysis of flexible rotating systems, more specifically with special application to a wind turbine control design. The methodology is based on the integration of Kane's approach for the analysis of the multi-rigid-body subsystem and a mixed finite element method for the analysis of the flexible-body subsystem. The combined analysis is then strongly coupled with an aerodynamic model based on Blade Element Momentum theory for inflow model. The unified framework from the analysis of subsystems is represented as, in a symbolic manner, a set of nonlinear ordinary differential equations with time-variant, periodic coefficients, which describe the aeroelastic behavior of whole system. The framework can be directly applied to control design due to its symbolic characteristics. The solution procedures for the equations are presented for the study of nonlinear simulation, periodic steady-state solution, and Floquet stability of the linearized system about the steady-state solution. Finally the linear periodic system equation can be obtained with both system and control matrices as explicit functions of time, which can be directly applicable to control design. The structural model is validated by comparison of its results with those from software, some of which is commercial. The stability of the linearized system about periodic steady-state solution is different from that obtained about a constant steady-state solution, which have been conventional in the field of wind turbine dynamics. Parametric studies are performed on a wind turbine model with various pitch angles, precone angles, and rotor speeds. Combined with composite material, their effects on wind turbine aeroelastic stability are investigated. Finally it is suggested that the aeroelastic stability analysis and control design for the whole system is crucial for the design of wind turbines, and the present research breaks new ground in the ability to treat the issue.
2014-01-01
Background Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. Results The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input–output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on average 15% of the mean values over the succeeding parameter sets. Conclusions Our results indicate that the presented approach is effective for comparing model alternatives and reducing models to the minimum complexity replicating measured data. We therefore believe that this approach has significant potential for reparameterising existing frameworks, for identification of redundant model components of large biophysical models and to increase their predictive capacity. PMID:24886522
NASA Astrophysics Data System (ADS)
Knoop, Tom H.; Derikx, Loes C.; Verdonschot, Nico; Slump, Cornelis H.
2015-03-01
In the progressive stages of cancer, metastatic lesions in often develop in the femur. The accompanying pain and risk of fracture dramatically affect the quality of life of the patient. Radiotherapy is often administered as palliative treatment to relieve pain and restore the bone around the lesion. It is thought to affect the bone mineralization of the treated region, but the quantitative relation between radiation dose and femur remineralization remains unclear. A new framework for the longitudinal analysis of CT-scans of patients receiving radiotherapy is presented to investigate this relationship. The implemented framework is capable of automatic calibration of Hounsfield Units to calcium equivalent values and the estimation of a prediction interval per scan. Other features of the framework are temporal registration of femurs using elastix, transformation of arbitrary Regions Of Interests (ROI), and extraction of metrics for analysis. Build in Matlab, the modular approach aids easy adaptation to the pertinent questions in the explorative phase of the research. For validation purposes, an in-vitro model consisting of a human cadaver femur with a milled hole in the intertrochanteric region was used, representing a femur with a metastatic lesion. The hole was incrementally stacked with plates of PMMA bone cement with variable radiopaqueness. Using a Kolmogorov-Smirnov (KS) test, changes in density distribution due to an increase of the calcium concentration could be discriminated. In a 21 cm3 ROI, changes in 8% of the volume from 888 ± 57mg • ml-1 to 1000 ± 80mg • ml-1 could be statistically proven using the proposed framework. In conclusion, the newly developed framework proved to be a useful and flexible tool for the analysis of longitudinal CT data.
Effect of Additional Incentives for Aviation Biofuels: Results from the Biomass Scenario Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vimmerstedt, Laura J; Newes, Emily K
2017-12-05
The National Renewable Energy Laboratory supported the Department of Energy, Bioenergy Technologies Office, with analysis of alternative jet fuels in collaboration with the U.S. Department of Transportation, Federal Aviation Administration. Airlines for America requested additional exploratory scenarios within FAA analytic framework. Airlines for America requested additional analysis using the same analytic framework, the Biomass Scenario Model. The results were presented at a public working meeting of the California Air Resources Board on including alternative jet fuel in the Low Carbon Fuel Standard on March 17, 2017 (https://www.arb.ca.gov/fuels/lcfs/lcfs_meetings/lcfs_meetings.htm). This presentation clarifies and annotates the slides from the public working meeting, andmore » provides a link to the full data set. NREL does not advocate for or against the policies analyzed in this study.« less
Effect of Additional Incentives for Aviation Biofuels: Results from the Biomass Scenario Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vimmerstedt, Laura J; Newes, Emily K
The National Renewable Energy Laboratory supported the Department of Energy, Bioenergy Technologies Office, with analysis of alternative jet fuels in collaboration with the U.S. Department of Transportation, Federal Aviation Administration. Airlines for America requested additional exploratory scenarios within FAA analytic framework. Airlines for America requested additional analysis using the same analytic framework, the Biomass Scenario Model. The results were presented at a public working meeting of the California Air Resources Board on including alternative jet fuel in the Low Carbon Fuel Standard on March 17, 2017 (https://www.arb.ca.gov/fuels/lcfs/lcfs_meetings/lcfs_meetings.htm). This presentation clarifies and annotates the slides from the public working meeting, andmore » provides a link to the full data set. NREL does not advocate for or against the policies analyzed in this study.« less
Do researchers have an obligation to actively look for genetic incidental findings?
Gliwa, Catherine; Berkman, Benjamin E
2013-01-01
The rapid growth of next-generation genetic sequencing has prompted debate about the responsibilities of researchers toward genetic incidental findings. Assuming there is a duty to disclose significant incidental findings, might there be an obligation for researchers to actively look for these findings? We present an ethical framework for analyzing whether there is a positive duty to look for genetic incidental findings. Using the ancillary care framework as a guide, we identify three main criteria that must be present to give rise to an obligation to look: high benefit to participants, lack of alternative access for participants, and reasonable burden on researchers. Our analysis indicates that there is no obligation to look for incidental findings today, but during the ongoing translation of genomic analysis from research to clinical care, this obligation may arise.
Do Researchers Have an Obligation to Actively Look for Genetic Incidental Findings?
Gliwa, Catherine; Berkman, Benjamin E.
2014-01-01
The rapid growth of next-generation genetic sequencing has prompted debate about the responsibilities of researchers toward genetic incidental findings. Assuming there is a duty to disclose significant incidental findings, might there be an obligation for researchers to actively look for these findings? We present an ethical framework for analyzing whether there is a positive duty to look for genetic incidental findings. Using the ancillary care framework as a guide, we identify three main criteria that must be present to give rise to an obligation to look: high benefit to participants, lack of alternative access for participants, and reasonable burden on researchers. Our analysis indicates that there is no obligation to look for incidental findings today, but during the ongoing translation of genomic analysis from research to clinical care, this obligation may arise. PMID:23391059
A multiple perspective modeling and simulation approach for renewable energy policy evaluation
NASA Astrophysics Data System (ADS)
Alyamani, Talal M.
Environmental issues and reliance on fossil fuel sources, including coal, oil, and natural gas, are the two most common energy issues that are currently faced by the United States (U.S.). Incorporation of renewable energy sources, a non-economical option in electricity generation compared to conventional sources that burn fossil fuels, single handedly promises a viable solution for both of these issues. Several energy policies have concordantly been suggested to reduce the financial burden of adopting renewable energy technologies and make such technologies competitive with conventional sources throughout the U.S. This study presents a modeling and analysis approach for comprehensive evaluation of renewable energy policies with respect to their benefits to various related stakeholders--customers, utilities, governmental and environmental agencies--where the debilitating impacts, advantages, and disadvantages of such policies can be assessed and quantified at the state level. In this work, a novel simulation framework is presented to help policymakers promptly assess and evaluate policies from different perspectives of its stakeholders. The proposed framework is composed of four modules: 1) a database that collates the economic, operational, and environmental data; 2) elucidation of policy, which devises the policy for the simulation model; 3) a preliminary analysis, which makes predictions for consumption, supply, and prices; and 4) a simulation model. After the validity of the proposed framework is demonstrated, a series of planned Florida and Texas renewable energy policies are implemented into the presented framework as case studies. Two solar and one energy efficiency programs are selected as part of the Florida case study. A utility rebate and federal tax credit programs are selected as part of the Texas case study. The results obtained from the simulation and conclusions drawn on the assessment of current energy policies are presented with respect to the conflicting objectives of different stakeholders.
An analysis of the concept of competence in individuals and social systems.
Adler, P T
1982-01-01
This paper has attempted to present a unified conceptual model of positive mental health or competence from the perspective of individuals and from the perspective of social systems of varying degrees of complexity, such as families, organizations, and entire communities. It has provided a taxonomy of the elements of competence which allows the application of a common framework to the analysis of competence and to the planning and evaluation of competence building interventions at any level of social organization. Community Mental Health Centers can apply the model which has been presented in a number of different ways. At whatever level(s) the CMHCs' efforts are directed, the competence model presents a framework for analysis, intervention, and evaluation which enriches and expands upon more typical disorder-based formulations. By providing a framework which encompasses all levels of social organization, the model provides the conceptual tools for going beyond the individual and microsystem levels which have often constituted the boundaries of CMHC concern, and allows the CMHC to approach the organizational and community levels which must be encompassed by a competently comprehensive center. Application of the concept of competence to social organizations and to communities allows the CMHC to analyze and intervene at these levels. Finally, the concept of organizational competence separated into its various elements provides the CMHC with a tool for analyzing and evaluating its own environment and the competence of various aspects of its own functioning within that environment.
ERIC Educational Resources Information Center
Leech, Nancy L.; Onwuegbuzie, Anthony J.
2008-01-01
Qualitative researchers in school psychology have a multitude of analyses available for data. The purpose of this article is to present several of the most common methods for analyzing qualitative data. Specifically, the authors describe the following 18 qualitative analysis techniques: method of constant comparison analysis, keywords-in-context,…
Data Curation and Visualization for MuSIASEM Analysis of the Nexus
NASA Astrophysics Data System (ADS)
Renner, Ansel
2017-04-01
A novel software-based approach to relational analysis applying recent theoretical advancements of the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) accounting framework is presented. This research explores and explains underutilized ways software can assist complex system analysis across the stages of data collection, exploration, analysis and dissemination and in a transparent and collaborative manner. This work is being conducted as part of, and in support of, the four-year European Commission H2020 project: Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (MAGIC). In MAGIC, theoretical advancements to MuSIASEM propose a powerful new approach to spatial-temporal WEFC relational analysis in accordance with a structural-functional scaling mechanism appropriate for biophysically relevant complex system analyses. Software is designed primarily with JavaScript using the Angular2 model-view-controller framework and the Data-Driven Documents (D3) library. These design choices clarify and modularize data flow, simplify research practitioner's work, allow for and assist stakeholder involvement and advance collaboration at all stages. Data requirements and scalable, robust yet light-weight structuring will first be explained. Following, algorithms to process this data will be explored. Data interfaces and data visualization approaches will lastly be presented and described.
OpenElectrophy: An Electrophysiological Data- and Analysis-Sharing Framework
Garcia, Samuel; Fourcaud-Trocmé, Nicolas
2008-01-01
Progress in experimental tools and design is allowing the acquisition of increasingly large datasets. Storage, manipulation and efficient analyses of such large amounts of data is now a primary issue. We present OpenElectrophy, an electrophysiological data- and analysis-sharing framework developed to fill this niche. It stores all experiment data and meta-data in a single central MySQL database, and provides a graphic user interface to visualize and explore the data, and a library of functions for user analysis scripting in Python. It implements multiple spike-sorting methods, and oscillation detection based on the ridge extraction methods due to Roux et al. (2007). OpenElectrophy is open source and is freely available for download at http://neuralensemble.org/trac/OpenElectrophy. PMID:19521545
Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M
2018-03-05
The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative analysis of heterogeneous data types in the development of complex botanicals such as polyphenols for eventual clinical and translational applications.
Argumentation in Science Education: A Model-based Framework
NASA Astrophysics Data System (ADS)
Böttcher, Florian; Meisert, Anke
2011-02-01
The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.
Predicting SPE Fluxes: Coupled Simulations and Analysis Tools
NASA Astrophysics Data System (ADS)
Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.
2017-12-01
Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.
NASA Astrophysics Data System (ADS)
Li, Wei; Chen, Ting; Zhang, Wenjun; Shi, Yunyu; Li, Jun
2012-04-01
In recent years, Music video data is increasing at an astonishing speed. Shot segmentation and keyframe extraction constitute a fundamental unit in organizing, indexing, retrieving video content. In this paper a unified framework is proposed to detect the shot boundaries and extract the keyframe of a shot. Music video is first segmented to shots by illumination-invariant chromaticity histogram in independent component (IC) analysis feature space .Then we presents a new metric, image complexity, to extract keyframe in a shot which is computed by ICs. Experimental results show the framework is effective and has a good performance.
Presotto, Anna Gabriella Camacho; Bhering, Cláudia Lopes Brilhante; Mesquita, Marcelo Ferraz; Barão, Valentim Adelino Ricardo
2017-03-01
Several studies have shown the superiority of computer-assisted design and computer-assisted manufacturing (CAD-CAM) technology compared with conventional casting. However, an advanced technology exists for casting procedures (the overcasting technique), which may serve as an acceptable and affordable alternative to CAD-CAM technology for fabricating 3-unit implant-supported fixed dental prostheses (FDPs). The purpose of this in vitro study was to evaluate, using quantitative photoelastic analysis, the effect of the prosthetic framework fabrication method (CAD-CAM and overcasting) on the marginal fit and stress transmitted to implants. The correlation between marginal fit and stress was also investigated. Three-unit implant-supported FDP frameworks were made using the CAD-CAM (n=10) and overcasting (n=10) methods. The frameworks were waxed to simulate a mandibular first premolar (PM region) to first molar (M region) FDP using overcast mini-abutment cylinders. The wax patterns were overcast (overcast experimental group) or scanned to obtain the frameworks (CAD-CAM control group). All frameworks were fabricated from cobalt-chromium (CoCr) alloy. The marginal fit was analyzed according to the single-screw test protocol, obtaining an average value for each region (M and PM) and each framework. The frameworks were tightened for the photoelastic model with standardized 10-Ncm torque. Stress was measured by quantitative photoelastic analysis. The results were submitted to the Student t test, 2-way ANOVA, and Pearson correlation test (α=.05). The framework fabrication method (FM) and evaluation site (ES; M and PM regions) did not affect the marginal fit values (P=.559 for FM and P=.065 for ES) and stress (P=.685 for FM and P=.468 for ES) in the implant-supported system. Positive correlations between marginal fit and stress were observed (CAD-CAM: r=0.922; P<.001; overcast: r=0.908; P<.001). CAD-CAM and overcasting methods present similar marginal fit and stress values for 3-unit FDP frameworks. The decreased marginal fit of frameworks induces greater stress in the implant-supported system. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Poudel, R; Jumpponen, A; Schlatter, D C; Paulitz, T C; Gardener, B B McSpadden; Kinkel, L L; Garrett, K A
2016-10-01
Network models of soil and plant microbiomes provide new opportunities for enhancing disease management, but also challenges for interpretation. We present a framework for interpreting microbiome networks, illustrating how observed network structures can be used to generate testable hypotheses about candidate microbes affecting plant health. The framework includes four types of network analyses. "General network analysis" identifies candidate taxa for maintaining an existing microbial community. "Host-focused analysis" includes a node representing a plant response such as yield, identifying taxa with direct or indirect associations with that node. "Pathogen-focused analysis" identifies taxa with direct or indirect associations with taxa known a priori as pathogens. "Disease-focused analysis" identifies taxa associated with disease. Positive direct or indirect associations with desirable outcomes, or negative associations with undesirable outcomes, indicate candidate taxa. Network analysis provides characterization not only of taxa with direct associations with important outcomes such as disease suppression, biofertilization, or expression of plant host resistance, but also taxa with indirect associations via their association with other key taxa. We illustrate the interpretation of network structure with analyses of microbiomes in the oak phyllosphere, and in wheat rhizosphere and bulk soil associated with the presence or absence of infection by Rhizoctonia solani.
General description and understanding of the nonlinear dynamics of mode-locked fiber lasers.
Wei, Huai; Li, Bin; Shi, Wei; Zhu, Xiushan; Norwood, Robert A; Peyghambarian, Nasser; Jian, Shuisheng
2017-05-02
As a type of nonlinear system with complexity, mode-locked fiber lasers are known for their complex behaviour. It is a challenging task to understand the fundamental physics behind such complex behaviour, and a unified description for the nonlinear behaviour and the systematic and quantitative analysis of the underlying mechanisms of these lasers have not been developed. Here, we present a complexity science-based theoretical framework for understanding the behaviour of mode-locked fiber lasers by going beyond reductionism. This hierarchically structured framework provides a model with variable dimensionality, resulting in a simple view that can be used to systematically describe complex states. Moreover, research into the attractors' basins reveals the origin of stochasticity, hysteresis and multistability in these systems and presents a new method for quantitative analysis of these nonlinear phenomena. These findings pave the way for dynamics analysis and system designs of mode-locked fiber lasers. We expect that this paradigm will also enable potential applications in diverse research fields related to complex nonlinear phenomena.
Policy Analysis of the English Graduation Benchmark in Taiwan
ERIC Educational Resources Information Center
Shih, Chih-Min
2012-01-01
To nudge students to study English and to improve their English proficiency, many universities in Taiwan have imposed an English graduation benchmark on their students. This article reviews this policy, using the theoretic framework for education policy analysis proposed by Haddad and Demsky (1995). The author presents relevant research findings,…
A Systemic Cause Analysis Model for Human Performance Technicians
ERIC Educational Resources Information Center
Sostrin, Jesse
2011-01-01
This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…
An Institutional Theory Analysis of Charter Schools: Addressing Institutional Challenges to Scale
ERIC Educational Resources Information Center
Huerta, Luis A.; Zuckerman, Andrew
2009-01-01
This article presents a conceptual framework derived from institutional theory in sociology that offers two competing policy contexts in which charter schools operate--a bureaucratic frame versus a decentralized frame. An analysis of evolving charter school types based on three underlying theories of action is considered. As charter school leaders…
Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis
ERIC Educational Resources Information Center
Young, Cristobal; Holsteen, Katherine
2017-01-01
Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…
Cost Efficiency in the University: A Departmental Evaluation Model
ERIC Educational Resources Information Center
Gimenez, Victor M.; Martinez, Jose Luis
2006-01-01
This article presents a model for the analysis of cost efficiency within the framework of data envelopment analysis models. It calculates the cost excess, separating a unit of production from its optimal or frontier levels, and, at the same time, breaks these excesses down into three explanatory factors: (a) technical inefficiency, which depends…
AFIT/AFOSR Workshop on the Role of Wavelets in Signal Processing Applications
1992-08-28
Stein and G. Weiss, "Fourier analysis on Eucildean spaces," Princeton University Press, 1971. [V] G. Vitali, Sulla condizione di chiusura di un sistema ...present the more general framework into wavelets fit, suggesting hence companion ways of time-scale analysis for self-similar and 1/f-type processes
Man-made objects cuing in satellite imagery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skurikhin, Alexei N
2009-01-01
We present a multi-scale framework for man-made structures cuing in satellite image regions. The approach is based on a hierarchical image segmentation followed by structural analysis. A hierarchical segmentation produces an image pyramid that contains a stack of irregular image partitions, represented as polygonized pixel patches, of successively reduced levels of detail (LOOs). We are jumping off from the over-segmented image represented by polygons attributed with spectral and texture information. The image is represented as a proximity graph with vertices corresponding to the polygons and edges reflecting polygon relations. This is followed by the iterative graph contraction based on Boruvka'smore » Minimum Spanning Tree (MST) construction algorithm. The graph contractions merge the patches based on their pairwise spectral and texture differences. Concurrently with the construction of the irregular image pyramid, structural analysis is done on the agglomerated patches. Man-made object cuing is based on the analysis of shape properties of the constructed patches and their spatial relations. The presented framework can be used as pre-scanning tool for wide area monitoring to quickly guide the further analysis to regions of interest.« less
Framework for SEM contour analysis
NASA Astrophysics Data System (ADS)
Schneider, L.; Farys, V.; Serret, E.; Fenouillet-Beranger, C.
2017-03-01
SEM images provide valuable information about patterning capability. Geometrical properties such as Critical Dimension (CD) can be extracted from them and are used to calibrate OPC models, thus making OPC more robust and reliable. However, there is currently a shortage of appropriate metrology tools to inspect complex two-dimensional patterns in the same way as one would work with simple one-dimensional patterns. In this article we present a full framework for the analysis of SEM images. It has been proven to be fast, reliable and robust for every type of structure, and particularly for two-dimensional structures. To achieve this result, several innovative solutions have been developed and will be presented in the following pages. Firstly, we will present a new noise filter which is used to reduce noise on SEM images, followed by an efficient topography identifier, and finally we will describe the use of a topological skeleton as a measurement tool that can extend CD measurements on all kinds of patterns.
Combining Cryptography with EEG Biometrics
Kazanavičius, Egidijus; Woźniak, Marcin
2018-01-01
Cryptographic frameworks depend on key sharing for ensuring security of data. While the keys in cryptographic frameworks must be correctly reproducible and not unequivocally connected to the identity of a user, in biometric frameworks this is different. Joining cryptography techniques with biometrics can solve these issues. We present a biometric authentication method based on the discrete logarithm problem and Bose-Chaudhuri-Hocquenghem (BCH) codes, perform its security analysis, and demonstrate its security characteristics. We evaluate a biometric cryptosystem using our own dataset of electroencephalography (EEG) data collected from 42 subjects. The experimental results show that the described biometric user authentication system is effective, achieving an Equal Error Rate (ERR) of 0.024.
Combining Cryptography with EEG Biometrics.
Damaševičius, Robertas; Maskeliūnas, Rytis; Kazanavičius, Egidijus; Woźniak, Marcin
2018-01-01
Cryptographic frameworks depend on key sharing for ensuring security of data. While the keys in cryptographic frameworks must be correctly reproducible and not unequivocally connected to the identity of a user, in biometric frameworks this is different. Joining cryptography techniques with biometrics can solve these issues. We present a biometric authentication method based on the discrete logarithm problem and Bose-Chaudhuri-Hocquenghem (BCH) codes, perform its security analysis, and demonstrate its security characteristics. We evaluate a biometric cryptosystem using our own dataset of electroencephalography (EEG) data collected from 42 subjects. The experimental results show that the described biometric user authentication system is effective, achieving an Equal Error Rate (ERR) of 0.024.
NASA Astrophysics Data System (ADS)
Tisa, Paul C.
Every year the DoD spends billions satisfying its large petroleum demand. This spending is highly sensitive to uncontrollable and poorly understood market forces. Additionally, while some stakeholders may not prioritize its monetary cost and risk, energy is fundamentally coupled to other critical factors. Energy, operational capability, and logistics are heavily intertwined and dependent on uncertain security environment and technology futures. These components and their relationships are less understood. Without better characterization, future capabilities may be significantly limited by present-day acquisition decisions. One attempt to demonstrate these costs and risks to decision makers has been through a metric known as the Fully Burdened Cost of Energy (FBCE). FBCE is defined as the commodity price for fuel plus many of these hidden costs. The metric encouraged a valuable conversation and is still required by law. However, most FBCE development stopped before the lessons from that conversation were incorporated. Current implementation is easy to employ but creates little value. Properly characterizing the costs and risks of energy and putting them in a useful tradespace requires a new framework. This research aims to highlight energy's complex role in many aspects of military operations, the critical need to incorporate it in decisions, and a novel framework to do so. It is broken into five parts. The first describes the motivation behind FBCE, the limits of current implementation, and outlines a new framework that aids decisions. Respectively, the second, third, and fourth present a historic analysis of the connections between military capabilities and energy, analyze the recent evolution of this conversation within the DoD, and pull the historic analysis into a revised framework. The final part quantifies the potential impacts of deeply uncertain futures and technological development and introduces an expanded framework that brings capability, energy, and their uncertainty into the same tradespace. The work presented is intended to inform better policies and investment decisions for military acquisitions. The discussion highlights areas within the DoD's understanding of energy that could improve or whose development has faltered. The new metric discussed allows the DoD to better manage and plan for long-term energy-related costs and risk.
NASA Technical Reports Server (NTRS)
Naiman, Cynthia Gutierrez
2010-01-01
Advancing and exploring the science of Multidisciplinary Analysis & Optimization (MDAO) capabilities are high-level goals in the Fundamental Aeronautics Program s Subsonic Fixed Wing (SFW) project. The OpenMDAO team has made significant progress toward completing the Alpha OpenMDAO deliverable due in September 2010. Included in the presentation are: details of progress on developing the OpenMDAO framework, example usage of OpenMDAO, technology transfer plans, near term plans, progress toward establishing partnerships with external parties, and discussion of additional potential collaborations.
Exploring Evolving Media Discourse Through Event Cueing.
Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross
2016-01-01
Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa.
The added value of thorough economic evaluation of telemedicine networks.
Le Goff-Pronost, Myriam; Sicotte, Claude
2010-02-01
This paper proposes a thorough framework for the economic evaluation of telemedicine networks. A standard cost analysis methodology was used as the initial base, similar to the evaluation method currently being applied to telemedicine, and to which we suggest adding subsequent stages that enhance the scope and sophistication of the analytical methodology. We completed the methodology with a longitudinal and stakeholder analysis, followed by the calculation of a break-even threshold, a calculation of the economic outcome based on net present value (NPV), an estimate of the social gain through external effects, and an assessment of the probability of social benefits. In order to illustrate the advantages, constraints and limitations of the proposed framework, we tested it in a paediatric cardiology tele-expertise network. The results demonstrate that the project threshold was not reached after the 4 years of the study. Also, the calculation of the project's NPV remained negative. However, the additional analytical steps of the proposed framework allowed us to highlight alternatives that can make this service economically viable. These included: use over an extended period of time, extending the network to other telemedicine specialties, or including it in the services offered by other community hospitals. In sum, the results presented here demonstrate the usefulness of an economic evaluation framework as a way of offering decision makers the tools they need to make comprehensive evaluations of telemedicine networks.
Hydrologic Connectivity: a Framework to Understand Threshold Behaviour in Semi-Arid Landscapes.
NASA Astrophysics Data System (ADS)
Saco, Patricia; Rodriguez, Jose; Keesstra, Saskia; Moreno-de las Heras, Mariano; Sandi, Steven; Baartman, Jantiene; Cerdà, Artemi
2017-04-01
Anthropogenic activities and climate change are imposing an unprecedented pressure on arid and semi-arid ecosystems, where shortage of water can trigger shifts in landscapes' structures and function leading to degradation and desertification. Hydrological connectivity is a useful framework for understanding water redistribution and scaling issues associated to runoff and sediment production, since human and/or natural disturbances alter the surface water availability and pathways increasing/decreasing connectivity. In this presentation, we illustrate the use of the connectivity framework for several examples of dryland systems that are analysed at a variety of spatial and temporal scales. In doing so, we draw particular attention to the analysis of co-evolution of system structures and function, and how they drive threshold behaviour leading to desertification and degradation. We first analyse the case of semi-arid rangelands, where feedbacks between decline in vegetation density and landscape erosion reinforces degradation processes driven by changes in connectivity until a threshold is crossed above which the return to a functional system is unlikely. We then focus on semi-arid wetlands, where decreases in water volumes promotes dryland vegetation encroachment that changes drainage conditions and connectivity potentially reinforcing redistribution of flow paths to other wetland areas. The examples presented highlight the need to incorporate a co-evolutionary framework for the analysis of changing connectivity patterns and the emergence of thresholds in arid and semi-arid systems.
2011-01-01
Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440
NASA Astrophysics Data System (ADS)
Suárez Araujo, Carmen Paz; Barahona da Fonseca, Isabel; Barahona da Fonseca, José; Simões da Fonseca, J.
2004-08-01
A theoretical approach that aims to the identification of information processing that may be responsible for emotional dimensions of subjective experience is studied as an initial step in the construction of a neural net model of affective dimensions of psychological experiences. In this paper it is suggested that a way of orientated recombination of attributes can be present not only in the perceptive processing but also in cognitive ones. We will present an analysis of the most important emotion theories, we show their neural organization and we propose the neural computation approach as an appropriate framework for generating knowledge about the neural base of emotional experience. Finally, in this study we present a scheme corresponding to framework to design a computational neural multi-system for Emotion (CONEMSE).
Stakeholder management for conservation projects: a case study of Ream National Park, Cambodia.
De Lopez, T T
2001-07-01
The paper gives an account of the development and implementation of a stakeholder management framework at Ream National Park, Cambodia. Firstly, the concept of stakeholder is reviewed in management and in conservation literatures. Secondly, the context in which the stakeholder framework was implemented is described. Thirdly, a five-step methodological framework is suggested: (1) stakeholder analysis, (2) stakeholder mapping, (3) development of generic strategies and workplan, (4) presentation of the workplan to stakeholders, and (5) implementation of the workplan. This framework classifies stakeholders according to their level of influence on the project and their potential for the conservation of natural resources. In a situation characterized by conflicting claims on natural resources, park authorities were able to successfully develop specific strategies for the management of stakeholders. The conclusion discusses the implications of the Ream experience and the generalization of the framework to other protected areas.
Critical asset and portfolio risk analysis: an all-hazards framework.
Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark
2007-08-01
This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.
Calibration and analysis of genome-based models for microbial ecology.
Louca, Stilianos; Doebeli, Michael
2015-10-16
Microbial ecosystem modeling is complicated by the large number of unknown parameters and the lack of appropriate calibration tools. Here we present a novel computational framework for modeling microbial ecosystems, which combines genome-based model construction with statistical analysis and calibration to experimental data. Using this framework, we examined the dynamics of a community of Escherichia coli strains that emerged in laboratory evolution experiments, during which an ancestral strain diversified into two coexisting ecotypes. We constructed a microbial community model comprising the ancestral and the evolved strains, which we calibrated using separate monoculture experiments. Simulations reproduced the successional dynamics in the evolution experiments, and pathway activation patterns observed in microarray transcript profiles. Our approach yielded detailed insights into the metabolic processes that drove bacterial diversification, involving acetate cross-feeding and competition for organic carbon and oxygen. Our framework provides a missing link towards a data-driven mechanistic microbial ecology.
DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.
Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien
2017-09-01
Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.
A formal framework of scenario creation and analysis of extreme hydrological events
NASA Astrophysics Data System (ADS)
Lohmann, D.
2007-12-01
We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.
NASA Technical Reports Server (NTRS)
Hyder, Imran; Schaefer, Joseph; Justusson, Brian; Wanthal, Steve; Leone, Frank; Rose, Cheryl
2017-01-01
Reducing the timeline for development and certification for composite structures has been a long standing objective of the aerospace industry. This timeline can be further exacerbated when attempting to integrate new fiber-reinforced composite materials due to the large number of testing required at every level of design. computational progressive damage and failure analysis (PDFA) attempts to mitigate this effect; however, new PDFA methods have been slow to be adopted in industry since material model evaluation techniques have not been fully defined. This study presents an efficient evaluation framework which uses a piecewise verification and validation (V&V) approach for PDFA methods. Specifically, the framework is applied to evaluate PDFA research codes within the context of intralaminar damage. Methods are incrementally taken through various V&V exercises specifically tailored to study PDFA intralaminar damage modeling capability. Finally, methods are evaluated against a defined set of success criteria to highlight successes and limitations.
A theoretical framework for negotiating the path of emergency management multi-agency coordination.
Curnin, Steven; Owen, Christine; Paton, Douglas; Brooks, Benjamin
2015-03-01
Multi-agency coordination represents a significant challenge in emergency management. The need for liaison officers working in strategic level emergency operations centres to play organizational boundary spanning roles within multi-agency coordination arrangements that are enacted in complex and dynamic emergency response scenarios creates significant research and practical challenges. The aim of the paper is to address a gap in the literature regarding the concept of multi-agency coordination from a human-environment interaction perspective. We present a theoretical framework for facilitating multi-agency coordination in emergency management that is grounded in human factors and ergonomics using the methodology of core-task analysis. As a result we believe the framework will enable liaison officers to cope more efficiently within the work domain. In addition, we provide suggestions for extending the theory of core-task analysis to an alternate high reliability environment. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
PetIGA: A framework for high-performance isogeometric analysis
Dalcin, Lisandro; Collier, Nathaniel; Vignal, Philippe; ...
2016-05-25
We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility ofmore » PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. Lastly, we show strong scaling results on up to 4096 cores, which confirm the suitability of PetIGA for large scale simulations.« less
An ovine in vivo framework for tracheobronchial stent analysis.
McGrath, Donnacha J; Thiebes, Anja Lena; Cornelissen, Christian G; O'Shea, Mary B; O'Brien, Barry; Jockenhoevel, Stefan; Bruzzi, Mark; McHugh, Peter E
2017-10-01
Tracheobronchial stents are most commonly used to restore patency to airways stenosed by tumour growth. Currently all tracheobronchial stents are associated with complications such as stent migration, granulation tissue formation, mucous plugging and stent strut fracture. The present work develops a computational framework to evaluate tracheobronchial stent designs in vivo. Pressurised computed tomography is used to create a biomechanical lung model which takes into account the in vivo stress state, global lung deformation and local loading from pressure variation. Stent interaction with the airway is then evaluated for a number of loading conditions including normal breathing, coughing and ventilation. Results of the analysis indicate that three of the major complications associated with tracheobronchial stents can potentially be analysed with this framework, which can be readily applied to the human case. Airway deformation caused by lung motion is shown to have a significant effect on stent mechanical performance, including implications for stent migration, granulation formation and stent fracture.
Deep Borehole Disposal Safety Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeze, Geoffrey A.; Stein, Emily; Price, Laura L.
This report presents a preliminary safety analysis for the deep borehole disposal (DBD) concept, using a safety case framework. A safety case is an integrated collection of qualitative and quantitative arguments, evidence, and analyses that substantiate the safety, and the level of confidence in the safety, of a geologic repository. This safety case framework for DBD follows the outline of the elements of a safety case, and identifies the types of information that will be required to satisfy these elements. At this very preliminary phase of development, the DBD safety case focuses on the generic feasibility of the DBD concept.more » It is based on potential system designs, waste forms, engineering, and geologic conditions; however, no specific site or regulatory framework exists. It will progress to a site-specific safety case as the DBD concept advances into a site-specific phase, progressing through consent-based site selection and site investigation and characterization.« less
Decision support models for solid waste management: Review and game-theoretic approaches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karmperis, Athanasios C., E-mail: athkarmp@mail.ntua.gr; Army Corps of Engineers, Hellenic Army General Staff, Ministry of Defence; Aravossis, Konstantinos
Highlights: ► The mainly used decision support frameworks for solid waste management are reviewed. ► The LCA, CBA and MCDM models are presented and their strengths, weaknesses, similarities and possible combinations are analyzed. ► The game-theoretic approach in a solid waste management context is presented. ► The waste management bargaining game is introduced as a specific decision support framework. ► Cooperative and non-cooperative game-theoretic approaches to decision support for solid waste management are discussed. - Abstract: This paper surveys decision support models that are commonly used in the solid waste management area. Most models are mainly developed within three decisionmore » support frameworks, which are the life-cycle assessment, the cost–benefit analysis and the multi-criteria decision-making. These frameworks are reviewed and their strengths and weaknesses as well as their critical issues are analyzed, while their possible combinations and extensions are also discussed. Furthermore, the paper presents how cooperative and non-cooperative game-theoretic approaches can be used for the purpose of modeling and analyzing decision-making in situations with multiple stakeholders. Specifically, since a waste management model is sustainable when considering not only environmental and economic but also social aspects, the waste management bargaining game is introduced as a specific decision support framework in which future models can be developed.« less
Marks, Katherine R; Clark, Claire D
2018-05-12
In an article published in International Journal of the Addictions in 1989, Nick Piazza and his coauthors described "telescoping," an accelerated progression through "landmark symptoms" of alcoholism, among a sample of recovering women. The aim of this critical analysis is to apply a feminist philosophy of science to examine the origins of the framework of telescoping research and its implications for contemporary scientific inquiry. A feminist philosophy of science framework is outlined and applied to key source publications of telescoping literature drawn from international and United States-based peer-reviewed journals published beginning in 1952. A feminist philosophy of science framework identifies gender bias in telescoping research in three ways. First, gender bias was present in the early conventions that laid the groundwork for telescoping research. Second, a "masculine" framework was present in the methodology guiding telescoping research. Third, gender bias was present in the interpretation of results as evidenced by biased comparative language. Telescoping research contributed to early evidence of critical sex and gender differences helping to usher in women's substance abuse research more broadly. However, it also utilized a "masculine" framework that perpetuated gender bias and limited generative, novel research that can arise from women-focused research and practice. A feminist philosophy of science identifies gender bias in telescoping research and provides an alternative, more productive approach for substance abuse researchers and clinicians.
NASA Technical Reports Server (NTRS)
Killough, Brian D., Jr.
2008-01-01
The CEOS Systems Engineering Office will present a 2007 status report of the CEOS constellation process, present a new systems engineering framework, and analysis results from the GEO Societal Benefit Area (SBA) assessment and the OST constellation requirements assessment.
Does Private Tutoring Work? The Effectiveness of Private Tutoring: A Nonparametric Bounds Analysis
ERIC Educational Resources Information Center
Hof, Stefanie
2014-01-01
Private tutoring has become popular throughout the world. However, evidence for the effect of private tutoring on students' academic outcome is inconclusive; therefore, this paper presents an alternative framework: a nonparametric bounds method. The present examination uses, for the first time, a large representative data-set in a European setting…
Accounting for Taste: Learning by Doing in the College Classroom
ERIC Educational Resources Information Center
Bradshaw, Kathlyn E.; Harvey, Robert W.
2017-01-01
This article presents Edelson and Reiser's (2006) strategies as a framework for analyzing an instance of authentic practice in a managerial accounting course. Specifically, this article presents an analysis of a managerial accounting project design created to provide learning-by-doing via authentic practice. Students need more than to learn about…
NASA Astrophysics Data System (ADS)
Paul, Avijit Kumar
2018-04-01
One new open-framework two-dimensional layer, [Cd(NH3CH2COO)(SO4)], I, has been synthesized using amino acid as templating agent. Single crystal structural analysis shows that the compound crystallizes in monoclinic cell with non-centrosymmetric space group P21, a = 4.9513(1) Å, b = 7.9763(2) Å, c = 8.0967(2) Å, β = 105.917(1)° and V = 307.504(12) Å3. The compound has connectivity between the Cd-centers and the sulfate units forming a two-dimensional layer structure. Sulfate unit is coordinated to metal center with η3, μ4 mode possessing a coordination free oxygen atom. The zwitterionic form of glycine molecule is present in the structure bridging with two metal centers through μ2-mode by carboxylate oxygens. The topological analysis reveals that the two-dimensional network is formed with a novel 4- and 6-connected binodal net of (32,42,52)(34,44,54,63) topology. Although one end of the glycine molecule is free from coordination, the structure is highly stable up to 350 °C. Strong N-H⋯ O hydrogen bonding interactions play an important role in the stabilization and formation of three-dimensional supramolecular structure. The cyanosilylation of imines using the present compounds as heterogeneous catalyst indicates good catalytic behavior. The present study illustrates the usefulness of the amino acid for the structure building in less studied sulfate based framework materials as well as designing of new heterogeneous catalysts for the broad application. The compound has also been characterized through elemental analysis, PXRD, IR, SEM and TG-DT studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slunge, Daniel, E-mail: daniel.slunge@economics.gu.se; Tran, Trang Thi Huyen, E-mail: trang2k@yahoo.com
Building on new institutional theory, this paper develops an analytical framework for analyzing constraints to the institutionalization of strategic environmental assessment (SEA) at four different institutional levels. The framework is tested in an empirical analysis of the environmental assessment system in Vietnam, which is a frontrunner among developing countries regarding the introduction and use of SEA. Building on interviews with Vietnamese and international experts, as well as an extensive literature review, we identify institutional constraints which challenge the effective use of SEA in Vietnam. We conclude that commonly identified constraints, such as inadequate training, technical guidelines, baseline data and financialmore » resources, are strongly linked to constraints at higher institutional levels, such as incentives to not share information between ministries and severe restrictions on access to information and public participation. Without a thorough understanding of these institutional constraints, there is a risk that attempts to improve the use of SEA are misdirected. Thus, a careful institutional analysis should guide efforts to introduce and improve the use of SEA in Vietnam and other developing countries. The analytical framework for analyzing constraints to institutionalization of SEA presented in this paper represents a systematic effort in this direction. - Highlights: • A framework for analyzing constraints to institutionalizing SEA is developed • Empirical analysis of the strategic environmental assessment system in Vietnam • Constraints in the action arena linked to deeper institutional constraints • Institutional analysis needed prior to introducing SEA in developing countries.« less
A Query Expansion Framework in Image Retrieval Domain Based on Local and Global Analysis
Rahman, M. M.; Antani, S. K.; Thoma, G. R.
2011-01-01
We present an image retrieval framework based on automatic query expansion in a concept feature space by generalizing the vector space model of information retrieval. In this framework, images are represented by vectors of weighted concepts similar to the keyword-based representation used in text retrieval. To generate the concept vocabularies, a statistical model is built by utilizing Support Vector Machine (SVM)-based classification techniques. The images are represented as “bag of concepts” that comprise perceptually and/or semantically distinguishable color and texture patches from local image regions in a multi-dimensional feature space. To explore the correlation between the concepts and overcome the assumption of feature independence in this model, we propose query expansion techniques in the image domain from a new perspective based on both local and global analysis. For the local analysis, the correlations between the concepts based on the co-occurrence pattern, and the metrical constraints based on the neighborhood proximity between the concepts in encoded images, are analyzed by considering local feedback information. We also analyze the concept similarities in the collection as a whole in the form of a similarity thesaurus and propose an efficient query expansion based on the global analysis. The experimental results on a photographic collection of natural scenes and a biomedical database of different imaging modalities demonstrate the effectiveness of the proposed framework in terms of precision and recall. PMID:21822350
2014-01-01
Background Recent innovations in sequencing technologies have provided researchers with the ability to rapidly characterize the microbial content of an environmental or clinical sample with unprecedented resolution. These approaches are producing a wealth of information that is providing novel insights into the microbial ecology of the environment and human health. However, these sequencing-based approaches produce large and complex datasets that require efficient and sensitive computational analysis workflows. Many recent tools for analyzing metagenomic-sequencing data have emerged, however, these approaches often suffer from issues of specificity, efficiency, and typically do not include a complete metagenomic analysis framework. Results We present PathoScope 2.0, a complete bioinformatics framework for rapidly and accurately quantifying the proportions of reads from individual microbial strains present in metagenomic sequencing data from environmental or clinical samples. The pipeline performs all necessary computational analysis steps; including reference genome library extraction and indexing, read quality control and alignment, strain identification, and summarization and annotation of results. We rigorously evaluated PathoScope 2.0 using simulated data and data from the 2011 outbreak of Shiga-toxigenic Escherichia coli O104:H4. Conclusions The results show that PathoScope 2.0 is a complete, highly sensitive, and efficient approach for metagenomic analysis that outperforms alternative approaches in scope, speed, and accuracy. The PathoScope 2.0 pipeline software is freely available for download at: http://sourceforge.net/projects/pathoscope/. PMID:25225611
Overdenture retaining bar stress distribution: a finite-element analysis.
Caetano, Conrado Reinoldes; Mesquita, Marcelo Ferraz; Consani, Rafael Leonardo Xediek; Correr-Sobrinho, Lourenço; Dos Santos, Mateus Bertolini Fernandes
2015-05-01
Evaluate the stress distribution on the peri-implant bone tissue and prosthetic components of bar-clip retaining systems for overdentures presenting different implant inclinations, vertical misfit and framework material. Three-dimensional models of a jaw and an overdenture retained by two implants and a bar-clip attachment were modeled using specific software (SolidWorks 2010). The studied variables were: latero-lateral inclination of one implant (-10°, -5°, 0°, +5°, +10°); vertical misfit on the other implant (50, 100, 200 µm); and framework material (Au type IV, Ag-Pd, Ti cp, Co-Cr). Solid models were imported into mechanical simulation software (ANSYS Workbench 11). All nodes on the bone's external surface were constrained and a displacement was applied to simulate the settling of the framework on the ill-fitted component. Von Mises stress for the prosthetic components and maximum principal stress to the bone tissue were evaluated. The +10° inclination presented the worst biomechanical behavior, promoting the highest stress values on the bar framework and peri-implant bone tissue. The -5° group presented the lowest stress values on the prosthetic components and the lowest stress value on peri-implant bone tissue was observed in -10°. Increased vertical misfit caused an increase on the stress values in all evaluated structures. Stiffer framework materials caused a considerable stress increase in the framework itself, prosthetic screw of the fitted component and peri-implant bone tissue. Inclination of one implant associated with vertical misfit caused a relevant effect on the stress distribution in bar-clip retained overdentures. Different framework materials promoted increased levels of stress in all the evaluated structures.
Vigi4Med Scraper: A Framework for Web Forum Structured Data Extraction and Semantic Representation
Audeh, Bissan; Beigbeder, Michel; Zimmermann, Antoine; Jaillon, Philippe; Bousquet, Cédric
2017-01-01
The extraction of information from social media is an essential yet complicated step for data analysis in multiple domains. In this paper, we present Vigi4Med Scraper, a generic open source framework for extracting structured data from web forums. Our framework is highly configurable; using a configuration file, the user can freely choose the data to extract from any web forum. The extracted data are anonymized and represented in a semantic structure using Resource Description Framework (RDF) graphs. This representation enables efficient manipulation by data analysis algorithms and allows the collected data to be directly linked to any existing semantic resource. To avoid server overload, an integrated proxy with caching functionality imposes a minimal delay between sequential requests. Vigi4Med Scraper represents the first step of Vigi4Med, a project to detect adverse drug reactions (ADRs) from social networks founded by the French drug safety agency Agence Nationale de Sécurité du Médicament (ANSM). Vigi4Med Scraper has successfully extracted greater than 200 gigabytes of data from the web forums of over 20 different websites. PMID:28122056
Family Environment and Childhood Obesity: A New Framework with Structural Equation Modeling
Huang, Hui; Wan Mohamed Radzi, Che Wan Jasimah bt; Salarzadeh Jenatabadi, Hashem
2017-01-01
The main purpose of the current article is to introduce a framework of the complexity of childhood obesity based on the family environment. A conceptual model that quantifies the relationships and interactions among parental socioeconomic status, family food security level, child’s food intake and certain aspects of parental feeding behaviour is presented using the structural equation modeling (SEM) concept. Structural models are analysed in terms of the direct and indirect connections among latent and measurement variables that lead to the child weight indicator. To illustrate the accuracy, fit, reliability and validity of the introduced framework, real data collected from 630 families from Urumqi (Xinjiang, China) were considered. The framework includes two categories of data comprising the normal body mass index (BMI) range and obesity data. The comparison analysis between two models provides some evidence that in obesity modeling, obesity data must be extracted from the dataset and analysis must be done separately from the normal BMI range. This study may be helpful for researchers interested in childhood obesity modeling based on family environment. PMID:28208833
Family Environment and Childhood Obesity: A New Framework with Structural Equation Modeling.
Huang, Hui; Wan Mohamed Radzi, Che Wan Jasimah Bt; Salarzadeh Jenatabadi, Hashem
2017-02-13
The main purpose of the current article is to introduce a framework of the complexity of childhood obesity based on the family environment. A conceptual model that quantifies the relationships and interactions among parental socioeconomic status, family food security level, child's food intake and certain aspects of parental feeding behaviour is presented using the structural equation modeling (SEM) concept. Structural models are analysed in terms of the direct and indirect connections among latent and measurement variables that lead to the child weight indicator. To illustrate the accuracy, fit, reliability and validity of the introduced framework, real data collected from 630 families from Urumqi (Xinjiang, China) were considered. The framework includes two categories of data comprising the normal body mass index (BMI) range and obesity data. The comparison analysis between two models provides some evidence that in obesity modeling, obesity data must be extracted from the dataset and analysis must be done separately from the normal BMI range. This study may be helpful for researchers interested in childhood obesity modeling based on family environment.
Constrained maximum consistency multi-path mitigation
NASA Astrophysics Data System (ADS)
Smith, George B.
2003-10-01
Blind deconvolution algorithms can be useful as pre-processors for signal classification algorithms in shallow water. These algorithms remove the distortion of the signal caused by multipath propagation when no knowledge of the environment is available. A framework in which filters that produce signal estimates from each data channel that are as consistent with each other as possible in a least-squares sense has been presented [Smith, J. Acoust. Soc. Am. 107 (2000)]. This framework provides a solution to the blind deconvolution problem. One implementation of this framework yields the cross-relation on which EVAM [Gurelli and Nikias, IEEE Trans. Signal Process. 43 (1995)] and Rietsch [Rietsch, Geophysics 62(6) (1997)] processing are based. In this presentation, partially blind implementations that have good noise stability properties are compared using Classification Operating Characteristics (CLOC) analysis. [Work supported by ONR under Program Element 62747N and NRL, Stennis Space Center, MS.
A conceptual framework for the domain of evidence-based design.
Ulrich, Roger S; Berry, Leonard L; Quan, Xiaobo; Parish, Janet Turner
2010-01-01
The physical facilities in which healthcare services are performed play an important role in the healing process. Evidence-based design in healthcare is a developing field of study that holds great promise for benefiting key stakeholders: patients, families, physicians, and nurses, as well as other healthcare staff and organizations. In this paper, the authors present and discuss a conceptual framework intended to capture the current domain of evidence-based design in healthcare. In this framework, the built environment is represented by nine design variable categories: audio environment, visual environment, safety enhancement, wayfinding system, sustainability, patient room, family support spaces, staff support spaces, and physician support spaces. Furthermore, a series of matrices is presented that indicates knowledge gaps concerning the relationship between specific healthcare facility design variable categories and participant and organizational outcomes. From this analysis, the authors identify fertile research opportunities from the perspectives of key stakeholders.
An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Zhou, Ning
With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less
Automated Analysis of Stateflow Models
NASA Technical Reports Server (NTRS)
Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier
2017-01-01
Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.
A Novel Framework for the Comparative Analysis of Biological Networks
Pache, Roland A.; Aloy, Patrick
2012-01-01
Genome sequencing projects provide nearly complete lists of the individual components present in an organism, but reveal little about how they work together. Follow-up initiatives have deciphered thousands of dynamic and context-dependent interrelationships between gene products that need to be analyzed with novel bioinformatics approaches able to capture their complex emerging properties. Here, we present a novel framework for the alignment and comparative analysis of biological networks of arbitrary topology. Our strategy includes the prediction of likely conserved interactions, based on evolutionary distances, to counter the high number of missing interactions in the current interactome networks, and a fast assessment of the statistical significance of individual alignment solutions, which vastly increases its performance with respect to existing tools. Finally, we illustrate the biological significance of the results through the identification of novel complex components and potential cases of cross-talk between pathways and alternative signaling routes. PMID:22363585
Dynamic decision making for dam-break emergency management - Part 1: Theoretical framework
NASA Astrophysics Data System (ADS)
Peng, M.; Zhang, L. M.
2013-02-01
An evacuation decision for dam breaks is a very serious issue. A late decision may lead to loss of lives and properties, but a very early evacuation will incur unnecessary expenses. This paper presents a risk-based framework of dynamic decision making for dam-break emergency management (DYDEM). The dam-break emergency management in both time scale and space scale is introduced first to define the dynamic decision problem. The probability of dam failure is taken as a stochastic process and estimated using a time-series analysis method. The flood consequences are taken as functions of warning time and evaluated with a human risk analysis model (HURAM) based on Bayesian networks. A decision criterion is suggested to decide whether to evacuate the population at risk (PAR) or to delay the decision. The optimum time for evacuating the PAR is obtained by minimizing the expected total loss, which integrates the time-related probabilities and flood consequences. When a delayed decision is chosen, the decision making can be updated with available new information. A specific dam-break case study is presented in a companion paper to illustrate the application of this framework to complex dam-breaching problems.
Evolution of a multilevel framework for health program evaluation.
Masso, Malcolm; Quinsey, Karen; Fildes, Dave
2017-07-01
A well-conceived evaluation framework increases understanding of a program's goals and objectives, facilitates the identification of outcomes and can be used as a planning tool during program development. Herein we describe the origins and development of an evaluation framework that recognises that implementation is influenced by the setting in which it takes place, the individuals involved and the processes by which implementation is accomplished. The framework includes an evaluation hierarchy that focuses on outcomes for consumers, providers and the care delivery system, and is structured according to six domains: program delivery, impact, sustainability, capacity building, generalisability and dissemination. These components of the evaluation framework fit into a matrix structure, and cells within the matrix are supported by relevant evaluation tools. The development of the framework has been influenced by feedback from various stakeholders, existing knowledge of the evaluators and the literature on health promotion and implementation science. Over the years, the framework has matured and is generic enough to be useful in a wide variety of circumstances, yet specific enough to focus data collection, data analysis and the presentation of findings.
Mendes, Stella de N C; Edwards Rezende, Carlos E; Moretti Neto, Rafael T; Capello Sousa, Edson A; Henrique Rubo, José
2013-04-01
Passive fit has been considered an important requirement for the longevity of implant-supported prostheses. Among the different steps of prostheses construction, casting is a feature that can influence the precision of fit and consequently the uniformity of possible deformation among abutments upon the framework connection. This study aimed at evaluating the deformation of abutments after the connection of frameworks either cast in one piece or after soldering. A master model was used to simulate a human mandible with 5 implants. Ten frameworks were fabricated on cast models and divided into 2 groups. Strain gauges were attached to the mesial and distal sides of the abutments to capture their deformation after the framework's screw retentions were tightened to the abutments. The mean values of deformation were submitted to a 3-way analysis of variance that revealed significant differences between procedures and the abutment side. The results showed that none of the frameworks presented a complete passive fit. The soldering procedure led to a better although uneven distribution of compression strains on the abutments.
A Framework for Sentiment Analysis Implementation of Indonesian Language Tweet on Twitter
NASA Astrophysics Data System (ADS)
Asniar; Aditya, B. R.
2017-01-01
Sentiment analysis is the process of understanding, extracting, and processing the textual data automatically to obtain information. Sentiment analysis can be used to see opinion on an issue and identify a response to something. Millions of digital data are still not used to be able to provide any information that has usefulness, especially for government. Sentiment analysis in government is used to monitor the work programs of the government such as the Government of Bandung City through social media data. The analysis can be used quickly as a tool to see the public response to the work programs, so the next strategic steps can be taken. This paper adopts Support Vector Machine as a supervised algorithm for sentiment analysis. It presents a framework for sentiment analysis implementation of Indonesian language tweet on twitter for Work Programs of Government of Bandung City. The results of this paper can be a reference for decision making in local government.
Van Neste, Christophe; Vandewoestyne, Mado; Van Criekinge, Wim; Deforce, Dieter; Van Nieuwerburgh, Filip
2014-03-01
Forensic scientists are currently investigating how to transition from capillary electrophoresis (CE) to massive parallel sequencing (MPS) for analysis of forensic DNA profiles. MPS offers several advantages over CE such as virtually unlimited multiplexy of loci, combining both short tandem repeat (STR) and single nucleotide polymorphism (SNP) loci, small amplicons without constraints of size separation, more discrimination power, deep mixture resolution and sample multiplexing. We present our bioinformatic framework My-Forensic-Loci-queries (MyFLq) for analysis of MPS forensic data. For allele calling, the framework uses a MySQL reference allele database with automatically determined regions of interest (ROIs) by a generic maximal flanking algorithm which makes it possible to use any STR or SNP forensic locus. Python scripts were designed to automatically make allele calls starting from raw MPS data. We also present a method to assess the usefulness and overall performance of a forensic locus with respect to MPS, as well as methods to estimate whether an unknown allele, which sequence is not present in the MySQL database, is in fact a new allele or a sequencing error. The MyFLq framework was applied to an Illumina MiSeq dataset of a forensic Illumina amplicon library, generated from multilocus STR polymerase chain reaction (PCR) on both single contributor samples and multiple person DNA mixtures. Although the multilocus PCR was not yet optimized for MPS in terms of amplicon length or locus selection, the results show excellent results for most loci. The results show a high signal-to-noise ratio, correct allele calls, and a low limit of detection for minor DNA contributors in mixed DNA samples. Technically, forensic MPS affords great promise for routine implementation in forensic genomics. The method is also applicable to adjacent disciplines such as molecular autopsy in legal medicine and in mitochondrial DNA research. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Walker, Daniel M; Hefner, Jennifer L; Sova, Lindsey N; Hilligoss, Brian; Song, Paula H; McAlearney, Ann Scheck
Accountable care organizations (ACOs) are emerging across the healthcare marketplace and now include Medicare, Medicaid, and private sector payers covering more than 24 million lives. However, little is known about the process of organizational change required to achieve cost savings and quality improvements from the ACO model. This study applies the complex innovation implementation framework to understand the challenges and facilitators associated with the ACO implementation process. We conducted four case studies of private sector ACOs, selected to achieve variation in terms of geography and organizational maturity. Across sites, we used semistructured interviews with 68 key informants to elicit information regarding ACO implementation. Our analysis found challenges and facilitators across all domains in the conceptual framework. Notably, our findings deviated from the framework in two ways. First, findings from the financial resource availability domain revealed both financial and nonfinancial (i.e., labor) resources that contributed to implementation effectiveness. Second, a new domain, patient engagement, emerged as an important factor in implementation effectiveness. We present these deviations in an adapted framework. As the ACO model proliferates, these findings can support implementation efforts, and they highlight the importance of focusing on patients throughout the process. Importantly, this study extends the complex innovation implementation framework to incorporate consumers into the implementation framework, making it more patient centered and aiding future efforts.
Linking stressors and ecological responses
Gentile, J.H.; Solomon, K.R.; Butcher, J.B.; Harrass, M.; Landis, W.G.; Power, M.; Rattner, B.A.; Warren-Hicks, W.J.; Wenger, R.; Foran, Jeffery A.; Ferenc, Susan A.
1999-01-01
To characterize risk, it is necessary to quantify the linkages and interactions between chemical, physical and biological stressors and endpoints in the conceptual framework for ecological risk assessment (ERA). This can present challenges in a multiple stressor analysis, and it will not always be possible to develop a quantitative stressor-response profile. This review commences with a conceptual representation of the problem of developing a linkage analysis for multiple stressors and responses. The remainder of the review surveys a variety of mathematical and statistical methods (e.g., ranking methods, matrix models, multivariate dose-response for mixtures, indices, visualization, simulation modeling and decision-oriented methods) for accomplishing the linkage analysis for multiple stressors. Describing the relationships between multiple stressors and ecological effects are critical components of 'effects assessment' in the ecological risk assessment framework.
A mechanics framework for a progressive failure methodology for laminated composites
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Allen, David H.; Lo, David C.
1989-01-01
A laminate strength and life prediction methodology has been postulated for laminated composites which accounts for the progressive development of microstructural damage to structural failure. A damage dependent constitutive model predicts the stress redistribution in an average sense that accompanies damage development in laminates. Each mode of microstructural damage is represented by a second-order tensor valued internal state variable which is a strain like quantity. The mechanics framework together with the global-local strategy for predicting laminate strength and life is presented in the paper. The kinematic effects of damage are represented by effective engineering moduli in the global analysis and the results of the global analysis provide the boundary conditions for the local ply level stress analysis. Damage evolution laws are based on experimental results.
Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment
NASA Astrophysics Data System (ADS)
Taner, M. U.; Wi, S.; Brown, C.
2017-12-01
The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.
KirungaTashobya, Christine; Ssengooba, Freddie; Nabyonga-Orem, Juliet; Bataringaya, Juliet; Macq, Jean; Marchal, Bruno; Musila, Timothy; Criel, Bart
2018-05-10
In 2003 the Uganda Ministry of Health (MoH) introduced the District League Table (DLT) to track district performance. This review of the DLT is intended to add to the evidence base on Health Systems Performance Assessment (HSPA) globally, with emphasis on Low and Middle Income Countries (LMICs), and provide recommendations for adjustments to the current Ugandan reality. A normative HSPA framework was used to inform the development of a Key Informant Interview (KII) tool. Thirty Key Informants were interviewed, purposively selected from the Ugandan health system on the basis of having developed or used the DLT. KII data and information from published and grey literature on the Uganda health system was analyzed using deductive analysis. Stakeholder involvement in the development of the DLT was limited, including MoH officials and development partners, and a few district technical managers. Uganda policy documents articulate a conceptually broad health system whereas the DLT focuses on a healthcare system. The complexity and dynamism of the Uganda health system was insufficiently acknowledged by the HSPA framework. Though DLT objectives and indicators were articulated, there was no conceptual reference model and lack of clarity on the constitutive dimensions. The DLT mechanisms for change were not explicit. The DLT compared markedly different districts and did not identify factors behind observed performance. Uganda lacks a designated institutional unit for the analysis and presentation of HSPA data, and there are challenges in data quality and range. The critique of the DLT using a normative model supported the development of recommendation for Uganda district HSPA and provides lessons for other LMICs. A similar approach can be used by researchers and policy makers elsewhere for the review and development of other frameworks. Adjustments in Uganda district HSPA should consider: wider stakeholder involvement with more district managers including political, administrative and technical; better anchoring within the national health system framework; integration of the notion of complexity in the design of the framework; and emphasis on facilitating district decision-making and learning. There is need to improve data quality and range and additional approaches for data analysis and presentation.
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096
Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.
Nanosurveyor: a framework for real-time data processing
Daurer, Benedikt J.; Krishnan, Hari; Perciano, Talita; ...
2017-01-31
Background: The ever improving brightness of accelerator based sources is enabling novel observations and discoveries with faster frame rates, larger fields of view, higher resolution, and higher dimensionality. Results: Here we present an integrated software/algorithmic framework designed to capitalize on high-throughput experiments through efficient kernels, load-balanced workflows, which are scalable in design. We describe the streamlined processing pipeline of ptychography data analysis. Conclusions: The pipeline provides throughput, compression, and resolution as well as rapid feedback to the microscope operators.
Multi-threaded Event Processing with DANA
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Lawrence; Elliott Wolin
2007-05-14
The C++ data analysis framework DANA has been written to support the next generation of Nuclear Physics experiments at Jefferson Lab commensurate with the anticipated 12GeV upgrade. The DANA framework was designed to allow multi-threaded event processing with a minimal impact on developers of reconstruction software. This document describes how DANA implements multi-threaded event processing and compares it to simply running multiple instances of a program. Also presented are relative reconstruction rates for Pentium4, Xeon, and Opteron based machines.
NASA Astrophysics Data System (ADS)
Ahmadibasir, Mohammad
In this study an interpretive learning framework that aims to measure learning on the classroom level is introduced. In order to develop and evaluate the value of the framework, a theoretical/empirical study is designed. The researcher attempted to illustrate how the proposed framework provides insights on the problem of classroom-level learning. The framework is developed by construction of connections between the current literature on science learning and Wittgenstein's language-game theory. In this framework learning is defined as change of classroom language-game or discourse. In the proposed framework, learning is measured by analysis of classroom discourse. The empirical explanation power of the framework is evaluated by applying the framework in the analysis of learning in a fifth-grade science classroom. The researcher attempted to analyze how students' colloquial discourse changed to a discourse that bears more resemblance to science discourse. The results of the empirical part of the investigation are presented in three parts: first, the gap between what students did and what they were supposed to do was reported. The gap showed that students during the classroom inquiry wanted to do simple comparisons by direct observation, while they were supposed to do tool-assisted observation and procedural manipulation for a complete comparison. Second, it was illustrated that the first attempt to connect the colloquial to science discourse was done by what was immediately intelligible for students and then the teacher negotiated with students in order to help them to connect the old to the new language-game more purposefully. The researcher suggested that these two events in the science classroom are critical in discourse change. Third, it was illustrated that through the academic year, the way that students did the act of comparison was improved and by the end of the year more accurate causal inferences were observable in classroom communication. At the end of the study, the researcher illustrates that the application of the proposed framework resulted in an improved version of the framework. The improved version of the proposed framework is more connected to the topic of science learning, and is able to measure the change of discourse in higher resolution.
Planning and Evaluation of New Academic Library Services by Means of Web-Based Conjoint Analysis
ERIC Educational Resources Information Center
Decker, Reinhold; Hermelbracht, Antonia
2006-01-01
New product development is an omnipresent challenge to modern libraries in the information age. Therefore, we present the design and selected results of a comprehensive research project aiming at the systematic and user-oriented planning of academic library services by means of conjoint analysis. The applicability of the analytical framework used…
Ideologies of English in a Chinese High School EFL Textbook: A Critical Discourse Analysis
ERIC Educational Resources Information Center
Xiong, Tao; Qian, Yamin
2012-01-01
In this article we examine ideologies of English in present-day China with a special focus on textbook discourse. The research framework is informed by critical theories on language and education. Critical discourse analysis is applied as a methodological approach characterized by a socially committed attitude in the explanation and interpretation…
Towards a Framework of Socio-Linguistic Analysis of Science Textbooks: The Greek Case
ERIC Educational Resources Information Center
Dimopoulos, Kostas; Koulaidis, Vasilis; Sklaveniti, Spyridoula
2005-01-01
This study aims at presenting a grid for analysing the way the language employed in Greek school science textbooks tends to project pedagogic messages. These messages are analysed for the different school science subjects (i.e., Physics, Chemistry, Biology) and educational levels (i.e., primary and lower secondary level). The analysis is made…
ERIC Educational Resources Information Center
Gabriel, Rachael
2017-01-01
Drawing upon discursive psychology as a theoretical and methodological framework, the author analyzes a set of five postobservation debrief conversations between novice teachers and their mentors. The author presents analysis and findings by highlighting how the interpretative repertoires of the rubric and protocol documents may be used to shape…
ERIC Educational Resources Information Center
Dante, Angelo; Fabris, Stefano; Palese, Alvisa
2013-01-01
Empirical studies and conceptual frameworks presented in the extant literature offer a static imagining of academic failure. Time-to-event analysis, which captures the dynamism of individual factors, as when they determine the failure to properly tailor timely strategies, impose longitudinal studies which are still lacking within the field. The…
Analysis of the Image of Scientists Portrayed in the Lebanese National Science Textbooks
ERIC Educational Resources Information Center
Yacoubian, Hagop A.; Al-Khatib, Layan; Mardirossian, Taline
2017-01-01
This article presents an analysis of how scientists are portrayed in the Lebanese national science textbooks. The purpose of this study was twofold. First, to develop a comprehensive analytical framework that can serve as a tool to analyze the image of scientists portrayed in educational resources. Second, to analyze the image of scientists…
Work-Centered Approach to Insurgency Campaign Analysis
2007-06-01
a constructivist or sensemaking philosophy by defining data, information , situation awareness , and situation understanding in the following manner...present paper explores a new approach to understanding transnational insurgency movements –an approach based on a fundamental analysis of the knowledge ...country or region. By focusing at the fundamental level of knowledge creation, the resulting framework allows an understanding of insurgency
A Content Analysis of Dissertations in the Field of Educational Technology: The Case of Turkey
ERIC Educational Resources Information Center
Durak, Gurhan; Cankaya, Serkan; Yunkul, Eyup; Misirli, Zeynel Abidin
2018-01-01
The present study aimed at conducting content analysis on dissertations carried out so far in the field of Educational Technology in Turkey. A total of 137 dissertations were examined to determine the key words, academic discipline, research areas, theoretical frameworks, research designs and models, statistical analyses, data collection tools,…
Beckwith, Sue; Dickinson, Angela; Kendall, Sally
2008-12-01
This paper draws on the work of Paley and Duncan et al in order to extend and engender debate regarding the use of Concept Analysis frameworks. Despite the apparent plethora of Concept Analysis frameworks used in nursing studies we found that over half of those used were derived from the work of one author. This paper explores the suitability and use of these frameworks and is set at a time when the numbers of published concept analysis papers are increasing. For the purpose of this study thirteen commonly used frameworks, identified from the nursing journals 1993 to 2005, were explored to reveal their origins, ontological and philosophical stance, and any common elements. The frameworks were critiqued and links made between their antecedents. It was noted if the articles contained discussion of any possible tensions between the ontological perspective of the framework used, the process of analysis, praxis and possible nursing theory developments. It was found that the thirteen identified frameworks are mainly based on hermeneutic propositions regarding understandings and are interpretive procedures founded on self-reflective modes of discovery. Six frameworks rely on or include the use of casuistry. Seven of the frameworks identified are predicated on, or adapt the work of Wilson, a school master writing for his pupils. Wilson's framework has a simplistic eleven step, binary and reductionist structure. Other frameworks identified include Morse et al's framework which this article suggests employs a contestable theory of concept maturity. Based on the findings revealed through our exploration of the use of concept analysis frameworks in the nursing literature, concerns were raised regarding the unjustified adaptation and alterations and the uncritical use of the frameworks. There is little evidence that these frameworks provide the necessary depth, rigor or replicability to enable the development in nursing theory which they underpin.
Reduced-Order Aerothermoelastic Analysis of Hypersonic Vehicle Structures
NASA Astrophysics Data System (ADS)
Falkiewicz, Nathan J.
Design and simulation of hypersonic vehicles require consideration of a variety of disciplines due to the highly coupled nature of the flight regime. In order to capture all of the potential effects on vehicle dynamics, one must consider the aerodynamics, aerodynamic heating, heat transfer, and structural dynamics as well as the interactions between these disciplines. The problem is further complicated by the large computational expense involved in capturing all of these effects and their interactions in a full-order sense. While high-fidelity modeling techniques exist for each of these disciplines, the use of such techniques is computationally infeasible in a vehicle design and control system simulation setting for such a highly coupled problem. Early in the design stage, many iterations of analyses may need to be carried out as the vehicle design matures, thus requiring quick analysis turnaround time. Additionally, the number of states used in the analyses must be small enough to allow for efficient control simulation and design. As a result, alternatives to full-order models must be considered. This dissertation presents a fully coupled, reduced-order aerothermoelastic framework for the modeling and analysis of hypersonic vehicle structures. The reduced-order transient thermal solution is a modal solution based on the proper orthogonal decomposition. The reduced-order structural dynamic model is based on projection of the equations of motion onto a Ritz modal subspace that is identified a priori. The reduced-order models are assembled into a time-domain aerothermoelastic simulation framework which uses a partitioned time-marching scheme to account for the disparate time scales of the associated physics. The aerothermoelastic modeling framework is outlined and the formulations associated with the unsteady aerodynamics, aerodynamic heating, transient thermal, and structural dynamics are outlined. Results demonstrate the accuracy of the reduced-order transient thermal and structural dynamic models under variation in boundary conditions and flight conditions. The framework is applied to representative hypersonic vehicle control surface structures and a variety of studies are conducted to assess the impact of aerothermoelastic effects on hypersonic vehicle dynamics. The results presented in this dissertation demonstrate the ability of the proposed framework to perform efficient aerothermoelastic analysis.
An integrated framework for the geographic surveillance of chronic disease
2009-01-01
Background Geographic public health surveillance is concerned with describing and disseminating geographic information about disease and other measures of health to policy makers and the public. While methodological developments in the geographical analysis of disease are numerous, few have been integrated into a framework that also considers the effects of case ascertainment bias on the effectiveness of chronic disease surveillance. Results We present a framework for the geographic surveillance of chronic disease that integrates methodological developments in the spatial statistical analysis and case ascertainment. The framework uses an hierarchical approach to organize and model health information derived from an administrative health data system, and importantly, supports the detection and analysis of case ascertainment bias in geographic data. We test the framework on asthmatic data from Alberta, Canada. We observe high prevalence in south-western Alberta, particularly among Aboriginal females. We also observe that persons likely mistaken for asthmatics tend to be distributed in a pattern similar to asthmatics, suggesting that there may be an underlying social vulnerability to a variety of respiratory illnesses, or the presence of a diagnostic practice style effect. Finally, we note that clustering of asthmatics tends to occur at small geographic scales, while clustering of persons mistaken for asthmatics tends to occur at larger geographic scales. Conclusion Routine and ongoing geographic surveillance of chronic diseases is critical to developing an understanding of underlying epidemiology, and is critical to informing policy makers and the public about the health of the population. PMID:19948046
NASA Astrophysics Data System (ADS)
LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.
2016-12-01
Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.
Hamm, Julian; Money, Arthur G; Atwal, Anita; Paraskevopoulos, Ioannis
2016-02-01
In recent years, an ever increasing range of technology-based applications have been developed with the goal of assisting in the delivery of more effective and efficient fall prevention interventions. Whilst there have been a number of studies that have surveyed technologies for a particular sub-domain of fall prevention, there is no existing research which surveys the full spectrum of falls prevention interventions and characterises the range of technologies that have augmented this landscape. This study presents a conceptual framework and survey of the state of the art of technology-based fall prevention systems which is derived from a systematic template analysis of studies presented in contemporary research literature. The framework proposes four broad categories of fall prevention intervention system: Pre-fall prevention; Post-fall prevention; Fall injury prevention; Cross-fall prevention. Other categories include, Application type, Technology deployment platform, Information sources, Deployment environment, User interface type, and Collaborative function. After presenting the conceptual framework, a detailed survey of the state of the art is presented as a function of the proposed framework. A number of research challenges emerge as a result of surveying the research literature, which include a need for: new systems that focus on overcoming extrinsic falls risk factors; systems that support the environmental risk assessment process; systems that enable patients and practitioners to develop more collaborative relationships and engage in shared decision making during falls risk assessment and prevention activities. In response to these challenges, recommendations and future research directions are proposed to overcome each respective challenge. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Modeling Criminal Activity in Urban Landscapes
NASA Astrophysics Data System (ADS)
Brantingham, Patricia; Glässer, Uwe; Jackson, Piper; Vajihollahi, Mona
Computational and mathematical methods arguably have an enormous potential for serving practical needs in crime analysis and prevention by offering novel tools for crime investigations and experimental platforms for evidence-based policy making. We present a comprehensive formal framework and tool support for mathematical and computational modeling of criminal behavior to facilitate systematic experimental studies of a wide range of criminal activities in urban environments. The focus is on spatial and temporal aspects of different forms of crime, including opportunistic and serial violent crimes. However, the proposed framework provides a basis to push beyond conventional empirical research and engage the use of computational thinking and social simulations in the analysis of terrorism and counter-terrorism.
The Service Environment for Enhanced Knowledge and Research (SEEKR) Framework
NASA Astrophysics Data System (ADS)
King, T. A.; Walker, R. J.; Weigel, R. S.; Narock, T. W.; McGuire, R. E.; Candey, R. M.
2011-12-01
The Service Environment for Enhanced Knowledge and Research (SEEKR) Framework is a configurable service oriented framework to enable the discovery, access and analysis of data shared in a community. The SEEKR framework integrates many existing independent services through the use of web technologies and standard metadata. Services are hosted on systems by using an application server and are callable by using REpresentational State Transfer (REST) protocols. Messages and metadata are transferred with eXtensible Markup Language (XML) encoding which conform to a published XML schema. Space Physics Archive Search and Extract (SPASE) metadata is central to utilizing the services. Resources (data, documents, software, etc.) are described with SPASE and the associated Resource Identifier is used to access and exchange resources. The configurable options for the service can be set by using a web interface. Services are packaged as web application resource (WAR) files for direct deployment on application services such as Tomcat or Jetty. We discuss the composition of the SEEKR framework, how new services can be integrated and the steps necessary to deploying the framework. The SEEKR Framework emerged from NASA's Virtual Magnetospheric Observatory (VMO) and other systems and we present an overview of these systems from a SEEKR Framework perspective.
INDOOR AIR ASSESSMENT - A REVIEW OF INDOOR AIR QUALITY RISK CHARACTERIZATION
Risk assessment methodologies provide a mechanism for incorporating scientific evidence and Judgments Into the risk management decision process. isk characterization framework has been developed to provide a systematic approach for analysis and presentation of risk characterizati...
An Extensible Model and Analysis Framework
2010-11-01
Eclipse or Netbeans Rich Client Platform (RCP). We call this the Triquetrum Project. Configuration files support narrower variability than Triquetrum/RCP...Triquetrum/RCP supports assembling in arbitrary ways. (12/08 presentation) 2. Prototyped OSGi component architecture for use with Netbeans and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Weixuan; Lian, Jianming; Engel, Dave
2017-07-27
This paper presents a general uncertainty quantification (UQ) framework that provides a systematic analysis of the uncertainty involved in the modeling of a control system, and helps to improve the performance of a control strategy.
Silicon ribbon technology assessment 1978-1986 - A computer-assisted analysis using PECAN
NASA Technical Reports Server (NTRS)
Kran, A.
1978-01-01
The paper presents a 1978-1986 economic outlook for silicon ribbon technology based on the capillary action shaping technique. The outlook is presented within the framework of two sets of scenarios, which develop strategy for approaching the 1986 national energy capacity cost objective of $0.50/WE peak. The PECAN (Photovoltaic Energy Conversion Analysis) simulation technique is used to develop a 1986 sheet material price ($50/sq m) which apparently can be attained without further scientific breakthrough.
Analyzing Crime and Crime Control: A Resource Guide. Economics-Political Science Series.
ERIC Educational Resources Information Center
Butterfield, Ruth I.; And Others
This document, the fourth in a series of resource guides emphasizing economic-political analysis of contemporary public policies and issues, focuses on crime control. Designed as a three-week unit for secondary school students, the guide is presented in three sections. The introduction presents an economic and a political science framework for…
The presentation builds on the work presented last year at the 14th CMAS meeting and it is applied to the work performed in the context of the AQMEII-HTAP collaboration. The analysis is conducted within the framework of the third phase of AQMEII (Air Quality Model Evaluation Inte...
Preliminary eddy current modelling for the large angle magnetic suspension test fixture
NASA Technical Reports Server (NTRS)
Britcher, Colin
1994-01-01
This report presents some recent developments in the mathematical modeling of the Large Angle Magnetic Suspension Test Fixture (LAMSTF) at NASA Langley Research Center. It is shown that these effects are significant, but may be amenable to analysis, modeling and measurement. A theoretical framework is presented, together with a comparison of computed and experimental data.
Analyzing the Relationships between Culture and Mentoring
ERIC Educational Resources Information Center
Kochan, Frances
2013-01-01
The purpose of this manuscript is to present a theoretical model of culture from which a conceptual framework has been built that can be used to conduct a cultural analysis. The paper presents definitions and components of culture and its role in our expanding global society. It then relates these to research findings related to the relationship…
Ethnicity identification from face images
NASA Astrophysics Data System (ADS)
Lu, Xiaoguang; Jain, Anil K.
2004-08-01
Human facial images provide the demographic information, such as ethnicity and gender. Conversely, ethnicity and gender also play an important role in face-related applications. Image-based ethnicity identification problem is addressed in a machine learning framework. The Linear Discriminant Analysis (LDA) based scheme is presented for the two-class (Asian vs. non-Asian) ethnicity classification task. Multiscale analysis is applied to the input facial images. An ensemble framework, which integrates the LDA analysis for the input face images at different scales, is proposed to further improve the classification performance. The product rule is used as the combination strategy in the ensemble. Experimental results based on a face database containing 263 subjects (2,630 face images, with equal balance between the two classes) are promising, indicating that LDA and the proposed ensemble framework have sufficient discriminative power for the ethnicity classification problem. The normalized ethnicity classification scores can be helpful in the facial identity recognition. Useful as a "soft" biometric, face matching scores can be updated based on the output of ethnicity classification module. In other words, ethnicity classifier does not have to be perfect to be useful in practice.
BEATBOX v1.0: Background Error Analysis Testbed with Box Models
NASA Astrophysics Data System (ADS)
Knote, Christoph; Barré, Jérôme; Eckl, Max
2018-02-01
The Background Error Analysis Testbed (BEATBOX) is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX) to the Kinetic Pre-Processor (KPP), this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE) point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.
An Open Data Platform in the framework of the EGI-LifeWatch Competence Center
NASA Astrophysics Data System (ADS)
Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Yaiza Rodríguez Marrero, Ana
2016-04-01
The working pilot of an Open Data Platform supporting the full data cycle in research is presented. It aims to preserve knowledge explicitly, starting with the description of the Case Studies, and integrating data and software management and preservation on equal basis. The uninterrupted support in the chain starts at the data acquisition level and covers up to the support for reuse and publication in an open framework, providing integrity and provenance controls. The Lifewatch Open Science Framework is a pilot web portal developed in collaboration with different commercial companies that tries to enrich and integrate different data lifecycle-related tools in order to address the management of the different steps: data planning, gathering, storing, curation, preservation, sharing, discovering, etc. To achieve this goal, the platform includes the following features: -Data Management Planning. Tool to set up an structure of the data, including what data will be generated, how it will be exploited, re-used, curated, preserved, etc. It has a semantic approach: includes reference to ontologies in order to express what data will be gathered. -Close to instrumentation. The portal includes a distributed storage system that can be used both for storing data from instruments and output data from analysis. All that data can be shared -Analysis. Resources from EGI Federated Cloud are accessible within the portal, so that users can exploit computing resources to perform analysis and other processes, including workflows. -Preservation. Data can be preserved in different systems and DOIs can be minted not only for datasets but also for software, DMPs, etc. The presentation will show the different components of the framework as well as how it can be extrapolated to other communities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Chris, E-mail: cyuan@uwm.edu; Wang, Endong; Zhai, Qiang
Temporal homogeneity of inventory data is one of the major problems in life cycle assessment (LCA). Addressing temporal homogeneity of life cycle inventory data is important in reducing the uncertainties and improving the reliability of LCA results. This paper attempts to present a critical review and discussion on the fundamental issues of temporal homogeneity in conventional LCA and propose a theoretical framework for temporal discounting in LCA. Theoretical perspectives for temporal discounting in life cycle inventory analysis are discussed first based on the key elements of a scientific mechanism for temporal discounting. Then generic procedures for performing temporal discounting inmore » LCA is derived and proposed based on the nature of the LCA method and the identified key elements of a scientific temporal discounting method. A five-step framework is proposed and reported in details based on the technical methods and procedures needed to perform a temporal discounting in life cycle inventory analysis. Challenges and possible solutions are also identified and discussed for the technical procedure and scientific accomplishment of each step within the framework. - Highlights: • A critical review for temporal homogeneity problem of life cycle inventory data • A theoretical framework for performing temporal discounting on inventory data • Methods provided to accomplish each step of the temporal discounting framework.« less
McGraw, Caroline; Drennan, Vari M
2015-02-01
To evaluate the suitability of root cause analysis frameworks for the investigation of community-acquired pressure ulcers. The objective was to identify the extent to which these frameworks take account of the setting where the ulcer originated as being the person's home rather than a hospital setting. Pressure ulcers involving full-thickness skin loss are increasingly being regarded as indicators of nursing patient safety failure, requiring investigation using root cause analysis frameworks. Evidence suggests that root cause analysis frameworks developed in hospital settings ignore the unique dimensions of risk in home healthcare settings. A systematic literature review and documentary analysis of frameworks used to investigate community-acquired grade three and four pressure ulcers by home nursing services in England. No published papers were identified for inclusion in the review. Fifteen patient safety investigative frameworks were collected and analysed. Twelve of the retrieved frameworks were intended for the investigation of community-acquired pressure ulcers; seven of which took account of the setting where the ulcer originated as being the patient's home. This study provides evidence to suggest that many of the root cause analysis frameworks used to investigate community-acquired pressure ulcers in England are unsuitable for this purpose. This study provides researchers and practitioners with evidence of the need to develop appropriate home nursing root cause analysis frameworks to investigate community-acquired pressure ulcers. © 2014 John Wiley & Sons Ltd.
NGSANE: a lightweight production informatics framework for high-throughput data analysis.
Buske, Fabian A; French, Hugh J; Smith, Martin A; Clark, Susan J; Bauer, Denis C
2014-05-15
The initial steps in the analysis of next-generation sequencing data can be automated by way of software 'pipelines'. However, individual components depreciate rapidly because of the evolving technology and analysis methods, often rendering entire versions of production informatics pipelines obsolete. Constructing pipelines from Linux bash commands enables the use of hot swappable modular components as opposed to the more rigid program call wrapping by higher level languages, as implemented in comparable published pipelining systems. Here we present Next Generation Sequencing ANalysis for Enterprises (NGSANE), a Linux-based, high-performance-computing-enabled framework that minimizes overhead for set up and processing of new projects, yet maintains full flexibility of custom scripting when processing raw sequence data. Ngsane is implemented in bash and publicly available under BSD (3-Clause) licence via GitHub at https://github.com/BauerLab/ngsane. Denis.Bauer@csiro.au Supplementary data are available at Bioinformatics online.
A Framework for Daylighting Optimization in Whole Buildings with OpenStudio
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2016-08-12
We present a toolkit and workflow for leveraging the OpenStudio (Guglielmetti et al. 2010) platform to perform daylighting analysis and optimization in a whole building energy modeling (BEM) context. We have re-implemented OpenStudio's integrated Radiance and EnergyPlus functionality as an OpenStudio Measure. The OpenStudio Radiance Measure works within the OpenStudio Application and Parametric Analysis Tool, as well as the OpenStudio Server large scale analysis framework, allowing a rigorous daylighting simulation to be performed on a single building model or potentially an entire population of programmatically generated models. The Radiance simulation results can automatically inform the broader building energy model, andmore » provide dynamic daylight metrics as a basis for decision. Through introduction and example, this paper illustrates the utility of the OpenStudio building energy modeling platform to leverage existing simulation tools for integrated building energy performance simulation, daylighting analysis, and reportage.« less
Coggins, L.G.; Pine, William E.; Walters, C.J.; Martell, S.J.D.
2006-01-01
We present a new model to estimate capture probabilities, survival, abundance, and recruitment using traditional Jolly-Seber capture-recapture methods within a standard fisheries virtual population analysis framework. This approach compares the numbers of marked and unmarked fish at age captured in each year of sampling with predictions based on estimated vulnerabilities and abundance in a likelihood function. Recruitment to the earliest age at which fish can be tagged is estimated by using a virtual population analysis method to back-calculate the expected numbers of unmarked fish at risk of capture. By using information from both marked and unmarked animals in a standard fisheries age structure framework, this approach is well suited to the sparse data situations common in long-term capture-recapture programs with variable sampling effort. ?? Copyright by the American Fisheries Society 2006.
Saturno-Hernández, Pedro J; Gutiérrez-Reyes, Juan Pablo; Vieyra-Romero, Waldo Ivan; Romero-Martínez, Martín; O'Shea-Cuevas, Gabriel Jaime; Lozano-Herrera, Javier; Tavera-Martínez, Sonia; Hernández-Ávila, Mauricio
2016-01-01
To describe the conceptual framework and methods for implementation and analysis of the satisfaction survey of the Mexican System for Social Protection in Health. We analyze the methodological elements of the 2013, 2014 and 2015 surveys, including the instrument, sampling method and study design, conceptual framework, and characteristics and indicators of the analysis. The survey captures information on perceived quality and satisfaction. Sampling has national and State representation. Simple and composite indicators (index of satisfaction and rate of reported quality problems) are built and described. The analysis is completed using Pareto diagrams, correlation between indicators and association with satisfaction by means of multivariate models. The measurement of satisfaction and perceived quality is a complex but necessary process to comply with regulations and to identify strategies for improvement. The described survey presents a design and rigorous analysis focused on its utility for improving.
Analysis of airframe/engine interactions in integrated flight and propulsion control
NASA Technical Reports Server (NTRS)
Schierman, John D.; Schmidt, David K.
1991-01-01
An analysis framework for the assessment of dynamic cross-coupling between airframe and engine systems from the perspective of integrated flight/propulsion control is presented. This analysis involves to determining the significance of the interactions with respect to deterioration in stability robustness and performance, as well as critical frequency ranges where problems may occur due to these interactions. The analysis illustrated here investigates both the airframe's effects on the engine control loops and the engine's effects on the airframe control loops in two case studies. The second case study involves a multi-input/multi-output analysis of the airframe. Sensitivity studies are performed on critical interactions to examine the degradations in the system's stability robustness and performance. Magnitudes of the interactions required to cause instabilities, as well as the frequencies at which the instabilities occur are recorded. Finally, the analysis framework is expanded to include control laws which contain cross-feeds between the airframe and engine systems.
A novel bi-level meta-analysis approach: applied to biological pathway analysis.
Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin
2016-02-01
The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Tracking and Motion Analysis of Crack Propagations in Crystals for Molecular Dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsap, L V; Duchaineau, M; Goldgof, D B
2001-05-14
This paper presents a quantitative analysis for a discovery in molecular dynamics. Recent simulations have shown that velocities of crack propagations in crystals under certain conditions can become supersonic, which is contrary to classical physics. In this research, they present a framework for tracking and motion analysis of crack propagations in crystals. It includes line segment extraction based on Canny edge maps, feature selection based on physical properties, and subsequent tracking of primary and secondary wavefronts. This tracking is completely automated; it runs in real time on three 834-image sequences using forty 250 MHZ processors. Results supporting physical observations aremore » presented in terms of both feature tracking and velocity analysis.« less
A Framework for the Design of Effective Graphics for Scientific Visualization
NASA Technical Reports Server (NTRS)
Miceli, Kristina D.
1992-01-01
This proposal presents a visualization framework, based on a data model, that supports the production of effective graphics for scientific visualization. Visual representations are effective only if they augment comprehension of the increasing amounts of data being generated by modern computer simulations. These representations are created by taking into account the goals and capabilities of the scientist, the type of data to be displayed, and software and hardware considerations. This framework is embodied in an assistant-based visualization system to guide the scientist in the visualization process. This will improve the quality of the visualizations and decrease the time the scientist is required to spend in generating the visualizations. I intend to prove that such a framework will create a more productive environment for tile analysis and interpretation of large, complex data sets.
Clustered-dot halftoning with direct binary search.
Goyal, Puneet; Gupta, Madhur; Staelin, Carl; Fischer, Mani; Shacham, Omri; Allebach, Jan P
2013-02-01
In this paper, we present a new algorithm for aperiodic clustered-dot halftoning based on direct binary search (DBS). The DBS optimization framework has been modified for designing clustered-dot texture, by using filters with different sizes in the initialization and update steps of the algorithm. Following an intuitive explanation of how the clustered-dot texture results from this modified framework, we derive a closed-form cost metric which, when minimized, equivalently generates stochastic clustered-dot texture. An analysis of the cost metric and its influence on the texture quality is presented, which is followed by a modification to the cost metric to reduce computational cost and to make it more suitable for screen design.
Gutman, Boris; Leonardo, Cassandra; Jahanshad, Neda; Hibar, Derrek; Eschen-burg, Kristian; Nir, Talia; Villalon, Julio; Thompson, Paul
2014-01-01
We present a framework for registering cortical surfaces based on tractography-informed structural connectivity. We define connectivity as a continuous kernel on the product space of the cortex, and develop a method for estimating this kernel from tractography fiber models. Next, we formulate the kernel registration problem, and present a means to non-linearly register two brains’ continuous connectivity profiles. We apply theoretical results from operator theory to develop an algorithm for decomposing the connectome into its shared and individual components. Lastly, we extend two discrete connectivity measures to the continuous case, and apply our framework to 98 Alzheimer’s patients and controls. Our measures show significant differences between the two groups. PMID:25320795
Microgravity isolation system design: A modern control synthesis framework
NASA Technical Reports Server (NTRS)
Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.
1994-01-01
Manned orbiters will require active vibration isolation for acceleration-sensitive microgravity science experiments. Since umbilicals are highly desirable or even indispensable for many experiments, and since their presence greatly affects the complexity of the isolation problem, they should be considered in control synthesis. In this paper a general framework is presented for applying extended H2 synthesis methods to the three-dimensional microgravity isolation problem. The methodology integrates control and state frequency weighting and input and output disturbance accommodation techniques into the basic H2 synthesis approach. The various system models needed for design and analysis are also presented. The paper concludes with a discussion of a general design philosophy for the microgravity vibration isolation problem.
Microgravity isolation system design: A modern control synthesis framework
NASA Technical Reports Server (NTRS)
Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.
1994-01-01
Manned orbiters will require active vibration isolation for acceleration-sensitive microgravity science experiments. Since umbilicals are highly desirable or even indispensable for many experiments, and since their presence greatly affects the complexity of the isolation problem, they should be considered in control synthesis. A general framework is presented for applying extended H2 synthesis methods to the three-dimensional microgravity isolation problem. The methodology integrates control and state frequency weighting and input and output disturbance accommodation techniques into the basic H2 synthesis approach. The various system models needed for design and analysis are also presented. The paper concludes with a discussion of a general design philosophy for the microgravity vibration isolation problem.
Dreiling, Katharina; Montano, Diego; Poinstingl, Herbert; Müller, Tjark; Schiekirka-Schwake, Sarah; Anders, Sven; von Steinbüchel, Nicole; Raupach, Tobias
2017-08-01
Evaluation is an integral part of curriculum development in medical education. Given the peculiarities of bedside teaching, specific evaluation tools for this instructional format are needed. Development of these tools should be informed by appropriate frameworks. The purpose of this study was to develop a specific evaluation tool for bedside teaching based on the Stanford Faculty Development Program's clinical teaching framework. Based on a literature review yielding 47 evaluation items, an 18-item questionnaire was compiled and subsequently completed by undergraduate medical students at two German universities. Reliability and validity were assessed in an exploratory full information item factor analysis (study one) and a confirmatory factor analysis as well as a measurement invariance analysis (study two). The exploratory analysis involving 824 students revealed a three-factor structure. Reliability estimates of the subscales were satisfactory (α = 0.71-0.84). The model yielded satisfactory fit indices in the confirmatory factor analysis involving 1043 students. The new questionnaire is short and yet based on a widely-used framework for clinical teaching. The analyses presented here indicate good reliability and validity of the instrument. Future research needs to investigate whether feedback generated from this tool helps to improve teaching quality and student learning outcome.
A Business Case Framework for Planning Clinical Nurse Specialist-Led Interventions.
Bartlett Ellis, Rebecca J; Embree, Jennifer L; Ellis, Kurt G
2015-01-01
The purpose of this article is to describe a business case framework that can guide clinical nurse specialists (CNS) in clinical intervention development. Increased emphasis on cost-effective interventions in healthcare requires skills in analyzing the need to make the business case, especially for resource-intensive interventions. This framework assists the CNS to anticipate resource use and then consider if the intervention makes good business sense. We describe a business case framework that can assist the CNS to fully explore the problem and determine if developing an intervention is a good investment. We describe several analyses that facilitate making the business case to include the following: problem identification and alignment with strategic priorities, needs assessment, stakeholder analysis, market analysis, intervention implementation planning, financial analysis, and outcome evaluation. The findings from these analyses can be used to develop a formal proposal to present to hospital leaders in a position to make decisions. By aligning intervention planning with organizational priorities and engaging patients in the process, interventions will be more likely to be implemented in practice and produce robust outcomes. The business case framework can be used to justify to organization decision makers the need to invest resources in new interventions that will make a difference for quality outcomes as well as the financial bottom line. This framework can be used to plan interventions that align with organizational strategic priorities, plan for associated costs and benefits, and outcome evaluation. Clinical nurse specialists are well positioned to lead clinical intervention projects that will improve the quality of patient care and be cost-effective. To do so requires skill development in making the business case.
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
NASA Astrophysics Data System (ADS)
Liu, Shuai; Chen, Ge; Yao, Shifeng; Tian, Fenglin; Liu, Wei
2017-07-01
This paper presents a novel integrated marine visualization framework which focuses on processing, analyzing the multi-dimension spatiotemporal marine data in one workflow. Effective marine data visualization is needed in terms of extracting useful patterns, recognizing changes, and understanding physical processes in oceanography researches. However, the multi-source, multi-format, multi-dimension characteristics of marine data pose a challenge for interactive and feasible (timely) marine data analysis and visualization in one workflow. And, global multi-resolution virtual terrain environment is also needed to give oceanographers and the public a real geographic background reference and to help them to identify the geographical variation of ocean phenomena. This paper introduces a data integration and processing method to efficiently visualize and analyze the heterogeneous marine data. Based on the data we processed, several GPU-based visualization methods are explored to interactively demonstrate marine data. GPU-tessellated global terrain rendering using ETOPO1 data is realized and the video memory usage is controlled to ensure high efficiency. A modified ray-casting algorithm for the uneven multi-section Argo volume data is also presented and the transfer function is designed to analyze the 3D structure of ocean phenomena. Based on the framework we designed, an integrated visualization system is realized. The effectiveness and efficiency of the framework is demonstrated. This system is expected to make a significant contribution to the demonstration and understanding of marine physical process in a virtual global environment.
Giordano, Bruno L.; Kayser, Christoph; Rousselet, Guillaume A.; Gross, Joachim; Schyns, Philippe G.
2016-01-01
Abstract We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open‐source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541–1573, 2017. © 2016 Wiley Periodicals, Inc. PMID:27860095
Graph-based urban scene analysis using symbolic data
NASA Astrophysics Data System (ADS)
Moissinac, Henri; Maitre, Henri; Bloch, Isabelle
1995-07-01
A framework is presented for the interpretation of a urban landscape based on the analysis of aerial pictures. This method has been designed for the use of a priori knowledge provided by a geographic map in order to improve the image analysis stage. A coherent final interpretation of the studied area is proposed. It relies on a graph based data structure to modelize the urban landscape, and on a global uncertainty management to evaluate the final confidence we can have in the results presented. This structure and uncertainty management tend to reflect the hierarchy of the available data and the interpretation levels.
Zhang, Chengwei; Li, Xiaohong; Li, Shuxin; Feng, Zhiyong
2017-09-20
Biological environment is uncertain and its dynamic is similar to the multiagent environment, thus the research results of the multiagent system area can provide valuable insights to the understanding of biology and are of great significance for the study of biology. Learning in a multiagent environment is highly dynamic since the environment is not stationary anymore and each agent's behavior changes adaptively in response to other coexisting learners, and vice versa. The dynamics becomes more unpredictable when we move from fixed-agent interaction environments to multiagent social learning framework. Analytical understanding of the underlying dynamics is important and challenging. In this work, we present a social learning framework with homogeneous learners (e.g., Policy Hill Climbing (PHC) learners), and model the behavior of players in the social learning framework as a hybrid dynamical system. By analyzing the dynamical system, we obtain some conditions about convergence or non-convergence. We experimentally verify the predictive power of our model using a number of representative games. Experimental results confirm the theoretical analysis. Under multiagent social learning framework, we modeled the behavior of agent in biologic environment, and theoretically analyzed the dynamics of the model. We present some sufficient conditions about convergence or non-convergence and prove them theoretically. It can be used to predict the convergence of the system.
Koutkias, Vassilis; Kilintzis, Vassilis; Stalidis, George; Lazou, Katerina; Niès, Julie; Durand-Texte, Ludovic; McNair, Peter; Beuscart, Régis; Maglaveras, Nicos
2012-06-01
The primary aim of this work was the development of a uniform, contextualized and sustainable knowledge-based framework to support adverse drug event (ADE) prevention via Clinical Decision Support Systems (CDSSs). In this regard, the employed methodology involved first the systematic analysis and formalization of the knowledge sources elaborated in the scope of this work, through which an application-specific knowledge model has been defined. The entire framework architecture has been then specified and implemented by adopting Computer Interpretable Guidelines (CIGs) as the knowledge engineering formalism for its construction. The framework integrates diverse and dynamic knowledge sources in the form of rule-based ADE signals, all under a uniform Knowledge Base (KB) structure, according to the defined knowledge model. Equally important, it employs the means to contextualize the encapsulated knowledge, in order to provide appropriate support considering the specific local environment (hospital, medical department, language, etc.), as well as the mechanisms for knowledge querying, inference, sharing, and management. In this paper, we present thoroughly the establishment of the proposed knowledge framework by presenting the employed methodology and the results obtained as regards implementation, performance and validation aspects that highlight its applicability and virtue in medication safety. Copyright © 2012 Elsevier Inc. All rights reserved.
Creative Survival in Educational Bureaucracies.
ERIC Educational Resources Information Center
Brubaker, Dale L.; Nelson, Roland H., Jr.
In order to survive creativity in and change educational organizations, the decision-maker needs to understand how these organizations presently function. Educational organizations are discussed as sociopolitical systems and a conceptual framework is proposed for analysis, planning, implementation, and evaluation. The five functions that…
An integrated hybrid spatial-compartmental modeling approach is presented for analyzing the dynamic distribution of chemicals in the multimedia environment. Information obtained from such analysis, which includes temporal chemical concentration profiles in various media, mass ...
Bayesian analysis of rare events
NASA Astrophysics Data System (ADS)
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Systematic text condensation: a strategy for qualitative analysis.
Malterud, Kirsti
2012-12-01
To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies. Giorgi's psychological phenomenological analysis is the point of departure and inspiration for systematic text condensation. The basic elements of Giorgi's method and the elaboration of these in systematic text condensation are presented, followed by a detailed description of procedures for analysis according to systematic text condensation. Finally, similarities and differences compared with other frequently applied methods for qualitative analysis are identified, as the foundation of a discussion of strengths and limitations of systematic text condensation. Systematic text condensation is a descriptive and explorative method for thematic cross-case analysis of different types of qualitative data, such as interview studies, observational studies, and analysis of written texts. The method represents a pragmatic approach, although inspired by phenomenological ideas, and various theoretical frameworks can be applied. The procedure consists of the following steps: 1) total impression - from chaos to themes; 2) identifying and sorting meaning units - from themes to codes; 3) condensation - from code to meaning; 4) synthesizing - from condensation to descriptions and concepts. Similarities and differences comparing systematic text condensation with other frequently applied qualitative methods regarding thematic analysis, theoretical methodological framework, analysis procedures, and taxonomy are discussed. Systematic text condensation is a strategy for analysis developed from traditions shared by most of the methods for analysis of qualitative data. The method offers the novice researcher a process of intersubjectivity, reflexivity, and feasibility, while maintaining a responsible level of methodological rigour.
ERIC Educational Resources Information Center
Vanfretti, Luigi; Farrokhabadi, Mostafa
2015-01-01
This article presents the implementation of the constructive alignment theory (CAT) in a power system analysis course through a consensus-based course design process. The consensus-based design process involves both the instructor and graduate-level students and it aims to develop the CAT framework in a holistic manner with the goal of including…
ERIC Educational Resources Information Center
Yusop, Farrah Dina
2013-01-01
This paper presents a curriculum and design analyses of an Emmy-award winning children educational television series, Cyberchase. Using Posner's (2004) four process of curriculum analysis framework, this paper addresses each of the components and relates it to the design principles undertaken by the Cyberchase production team. Media and document…
ERIC Educational Resources Information Center
Czapla, Malgorzata; Berlinska, Agnieszka
2011-01-01
The aim of this article is to present an analysis of formal educational documents in the context of the sustainable development notion. This goal was realised by an analysis of the National Curriculum Framework documents from 2002 in comparison with the newest document from 2008. In addition, seven teaching programmes were analysed. On the grounds…
ERIC Educational Resources Information Center
Anderhag, Per; Wickman, Per-Olof; Hamza, Karim Mikael
2015-01-01
In this article we respond to the discussion by Alexandra Schindel Dimick regarding how the taste analysis presented in our feature article can be expanded within a Bourdieuan framework. Here we acknowledge the significance of field theory to introduce wider reflexivity on the kind of taste that is constituted in the science classroom, while we at…
Re-Engineering Values into the Youth Education System: A Needs Analysis Study in Brunei Darussalam
ERIC Educational Resources Information Center
Zakaria, Gamal Abdul Nasir; Tajudeen, Ahmad Labeeb; Nawi, Aliff; Mahalle, Salwa
2014-01-01
This study aimed to present a practical framework for designing a values teaching program in the youth education system. The choice of content, the nature of the students with respect to learning and their perception about the selected content for teaching values were studied. The study follows a Needs analysis design which drew upon document…
ERIC Educational Resources Information Center
Sandoval, Ivonne; Possani, Edgar
2016-01-01
The purpose of this paper is to present an analysis of the difficulties faced by students when working with different representations of vectors, planes and their intersections in R[superscript 3]. Duval's theoretical framework on semiotic representations is used to design a set of evaluating activities, and later to analyze student work. The…
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei (OA)
2014-01-01
A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This presentation describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this presentation is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture.
Yin, X-X; Zhang, Y; Cao, J; Wu, J-L; Hadjiloucas, S
2016-12-01
We provide a comprehensive account of recent advances in biomedical image analysis and classification from two complementary imaging modalities: terahertz (THz) pulse imaging and dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). The work aims to highlight underlining commonalities in both data structures so that a common multi-channel data fusion framework can be developed. Signal pre-processing in both datasets is discussed briefly taking into consideration advances in multi-resolution analysis and model based fractional order calculus system identification. Developments in statistical signal processing using principal component and independent component analysis are also considered. These algorithms have been developed independently by the THz-pulse imaging and DCE-MRI communities, and there is scope to place them in a common multi-channel framework to provide better software standardization at the pre-processing de-noising stage. A comprehensive discussion of feature selection strategies is also provided and the importance of preserving textural information is highlighted. Feature extraction and classification methods taking into consideration recent advances in support vector machine (SVM) and extreme learning machine (ELM) classifiers and their complex extensions are presented. An outlook on Clifford algebra classifiers and deep learning techniques suitable to both types of datasets is also provided. The work points toward the direction of developing a new unified multi-channel signal processing framework for biomedical image analysis that will explore synergies from both sensing modalities for inferring disease proliferation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Exploring the Use of Cost-Benefit Analysis to Compare Pharmaceutical Treatments for Menorrhagia.
Sanghera, Sabina; Frew, Emma; Gupta, Janesh Kumar; Kai, Joe; Roberts, Tracy Elizabeth
2015-09-01
The extra-welfarist theoretical framework tends to focus on health-related quality of life, whilst the welfarist framework captures a wider notion of well-being. EQ-5D and SF-6D are commonly used to value outcomes in chronic conditions with episodic symptoms, such as heavy menstrual bleeding (clinically termed menorrhagia). Because of their narrow-health focus and the condition's periodic nature these measures may be unsuitable. A viable alternative measure is willingness to pay (WTP) from the welfarist framework. We explore the use of WTP in a preliminary cost-benefit analysis comparing pharmaceutical treatments for menorrhagia. A cost-benefit analysis was carried out based on an outcome of WTP. The analysis is based in the UK primary care setting over a 24-month time period, with a partial societal perspective. Ninety-nine women completed a WTP exercise from the ex-ante (pre-treatment/condition) perspective. Maximum average WTP values were elicited for two pharmaceutical treatments, levonorgestrel-releasing intrauterine system (LNG-IUS) and oral treatment. Cost data were offset against WTP and the net present value derived for treatment. Qualitative information explaining the WTP values was also collected. Oral treatment was indicated to be the most cost-beneficial intervention costing £107 less than LNG-IUS and generating £7 more benefits. The mean incremental net present value for oral treatment compared with LNG-IUS was £113. The use of the WTP approach was acceptable as very few protests and non-responses were observed. The preliminary cost-benefit analysis results recommend oral treatment as the first-line treatment for menorrhagia. The WTP approach is a feasible alternative to the conventional EQ-5D/SF-6D approaches and offers advantages by capturing benefits beyond health, which is particularly relevant in menorrhagia.
Influence of different tightening forces before laser welding to the implant/framework fit.
da Silveira-Júnior, Clebio Domingues; Neves, Flávio Domingues; Fernandes-Neto, Alfredo Júlio; Prado, Célio Jesus; Simamoto-Júnior, Paulo César
2009-06-01
The aim of the present study was to evaluate the influence of abutment screw tightening force before laser welding procedures on the vertical fit of metal frameworks over four implants. To construct the frameworks, prefabricated titanium abutments and cylindrical titanium bars were joined by laser welding to compose three groups: group of manual torque (GMT), GT10 and GT20. Before welding, manual torque simulating routine laboratory procedure was applied to GTM. In GT10 and GT20, the abutment screws received 10 and 20 Ncm torque, respectively. After welding, the implant/framework interfaces were assessed by optical comparator microscope using two methods. First, the single screw test (SST) was used, in which the interfaces of the screwed and non-screwed abutments were assessed, considering only the abutments at the framework extremities. Second, the interfaces of all the abutments were evaluated when they were screwed. In the SST, intergroup analysis (Kruskal Wallis) showed no significant difference among the three conditions of tightening force; that is, the different tightening force before welding did not guarantee smaller distortions. Intragroup analysis (Wilcoxon) showed that for all groups, the interfaces of the non-screwed abutments were statistically greater than the interfaces of the screwed abutments, evidencing distortions in all the frameworks. ANOVA was applied for the comparison of interfaces when all the abutments were screwed and showed no significant difference among the groups. Under the conditions of this study, pre-welding tightness on abutment screws did not influence the vertical fit of implant-supported metal frameworks.
Mirzoev, Tolib N; Green, Andrew; Van Kalliecharan, Ricky
2015-01-01
An adequate capacity of ministries of health (MOH) to develop and implement policies is essential. However, no frameworks were found assessing MOH capacity to conduct health policy processes within developing countries. This paper presents a conceptual framework for assessing MOH capacity to conduct policy processes based on a study from Tajikistan, a former Soviet republic where independence highlighted capacity challenges. The data collection for this qualitative study included in-depth interviews, document reviews and observations of policy events. Framework approach for analysis was used. The conceptual framework was informed by existing literature, guided the data collection and analysis, and was subsequently refined following insights from the study. The Tajik MOH capacity, while gradually improving, remains weak. There is poor recognition of wider contextual influences, ineffective leadership and governance as reflected in centralised decision-making, limited use of evidence, inadequate actors' participation and ineffective use of resources to conduct policy processes. However, the question is whether this is a reflection of lack of MOH ability or evidence of constraining environment or both. The conceptual framework identifies five determinants of robust policy processes, each with specific capacity needs: policy context, MOH leadership and governance, involvement of policy actors, the role of evidence and effective resource use for policy processes. Three underlying considerations are important for applying the capacity to policy processes: the need for clear focus, recognition of capacity levels and elements, and both ability and enabling environment. The proposed framework can be used in assessing and strengthening of the capacity of different policy actors. Copyright © 2013 John Wiley & Sons, Ltd.
Selvarasu, Suresh; Kim, Do Yun; Karimi, Iftekhar A; Lee, Dong-Yup
2010-10-01
We present an integrated framework for characterizing fed-batch cultures of mouse hybridoma cells producing monoclonal antibody (mAb). This framework systematically combines data preprocessing, elemental balancing and statistical analysis technique. Initially, specific rates of cell growth, glucose/amino acid consumptions and mAb/metabolite productions were calculated via curve fitting using logistic equations, with subsequent elemental balancing of the preprocessed data indicating the presence of experimental measurement errors. Multivariate statistical analysis was then employed to understand physiological characteristics of the cellular system. The results from principal component analysis (PCA) revealed three major clusters of amino acids with similar trends in their consumption profiles: (i) arginine, threonine and serine, (ii) glycine, tyrosine, phenylalanine, methionine, histidine and asparagine, and (iii) lysine, valine and isoleucine. Further analysis using partial least square (PLS) regression identified key amino acids which were positively or negatively correlated with the cell growth, mAb production and the generation of lactate and ammonia. Based on these results, the optimal concentrations of key amino acids in the feed medium can be inferred, potentially leading to an increase in cell viability and productivity, as well as a decrease in toxic waste production. The study demonstrated how the current methodological framework using multivariate statistical analysis techniques can serve as a potential tool for deriving rational medium design strategies. Copyright © 2010 Elsevier B.V. All rights reserved.
Geomorphic analysis of large alluvial rivers
NASA Astrophysics Data System (ADS)
Thorne, Colin R.
2002-05-01
Geomorphic analysis of a large river presents particular challenges and requires a systematic and organised approach because of the spatial scale and system complexity involved. This paper presents a framework and blueprint for geomorphic studies of large rivers developed in the course of basic, strategic and project-related investigations of a number of large rivers. The framework demonstrates the need to begin geomorphic studies early in the pre-feasibility stage of a river project and carry them through to implementation and post-project appraisal. The blueprint breaks down the multi-layered and multi-scaled complexity of a comprehensive geomorphic study into a number of well-defined and semi-independent topics, each of which can be performed separately to produce a clearly defined, deliverable product. Geomorphology increasingly plays a central role in multi-disciplinary river research and the importance of effective quality assurance makes it essential that audit trails and quality checks are hard-wired into study design. The structured approach presented here provides output products and production trails that can be rigorously audited, ensuring that the results of a geomorphic study can stand up to the closest scrutiny.
Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data
Kümmel, Anne; Panke, Sven; Heinemann, Matthias
2006-01-01
As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595
NASA Astrophysics Data System (ADS)
Kalman, Calvin S.; Aulls, Mark W.
This study examines a course in which students use two writing activities and collaborative group activities to examine the conceptual structure of the calculus-based introductory Physics course. Students are presented with two alternative frameworks; pre-Galilean Physics and Newtonian Physics. The idea of the course design is that students would at first view the frameworks almost in a theatrical sense as a view of a drama involving a conflict of actors;Aristotle, Galileo, Newton and others occurring a long time ago. As participants passing through a series of interventions, the students become aware that the frameworks relate concepts from different parts of the course and learn to evaluate the two alternative frameworks. They develop a scientific mindset changing their outlook on the course material from the viewpoint that it consists of a tool kit of assorted practices, classified according to problem type, to the viewpoint that it comprises a connected structure of concepts.
A security framework for nationwide health information exchange based on telehealth strategy.
Zaidan, B B; Haiqi, Ahmed; Zaidan, A A; Abdulnabi, Mohamed; Kiah, M L Mat; Muzamel, Hussaen
2015-05-01
This study focuses on the situation of health information exchange (HIE) in the context of a nationwide network. It aims to create a security framework that can be implemented to ensure the safe transmission of health information across the boundaries of care providers in Malaysia and other countries. First, a critique of the major elements of nationwide health information networks is presented from the perspective of security, along with such topics as the importance of HIE, issues, and main approaches. Second, a systematic evaluation is conducted on the security solutions that can be utilized in the proposed nationwide network. Finally, a secure framework for health information transmission is proposed within a central cloud-based model, which is compatible with the Malaysian telehealth strategy. The outcome of this analysis indicates that a complete security framework for a global structure of HIE is yet to be defined and implemented. Our proposed framework represents such an endeavor and suggests specific techniques to achieve this goal.
Development of the Modes of Collaboration framework
NASA Astrophysics Data System (ADS)
Pawlak, Alanna; Irving, Paul W.; Caballero, Marcos D.
2018-01-01
Group work is becoming increasingly common in introductory physics classrooms. Understanding how students engage in these group learning environments is important for designing and facilitating productive learning opportunities for students. We conducted a study in which we collected video of groups of students working on conceptual electricity and magnetism problems in an introductory physics course. In this setting, students needed to negotiate a common understanding and coordinate group decisions in order to complete the activity successfully. We observed students interacting in several distinct ways while solving these problems. Analysis of these observations focused on identifying the different ways students interacted and articulating what defines and distinguishes them, resulting in the development of the modes of collaboration framework. The modes of collaboration framework defines student interactions along three dimensions: social, discursive, and disciplinary content. This multidimensional approach offers a unique lens through which to consider group work and provides a flexibility that could allow the framework to be adapted for a variety of contexts. We present the framework and several examples of its application here.
Analysis of Implicit Uncertain Systems. Part 1: Theoretical Framework
1994-12-07
Analysis of Implicit Uncertain Systems Part I: Theoretical Framework Fernando Paganini * John Doyle 1 December 7, 1994 Abst rac t This paper...Analysis of Implicit Uncertain Systems Part I: Theoretical Framework 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...model and a number of constraints relevant to the analysis problem under consideration. In Part I of this paper we propose a theoretical framework which
Remote visual analysis of large turbulence databases at multiple scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulido, Jesus; Livescu, Daniel; Kanov, Kalin
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less
Remote visual analysis of large turbulence databases at multiple scales
Pulido, Jesus; Livescu, Daniel; Kanov, Kalin; ...
2018-06-15
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less
Does social marketing provide a framework for changing healthcare practice?
Morris, Zoë Slote; Clarkson, Peter John
2009-07-01
We argue that social marketing can be used as a generic framework for analysing barriers to the take-up of clinical guidelines, and planning interventions which seek to enable this change. We reviewed the literature on take-up of clinical guidelines, in particular barriers and enablers to change; social marketing principles and social marketing applied to healthcare. We then applied the social marketing framework to analyse the literature and to consider implications for future guideline policy to assess its feasibility and accessibility. There is sizeable extant literature on healthcare practitioners' non-compliance with clinical guidelines. This is an international problem common to a number of settings. The reasons for poor levels of take up appear to be well understood, but not addressed adequately in practice. Applying a social marketing framework brings new insights to the problem." We show that a social marketing framework provides a useful solution-focused framework for systematically understanding barriers to individual behaviour change and designing interventions accordingly. Whether the social marketing framework provides an effective means of bringing about behaviour change remains an empirical question which has still to be tested in practice. The analysis presented here provides strong motivation to begin such testing.
An overview of infusing service-learning in medical education.
Stewart, Trae; Wubbena, Zane
2014-08-04
To identify and review existing empirical research about service-learning and medical education and then to develop a framework for infusing service-learning in Doctor of Medicine or Doctor of Osteopathic Medicine curricula. We selected literature on service-learning and medical education. Articles were screened with a protocol for inclusion or exclusion at two separate stages. At stage one, articles were screened according to their titles, abstracts, and keywords. The second stage involved a full-text review. Finally, a thematic analysis using focused and selective coding was conducted. Eighteen studies were analyzed spanning the years 1998 to 2012. The results from our analysis informed the development of a four-stage service-learning framework: 1) planning and preparation, 2) action, 3) reflection and demonstration, and 4) assessment and celebration. The presented service-learning framework can be used to develop curricula for the infusion of service-learning in medical school. Service-learning curricula in medical education have the potential to provide myriad benefits to faculty, students, community members, and university-community partnerships.
Toward Improved Fidelity of Thermal Explosion Simulations
NASA Astrophysics Data System (ADS)
Nichols, Albert; Becker, Richard; Burnham, Alan; Howard, W. Michael; Knap, Jarek; Wemhoff, Aaron
2009-06-01
We present results of an improved thermal/chemical/mechanical model of HMX based explosives like LX04 and LX10 for thermal cook-off. The original HMX model and analysis scheme were developed by Yoh et.al. for use in the ALE3D modeling framework. The improvements were concentrated in four areas. First, we added porosity to the chemical material model framework in ALE3D used to model HMX explosive formulations to handle the roughly 2% porosity in solid explosives. Second, we improved the HMX reaction network, which included the addition of a reactive phase change model base on work by Henson et.al. Third, we added early decomposition gas species to the CHEETAH material database to improve equations of state for gaseous intermediates and products. Finally, we improved the implicit mechanics module in ALE3D to more naturally handle the long time scales associated with thermal cookoff. The application of the resulting framework to the analysis of the Scaled Thermal Explosion (STEX) experiments will be discussed.
Vogel, Curtis R; Tyler, Glenn A; Wittich, Donald J
2014-07-01
We introduce a framework for modeling, analysis, and simulation of aero-optics wavefront aberrations that is based on spatial-temporal covariance matrices extracted from wavefront sensor measurements. Within this framework, we present a quasi-homogeneous structure function to analyze nonhomogeneous, mildly anisotropic spatial random processes, and we use this structure function to show that phase aberrations arising in aero-optics are, for an important range of operating parameters, locally Kolmogorov. This strongly suggests that the d5/3 power law for adaptive optics (AO) deformable mirror fitting error, where d denotes actuator separation, holds for certain important aero-optics scenarios. This framework also allows us to compute bounds on AO servo lag error and predictive control error. In addition, it provides us with the means to accurately simulate AO systems for the mitigation of aero-effects, and it may provide insight into underlying physical processes associated with turbulent flow. The techniques introduced here are demonstrated using data obtained from the Airborne Aero-Optics Laboratory.
Breast Mass Detection in Digital Mammogram Based on Gestalt Psychology
Bu, Qirong; Liu, Feihong; Zhang, Min; Ren, Yu; Lv, Yi
2018-01-01
Inspired by gestalt psychology, we combine human cognitive characteristics with knowledge of radiologists in medical image analysis. In this paper, a novel framework is proposed to detect breast masses in digitized mammograms. It can be divided into three modules: sensation integration, semantic integration, and verification. After analyzing the progress of radiologist's mammography screening, a series of visual rules based on the morphological characteristics of breast masses are presented and quantified by mathematical methods. The framework can be seen as an effective trade-off between bottom-up sensation and top-down recognition methods. This is a new exploratory method for the automatic detection of lesions. The experiments are performed on Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM) data sets. The sensitivity reached to 92% at 1.94 false positive per image (FPI) on MIAS and 93.84% at 2.21 FPI on DDSM. Our framework has achieved a better performance compared with other algorithms. PMID:29854359
NASA Technical Reports Server (NTRS)
Nakazawa, Shohei
1991-01-01
Formulations and algorithms implemented in the MHOST finite element program are discussed. The code uses a novel concept of the mixed iterative solution technique for the efficient 3-D computations of turbine engine hot section components. The general framework of variational formulation and solution algorithms are discussed which were derived from the mixed three field Hu-Washizu principle. This formulation enables the use of nodal interpolation for coordinates, displacements, strains, and stresses. Algorithmic description of the mixed iterative method includes variations for the quasi static, transient dynamic and buckling analyses. The global-local analysis procedure referred to as the subelement refinement is developed in the framework of the mixed iterative solution, of which the detail is presented. The numerically integrated isoparametric elements implemented in the framework is discussed. Methods to filter certain parts of strain and project the element discontinuous quantities to the nodes are developed for a family of linear elements. Integration algorithms are described for linear and nonlinear equations included in MHOST program.
A Framework for Web Usage Mining in Electronic Government
NASA Astrophysics Data System (ADS)
Zhou, Ping; Le, Zhongjian
Web usage mining has been a major component of management strategy to enhance organizational analysis and decision. The literature on Web usage mining that deals with strategies and technologies for effectively employing Web usage mining is quite vast. In recent years, E-government has received much attention from researchers and practitioners. Huge amounts of user access data are produced in Electronic government Web site everyday. The role of these data in the success of government management cannot be overstated because they affect government analysis, prediction, strategies, tactical, operational planning and control. Web usage miming in E-government has an important role to play in setting government objectives, discovering citizen behavior, and determining future courses of actions. Web usage mining in E-government has not received adequate attention from researchers or practitioners. We developed a framework to promote a better understanding of the importance of Web usage mining in E-government. Using the current literature, we developed the framework presented herein, in hopes that it would stimulate more interest in this important area.
Sadeghi, Neda; Prastawa, Marcel; Fletcher, P Thomas; Gilmore, John H; Lin, Weili; Gerig, Guido
2012-01-01
A population growth model that represents the growth trajectories of individual subjects is critical to study and understand neurodevelopment. This paper presents a framework for jointly estimating and modeling individual and population growth trajectories, and determining significant regional differences in growth pattern characteristics applied to longitudinal neuroimaging data. We use non-linear mixed effect modeling where temporal change is modeled by the Gompertz function. The Gompertz function uses intuitive parameters related to delay, rate of change, and expected asymptotic value; all descriptive measures which can answer clinical questions related to growth. Our proposed framework combines nonlinear modeling of individual trajectories, population analysis, and testing for regional differences. We apply this framework to the study of early maturation in white matter regions as measured with diffusion tensor imaging (DTI). Regional differences between anatomical regions of interest that are known to mature differently are analyzed and quantified. Experiments with image data from a large ongoing clinical study show that our framework provides descriptive, quantitative information on growth trajectories that can be directly interpreted by clinicians. To our knowledge, this is the first longitudinal analysis of growth functions to explain the trajectory of early brain maturation as it is represented in DTI.
Bacchi, Ataís; Consani, Rafael Leonardo Xediek; Mesquita, Marcelo Ferraz; Dos Santos, Mateus Bertolini Fernandes
2013-09-01
This study evaluated the influence of framework material and vertical misfit on stress created in an implant-supported partial prosthesis under load application. The posterior part of a severely reabsorbed jaw with a fixed partial prosthesis above two osseointegrated titanium implants at the place of the second premolar and second molar was modeled using SolidWorks 2010 software. Finite element models were obtained by importing the solid model into an ANSYS Workbench 11 simulation. The models were divided into 15 groups according to their prosthetic framework material (type IV gold alloy, silver-palladium alloy, commercially pure titanium, cobalt-chromium alloy or zirconia) and vertical misfit level (10 µm, 50 µm and 100 µm). After settlement of the prosthesis with the closure of the misfit, simultaneous loads of 110 N vertical and 15 N horizontal were applied on the occlusal and lingual faces of each tooth, respectively. The data was evaluated using Maximum Principal Stress (framework, porcelain veneer and bone tissue) and a von Mises Stress (retention screw) provided by the software. As a result, stiffer frameworks presented higher stress concentrations; however, these frameworks led to lower stresses in the porcelain veneer, the retention screw (faced to 10 µm and 50 µm of the misfit) and the peri-implant bone tissues. The increase in the vertical misfit resulted in stress values increasing in all of the prosthetic structures and peri-implant bone tissues. The framework material and vertical misfit level presented a relevant influence on the stresses for all of the structures evaluated.
ERIC Educational Resources Information Center
Guilamo-Ramos; Vincent; Jaccard, James; Dittus, Patricia; Gonzalez, Bernardo; Bouris, Alida
2008-01-01
A framework for the analysis of adolescent problem behaviors was explicated that draws on five major theories of human behavior. The framework emphasizes intentions to perform behaviors and factors that influence intentions as well as moderate the impact of intentions on behavior. The framework was applied to the analysis of adolescent sexual risk…
DOE Office of Scientific and Technical Information (OSTI.GOV)
López C, Diana C.; Wozny, Günter; Flores-Tlacuahuac, Antonio
2016-03-23
The lack of informative experimental data and the complexity of first-principles battery models make the recovery of kinetic, transport, and thermodynamic parameters complicated. We present a computational framework that combines sensitivity, singular value, and Monte Carlo analysis to explore how different sources of experimental data affect parameter structural ill conditioning and identifiability. Our study is conducted on a modified version of the Doyle-Fuller-Newman model. We demonstrate that the use of voltage discharge curves only enables the identification of a small parameter subset, regardless of the number of experiments considered. Furthermore, we show that the inclusion of a single electrolyte concentrationmore » measurement significantly aids identifiability and mitigates ill-conditioning.« less
Evaluating the transport layer of the ALFA framework for the Intel® Xeon Phi™ Coprocessor
NASA Astrophysics Data System (ADS)
Santogidis, Aram; Hirstius, Andreas; Lalis, Spyros
2015-12-01
The ALFA framework supports the software development of major High Energy Physics experiments. As part of our research effort to optimize the transport layer of ALFA, we focus on profiling its data transfer performance for inter-node communication on the Intel Xeon Phi Coprocessor. In this article we present the collected performance measurements with the related analysis of the results. The optimization opportunities that are discovered, help us to formulate the future plans of enabling high performance data transfer for ALFA on the Intel Xeon Phi architecture.
NASA Astrophysics Data System (ADS)
Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis
2018-02-01
This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.
Towards Stability Analysis of Jump Linear Systems with State-Dependent and Stochastic Switching
NASA Technical Reports Server (NTRS)
Tejada, Arturo; Gonzalez, Oscar R.; Gray, W. Steven
2004-01-01
This paper analyzes the stability of hierarchical jump linear systems where the supervisor is driven by a Markovian stochastic process and by the values of the supervised jump linear system s states. The stability framework for this class of systems is developed over infinite and finite time horizons. The framework is then used to derive sufficient stability conditions for a specific class of hybrid jump linear systems with performance supervision. New sufficient stochastic stability conditions for discrete-time jump linear systems are also presented.
Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics
NASA Astrophysics Data System (ADS)
Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu
2007-11-01
In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.
A Cyber-ITS Framework for Massive Traffic Data Analysis Using Cyber Infrastructure
Fontaine, Michael D.
2013-01-01
Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs), which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI), by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS) data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing. PMID:23766690
Discovering System Health Anomalies Using Data Mining Techniques
NASA Technical Reports Server (NTRS)
Sriastava, Ashok, N.
2005-01-01
We present a data mining framework for the analysis and discovery of anomalies in high-dimensional time series of sensor measurements that would be found in an Integrated System Health Monitoring system. We specifically treat the problem of discovering anomalous features in the time series that may be indicative of a system anomaly, or in the case of a manned system, an anomaly due to the human. Identification of these anomalies is crucial to building stable, reusable, and cost-efficient systems. The framework consists of an analysis platform and new algorithms that can scale to thousands of sensor streams to discovers temporal anomalies. We discuss the mathematical framework that underlies the system and also describe in detail how this framework is general enough to encompass both discrete and continuous sensor measurements. We also describe a new set of data mining algorithms based on kernel methods and hidden Markov models that allow for the rapid assimilation, analysis, and discovery of system anomalies. We then describe the performance of the system on a real-world problem in the aircraft domain where we analyze the cockpit data from aircraft as well as data from the aircraft propulsion, control, and guidance systems. These data are discrete and continuous sensor measurements and are dealt with seamlessly in order to discover anomalous flights. We conclude with recommendations that describe the tradeoffs in building an integrated scalable platform for robust anomaly detection in ISHM applications.
A Cyber-ITS framework for massive traffic data analysis using cyber infrastructure.
Xia, Yingjie; Hu, Jia; Fontaine, Michael D
2013-01-01
Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs), which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI), by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS) data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing.
A framework for building hypercubes using MapReduce
NASA Astrophysics Data System (ADS)
Tapiador, D.; O'Mullane, W.; Brown, A. G. A.; Luri, X.; Huedo, E.; Osuna, P.
2014-05-01
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalog will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigm but without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Jiang, Guoqian; Wang, Chen; Zhu, Qian; Chute, Christopher G
2013-01-01
Knowledge-driven text mining is becoming an important research area for identifying pharmacogenomics target genes. However, few of such studies have been focused on the pharmacogenomics targets of adverse drug events (ADEs). The objective of the present study is to build a framework of knowledge integration and discovery that aims to support pharmacogenomics target predication of ADEs. We integrate a semantically annotated literature corpus Semantic MEDLINE with a semantically coded ADE knowledgebase known as ADEpedia using a semantic web based framework. We developed a knowledge discovery approach combining a network analysis of a protein-protein interaction (PPI) network and a gene functional classification approach. We performed a case study of drug-induced long QT syndrome for demonstrating the usefulness of the framework in predicting potential pharmacogenomics targets of ADEs.
Multimodal Speaker Diarization.
Noulas, A; Englebienne, G; Krose, B J A
2012-01-01
We present a novel probabilistic framework that fuses information coming from the audio and video modality to perform speaker diarization. The proposed framework is a Dynamic Bayesian Network (DBN) that is an extension of a factorial Hidden Markov Model (fHMM) and models the people appearing in an audiovisual recording as multimodal entities that generate observations in the audio stream, the video stream, and the joint audiovisual space. The framework is very robust to different contexts, makes no assumptions about the location of the recording equipment, and does not require labeled training data as it acquires the model parameters using the Expectation Maximization (EM) algorithm. We apply the proposed model to two meeting videos and a news broadcast video, all of which come from publicly available data sets. The results acquired in speaker diarization are in favor of the proposed multimodal framework, which outperforms the single modality analysis results and improves over the state-of-the-art audio-based speaker diarization.
SmartMal: a service-oriented behavioral malware detection framework for mobile devices.
Wang, Chao; Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C K
2014-01-01
This paper presents SmartMal--a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices.
Overarching framework for data-based modelling
NASA Astrophysics Data System (ADS)
Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco
2014-02-01
One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.
SmartMal: A Service-Oriented Behavioral Malware Detection Framework for Mobile Devices
Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C. K.
2014-01-01
This paper presents SmartMal—a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices. PMID:25165729
Clinical Knowledge Governance Framework for Nationwide Data Infrastructure Projects.
Wulff, Antje; Haarbrandt, Birger; Marschollek, Michael
2018-01-01
The availability of semantically-enriched and interoperable clinical information models is crucial for reusing once collected data across institutions like aspired in the German HiGHmed project. Funded by the Federal Ministry of Education and Research, this nationwide data infrastructure project adopts the openEHR approach for semantic modelling. Here, strong governance is required to define high-quality and reusable models. Design of a clinical knowledge governance framework for openEHR modelling in cross-institutional settings like HiGHmed. Analysis of successful practices from international projects, published ideas on archetype governance and own modelling experiences as well as modelling of BPMN processes. We designed a framework by presenting archetype variations, roles and responsibilities, IT support and modelling workflows. Our framework has great potential to make the openEHR modelling efforts manageable. Because practical experiences are rare, prospectively our work will be predestinated to evaluate the benefits of such structured governance approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek
Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less
Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek; ...
2017-04-24
Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less
Eutrophication of lakes and reservoirs: A framework for making management decisions
Rast, W.; Holland, M.
1988-01-01
The development of management strategies for the protection of environmental quality usually involves consideration both of technical and nontechnical issues. A logical, step-by-step framework for development of such strategies is provided. Its application to the control of cultured eutrophication of lakes and reservoirs illustrates its potential usefulness. From the perspective of the policymaker, the main consideration is that the eutrophication-related water quality of a lake or reservoir can be managed for given water uses. The approach presented here allows the rational assessment of relevant water-quality parameters and establishment of water-quality goals, consideration of social and other nontechnical issues, the possibilities of public involvement in the decision-making process, and a reasonable economic analysis within a management framework.
NASA Technical Reports Server (NTRS)
Franck, Bruno M.
1990-01-01
The research is focused on automating the evaluation of complex structural systems, whether for the design of a new system or the analysis of an existing one, by developing new structural analysis techniques based on qualitative reasoning. The problem is to identify and better understand: (1) the requirements for the automation of design, and (2) the qualitative reasoning associated with the conceptual development of a complex system. The long-term objective is to develop an integrated design-risk assessment environment for the evaluation of complex structural systems. The scope of this short presentation is to describe the design and cognition components of the research. Design has received special attention in cognitive science because it is now identified as a problem solving activity that is different from other information processing tasks (1). Before an attempt can be made to automate design, a thorough understanding of the underlying design theory and methodology is needed, since the design process is, in many cases, multi-disciplinary, complex in size and motivation, and uses various reasoning processes involving different kinds of knowledge in ways which vary from one context to another. The objective is to unify all the various types of knowledge under one framework of cognition. This presentation focuses on the cognitive science framework that we are using to represent the knowledge aspects associated with the human mind's abstraction abilities and how we apply it to the engineering knowledge and engineering reasoning in design.
NASA Astrophysics Data System (ADS)
de Jong, Floor; van Hillegersberg, Jos; van Eck, Pascal; van der Kolk, Feiko; Jorissen, Rene
The lack of effective IT governance is widely recognized as a key inhibitor to successful global IT outsourcing relationships. In this study we present the development and application of a governance framework to improve outsourcing relationships. The approach used to developing an IT governance framework includes a meta model and a customization process to fit the framework to the target organization. The IT governance framework consists of four different elements (1) organisational structures, (2) joint processes between in- and outsourcer, (3) responsibilities that link roles to processes and (4) a diverse set of control indicators to measure the success of the relationship. The IT governance framework is put in practice in Shell GFIT BAM, a part of Shell that concluded to have a lack of management control over at least one of their outsourcing relationships. In a workshop the governance framework was used to perform a gap analysis between the current and desired governance. Several gaps were identified in the way roles and responsibilities are assigned and joint processes are set-up. Moreover, this workshop also showed the usefulness and usability of the IT governance framework in structuring, providing input and managing stakeholders in the discussions around IT governance.
Proposing a Universal Framework for Resilience: Optimizing Risk and Combating Human Vulnerabilities
NASA Astrophysics Data System (ADS)
Sarkar, Arunima
2017-04-01
In the recent years we have seen a massive impact of loss created to urban settlements and critical infrastructure as a result of disasters. The disaster risk associates itself vulnerabilities and many complexities which can disrupt the functioning of human society. The uncertain loss created by disasters can present unforeseeable risk which remain unaccounted to human understanding. It is imperative to note that human urbanization and development is correlated with human vulnerabilities and challenges posed by disasters. Disaster risks are aggravated by improper planning of cities, weak framework for urban governance and regulatory regimes and lack of equalities amongst the citizens. The international agenda on disaster risk reduction talks about increasing losses due to disasters associated with development and urbanization. The United Nations announced that the year 1990 was the International Decade for Natural Disaster Reduction. In relation to this, the "Yokohama Strategy and Plan of Action" was adopted at the first United Nations World Conference on Disaster Reduction. The United Nations Educational, Scientific and Cultural Organization's (UNESCO) Intergovernmental Oceanic Commission coordinated the World Conference on Disaster Reduction in 2005 where the Hyogo Framework for Action was adopted. The Hyogo Framework for Action: Building the resilience of communities to disaster was adopted by 168 nations after the massive loss caused by Indian ocean tsunami in 2005. The Hyogo Framework proposes to focus on implementation of risk and reliability system to shield disasters, proposes global scientific and community platform for disaster prevention and mitigation etc. The early warning system and its importance as an effective tool for reduction of human vulnerabilities for disaster management was majorly emphasized. It is imperative to highlight that resilience framework is important in order to minimize cost of disruption caused to critical infrastructure and to strengthen and optimize the decision making skill and platform for a better sustainable society. The resilience framework provides a cross-sector and multi-level analysis to tackle the vulnerabilities which can be caused to essential utilities like power, water, transport and various machineries that are essential for human sustainability. The direction of resilience framework focuses on prevention of damage and disruption of disaster, mitigate the loss caused to human society and provide the best response for disaster resilience. Thus, the basic pillars which are important for the implementation of resilience is proper governance framework and transparency which takes into account various cost and risk analysis. Thus a common and universal framework for resilience is the main requirement for mass accessibility. The aim of resilience framework focuses on universal adaptability, coherence and validation. A mixed method analysis has been undertaken in this research paper which focuses on the following issues: • Legal, Institutional and community framework for integrating resilience framework of global north and global south. • Spatial as well as statistical analysis to structuralize disaster risk and resilient framework for disaster management. • Early warning system and emergency response in a comparative scale to analyse the various models of risk and resilience framework implemented in USA, China, Nepal and India for proposing an integrated resilience strategy.
NASA Astrophysics Data System (ADS)
Bora, Sanjay; Scherbaum, Frank; Kuehn, Nicolas; Stafford, Peter; Edwards, Benjamin
2016-04-01
The current practice of deriving empirical ground motion prediction equations (GMPEs) involves using ground motions recorded at multiple sites. However, in applications like site-specific (e.g., critical facility) hazard ground motions obtained from the GMPEs are need to be adjusted/corrected to a particular site/site-condition under investigation. This study presents a complete framework for developing a response spectral GMPE, within which the issue of adjustment of ground motions is addressed in a manner consistent with the linear system framework. The present approach is a two-step process in which the first step consists of deriving two separate empirical models, one for Fourier amplitude spectra (FAS) and the other for a random vibration theory (RVT) optimized duration (Drvto) of ground motion. In the second step the two models are combined within the RVT framework to obtain full response spectral amplitudes. Additionally, the framework also involves a stochastic model based extrapolation of individual Fourier spectra to extend the useable frequency limit of the empirically derived FAS model. The stochastic model parameters were determined by inverting the Fourier spectral data using an approach similar to the one as described in Edwards and Faeh (2013). Comparison of median predicted response spectra from present approach with those from other regional GMPEs indicates that the present approach can also be used as a stand-alone model. The dataset used for the presented analysis is a subset of the recently compiled database RESORCE-2012 across Europe, the Middle East and the Mediterranean region.
The Parallel System for Integrating Impact Models and Sectors (pSIMS)
NASA Technical Reports Server (NTRS)
Elliott, Joshua; Kelly, David; Chryssanthacopoulos, James; Glotter, Michael; Jhunjhnuwala, Kanika; Best, Neil; Wilde, Michael; Foster, Ian
2014-01-01
We present a framework for massively parallel climate impact simulations: the parallel System for Integrating Impact Models and Sectors (pSIMS). This framework comprises a) tools for ingesting and converting large amounts of data to a versatile datatype based on a common geospatial grid; b) tools for translating this datatype into custom formats for site-based models; c) a scalable parallel framework for performing large ensemble simulations, using any one of a number of different impacts models, on clusters, supercomputers, distributed grids, or clouds; d) tools and data standards for reformatting outputs to common datatypes for analysis and visualization; and e) methodologies for aggregating these datatypes to arbitrary spatial scales such as administrative and environmental demarcations. By automating many time-consuming and error-prone aspects of large-scale climate impacts studies, pSIMS accelerates computational research, encourages model intercomparison, and enhances reproducibility of simulation results. We present the pSIMS design and use example assessments to demonstrate its multi-model, multi-scale, and multi-sector versatility.
On the Use of CAD and Cartesian Methods for Aerodynamic Optimization
NASA Technical Reports Server (NTRS)
Nemec, M.; Aftosmis, M. J.; Pulliam, T. H.
2004-01-01
The objective for this paper is to present the development of an optimization capability for Curt3D, a Cartesian inviscid-flow analysis package. We present the construction of a new optimization framework and we focus on the following issues: 1) Component-based geometry parameterization approach using parametric-CAD models and CAPRI. A novel geometry server is introduced that addresses the issue of parallel efficiency while only sparingly consuming CAD resources; 2) The use of genetic and gradient-based algorithms for three-dimensional aerodynamic design problems. The influence of noise on the optimization methods is studied. Our goal is to create a responsive and automated framework that efficiently identifies design modifications that result in substantial performance improvements. In addition, we examine the architectural issues associated with the deployment of a CAD-based approach in a heterogeneous parallel computing environment that contains both CAD workstations and dedicated compute engines. We demonstrate the effectiveness of the framework for a design problem that features topology changes and complex geometry.
Rethinking pattern formation in reaction-diffusion systems
NASA Astrophysics Data System (ADS)
Halatek, J.; Frey, E.
2018-05-01
The present theoretical framework for the analysis of pattern formation in complex systems is mostly limited to the vicinity of fixed (global) equilibria. Here we present a new theoretical approach to characterize dynamical states arbitrarily far from (global) equilibrium. We show that reaction-diffusion systems that are driven by locally mass-conserving interactions can be understood in terms of local equilibria of diffusively coupled compartments. Diffusive coupling generically induces lateral redistribution of the globally conserved quantities, and the variable local amounts of these quantities determine the local equilibria in each compartment. We find that, even far from global equilibrium, the system is well characterized by its moving local equilibria. We apply this framework to in vitro Min protein pattern formation, a paradigmatic model for biological pattern formation. Within our framework we can predict and explain transitions between chemical turbulence and order arbitrarily far from global equilibrium. Our results reveal conceptually new principles of self-organized pattern formation that may well govern diverse dynamical systems.
NASA Astrophysics Data System (ADS)
Callear, Samantha K.; Ramirez-Cuesta, Anibal J.; David, William I. F.; Millange, Franck; Walton, Richard I.
2013-12-01
We present new high-resolution inelastic neutron scattering (INS) spectra (measured using the TOSCA and MARI instruments at ISIS) and powder neutron diffraction data (measured on the diffractometer WISH at ISIS) from the interaction of the prototypical metal-organic framework HKUST-1 with various dosages of dihydrogen gas. The INS spectra show direct evidence for the sequential occupation of various distinct sites for dihydrogen in the metal-organic framework, whose population is adjusted during increasing loading of the guest. The superior resolution of TOSCA reveals subtle features in the spectra, not previously reported, including evidence for split signals, while complementary spectra recorded on MARI present full information in energy and momentum transfer. The analysis of the powder neutron patterns using the Rietveld method shows a consistent picture, allowing the crystallographic indenisation of binding sites for dihydrogen, thus building a comprehensive picture of the interaction of the guest with the nanoporous host.
GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data
NASA Astrophysics Data System (ADS)
Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.
2016-08-01
The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.
ERIC Educational Resources Information Center
Sundaram, Vanita; Sauntson, Helen
2016-01-01
In this paper, we present an analysis of "pleasure" in sex and relationships education (SRE) in England. Drawing together two distinct sources of data and different but complementary analytical frameworks, we argue that pleasure is largely absent within SRE and that this discursive silence serves to produce highly gendered and…
An Interactive Assessment Framework for Visual Engagement: Statistical Analysis of a TEDx Video
ERIC Educational Resources Information Center
Farhan, Muhammad; Aslam, Muhammad
2017-01-01
This study aims to assess the visual engagement of the video lectures. This analysis can be useful for the presenter and student to find out the overall visual attention of the videos. For this purpose, a new algorithm and data collection module are developed. Videos can be transformed into a dataset with the help of data collection module. The…
Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)
NASA Astrophysics Data System (ADS)
Dubinskii, Yu A.; Osipenko, A. S.
2000-02-01
Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; LaBel, Kenneth; Kim, Hak
2014-01-01
An informative session regarding SRAM FPGA basics. Presenting a framework for fault injection techniques applied to Xilinx Field Programmable Gate Arrays (FPGAs). Introduce an overlooked time component that illustrates fault injection is impractical for most real designs as a stand-alone characterization tool. Demonstrate procedures that benefit from fault injection error analysis.
Analysis of an industry in transition.
Baliga, B R; Johnson, B
1986-12-01
The health care industry is undergoing major structural changes. The significance of these changes for individual competitors moving toward the 1990s is not yet clear. This article assesses the implications of the current changes by applying Porter's industry structure and generic strategy frameworks to the health care industry. Present trends are compared to this analysis to highlight areas where individual hospitals might improve their competitive positioning.
Lessons Learned From Developing A Streaming Data Framework for Scientific Analysis
NASA Technical Reports Server (NTRS)
Wheeler. Kevin R.; Allan, Mark; Curry, Charles
2003-01-01
We describe the development and usage of a streaming data analysis software framework. The framework is used for three different applications: Earth science hyper-spectral imaging analysis, Electromyograph pattern detection, and Electroencephalogram state determination. In each application the framework was used to answer a series of science questions which evolved with each subsequent answer. This evolution is summarized in the form of lessons learned.
Line transect estimation of population size: the exponential case with grouped data
Anderson, D.R.; Burnham, K.P.; Crain, B.R.
1979-01-01
Gates, Marshall, and Olson (1968) investigated the line transect method of estimating grouse population densities in the case where sighting probabilities are exponential. This work is followed by a simulation study in Gates (1969). A general overview of line transect analysis is presented by Burnham and Anderson (1976). These articles all deal with the ungrouped data case. In the present article, an analysis of line transect data is formulated under the Gates framework of exponential sighting probabilities and in the context of grouped data.
Frequency analysis of uncertain structures using imprecise probability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Modares, Mehdi; Bergerson, Joshua
2015-01-01
Two new methods for finite element based frequency analysis of a structure with uncertainty are developed. An imprecise probability formulation based on enveloping p-boxes is used to quantify the uncertainty present in the mechanical characteristics of the structure. For each element, independent variations are considered. Using the two developed methods, P-box Frequency Analysis (PFA) and Interval Monte-Carlo Frequency Analysis (IMFA), sharp bounds on natural circular frequencies at different probability levels are obtained. These methods establish a framework for handling incomplete information in structural dynamics. Numerical example problems are presented that illustrate the capabilities of the new methods along with discussionsmore » on their computational efficiency.« less
This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...
Parallel Index and Query for Large Scale Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Jerry; Wu, Kesheng; Ruebel, Oliver
2011-07-18
Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing ofmore » a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.« less
A framework for longitudinal data analysis via shape regression
NASA Astrophysics Data System (ADS)
Fishbaugh, James; Durrleman, Stanley; Piven, Joseph; Gerig, Guido
2012-02-01
Traditional longitudinal analysis begins by extracting desired clinical measurements, such as volume or head circumference, from discrete imaging data. Typically, the continuous evolution of a scalar measurement is estimated by choosing a 1D regression model, such as kernel regression or fitting a polynomial of fixed degree. This type of analysis not only leads to separate models for each measurement, but there is no clear anatomical or biological interpretation to aid in the selection of the appropriate paradigm. In this paper, we propose a consistent framework for the analysis of longitudinal data by estimating the continuous evolution of shape over time as twice differentiable flows of deformations. In contrast to 1D regression models, one model is chosen to realistically capture the growth of anatomical structures. From the continuous evolution of shape, we can simply extract any clinical measurements of interest. We demonstrate on real anatomical surfaces that volume extracted from a continuous shape evolution is consistent with a 1D regression performed on the discrete measurements. We further show how the visualization of shape progression can aid in the search for significant measurements. Finally, we present an example on a shape complex of the brain (left hemisphere, right hemisphere, cerebellum) that demonstrates a potential clinical application for our framework.
A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints.
Sundharam, Sakthivel Manikandan; Navet, Nicolas; Altmeyer, Sebastian; Havet, Lionel
2018-02-20
Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.
A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints
Navet, Nicolas; Havet, Lionel
2018-01-01
Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system. PMID:29461489
Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.
Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark
2017-12-01
A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Yang, Meng; Qian, Xin; Zhang, Yuchao; Sheng, Jinbao; Shen, Dengle; Ge, Yi
2011-01-01
Approximately 30,000 dams in China are aging and are considered to be high-level risks. Developing a framework for analyzing spatial multicriteria flood risk is crucial to ranking management scenarios for these dams, especially in densely populated areas. Based on the theories of spatial multicriteria decision analysis, this report generalizes a framework consisting of scenario definition, problem structuring, criteria construction, spatial quantification of criteria, criteria weighting, decision rules, sensitivity analyses, and scenario appraisal. The framework is presented in detail by using a case study to rank dam rehabilitation, decommissioning and existing-condition scenarios. The results show that there was a serious inundation, and that a dam rehabilitation scenario could reduce the multicriteria flood risk by 0.25 in the most affected areas; this indicates a mean risk decrease of less than 23%. Although increased risk (<0.20) was found for some residential and commercial buildings, if the dam were to be decommissioned, the mean risk would not be greater than the current existing risk, indicating that the dam rehabilitation scenario had a higher rank for decreasing the flood risk than the decommissioning scenario, but that dam rehabilitation alone might be of little help in abating flood risk. With adjustments and improvement to the specific methods (according to the circumstances and available data) this framework may be applied to other sites. PMID:21655125
Chiabai, Aline; Quiroga, Sonia; Martinez-Juarez, Pablo; Higgins, Sahran; Taylor, Tim
2018-09-01
This paper addresses the impact that changes in natural ecosystems can have on health and wellbeing focusing on the potential co-benefits that green spaces could provide when introduced as climate change adaptation measures. Ignoring such benefits could lead to sub-optimal planning and decision-making. A conceptual framework, building on the ecosystem-enriched Driver, Pressure, State, Exposure, Effect, Action model (eDPSEEA), is presented to aid in clarifying the relational structure between green spaces and human health, taking climate change as the key driver. The study has the double intention of (i) summarising the literature with a special emphasis on the ecosystem and health perspectives, as well as the main theories behind these impacts, and (ii) modelling these findings into a framework that allows for multidisciplinary approaches to the underlying relations between human health and green spaces. The paper shows that while the literature based on the ecosystem perspective presents a well-documented association between climate, health and green spaces, the literature using a health-based perspective presents mixed evidence in some cases. The role of contextual factors and the exposure mechanism are rarely addressed. The proposed framework could serve as a multidisciplinary knowledge platform for multi-perspecitve analysis and discussion among experts and stakeholders, as well as to support the operationalization of quantitative assessment and modelling exercises. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.
2012-01-01
An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering.
Rushton, A; Rivett, D; Carlesso, L; Flynn, T; Hing, W; Kerry, R
2014-06-01
A consensus clinical reasoning framework for best practice for the examination of the cervical spine region has been developed through an iterative consultative process with experts and manual physical therapy organisations. The framework was approved by the 22 member countries of the International Federation of Orthopaedic Manipulative Physical Therapists (October 2012). The purpose of the framework is to provide guidance to clinicians for the assessment of the cervical region for potential of Cervical Arterial Dysfunction in advance of planned management (inclusive of manual therapy and exercise interventions). The best, most recent scientific evidence is combined with international expert opinion, and is presented with the intention to be informative, but not prescriptive; and therefore as an aid to the clinician's clinical reasoning. Important underlying principles of the framework are that 1] although presentations and adverse events of Cervical Arterial Dysfunction are rare, it is a potentially serious condition and needs to be considered in musculoskeletal assessment; 2] manual therapists cannot rely on the results of one clinical test to draw conclusions as to the presence or risk of Cervical Arterial Dysfunction; and 3] a clinically reasoned understanding of the patient's presentation, including a risk:benefit analysis, following an informed, planned and individualised assessment, is essential for recognition of this condition and for safe manual therapy practice in the cervical region. Clinicians should also be cognisant of jurisdictionally specific requirements and obligations, particularly related to patient informed consent, when intending to use manual therapy in the cervical region. Copyright © 2013 Elsevier Ltd. All rights reserved.
Past, present, and future design of urban drainage systems with focus on Danish experiences.
Arnbjerg-Nielsen, K
2011-01-01
Climate change will influence the water cycle substantially, and extreme precipitation will become more frequent in many regions in the years to come. How should this fact be incorporated into design of urban drainage systems, if at all? And how important is climate change compared to other changes over time? Based on an analysis of the underlying key drivers of changes that are expected to affect urban drainage systems the current problems and their predicted development over time are presented. One key issue is management of risk and uncertainties and therefore a framework for design and analysis of urban structures in light of present and future uncertainties is presented.
A conceptual framework and classification of capability areas for business process maturity
NASA Astrophysics Data System (ADS)
Van Looy, Amy; De Backer, Manu; Poels, Geert
2014-03-01
The article elaborates on business process maturity, which indicates how well an organisation can perform based on its business processes, i.e. on its way of working. This topic is of paramount importance for managers who try to excel in today's competitive world. Hence, business process maturity is an emerging research field. However, no consensus exists on the capability areas (or skills) needed to excel. Moreover, their theoretical foundation and synergies with other fields are frequently neglected. To overcome this gap, our study presents a conceptual framework with six main capability areas and 17 sub areas. It draws on theories regarding the traditional business process lifecycle, which are supplemented by recognised organisation management theories. The comprehensiveness of this framework is validated by mapping 69 business process maturity models (BPMMs) to the identified capability areas, based on content analysis. Nonetheless, as a consensus neither exists among the collected BPMMs, a classification of different maturity types is proposed, based on cluster analysis and discriminant analysis. Consequently, the findings contribute to the grounding of business process literature. Possible future avenues are evaluating existing BPMMs, directing new BPMMs or investigating which combinations of capability areas (i.e. maturity types) contribute more to performance than others.
Politi, Liran; Codish, Shlomi; Sagy, Iftach; Fink, Lior
2014-12-01
Insights about patterns of system use are often gained through the analysis of system log files, which record the actual behavior of users. In a clinical context, however, few attempts have been made to typify system use through log file analysis. The present study offers a framework for identifying, describing, and discerning among patterns of use of a clinical information retrieval system. We use the session attributes of volume, diversity, granularity, duration, and content to define a multidimensional space in which each specific session can be positioned. We also describe an analytical method for identifying the common archetypes of system use in this multidimensional space. We demonstrate the value of the proposed framework with a log file of the use of a health information exchange (HIE) system by physicians in an emergency department (ED) of a large Israeli hospital. The analysis reveals five distinct patterns of system use, which have yet to be described in the relevant literature. The results of this study have the potential to inform the design of HIE systems for efficient and effective use, thus increasing their contribution to the clinical decision-making process. Copyright © 2014 Elsevier Inc. All rights reserved.
Factors affecting construction performance: exploratory factor analysis
NASA Astrophysics Data System (ADS)
Soewin, E.; Chinda, T.
2018-04-01
The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.
NASA Astrophysics Data System (ADS)
St-Onge, Guillaume; Young, Jean-Gabriel; Laurence, Edward; Murphy, Charles; Dubé, Louis J.
2018-02-01
We present a degree-based theoretical framework to study the susceptible-infected-susceptible (SIS) dynamics on time-varying (rewired) configuration model networks. Using this framework on a given degree distribution, we provide a detailed analysis of the stationary state using the rewiring rate to explore the whole range of the time variation of the structure relative to that of the SIS process. This analysis is suitable for the characterization of the phase transition and leads to three main contributions: (1) We obtain a self-consistent expression for the absorbing-state threshold, able to capture both collective and hub activation. (2) We recover the predictions of a number of existing approaches as limiting cases of our analysis, providing thereby a unifying point of view for the SIS dynamics on random networks. (3) We obtain bounds for the critical exponents of a number of quantities in the stationary state. This allows us to reinterpret the concept of hub-dominated phase transition. Within our framework, it appears as a heterogeneous critical phenomenon: observables for different degree classes have a different scaling with the infection rate. This phenomenon is followed by the successive activation of the degree classes beyond the epidemic threshold.
Testing the single-state dominance hypothesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Álvarez-Rodríguez, R.; Moreno, O.; Moya de Guerra, E.
2013-12-30
We present a theoretical analysis of the single-state dominance hypothesis for the two-neutrino double-beta decay process. The theoretical framework is a proton-neutron QRPA based on a deformed Hartree-Fock mean field with BCS pairing correlations. We focus on the decays of {sup 100}Mo, {sup 116}Cd and {sup 128}Te. We do not find clear evidences for single-state dominance within the present approach.
NASA Technical Reports Server (NTRS)
Alexander, Tiffaney Miller
2017-01-01
Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.
NASA Astrophysics Data System (ADS)
Chęciński, Jakub; Frankowski, Marek
2016-10-01
We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.
Creativity as action: findings from five creative domains
Glaveanu, Vlad; Lubart, Todd; Bonnardel, Nathalie; Botella, Marion; de Biaisi, Pierre-Marc; Desainte-Catherine, Myriam; Georgsdottir, Asta; Guillou, Katell; Kurtag, Gyorgy; Mouchiroud, Christophe; Storme, Martin; Wojtczuk, Alicja; Zenasni, Franck
2013-01-01
The present paper outlines an action theory of creativity and substantiates this approach by investigating creative expression in five different domains. We propose an action framework for the analysis of creative acts built on the assumption that creativity is a relational, inter-subjective phenomenon. This framework, drawing extensively from the work of Dewey (1934) on art as experience, is used to derive a coding frame for the analysis of interview material. The article reports findings from the analysis of 60 interviews with recognized French creators in five creative domains: art, design, science, scriptwriting, and music. Results point to complex models of action and inter-action specific for each domain and also to interesting patterns of similarity and differences between domains. These findings highlight the fact that creative action takes place not “inside” individual creators but “in between” actors and their environment. Implications for the field of educational psychology are discussed. PMID:23596431
NASA Technical Reports Server (NTRS)
Gorski, K. M.; Hivon, Eric; Banday, A. J.; Wandelt, Benjamin D.; Hansen, Frode K.; Reinecke, Mstvos; Bartelmann, Matthia
2005-01-01
HEALPix the Hierarchical Equal Area isoLatitude Pixelization is a versatile structure for the pixelization of data on the sphere. An associated library of computational algorithms and visualization software supports fast scientific applications executable directly on discretized spherical maps generated from very large volumes of astronomical data. Originally developed to address the data processing and analysis needs of the present generation of cosmic microwave background experiments (e.g., BOOMERANG, WMAP), HEALPix can be expanded to meet many of the profound challenges that will arise in confrontation with the observational output of future missions and experiments, including, e.g., Planck, Herschel, SAFIR, and the Beyond Einstein inflation probe. In this paper we consider the requirements and implementation constraints on a framework that simultaneously enables an efficient discretization with associated hierarchical indexation and fast analysis/synthesis of functions defined on the sphere. We demonstrate how these are explicitly satisfied by HEALPix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoette, Trisha Marie
Throughout history, as new chemical threats arose, strategies for the defense against chemical attacks have also evolved. As a part of an Early Career Laboratory Directed Research and Development project, a systems analysis of past, present, and future chemical terrorism scenarios was performed to understand how the chemical threats and attack strategies change over time. For the analysis, the difficulty in executing chemical attack was evaluated within a framework of three major scenario elements. First, historical examples of chemical terrorism were examined to determine how the use of chemical threats, versus other weapons, contributed to the successful execution of themore » attack. Using the same framework, the future of chemical terrorism was assessed with respect to the impact of globalization and new technologies. Finally, the efficacy of the current defenses against contemporary chemical terrorism was considered briefly. The results of this analysis justify the need for continued diligence in chemical defense.« less
Bayesian analysis of rare events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less
How equity is addressed in clinical practice guidelines: a content analysis
Shi, Chunhu; Tian, Jinhui; Wang, Quan; Petkovic, Jennifer; Ren, Dan; Yang, Kehu; Yang, Yang
2014-01-01
Objectives Considering equity into guidelines presents methodological challenges. This study aims to qualitatively synthesise the methods for incorporating equity in clinical practice guidelines (CPGs). Setting Content analysis of methodological publications. Eligibility criteria for selecting studies Methodological publications were included if they provided checklists/frameworks on when, how and to what extent equity should be incorporated in CPGs. Data sources We electronically searched MEDLINE, retrieved references, and browsed guideline development organisation websites from inception to January 2013. After study selection by two authors, general characteristics and checklists items/framework components from included studies were extracted. Based on the questions or items from checklists/frameworks (unit of analysis), content analysis was conducted to identify themes and questions/items were grouped into these themes. Primary outcomes The primary outcomes were methodological themes and processes on how to address equity issues in guideline development. Results 8 studies with 10 publications were included from 3405 citations. In total, a list of 87 questions/items was generated from 17 checklists/frameworks. After content analysis, questions were grouped into eight themes (‘scoping questions’, ‘searching relevant evidence’, ‘appraising evidence and recommendations’, ‘formulating recommendations’, ‘monitoring implementation’, ‘providing a flow chart to include equity in CPGs’, and ‘others: reporting of guidelines and comments from stakeholders’ for CPG developers and ‘assessing the quality of CPGs’ for CPG users). Four included studies covered more than five of these themes. We also summarised the process of guideline development based on the themes mentioned above. Conclusions For disadvantaged population-specific CPGs, eight important methodological issues identified in this review should be considered when including equity in CPGs under the guidance of a scientific guideline development manual. PMID:25479795
A Healthcare Utilization Analysis Framework for Hot Spotting and Contextual Anomaly Detection
Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram
2012-01-01
Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient’s clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among medical encounters to provide more in-depth insights. PMID:23304306
A healthcare utilization analysis framework for hot spotting and contextual anomaly detection.
Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram
2012-01-01
Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient's clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among medical encounters to provide more in-depth insights.
A Framework for Integrating Oceanographic Data Repositories
NASA Astrophysics Data System (ADS)
Rozell, E.; Maffei, A. R.; Beaulieu, S. E.; Fox, P. A.
2010-12-01
Oceanographic research covers a broad range of science domains and requires a tremendous amount of cross-disciplinary collaboration. Advances in cyberinfrastructure are making it easier to share data across disciplines through the use of web services and community vocabularies. Best practices in the design of web services and vocabularies to support interoperability amongst science data repositories are only starting to emerge. Strategic design decisions in these areas are crucial to the creation of end-user data and application integration tools. We present S2S, a novel framework for deploying customizable user interfaces to support the search and analysis of data from multiple repositories. Our research methods follow the Semantic Web methodology and technology development process developed by Fox et al. This methodology stresses the importance of close scientist-technologist interactions when developing scientific use cases, keeping the project well scoped and ensuring the result meets a real scientific need. The S2S framework motivates the development of standardized web services with well-described parameters, as well as the integration of existing web services and applications in the search and analysis of data. S2S also encourages the use and development of community vocabularies and ontologies to support federated search and reduce the amount of domain expertise required in the data discovery process. S2S utilizes the Web Ontology Language (OWL) to describe the components of the framework, including web service parameters, and OpenSearch as a standard description for web services, particularly search services for oceanographic data repositories. We have created search services for an oceanographic metadata database, a large set of quality-controlled ocean profile measurements, and a biogeographic search service. S2S provides an application programming interface (API) that can be used to generate custom user interfaces, supporting data and application integration across these repositories and other web resources. Although initially targeted towards a general oceanographic audience, the S2S framework shows promise in many science domains, inspired in part by the broad disciplinary coverage of oceanography. This presentation will cover the challenges addressed by the S2S framework, the research methods used in its development, and the resulting architecture for the system. It will demonstrate how S2S is remarkably extensible, and can be generalized to many science domains. Given these characteristics, the framework can simplify the process of data discovery and analysis for the end user, and can help to shift the responsibility of search interface development away from data managers.
Ince, Robin A A; Giordano, Bruno L; Kayser, Christoph; Rousselet, Guillaume A; Gross, Joachim; Schyns, Philippe G
2017-03-01
We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open-source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541-1573, 2017. © 2016 Wiley Periodicals, Inc. 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
StreamExplorer: A Multi-Stage System for Visually Exploring Events in Social Streams.
Wu, Yingcai; Chen, Zhutian; Sun, Guodao; Xie, Xiao; Cao, Nan; Liu, Shixia; Cui, Weiwei
2017-10-18
Analyzing social streams is important for many applications, such as crisis management. However, the considerable diversity, increasing volume, and high dynamics of social streams of large events continue to be significant challenges that must be overcome to ensure effective exploration. We propose a novel framework by which to handle complex social streams on a budget PC. This framework features two components: 1) an online method to detect important time periods (i.e., subevents), and 2) a tailored GPU-assisted Self-Organizing Map (SOM) method, which clusters the tweets of subevents stably and efficiently. Based on the framework, we present StreamExplorer to facilitate the visual analysis, tracking, and comparison of a social stream at three levels. At a macroscopic level, StreamExplorer uses a new glyph-based timeline visualization, which presents a quick multi-faceted overview of the ebb and flow of a social stream. At a mesoscopic level, a map visualization is employed to visually summarize the social stream from either a topical or geographical aspect. At a microscopic level, users can employ interactive lenses to visually examine and explore the social stream from different perspectives. Two case studies and a task-based evaluation are used to demonstrate the effectiveness and usefulness of StreamExplorer.Analyzing social streams is important for many applications, such as crisis management. However, the considerable diversity, increasing volume, and high dynamics of social streams of large events continue to be significant challenges that must be overcome to ensure effective exploration. We propose a novel framework by which to handle complex social streams on a budget PC. This framework features two components: 1) an online method to detect important time periods (i.e., subevents), and 2) a tailored GPU-assisted Self-Organizing Map (SOM) method, which clusters the tweets of subevents stably and efficiently. Based on the framework, we present StreamExplorer to facilitate the visual analysis, tracking, and comparison of a social stream at three levels. At a macroscopic level, StreamExplorer uses a new glyph-based timeline visualization, which presents a quick multi-faceted overview of the ebb and flow of a social stream. At a mesoscopic level, a map visualization is employed to visually summarize the social stream from either a topical or geographical aspect. At a microscopic level, users can employ interactive lenses to visually examine and explore the social stream from different perspectives. Two case studies and a task-based evaluation are used to demonstrate the effectiveness and usefulness of StreamExplorer.
Multivariate Methods for Meta-Analysis of Genetic Association Studies.
Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G
2018-01-01
Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.
Yadav, Ram Bharos; Srivastava, Subodh; Srivastava, Rajeev
2016-01-01
The proposed framework is obtained by casting the noise removal problem into a variational framework. This framework automatically identifies the various types of noise present in the magnetic resonance image and filters them by choosing an appropriate filter. This filter includes two terms: the first term is a data likelihood term and the second term is a prior function. The first term is obtained by minimizing the negative log likelihood of the corresponding probability density functions: Gaussian or Rayleigh or Rician. Further, due to the ill-posedness of the likelihood term, a prior function is needed. This paper examines three partial differential equation based priors which include total variation based prior, anisotropic diffusion based prior, and a complex diffusion (CD) based prior. A regularization parameter is used to balance the trade-off between data fidelity term and prior. The finite difference scheme is used for discretization of the proposed method. The performance analysis and comparative study of the proposed method with other standard methods is presented for brain web dataset at varying noise levels in terms of peak signal-to-noise ratio, mean square error, structure similarity index map, and correlation parameter. From the simulation results, it is observed that the proposed framework with CD based prior is performing better in comparison to other priors in consideration.
A Transparent and Transferable Framework for Tracking Quality Information in Large Datasets
Smith, Derek E.; Metzger, Stefan; Taylor, Jeffrey R.
2014-01-01
The ability to evaluate the validity of data is essential to any investigation, and manual “eyes on” assessments of data quality have dominated in the past. Yet, as the size of collected data continues to increase, so does the effort required to assess their quality. This challenge is of particular concern for networks that automate their data collection, and has resulted in the automation of many quality assurance and quality control analyses. Unfortunately, the interpretation of the resulting data quality flags can become quite challenging with large data sets. We have developed a framework to summarize data quality information and facilitate interpretation by the user. Our framework consists of first compiling data quality information and then presenting it through 2 separate mechanisms; a quality report and a quality summary. The quality report presents the results of specific quality analyses as they relate to individual observations, while the quality summary takes a spatial or temporal aggregate of each quality analysis and provides a summary of the results. Included in the quality summary is a final quality flag, which further condenses data quality information to assess whether a data product is valid or not. This framework has the added flexibility to allow “eyes on” information on data quality to be incorporated for many data types. Furthermore, this framework can aid problem tracking and resolution, should sensor or system malfunctions arise. PMID:25379884
NASA Astrophysics Data System (ADS)
Lev, S. M.; Gallo, J.
2017-12-01
The international Arctic scientific community has identified the need for a sustained and integrated portfolio of pan-Arctic Earth-observing systems. In 2017, an international effort was undertaken to develop the first ever Value Tree framework for identifying common research and operational objectives that rely on Earth observation data derived from Earth-observing systems, sensors, surveys, networks, models, and databases to deliver societal benefits in the Arctic. A Value Tree Analysis is a common tool used to support decision making processes and is useful for defining concepts, identifying objectives, and creating a hierarchical framework of objectives. A multi-level societal benefit area value tree establishes the connection from societal benefits to the set of observation inputs that contribute to delivering those benefits. A Value Tree that relies on expert domain knowledge from Arctic and non-Arctic nations, international researchers, Indigenous knowledge holders, and other experts to develop a framework to serve as a logical and interdependent decision support tool will be presented. Value tree examples that map the contribution of Earth observations in the Arctic to achieving societal benefits will be presented in the context of the 2017 International Arctic Observations Assessment Framework. These case studies will highlight specific observing products and capability groups where investment is needed to contribute to the development of a sustained portfolio of Arctic observing systems.
A lightweight distributed framework for computational offloading in mobile cloud computing.
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.
A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245
NASA Astrophysics Data System (ADS)
Sadegh, M.; Moftakhari, H.; AghaKouchak, A.
2017-12-01
Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.
ERIC Educational Resources Information Center
Hilz, Christoph; Ehrenfeld, John R.
1991-01-01
Several policy frameworks for managing hazardous waste import/export are examined with respect to economic issues, environmental sustainability, and administrative feasibility and effectiveness. Several recommendations for improving the present instrument and implementing process are offered. (Author/CW)
Contextualizing Community in Teacher Bible Talk
ERIC Educational Resources Information Center
Avni, Sharon
2013-01-01
This paper explores the interactions surrounding Bible teaching as a means of understanding how Jewish youth are discursively implicated within ideologies of community. Drawing on theoretical frameworks from linguistic anthropology and interactional sociolinguistics, I present a micro-analysis of a classroom lesson on the book of Leviticus to…
Actor Interdependence in Collaborative Telelearning.
ERIC Educational Resources Information Center
Wasson, Barbara; Bourdeau, Jacqueline
This paper presents a model of collaborative telelearning and describes how coordination theory has provided a framework for the analysis of actor (inter)dependencies in this scenario. The model is intended to inform the instructional design of learning scenarios, the technological design of the telelearning environment, and the design of…
Leadership for Community Engagement--A Distributed Leadership Perspective
ERIC Educational Resources Information Center
Liang, Jia G.; Sandmann, Lorilee R.
2015-01-01
This article presents distributed leadership as a framework for analysis, showing how the phenomenon complements formal higher education structures by mobilizing leadership from various sources, formal and informal. This perspective more accurately portrays the reality of leading engaged institutions. Using the application data from 224…
Philosophical Analysis of the Question "What Is English?"
ERIC Educational Resources Information Center
Nystrand, Philip Martin
This dissertation combines linguistics, sociology, aesthetics, and psychology to explore the philosophical and pedagogical assumptions which currently provide the framework for answering the question, "What is English?" The theories of Piaget, Langer, Mead, and Berger and Luckmann are used to define two essential terms: presentational and…
Alvarez, Stéphanie; Timler, Carl J.; Michalscheck, Mirja; Paas, Wim; Descheemaeker, Katrien; Tittonell, Pablo; Andersson, Jens A.; Groot, Jeroen C. J.
2018-01-01
Creating typologies is a way to summarize the large heterogeneity of smallholder farming systems into a few farm types. Various methods exist, commonly using statistical analysis, to create these typologies. We demonstrate that the methodological decisions on data collection, variable selection, data-reduction and clustering techniques can bear a large impact on the typology results. We illustrate the effects of analysing the diversity from different angles, using different typology objectives and different hypotheses, on typology creation by using an example from Zambia’s Eastern Province. Five separate typologies were created with principal component analysis (PCA) and hierarchical clustering analysis (HCA), based on three different expert-informed hypotheses. The greatest overlap between typologies was observed for the larger, wealthier farm types but for the remainder of the farms there were no clear overlaps between typologies. Based on these results, we argue that the typology development should be guided by a hypothesis on the local agriculture features and the drivers and mechanisms of differentiation among farming systems, such as biophysical and socio-economic conditions. That hypothesis is based both on the typology objective and on prior expert knowledge and theories of the farm diversity in the study area. We present a methodological framework that aims to integrate participatory and statistical methods for hypothesis-based typology construction. This is an iterative process whereby the results of the statistical analysis are compared with the reality of the target population as hypothesized by the local experts. Using a well-defined hypothesis and the presented methodological framework, which consolidates the hypothesis through local expert knowledge for the creation of typologies, warrants development of less subjective and more contextualized quantitative farm typologies. PMID:29763422
Sadegh Amalnick, Mohsen; Zarrin, Mansour
2017-03-13
Purpose The purpose of this paper is to present an integrated framework for performance evaluation and analysis of human resource (HR) with respect to the factors of health, safety, environment and ergonomics (HSEE) management system, and also the criteria of European federation for quality management (EFQM) as one of the well-known business excellence models. Design/methodology/approach In this study, an intelligent algorithm based on adaptive neuro-fuzzy inference system (ANFIS) along with fuzzy data envelopment analysis (FDEA) are developed and employed to assess the performance of the company. Furthermore, the impact of the factors on the company's performance as well as their strengths and weaknesses are identified by conducting a sensitivity analysis on the results. Similarly, a design of experiment is performed to prioritize the factors in the order of importance. Findings The results show that EFQM model has a far greater impact upon the company's performance than HSEE management system. According to the obtained results, it can be argued that integration of HSEE and EFQM leads to the performance improvement in the company. Practical implications In current study, the required data for executing the proposed framework are collected via valid questionnaires which are filled in by the staff of an aviation industry located in Tehran, Iran. Originality/value Managing HR performance results in improving usability, maintainability and reliability and finally in a significant reduction in the commercial aviation accident rate. Also, study of factors affecting HR performance authorities participate in developing systems in order to help operators better manage human error. This paper for the first time presents an intelligent framework based on ANFIS, FDEA and statistical tests for HR performance assessment and analysis with the ability of handling uncertainty and vagueness existing in real world environment.
Alvarez, Stéphanie; Timler, Carl J; Michalscheck, Mirja; Paas, Wim; Descheemaeker, Katrien; Tittonell, Pablo; Andersson, Jens A; Groot, Jeroen C J
2018-01-01
Creating typologies is a way to summarize the large heterogeneity of smallholder farming systems into a few farm types. Various methods exist, commonly using statistical analysis, to create these typologies. We demonstrate that the methodological decisions on data collection, variable selection, data-reduction and clustering techniques can bear a large impact on the typology results. We illustrate the effects of analysing the diversity from different angles, using different typology objectives and different hypotheses, on typology creation by using an example from Zambia's Eastern Province. Five separate typologies were created with principal component analysis (PCA) and hierarchical clustering analysis (HCA), based on three different expert-informed hypotheses. The greatest overlap between typologies was observed for the larger, wealthier farm types but for the remainder of the farms there were no clear overlaps between typologies. Based on these results, we argue that the typology development should be guided by a hypothesis on the local agriculture features and the drivers and mechanisms of differentiation among farming systems, such as biophysical and socio-economic conditions. That hypothesis is based both on the typology objective and on prior expert knowledge and theories of the farm diversity in the study area. We present a methodological framework that aims to integrate participatory and statistical methods for hypothesis-based typology construction. This is an iterative process whereby the results of the statistical analysis are compared with the reality of the target population as hypothesized by the local experts. Using a well-defined hypothesis and the presented methodological framework, which consolidates the hypothesis through local expert knowledge for the creation of typologies, warrants development of less subjective and more contextualized quantitative farm typologies.
A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework
NASA Astrophysics Data System (ADS)
Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo
An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawton, Craig R.
2015-01-01
The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less
Cox, R; Lowe, D R
1996-05-01
Most studies of sandstone provenance involve modal analysis of framework grains using techniques that exclude the fine-grained breakdown products of labile mineral grains and rock fragments, usually termed secondary matrix or pseudomatrix. However, the data presented here demonstrate that, when the proportion of pseudomatrix in a sandstone exceeds 10%, standard petrographic analysis can lead to incorrect provenance interpretation. Petrographic schemes for provenance analysis such as QFL and QFR should not therefore be applied to sandstones containing more than 10% secondary matrix. Pseudomatrix is commonly abundant in sandstones, and this is therefore a problem for provenance analysis. The difficulty can be alleviated by the use of whole-rock chemistry in addition to petrographic analysis. Combination of chemical and point-count data permits the construction of normative compositions that approximate original framework grain compositions. Provenance analysis is also complicated in many cases by fundamental compositional alteration during weathering and transport. Many sandstones, particularly shallow marine deposits, have undergone vigorous reworking, which may destroy unstable mineral grains and rock fragments. In such cases it may not be possible to retrieve provenance information by either petrographic or chemical means. Because of this, pseudomatrix-rich sandstones should be routinely included in chemical-petrological provenance analysis. Because of the many factors, both pre- and post-depositional, that operate to increase the compositional maturity of sandstones, petrologic studies must include a complete inventory of matrix proportions, grain size and sorting parameters, and an assessment of depositional setting.
Plioutsias, Anastasios; Karanikas, Nektarios; Chatzimihailidou, Maria Mikela
2018-03-01
Currently, published risk analyses for drones refer mainly to commercial systems, use data from civil aviation, and are based on probabilistic approaches without suggesting an inclusive list of hazards and respective requirements. Within this context, this article presents: (1) a set of safety requirements generated from the application of the systems theoretic process analysis (STPA) technique on a generic small drone system; (2) a gap analysis between the set of safety requirements and the ones met by 19 popular drone models; (3) the extent of the differences between those models, their manufacturers, and the countries of origin; and (4) the association of drone prices with the extent they meet the requirements derived by STPA. The application of STPA resulted in 70 safety requirements distributed across the authority, manufacturer, end user, or drone automation levels. A gap analysis showed high dissimilarities regarding the extent to which the 19 drones meet the same safety requirements. Statistical results suggested a positive correlation between drone prices and the extent that the 19 drones studied herein met the safety requirements generated by STPA, and significant differences were identified among the manufacturers. This work complements the existing risk assessment frameworks for small drones, and contributes to the establishment of a commonly endorsed international risk analysis framework. Such a framework will support the development of a holistic and methodologically justified standardization scheme for small drone flights. © 2017 Society for Risk Analysis.
Design of a Modular Monolithic Implicit Solver for Multi-Physics Applications
NASA Technical Reports Server (NTRS)
Carton De Wiart, Corentin; Diosady, Laslo T.; Garai, Anirban; Burgess, Nicholas; Blonigan, Patrick; Ekelschot, Dirk; Murman, Scott M.
2018-01-01
The design of a modular multi-physics high-order space-time finite-element framework is presented together with its extension to allow monolithic coupling of different physics. One of the main objectives of the framework is to perform efficient high- fidelity simulations of capsule/parachute systems. This problem requires simulating multiple physics including, but not limited to, the compressible Navier-Stokes equations, the dynamics of a moving body with mesh deformations and adaptation, the linear shell equations, non-re effective boundary conditions and wall modeling. The solver is based on high-order space-time - finite element methods. Continuous, discontinuous and C1-discontinuous Galerkin methods are implemented, allowing one to discretize various physical models. Tangent and adjoint sensitivity analysis are also targeted in order to conduct gradient-based optimization, error estimation, mesh adaptation, and flow control, adding another layer of complexity to the framework. The decisions made to tackle these challenges are presented. The discussion focuses first on the "single-physics" solver and later on its extension to the monolithic coupling of different physics. The implementation of different physics modules, relevant to the capsule/parachute system, are also presented. Finally, examples of coupled computations are presented, paving the way to the simulation of the full capsule/parachute system.
NASA Technical Reports Server (NTRS)
Plitau, Denis; Prasad, Narasimha S.
2012-01-01
The Active Sensing of CO2 Emissions over Nights Days and Seasons (ASCENDS) mission recommended by the NRC Decadal Survey has a desired accuracy of 0.3% in carbon dioxide mixing ratio (XCO2) retrievals requiring careful selection and optimization of the instrument parameters. NASA Langley Research Center (LaRC) is investigating 1.57 micron carbon dioxide as well as the 1.26-1.27 micron oxygen bands for our proposed ASCENDS mission requirements investigation. Simulation studies are underway for these bands to select optimum instrument parameters. The simulations are based on a multi-wavelength lidar modeling framework being developed at NASA LaRC to predict the performance of CO2 and O2 sensing from space and airborne platforms. The modeling framework consists of a lidar simulation module and a line-by-line calculation component with interchangeable lineshape routines to test the performance of alternative lineshape models in the simulations. As an option the line-by-line radiative transfer model (LBLRTM) program may also be used for line-by-line calculations. The modeling framework is being used to perform error analysis, establish optimum measurement wavelengths as well as to identify the best lineshape models to be used in CO2 and O2 retrievals. Several additional programs for HITRAN database management and related simulations are planned to be included in the framework. The description of the modeling framework with selected results of the simulation studies for CO2 and O2 sensing is presented in this paper.
NASA Astrophysics Data System (ADS)
Huang, T.; Alarcon, C.; Quach, N. T.
2014-12-01
Capture, curate, and analysis are the typical activities performed at any given Earth Science data center. Modern data management systems must be adaptable to heterogeneous science data formats, scalable to meet the mission's quality of service requirements, and able to manage the life-cycle of any given science data product. Designing a scalable data management doesn't happen overnight. It takes countless hours of refining, refactoring, retesting, and re-architecting. The Horizon data management and workflow framework, developed at the Jet Propulsion Laboratory, is a portable, scalable, and reusable framework for developing high-performance data management and product generation workflow systems to automate data capturing, data curation, and data analysis activities. The NASA's Physical Oceanography Distributed Active Archive Center (PO.DAAC)'s Data Management and Archive System (DMAS) is its core data infrastructure that handles capturing and distribution of hundreds of thousands of satellite observations each day around the clock. DMAS is an application of the Horizon framework. The NASA Global Imagery Browse Services (GIBS) is NASA's Earth Observing System Data and Information System (EOSDIS)'s solution for making high-resolution global imageries available to the science communities. The Imagery Exchange (TIE), an application of the Horizon framework, is a core subsystem for GIBS responsible for data capturing and imagery generation automation to support the EOSDIS' 12 distributed active archive centers and 17 Science Investigator-led Processing Systems (SIPS). This presentation discusses our ongoing effort in refining, refactoring, retesting, and re-architecting the Horizon framework to enable data-intensive science and its applications.
ERIC Educational Resources Information Center
Jimenez, Evelyn
2013-01-01
This capstone project applied Clark and Estes' (2008) gap analysis framework to identify performance gaps, develop perceived root causes, validate the causes, and formulate research-based solutions to present to Trojan High School. The purpose was to examine ways to increase the academic achievement of ELL students, specifically Latinos, by…
ERIC Educational Resources Information Center
Leadley, S. M., Ed.; Pignone, M. M., Ed.
Inadequacies in the quality and quantity of human services for Northeastern rural area residents prompted the seminar from which these transcripts are derived. Presented via chronological order, these transcripts reflect development of a framework and methodology for analysis of community service systems. Major seminar objectives are identified…
A framework for analysis of large database of old art paintings
NASA Astrophysics Data System (ADS)
Da Rugna, Jérome; Chareyron, Ga"l.; Pillay, Ruven; Joly, Morwena
2011-03-01
For many years, a lot of museums and countries organize the high definition digitalization of their own collections. In consequence, they generate massive data for each object. In this paper, we only focus on art painting collections. Nevertheless, we faced a very large database with heterogeneous data. Indeed, image collection includes very old and recent scans of negative photos, digital photos, multi and hyper spectral acquisitions, X-ray acquisition, and also front, back and lateral photos. Moreover, we have noted that art paintings suffer from much degradation: crack, softening, artifact, human damages and, overtime corruption. Considering that, it appears necessary to develop specific approaches and methods dedicated to digital art painting analysis. Consequently, this paper presents a complete framework to evaluate, compare and benchmark devoted to image processing algorithms.
Salmon, Paul M; Lenne, Michael G; Walker, Guy H; Stanton, Neville A; Filtness, Ashleigh
2014-01-01
Collisions between different types of road users at intersections form a substantial component of the road toll. This paper presents an analysis of driver, cyclist, motorcyclist and pedestrian behaviour at intersections that involved the application of an integrated suite of ergonomics methods, the Event Analysis of Systemic Teamwork (EAST) framework, to on-road study data. EAST was used to analyse behaviour at three intersections using data derived from an on-road study of driver, cyclist, motorcyclist and pedestrian behaviour. The analysis shows the differences in behaviour and cognition across the different road user groups and pinpoints instances where this may be creating conflicts between different road users. The role of intersection design in creating these differences in behaviour and resulting conflicts is discussed. It is concluded that currently intersections are not designed in a way that supports behaviour across the four forms of road user studied. Interventions designed to improve intersection safety are discussed. Practitioner Summary: Intersection safety currently represents a key road safety issue worldwide. This paper presents a novel application of a framework of ergonomics methods for studying differences in road user behaviour at intersections. The findings support development of interventions that consider all road users as opposed to one group in isolation.