Sample records for analytic framework based

  1. Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning Analytics Interventions at the Open University UK

    ERIC Educational Resources Information Center

    Rienties, Bart; Boroowa, Avinash; Cross, Simon; Kubiak, Chris; Mayles, Kevin; Murphy, Sam

    2016-01-01

    There is an urgent need to develop an evidence-based framework for learning analytics whereby stakeholders can manage, evaluate, and make decisions about which types of interventions work well and under which conditions. In this article, we will work towards developing a foundation of an Analytics4Action Evaluation Framework (A4AEF) that is…

  2. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  3. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  4. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  5. Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework

    ERIC Educational Resources Information Center

    Ranjan, Jayanthi; Bhatnagar, Vishal

    2011-01-01

    Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…

  6. RT-18: Value of Flexibility. Phase 1

    DTIC Science & Technology

    2010-09-25

    an analytical framework based on sound mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory...framework that is mathematically consistent, domain independent and applicable under varying information levels. This report presents our advances in...During this period, we also explored the development of an analytical framework based on sound mathematical constructs. A review of the current state

  7. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less

  8. Value of Flexibility - Phase 1

    DTIC Science & Technology

    2010-09-25

    weaknesses of each approach. During this period, we also explored the development of an analytical framework based on sound mathematical constructs... mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory or guidance on best approaches to...research activities is in developing a coherent value based definition of flexibility that is based on an analytical framework that is mathematically

  9. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  10. Real-Time Analytics for the Healthcare Industry: Arrhythmia Detection.

    PubMed

    Agneeswaran, Vijay Srinivas; Mukherjee, Joydeb; Gupta, Ashutosh; Tonpay, Pranay; Tiwari, Jayati; Agarwal, Nitin

    2013-09-01

    It is time for the healthcare industry to move from the era of "analyzing our health history" to the age of "managing the future of our health." In this article, we illustrate the importance of real-time analytics across the healthcare industry by providing a generic mechanism to reengineer traditional analytics expressed in the R programming language into Storm-based real-time analytics code. This is a powerful abstraction, since most data scientists use R to write the analytics and are not clear on how to make the data work in real-time and on high-velocity data. Our paper focuses on the applications necessary to a healthcare analytics scenario, specifically focusing on the importance of electrocardiogram (ECG) monitoring. A physician can use our framework to compare ECG reports by categorization and consequently detect Arrhythmia. The framework can read the ECG signals and uses a machine learning-based categorizer that runs within a Storm environment to compare different ECG signals. The paper also presents some performance studies of the framework to illustrate the throughput and accuracy trade-off in real-time analytics.

  11. An Analytical Framework for Evaluating E-Commerce Business Models and Strategies.

    ERIC Educational Resources Information Center

    Lee, Chung-Shing

    2001-01-01

    Considers electronic commerce as a paradigm shift, or a disruptive innovation, and presents an analytical framework based on the theories of transaction costs and switching costs. Topics include business transformation process; scale effect; scope effect; new sources of revenue; and e-commerce value creation model and strategy. (LRW)

  12. A Data Protection Framework for Learning Analytics

    ERIC Educational Resources Information Center

    Cormack, Andrew

    2016-01-01

    Most studies on the use of digital student data adopt an ethical framework derived from human-subject research, based on the informed consent of the experimental subject. However, consent gives universities little guidance on using learning analytics as a routine part of educational provision: which purposes are legitimate and which analyses…

  13. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  14. The Framework of Intervention Engine Based on Learning Analytics

    ERIC Educational Resources Information Center

    Sahin, Muhittin; Yurdugül, Halil

    2017-01-01

    Learning analytics primarily deals with the optimization of learning environments and the ultimate goal of learning analytics is to improve learning and teaching efficiency. Studies on learning analytics seem to have been made in the form of adaptation engine and intervention engine. Adaptation engine studies are quite widespread, but intervention…

  15. An Active Learning Exercise for Introducing Agent-Based Modeling

    ERIC Educational Resources Information Center

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  16. Analytical method of waste allocation in waste management systems: Concept, method and case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergeron, Francis C., E-mail: francis.b.c@videotron.ca

    Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. Themore » conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste management in Geneva.« less

  17. The Ophidia framework: toward cloud-based data analytics for climate change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni

    2015-04-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.

  18. Open-source Framework for Storing and Manipulation of Plasma Chemical Reaction Data

    NASA Astrophysics Data System (ADS)

    Jenkins, T. G.; Averkin, S. N.; Cary, J. R.; Kruger, S. E.

    2017-10-01

    We present a new open-source framework for storage and manipulation of plasma chemical reaction data that has emerged from our in-house project MUNCHKIN. This framework consists of python scripts and C + + programs. It stores data in an SQL data base for fast retrieval and manipulation. For example, it is possible to fit cross-section data into most widely used analytical expressions, calculate reaction rates for Maxwellian distribution functions of colliding particles, and fit them into different analytical expressions. Another important feature of this framework is the ability to calculate transport properties based on the cross-section data and supplied distribution functions. In addition, this framework allows the export of chemical reaction descriptions in LaTeX format for ease of inclusion in scientific papers. With the help of this framework it is possible to generate corresponding VSim (Particle-In-Cell simulation code) and USim (unstructured multi-fluid code) input blocks with appropriate cross-sections.

  19. Using Learning Analytics for Preserving Academic Integrity

    ERIC Educational Resources Information Center

    Amigud, Alexander; Arnedo-Moreno, Joan; Daradoumis, Thanasis; Guerrero-Roldan, Ana-Elena

    2017-01-01

    This paper presents the results of integrating learning analytics into the assessment process to enhance academic integrity in the e-learning environment. The goal of this research is to evaluate the computational-based approach to academic integrity. The machine-learning based framework learns students' patterns of language use from data,…

  20. Evaluation of Copper-1,3,5-benzenetricarboxylate Metal-organic Framework (Cu-MOF) as a Selective Sorbent for Lewis-base Analytes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Scott D.; Eckberg, Alison D.; Thallapally, Praveen K.

    2011-09-01

    The metal-organic framework Cu-BTC was evaluated for its ability to selectively interact with Lewis-base analytes, including explosives, by examining retention on GC columns packed with Chromosorb W HP that contained 3.0% SE-30 along with various loadings of Cu-BTC. SEM images of the support material showed the characteristic Cu-BTC crystals embedded in the SE-30 coating on the diatomaceous support. Results indicated that the Cu-BTC-containing stationary phase had limited thermal stability (220°C) and strong general retention for analytes. Kováts index calculations showed selective retention (amounting to about 300 Kováts units) relative to n-alkanes for many small Lewis-base analytes on a column thatmore » contained 0.75% Cu-BTC compared to an SE-30 control. Short columns that contained lower loadings of Cu-BTC (0.10%) were necessary to elute explosives and related analytes; however, selectivity was not observed for aromatic compounds (including nitroaromatics) or nitroalkanes. Observed retention characteristics are discussed.« less

  1. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  2. Argumentation in Science Education: A Model-based Framework

    NASA Astrophysics Data System (ADS)

    Böttcher, Florian; Meisert, Anke

    2011-02-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.

  3. Evaluation Framework for NASA's Educational Outreach Programs

    NASA Technical Reports Server (NTRS)

    Berg, Rick; Booker, Angela; Linde, Charlotte; Preston, Connie

    1999-01-01

    The objective of the proposed work is to develop an evaluation framework for NASA's educational outreach efforts. We focus on public (rather than technical or scientific) dissemination efforts, specifically on Internet-based outreach sites for children.The outcome of this work is to propose both methods and criteria for evaluation, which would enable NASA to do a more analytic evaluation of its outreach efforts. The proposed framework is based on IRL's ethnographic and video-based observational methods, which allow us to analyze how these sites are actually used.

  4. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    PubMed

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. © 2016 American Physical Therapy Association.

  5. Analyzing Electronic Question/Answer Services: Framework and Evaluations of Selected Services.

    ERIC Educational Resources Information Center

    White, Marilyn Domas, Ed.

    This report develops an analytical framework based on systems analysis for evaluating electronic question/answer or AskA services operated by a wide range of types of organizations, including libraries. Version 1.0 of this framework was applied in June 1999 to a selective sample of 11 electronic question/answer services, which cover a range of…

  6. Analysis of Naval NETWAR FORCEnet Enterprise: Implications for Capabilities Based Budgeting

    DTIC Science & Technology

    2006-12-01

    of this background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed...background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed. The...Business Approach ......................................................26 Figure 8. Critical Assumption for Common Analytical Framework

  7. Teacher Identity and Numeracy: Developing an Analytic Lens for Understanding Numeracy Teacher Identity

    ERIC Educational Resources Information Center

    Bennison, Anne; Goos, Merrilyn

    2013-01-01

    This paper reviews recent literature on teacher identity in order to propose an operational framework that can be used to investigate the formation and development of numeracy teacher identities. The proposed framework is based on Van Zoest and Bohl's (2005) framework for mathematics teacher identity with a focus on those characteristics thought…

  8. Pilot testing of SHRP 2 reliability data and analytical products: Florida. [supporting datasets

    DOT National Transportation Integrated Search

    2014-01-01

    SHRP 2 initiated the L38 project to pilot test products from five of the programs completed projects. The products support reliability estimation and use based on data analyses, analytical techniques, and decision-making framework. The L38 project...

  9. a Web-Based Framework for Visualizing Industrial Spatiotemporal Distribution Using Standard Deviational Ellipse and Shifting Routes of Gravity Centers

    NASA Astrophysics Data System (ADS)

    Song, Y.; Gui, Z.; Wu, H.; Wei, Y.

    2017-09-01

    Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  10. The Earth Data Analytic Services (EDAS) Framework

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  11. Facilitating Multiple Intelligences through Multimodal Learning Analytics

    ERIC Educational Resources Information Center

    Perveen, Ayesha

    2018-01-01

    This paper develops a theoretical framework for employing learning analytics in online education to trace multiple learning variations of online students by considering their potential of being multiple intelligences based on Howard Gardner's 1983 theory of multiple intelligences. The study first emphasizes the need to facilitate students as…

  12. How to evaluate population management? Transforming the Care Continuum Alliance population health guide toward a broadly applicable analytical framework.

    PubMed

    Struijs, Jeroen N; Drewes, Hanneke W; Heijink, Richard; Baan, Caroline A

    2015-04-01

    Many countries face the persistent twin challenge of providing high-quality care while keeping health systems affordable and accessible. As a result, the interest for more efficient strategies to stimulate population health is increasing. A possible successful strategy is population management (PM). PM strives to address health needs for the population at-risk and the chronically ill at all points along the health continuum by integrating services across health care, prevention, social care and welfare. The Care Continuum Alliance (CCA) population health guide, which recently changed their name in Population Health Alliance (PHA) provides a useful instrument for implementing and evaluating such innovative approaches. This framework is developed for PM specifically and describes the core elements of the PM-concept on the basis of six subsequent interrelated steps. The aim of this article is to transform the CCA framework into an analytical framework. Quantitative methods are refined and we operationalized a set of indicators to measure the impact of PM in terms of the Triple Aim (population health, quality of care and cost per capita). Additionally, we added a qualitative part to gain insight into the implementation process of PM. This resulted in a broadly applicable analytical framework based on a mixed-methods approach. In the coming years, the analytical framework will be applied within the Dutch Monitor Population Management to derive transferable 'lessons learned' and to methodologically underpin the concept of PM. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  14. Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)

    NASA Astrophysics Data System (ADS)

    Dubinskii, Yu A.; Osipenko, A. S.

    2000-02-01

    Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.

  15. The Matsu Wheel: A Cloud-Based Framework for Efficient Analysis and Reanalysis of Earth Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James; hide

    2016-01-01

    Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.

  16. Estimating Aquifer Properties Using Sinusoidal Pumping Tests

    NASA Astrophysics Data System (ADS)

    Rasmussen, T. C.; Haborak, K. G.; Young, M. H.

    2001-12-01

    We develop the theoretical and applied framework for using sinusoidal pumping tests to estimate aquifer properties for confined, leaky, and partially penetrating conditions. The framework 1) derives analytical solutions for three boundary conditions suitable for many practical applications, 2) validates the analytical solutions against a finite element model, 3) establishes a protocol for conducting sinusoidal pumping tests, and 4) estimates aquifer hydraulic parameters based on the analytical solutions. The analytical solutions to sinusoidal stimuli in radial coordinates are derived for boundary value problems that are analogous to the Theis (1935) confined aquifer solution, the Hantush and Jacob (1955) leaky aquifer solution, and the Hantush (1964) partially penetrated confined aquifer solution. The analytical solutions compare favorably to a finite-element solution of a simulated flow domain, except in the region immediately adjacent to the pumping well where the implicit assumption of zero borehole radius is violated. The procedure is demonstrated in one unconfined and two confined aquifer units near the General Separations Area at the Savannah River Site, a federal nuclear facility located in South Carolina. Aquifer hydraulic parameters estimated using this framework provide independent confirmation of parameters obtained from conventional aquifer tests. The sinusoidal approach also resulted in the elimination of investigation-derived wastes.

  17. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options.

    PubMed

    Reed, M S; Podesta, G; Fazey, I; Geeson, N; Hessel, R; Hubacek, K; Letson, D; Nainggolan, D; Prell, C; Rickenbach, M G; Ritsema, C; Schwilch, G; Stringer, L C; Thomas, A D

    2013-10-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.

  18. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options☆

    PubMed Central

    Reed, M.S.; Podesta, G.; Fazey, I.; Geeson, N.; Hessel, R.; Hubacek, K.; Letson, D.; Nainggolan, D.; Prell, C.; Rickenbach, M.G.; Ritsema, C.; Schwilch, G.; Stringer, L.C.; Thomas, A.D.

    2013-01-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change. PMID:25844020

  19. Functional Analytic Psychotherapy Is a Framework for Implementing Evidence-Based Practices: The Example of Integrated Smoking Cessation and Depression Treatment

    ERIC Educational Resources Information Center

    Holman, Gareth; Kohlenberg, Robert J.; Tsai, Mavis; Haworth, Kevin; Jacobson, Emily; Liu, Sarah

    2012-01-01

    Depression and cigarette smoking are recurrent, interacting problems that co-occur at high rates and--especially when depression is chronic--are difficult to treat and associated with costly health consequences. In this paper we present an integrative therapeutic framework for concurrent treatment of these problems based on evidence-based…

  20. Earthdata Cloud Analytics Project

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Lynnes, Chris

    2018-01-01

    This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.

  1. RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.

    PubMed

    Varghese, Blesson; Patel, Ishan; Barker, Adam

    2015-01-01

    Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.

  2. ClimateSpark: An in-memory distributed computing framework for big climate data analytics

    NASA Astrophysics Data System (ADS)

    Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei

    2018-06-01

    The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

  3. Closed-form solutions and scaling laws for Kerr frequency combs

    PubMed Central

    Renninger, William H.; Rakich, Peter T.

    2016-01-01

    A single closed-form analytical solution of the driven nonlinear Schrödinger equation is developed, reproducing a large class of the behaviors in Kerr-comb systems, including bright-solitons, dark-solitons, and a large class of periodic wavetrains. From this analytical framework, a Kerr-comb area theorem and a pump-detuning relation are developed, providing new insights into soliton- and wavetrain-based combs along with concrete design guidelines for both. This new area theorem reveals significant deviation from the conventional soliton area theorem, which is crucial to understanding cavity solitons in certain limits. Moreover, these closed-form solutions represent the first step towards an analytical framework for wavetrain formation, and reveal new parameter regimes for enhanced Kerr-comb performance. PMID:27108810

  4. A Move-Analytic Contrastive Study on the Introductions of American and Philippine Master's Theses in Architecture

    ERIC Educational Resources Information Center

    Lintao, Rachelle B.; Erfe, Jonathan P.

    2012-01-01

    This study purports to foster the understanding of profession-based academic writing in two different cultural conventions by examining the rhetorical moves employed by American and Philippine thesis introductions in Architecture using Swales' 2004 Revised CARS move-analytic model as framework. Twenty (20) Master's thesis introductions in…

  5. A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories

    ERIC Educational Resources Information Center

    Duvvuri, Sri Devi; Gruca, Thomas S.

    2010-01-01

    Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…

  6. Learning Analytics for Communities of Inquiry

    ERIC Educational Resources Information Center

    Kovanovic, Vitomir; Gaševic, Dragan; Hatala, Marek

    2014-01-01

    This paper describes doctoral research that focuses on the development of a learning analytics framework for inquiry-based digital learning. Building on the Community of Inquiry model (CoI)--a foundation commonly used in the research and practice of digital learning and teaching--this research builds on the existing body of knowledge in two…

  7. TU-H-CAMPUS-IeP1-05: A Framework for the Analytic Calculation of Patient-Specific Dose Distribution Due to CBCT Scan for IGRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youn, H; Jeon, H; Nam, J

    Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law.more » In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.« less

  8. Value Driven Outcomes (VDO): a pragmatic, modular, and extensible software framework for understanding and improving health care costs and outcomes

    PubMed Central

    Kawamoto, Kensaku; Martin, Cary J; Williams, Kip; Tu, Ming-Chieh; Park, Charlton G; Hunter, Cheri; Staes, Catherine J; Bray, Bruce E; Deshmukh, Vikrant G; Holbrook, Reid A; Morris, Scott J; Fedderson, Matthew B; Sletta, Amy; Turnbull, James; Mulvihill, Sean J; Crabtree, Gordon L; Entwistle, David E; McKenna, Quinn L; Strong, Michael B; Pendleton, Robert C; Lee, Vivian S

    2015-01-01

    Objective To develop expeditiously a pragmatic, modular, and extensible software framework for understanding and improving healthcare value (costs relative to outcomes). Materials and methods In 2012, a multidisciplinary team was assembled by the leadership of the University of Utah Health Sciences Center and charged with rapidly developing a pragmatic and actionable analytics framework for understanding and enhancing healthcare value. Based on an analysis of relevant prior work, a value analytics framework known as Value Driven Outcomes (VDO) was developed using an agile methodology. Evaluation consisted of measurement against project objectives, including implementation timeliness, system performance, completeness, accuracy, extensibility, adoption, satisfaction, and the ability to support value improvement. Results A modular, extensible framework was developed to allocate clinical care costs to individual patient encounters. For example, labor costs in a hospital unit are allocated to patients based on the hours they spent in the unit; actual medication acquisition costs are allocated to patients based on utilization; and radiology costs are allocated based on the minutes required for study performance. Relevant process and outcome measures are also available. A visualization layer facilitates the identification of value improvement opportunities, such as high-volume, high-cost case types with high variability in costs across providers. Initial implementation was completed within 6 months, and all project objectives were fulfilled. The framework has been improved iteratively and is now a foundational tool for delivering high-value care. Conclusions The framework described can be expeditiously implemented to provide a pragmatic, modular, and extensible approach to understanding and improving healthcare value. PMID:25324556

  9. 77 FR 47767 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security U.S. Customs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-10

    ... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY: Privacy Office... Homeland Security/U.S. Customs and Border Protection, DHS/CBP--017 Analytical Framework for Intelligence... Analytical Framework for Intelligence (AFI) System of Records'' from one or more provisions of the Privacy...

  10. Analytical modeling and feasibility study of a multi-GPU cloud-based server (MGCS) framework for non-voxel-based dose calculations.

    PubMed

    Neylon, J; Min, Y; Kupelian, P; Low, D A; Santhanam, A

    2017-04-01

    In this paper, a multi-GPU cloud-based server (MGCS) framework is presented for dose calculations, exploring the feasibility of remote computing power for parallelization and acceleration of computationally and time intensive radiotherapy tasks in moving toward online adaptive therapies. An analytical model was developed to estimate theoretical MGCS performance acceleration and intelligently determine workload distribution. Numerical studies were performed with a computing setup of 14 GPUs distributed over 4 servers interconnected by a 1 Gigabits per second (Gbps) network. Inter-process communication methods were optimized to facilitate resource distribution and minimize data transfers over the server interconnect. The analytically predicted computation time predicted matched experimentally observations within 1-5 %. MGCS performance approached a theoretical limit of acceleration proportional to the number of GPUs utilized when computational tasks far outweighed memory operations. The MGCS implementation reproduced ground-truth dose computations with negligible differences, by distributing the work among several processes and implemented optimization strategies. The results showed that a cloud-based computation engine was a feasible solution for enabling clinics to make use of fast dose calculations for advanced treatment planning and adaptive radiotherapy. The cloud-based system was able to exceed the performance of a local machine even for optimized calculations, and provided significant acceleration for computationally intensive tasks. Such a framework can provide access to advanced technology and computational methods to many clinics, providing an avenue for standardization across institutions without the requirements of purchasing, maintaining, and continually updating hardware.

  11. Social Exclusion and Education Inequality: Towards an Integrated Analytical Framework for the Urban-Rural Divide in China

    ERIC Educational Resources Information Center

    Wang, Li

    2012-01-01

    The aim of this paper is to build a capability-based framework, drawing upon the strengths of other approaches, which is applicable to the complexity of the urban-rural divide in education in China. It starts with a brief introduction to the capability approach. This is followed by a discussion of how the rights-based approach and resource-based…

  12. Using connectivity to identify climatic drivers of local adaptation: a response to Macdonald et al.

    PubMed

    Prunier, Jérôme G; Blanchet, Simon

    2018-04-30

    Macdonald et al. (Ecol. Lett., 21, 2018, 207-216) proposed an analytical framework for identifying evolutionary processes underlying trait-environment relationships observed in natural populations. Here, we propose an expanded and refined framework based on simulations and bootstrap-based approaches, and we elaborate on an important statistical caveat common to most datasets. © 2018 John Wiley & Sons Ltd/CNRS.

  13. A genetic algorithm-based job scheduling model for big data analytics.

    PubMed

    Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei

    Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.

  14. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  15. service line analytics in the new era.

    PubMed

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  16. Large Ensemble Analytic Framework for Consequence-Driven Discovery of Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Lamontagne, Jonathan R.; Reed, Patrick M.; Link, Robert; Calvin, Katherine V.; Clarke, Leon E.; Edmonds, James A.

    2018-03-01

    An analytic scenario generation framework is developed based on the idea that the same climate outcome can result from very different socioeconomic and policy drivers. The framework builds on the Scenario Matrix Framework's abstraction of "challenges to mitigation" and "challenges to adaptation" to facilitate the flexible discovery of diverse and consequential scenarios. We combine visual and statistical techniques for interrogating a large factorial data set of 33,750 scenarios generated using the Global Change Assessment Model. We demonstrate how the analytic framework can aid in identifying which scenario assumptions are most tied to user-specified measures for policy relevant outcomes of interest, specifically for our example high or low mitigation costs. We show that the current approach for selecting reference scenarios can miss policy relevant scenario narratives that often emerge as hybrids of optimistic and pessimistic scenario assumptions. We also show that the same scenario assumption can be associated with both high and low mitigation costs depending on the climate outcome of interest and the mitigation policy context. In the illustrative example, we show how agricultural productivity, population growth, and economic growth are most predictive of the level of mitigation costs. Formulating policy relevant scenarios of deeply and broadly uncertain futures benefits from large ensemble-based exploration of quantitative measures of consequences. To this end, we have contributed a large database of climate change futures that can support "bottom-up" scenario generation techniques that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.

  17. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  18. Value Driven Outcomes (VDO): a pragmatic, modular, and extensible software framework for understanding and improving health care costs and outcomes.

    PubMed

    Kawamoto, Kensaku; Martin, Cary J; Williams, Kip; Tu, Ming-Chieh; Park, Charlton G; Hunter, Cheri; Staes, Catherine J; Bray, Bruce E; Deshmukh, Vikrant G; Holbrook, Reid A; Morris, Scott J; Fedderson, Matthew B; Sletta, Amy; Turnbull, James; Mulvihill, Sean J; Crabtree, Gordon L; Entwistle, David E; McKenna, Quinn L; Strong, Michael B; Pendleton, Robert C; Lee, Vivian S

    2015-01-01

    To develop expeditiously a pragmatic, modular, and extensible software framework for understanding and improving healthcare value (costs relative to outcomes). In 2012, a multidisciplinary team was assembled by the leadership of the University of Utah Health Sciences Center and charged with rapidly developing a pragmatic and actionable analytics framework for understanding and enhancing healthcare value. Based on an analysis of relevant prior work, a value analytics framework known as Value Driven Outcomes (VDO) was developed using an agile methodology. Evaluation consisted of measurement against project objectives, including implementation timeliness, system performance, completeness, accuracy, extensibility, adoption, satisfaction, and the ability to support value improvement. A modular, extensible framework was developed to allocate clinical care costs to individual patient encounters. For example, labor costs in a hospital unit are allocated to patients based on the hours they spent in the unit; actual medication acquisition costs are allocated to patients based on utilization; and radiology costs are allocated based on the minutes required for study performance. Relevant process and outcome measures are also available. A visualization layer facilitates the identification of value improvement opportunities, such as high-volume, high-cost case types with high variability in costs across providers. Initial implementation was completed within 6 months, and all project objectives were fulfilled. The framework has been improved iteratively and is now a foundational tool for delivering high-value care. The framework described can be expeditiously implemented to provide a pragmatic, modular, and extensible approach to understanding and improving healthcare value. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  19. Multiaxis sensing using metal organic frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talin, Albert Alec; Allendorf, Mark D.; Leonard, Francois

    2017-01-17

    A sensor device including a sensor substrate; and a thin film comprising a porous metal organic framework (MOF) on the substrate that presents more than one transduction mechanism when exposed to an analyte. A method including exposing a porous metal organic framework (MOF) on a substrate to an analyte; and identifying more than one transduction mechanism in response to the exposure to the analyte.

  20. The MOOC and Learning Analytics Innovation Cycle (MOLAC): A Reflective Summary of Ongoing Research and Its Challenges

    ERIC Educational Resources Information Center

    Drachsler, H.; Kalz, M.

    2016-01-01

    The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…

  1. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  2. Digging beneath the Surface: Analyzing the Complexity of Instructors' Participation in Asynchronous Discussion

    ERIC Educational Resources Information Center

    Clarke, Lane Whitney; Bartholomew, Audrey

    2014-01-01

    The purpose of this study was to investigate instructor participation in asynchronous discussions through an in-depth content analysis of instructors' postings and comments through the Community of Inquiry (COI) framework (Garrison et. al, 2001). We developed an analytical tool based on this framework in order to better understand what instructors…

  3. Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.

    PubMed

    El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher

    2018-01-01

    Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.

  4. Patterns of Work and Family Involvement among Single and Dual Earner Couples: Two Competing Analytical Approaches.

    ERIC Educational Resources Information Center

    Yogev, Sara; Brett, Jeanne

    This paper offers a conceptual framework for the intersection of work and family roles based on the constructs of work involvement and family involvement. The theoretical and empirical literature on the intersection of work and family roles is reviewed from two analytical approaches. From the individual level of analysis, the literature reviewed…

  5. A trajectory generation framework for modeling spacecraft entry in MDAO

    NASA Astrophysics Data System (ADS)

    D`Souza, Sarah N.; Sarigul-Klijn, Nesrin

    2016-04-01

    In this paper a novel trajectory generation framework was developed that optimizes trajectory event conditions for use in a Generalized Entry Guidance algorithm. The framework was developed to be adaptable via the use of high fidelity equations of motion and drag based analytical bank profiles. Within this framework, a novel technique was implemented that resolved the sensitivity of the bank profile to atmospheric non-linearities. The framework's adaptability was established by running two different entry bank conditions. Each case yielded a reference trajectory and set of transition event conditions that are flight feasible and implementable in a Generalized Entry Guidance algorithm.

  6. High-Performance Data Analytics Beyond the Relational and Graph Data Models with GEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Minutoli, Marco; Bhatt, Shreyansh

    Graphs represent an increasingly popular data model for data-analytics, since they can naturally represent relationships and interactions between entities. Relational databases and their pure table-based data model are not well suitable to store and process sparse data. Consequently, graph databases have gained interest in the last few years and the Resource Description Framework (RDF) became the standard data model for graph data. Nevertheless, while RDF is well suited to analyze the relationships between the entities, it is not efficient in representing their attributes and properties. In this work we propose the adoption of a new hybrid data model, based onmore » attributed graphs, that aims at overcoming the limitations of the pure relational and graph data models. We present how we have re-designed the GEMS data-analytics framework to fully take advantage of the proposed hybrid data model. To improve analysts productivity, in addition to a C++ API for applications development, we adopt GraQL as input query language. We validate our approach implementing a set of queries on net-flow data and we compare our framework performance against Neo4j. Experimental results show significant performance improvement over Neo4j, up to several orders of magnitude when increasing the size of the input data.« less

  7. Optimization of storage tank locations in an urban stormwater drainage system using a two-stage approach.

    PubMed

    Wang, Mingming; Sun, Yuanxiang; Sweetapple, Chris

    2017-12-15

    Storage is important for flood mitigation and non-point source pollution control. However, to seek a cost-effective design scheme for storage tanks is very complex. This paper presents a two-stage optimization framework to find an optimal scheme for storage tanks using storm water management model (SWMM). The objectives are to minimize flooding, total suspended solids (TSS) load and storage cost. The framework includes two modules: (i) the analytical module, which evaluates and ranks the flooding nodes with the analytic hierarchy process (AHP) using two indicators (flood depth and flood duration), and then obtains the preliminary scheme by calculating two efficiency indicators (flood reduction efficiency and TSS reduction efficiency); (ii) the iteration module, which obtains an optimal scheme using a generalized pattern search (GPS) method based on the preliminary scheme generated by the analytical module. The proposed approach was applied to a catchment in CZ city, China, to test its capability in choosing design alternatives. Different rainfall scenarios are considered to test its robustness. The results demonstrate that the optimal framework is feasible, and the optimization is fast based on the preliminary scheme. The optimized scheme is better than the preliminary scheme for reducing runoff and pollutant loads under a given storage cost. The multi-objective optimization framework presented in this paper may be useful in finding the best scheme of storage tanks or low impact development (LID) controls. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. An Intrusion Detection System Based on Multi-Level Clustering for Hierarchical Wireless Sensor Networks

    PubMed Central

    Butun, Ismail; Ra, In-Ho; Sankar, Ravi

    2015-01-01

    In this work, an intrusion detection system (IDS) framework based on multi-level clustering for hierarchical wireless sensor networks is proposed. The framework employs two types of intrusion detection approaches: (1) “downward-IDS (D-IDS)” to detect the abnormal behavior (intrusion) of the subordinate (member) nodes; and (2) “upward-IDS (U-IDS)” to detect the abnormal behavior of the cluster heads. By using analytical calculations, the optimum parameters for the D-IDS (number of maximum hops) and U-IDS (monitoring group size) of the framework are evaluated and presented. PMID:26593915

  9. Microfluidic paper-based device for colorimetric determination of glucose based on a metal-organic framework acting as peroxidase mimetic.

    PubMed

    Ortiz-Gómez, Inmaculada; Salinas-Castillo, Alfonso; García, Amalia García; Álvarez-Bermejo, José Antonio; de Orbe-Payá, Ignacio; Rodríguez-Diéguez, Antonio; Capitán-Vallvey, Luis Fermín

    2017-12-13

    This work presents a microfluidic paper-based analytical device (μPAD) for glucose determination using a supported metal-organic framework (MOF) acting as a peroxidase mimic. The catalytic action of glucose oxidase (GOx) on glucose causes the formation of H 2 O 2 , and the MOF causes the oxidation of 3,3',5,5'-tetramethylbenzidine (TMB) by H 2 O 2 to form a blue-green product with an absorption peak at 650 nm in the detection zone. A digital camera and the iOS feature of a smartphone are used for the quantitation of glucose with the S coordinate of the HSV color space as the analytical parameter. Different factors such as the concentration of TMB, GOx and MOF, pH and buffer, sample volume, reaction time and reagent position in the μPAD were optimized. Under optimal conditions, the value for the S coordinate increases linearly up to 150 μmol·L -1 glucose concentrations, with a 2.5 μmol·L -1 detection limit. The μPAD remains stable for 21 days under conventional storage conditions. Such an enzyme mimetic-based assay to glucose determination using Fe-MIL-101 MOF implemented in a microfluidic paper-based device possesses advantages over enzyme-based assays in terms of costs, durability and stability compared to other existing glucose determination methods. The procedure was applied to the determination of glucose in (spiked) serum and urine. Graphical abstract Schematic representation of microfluidic paper-based analytical device using metal-organic framework as a peroxidase mimic for colorimetric glucose detection with digital camera or smartphone and iOS app readout.

  10. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  11. The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.

    2015-12-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.

  12. Simulation and modeling of the temporal performance of path-based restoration schemes in planar mesh networks

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Manish; McCaughan, Leon; Olkhovets, Anatoli; Korotky, Steven K.

    2006-12-01

    We formulate an analytic framework for the restoration performance of path-based restoration schemes in planar mesh networks. We analyze various switch architectures and signaling schemes and model their total restoration interval. We also evaluate the network global expectation value of the time to restore a demand as a function of network parameters. We analyze a wide range of nominally capacity-optimal planar mesh networks and find our analytic model to be in good agreement with numerical simulation data.

  13. An Analytical Framework for the Steady State Impact of Carbonate Compensation on Atmospheric CO2

    NASA Astrophysics Data System (ADS)

    Omta, Anne Willem; Ferrari, Raffaele; McGee, David

    2018-04-01

    The deep-ocean carbonate ion concentration impacts the fraction of the marine calcium carbonate production that is buried in sediments. This gives rise to the carbonate compensation feedback, which is thought to restore the deep-ocean carbonate ion concentration on multimillennial timescales. We formulate an analytical framework to investigate the impact of carbonate compensation under various changes in the carbon cycle relevant for anthropogenic change and glacial cycles. Using this framework, we show that carbonate compensation amplifies by 15-20% changes in atmospheric CO2 resulting from a redistribution of carbon between the atmosphere and ocean (e.g., due to changes in temperature, salinity, or nutrient utilization). A counterintuitive result emerges when the impact of organic matter burial in the ocean is examined. The organic matter burial first leads to a slight decrease in atmospheric CO2 and an increase in the deep-ocean carbonate ion concentration. Subsequently, enhanced calcium carbonate burial leads to outgassing of carbon from the ocean to the atmosphere, which is quantified by our framework. Results from simulations with a multibox model including the minor acids and bases important for the ocean-atmosphere exchange of carbon are consistent with our analytical predictions. We discuss the potential role of carbonate compensation in glacial-interglacial cycles as an example of how our theoretical framework may be applied.

  14. Progress towards and barriers to implementation of a risk framework for US federal wildland fire policy and decision making

    Treesearch

    David C. Calkin; Mark A. Finney; Alan A. Ager; Matthew P. Thompson; Krista M. Gebert

    2011-01-01

    In this paper we review progress towards the implementation of a riskmanagement framework for US federal wildland fire policy and operations. We first describe new developments in wildfire simulation technology that catalyzed the development of risk-based decision support systems for strategic wildfire management. These systems include new analytical methods to measure...

  15. The Climate Data Analytic Services (CDAS) Framework.

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2016-12-01

    Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.

  16. The Analytic Onion: Examining Training Issues from Different Levels of Analysis. Interim Technical Paper for Period July 1989-June 1991.

    ERIC Educational Resources Information Center

    Lamb, Theodore A.; Chin, Keric B. O.

    This paper proposes a conceptual framework based on different levels of analysis using the metaphor of the layers of an onion to help organize and structure thinking on research issues concerning training. It discusses the core of the "analytic onion," the biological level, and seven levels of analysis that surround that core: the individual, the…

  17. A Strategy for Incorporating Learning Analytics into the Design and Evaluation of a K-12 Science Curriculum

    ERIC Educational Resources Information Center

    Monroy, Carlos; Rangel, Virginia Snodgrass; Whitaker, Reid

    2014-01-01

    In this paper, we discuss a scalable approach for integrating learning analytics into an online K-12 science curriculum. A description of the curriculum and the underlying pedagogical framework is followed by a discussion of the challenges to be tackled as part of this integration. We include examples of data visualization based on teacher usage…

  18. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis.

    PubMed

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions.

  19. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis

    PubMed Central

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions. PMID:26600156

  20. Medication information leaflets for patients: the further validation of an analytic linguistic framework.

    PubMed

    Clerehan, Rosemary; Hirsh, Di; Buchbinder, Rachelle

    2009-01-01

    While clinicians may routinely use patient information leaflets about drug therapy, a poorly conceived leaflet has the potential to do harm. We previously developed a novel approach to analysing leaflets about a rheumatoid arthritis drug, using an analytic approach based on systemic functional linguistics. The aim of the present study was to verify the validity of the linguistic framework by applying it to two further arthritis drug leaflets. The findings confirmed the applicability of the framework and were used to refine it. A new stage or 'move' in the genre was identified. While the function of many of the moves appeared to be 'to instruct' the patient, the instruction was often unclear. The role relationships expressed in the text were critical to the meaning. As with our previous study, judged on their lexical density, the leaflets resembled academic text. The framework can provide specific tools to assess and produce medication information leaflets to support readers in taking medication. Future work could utilize the framework to evaluate information on other treatments and procedures or on healthcare information more widely.

  1. Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making - Proceedings of a Workshop

    USGS Publications Warehouse

    Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl

    2010-01-01

    The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and the use of ecosystem services in decision making.

  2. Video-Based Analyses of Motivation and Interaction in Science Classrooms

    NASA Astrophysics Data System (ADS)

    Moeller Andersen, Hanne; Nielsen, Birgitte Lund

    2013-04-01

    An analytical framework for examining students' motivation was developed and used for analyses of video excerpts from science classrooms. The framework was developed in an iterative process involving theories on motivation and video excerpts from a 'motivational event' where students worked in groups. Subsequently, the framework was used for an analysis of students' motivation in the whole class situation. A cross-case analysis was carried out illustrating characteristics of students' motivation dependent on the context. This research showed that students' motivation to learn science is stimulated by a range of different factors, with autonomy, relatedness and belonging apparently being the main sources of motivation. The teacher's combined use of questions, uptake and high level evaluation was very important for students' learning processes and motivation, especially students' self-efficacy. By coding and analysing video excerpts from science classrooms, we were able to demonstrate that the analytical framework helped us gain new insights into the effect of teachers' communication and other elements on students' motivation.

  3. A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.

    PubMed

    Morag, Ido; Luria, Gil

    2013-01-01

    Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.

  4. New York State energy-analytic information system: first-stage implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allentuck, J.; Carroll, O.; Fiore, L.

    1979-09-01

    So that energy policy by state government may be formulated within the constraints imposed by policy determined at the national level - yet reflect the diverse interests of its citizens - large quantities of data and sophisticated analytic capabilities are required. This report presents the design of an energy-information/analytic system for New York State, the data for a base year, 1976, and projections of these data. At the county level, 1976 energy-supply demand data and electric generating plant data are provided as well. Data-base management is based on System 2000. Three computerized models provide the system's basic analytic capacity. Themore » Brookhaven Energy System Network Simulator provides an integrating framework while a price-response model and a weather sensitive energy demand model furnished a short-term energy response estimation capability. The operation of these computerized models is described. 62 references, 25 figures, 39 tables.« less

  5. DIVE: A Graph-based Visual Analytics Framework for Big Data

    PubMed Central

    Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie

    2014-01-01

    The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197

  6. Analysis of System-Wide Investment in the National Airspace System: A Portfolio Analytical Framework and an Example

    NASA Technical Reports Server (NTRS)

    Bhadra, Dipasis; Morser, Frederick R.

    2006-01-01

    In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.

  7. A consortium-driven framework to guide the implementation of ICH M7 Option 4 control strategies.

    PubMed

    Barber, Chris; Antonucci, Vincent; Baumann, Jens-Christoph; Brown, Roland; Covey-Crump, Elizabeth; Elder, David; Elliott, Eric; Fennell, Jared W; Gallou, Fabrice; Ide, Nathan D; Jordine, Guido; Kallemeyn, Jeffrey M; Lauwers, Dirk; Looker, Adam R; Lovelle, Lucie E; McLaughlin, Mark; Molzahn, Robert; Ott, Martin; Schils, Didier; Oestrich, Rolf Schulte; Stevenson, Neil; Talavera, Pere; Teasdale, Andrew; Urquhart, Michael W; Varie, David L; Welch, Dennie

    2017-11-01

    The ICH M7 Option 4 control of (potentially) mutagenic impurities is based on the use of scientific principles in lieu of routine analytical testing. This approach can reduce the burden of analytical testing without compromising patient safety, provided a scientifically rigorous approach is taken which is backed up by sufficient theoretical and/or analytical data. This paper introduces a consortium-led initiative and offers a proposal on the supporting evidence that could be presented in regulatory submissions. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. High Technology Service Value Maximization through an MCDM-Based Innovative e-Business Model

    NASA Astrophysics Data System (ADS)

    Huang, Chi-Yo; Tzeng, Gwo-Hshiung; Ho, Wen-Rong; Chuang, Hsiu-Tyan; Lue, Yeou-Feng

    The emergence of the Internet has changed the high technology marketing channels thoroughly in the past decade while E-commerce has already become one of the most efficient channels which high technology firms may skip the intermediaries and reach end customers directly. However, defining appropriate e-business models for commercializing new high technology products or services through Internet are not that easy. To overcome the above mentioned problems, a novel analytic framework based on the concept of high technology customers’ competence set expansion by leveraging high technology service firms’ capabilities and resources as well as novel multiple criteria decision making (MCDM) techniques, will be proposed in order to define an appropriate e-business model. An empirical example study of a silicon intellectual property (SIP) commercialization e-business model based on MCDM techniques will be provided for verifying the effectiveness of this novel analytic framework. The analysis successful assisted a Taiwanese IC design service firm to define an e-business model for maximizing its customer’s SIP transactions. In the future, the novel MCDM framework can be applied successful to novel business model definitions in the high technology industry.

  9. High-Contrast Gratings based Spoof Surface Plasmons

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Liu, Liangliang; Xu, Bingzheng; Ning, Pingping; Chen, Chen; Xu, Jia; Chen, Xinlei; Gu, Changqing; Qing, Quan

    2016-02-01

    In this work, we explore the existence of spoof surface plasmons (SSPs) supported by deep-subwavelength high-contrast gratings (HCGs) on a perfect electric conductor plane. The dispersion relation of the HCGs-based SSPs is derived analyt- ically by combining multimode network theory with rigorous mode matching method, which has nearly the same form with and can be degenerated into that of the SSPs arising from deep-subwavelength metallic gratings (MGs). Numerical simula- tions validate the analytical dispersion relation and an effective medium approximation is also presented to obtain the same analytical dispersion formula. This work sets up a unified theoretical framework for SSPs and opens up new vistas in surface plasmon optics.

  10. Organizational culture and organizational effectiveness: a meta-analytic investigation of the competing values framework's theoretical suppositions.

    PubMed

    Hartnell, Chad A; Ou, Amy Yi; Kinicki, Angelo

    2011-07-01

    We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial performance). The paper also tests theoretical suppositions undergirding the CVF by investigating the framework's nomological validity and proposed internal structure (i.e., interrelationships among culture types). Results based on data from 84 empirical studies with 94 independent samples indicate that clan, adhocracy, and market cultures are differentially and positively associated with the effectiveness criteria, though not always as hypothesized. The findings provide mixed support for the CVF's nomological validity and fail to support aspects of the CVF's proposed internal structure. We propose an alternative theoretical approach to the CVF and delineate directions for future research.

  11. Harmonizing community-based health worker programs for HIV: a narrative review and analytic framework.

    PubMed

    De Neve, Jan-Walter; Boudreaux, Chantelle; Gill, Roopan; Geldsetzer, Pascal; Vaikath, Maria; Bärnighausen, Till; Bossert, Thomas J

    2017-07-03

    Many countries have created community-based health worker (CHW) programs for HIV. In most of these countries, several national and non-governmental initiatives have been implemented raising questions of how well these different approaches address the health problems and use health resources in a compatible way. While these questions have led to a general policy initiative to promote harmonization across programs, there is a need for countries to develop a more coherent and organized approach to CHW programs and to generate evidence about the most efficient and effective strategies to ensure their optimal, sustained performance. We conducted a narrative review of the existing published and gray literature on the harmonization of CHW programs. We searched for and noted evidence on definitions, models, and/or frameworks of harmonization; theoretical arguments or hypotheses about the effects of CHW program fragmentation; and empirical evidence. Based on this evidence, we defined harmonization, introduced three priority areas for harmonization, and identified a conceptual framework for analyzing harmonization of CHW programs that can be used to support their expanding role in HIV service delivery. We identified and described the major issues and relationships surrounding the harmonization of CHW programs, including key characteristics, facilitators, and barriers for each of the priority areas of harmonization, and used our analytic framework to map overarching findings. We apply this approach of CHW programs supporting HIV services across four countries in Southern Africa in a separate article. There is a large number and immense diversity of CHW programs for HIV. This includes integration of HIV components into countries' existing national programs along with the development of multiple, stand-alone CHW programs. We defined (i) coordination among stakeholders, (ii) integration into the broader health system, and (iii) assurance of a CHW program's sustainability to be priority areas of harmonization. While harmonization is likely a complex political process, with in many cases incremental steps toward improvement, a wide range of facilitators are available to decision-makers. These can be categorized using an analytic framework assessing the (i) health issue, (ii) intervention itself, (iii) stakeholders, (iv) health system, and (v) broad context. There is a need to address fragmentation of CHW programs to advance and sustain CHW roles and responsibilities for HIV. This study provides a narrative review and analytic framework to understand the process by which harmonization of CHW programs might be achieved and to test the assumption that harmonization is needed to improve CHW performance.

  12. Analysing task design and students' responses to context-based problems through different analytical frameworks

    NASA Astrophysics Data System (ADS)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been found successful to analyse both the test items as well as students' responses in a systematic way. The framework can therefore be applied in the design of new tasks, the analysis and assessment of students' responses, and as a tool for teachers to scaffold students in their problem-solving process. Conclusions:This paper gives implications for practice and for future research to both develop new context-based problems in a structured way, as well as providing analytical tools for investigating students' higher order thinking in their responses to these tasks.

  13. A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.

    2015-12-01

    Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.

  14. Assessing sustainability effect of infrastructure transportation projects using systems-based analytic framework.

    DOT National Transportation Integrated Search

    2015-07-01

    Sustainability means providing for the necessities of today without endangering the necessities of tomorrow within the technical, environmental, economic, social/cultural, and individual contexts. However, the assessment tools available to study the ...

  15. Industrial Internet of Things-Based Collaborative Sensing Intelligence: Framework and Research Challenges.

    PubMed

    Chen, Yuanfang; Lee, Gyu Myoung; Shu, Lei; Crespi, Noel

    2016-02-06

    The development of an efficient and cost-effective solution to solve a complex problem (e.g., dynamic detection of toxic gases) is an important research issue in the industrial applications of the Internet of Things (IoT). An industrial intelligent ecosystem enables the collection of massive data from the various devices (e.g., sensor-embedded wireless devices) dynamically collaborating with humans. Effectively collaborative analytics based on the collected massive data from humans and devices is quite essential to improve the efficiency of industrial production/service. In this study, we propose a collaborative sensing intelligence (CSI) framework, combining collaborative intelligence and industrial sensing intelligence. The proposed CSI facilitates the cooperativity of analytics with integrating massive spatio-temporal data from different sources and time points. To deploy the CSI for achieving intelligent and efficient industrial production/service, the key challenges and open issues are discussed, as well.

  16. Industrial Internet of Things-Based Collaborative Sensing Intelligence: Framework and Research Challenges

    PubMed Central

    Chen, Yuanfang; Lee, Gyu Myoung; Shu, Lei; Crespi, Noel

    2016-01-01

    The development of an efficient and cost-effective solution to solve a complex problem (e.g., dynamic detection of toxic gases) is an important research issue in the industrial applications of the Internet of Things (IoT). An industrial intelligent ecosystem enables the collection of massive data from the various devices (e.g., sensor-embedded wireless devices) dynamically collaborating with humans. Effectively collaborative analytics based on the collected massive data from humans and devices is quite essential to improve the efficiency of industrial production/service. In this study, we propose a collaborative sensing intelligence (CSI) framework, combining collaborative intelligence and industrial sensing intelligence. The proposed CSI facilitates the cooperativity of analytics with integrating massive spatio-temporal data from different sources and time points. To deploy the CSI for achieving intelligent and efficient industrial production/service, the key challenges and open issues are discussed, as well. PMID:26861345

  17. A Framework for Understanding Physics Students' Computational Modeling Practices

    NASA Astrophysics Data System (ADS)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by their existing physics content knowledge, particularly their knowledge of analytic procedures. While this existing knowledge was often applied in inappropriate circumstances, the students were still able to display a considerable amount of understanding of the physics content and of analytic solution procedures. These observations could not be adequately accommodated by the existing literature of programming comprehension. In extending the resource framework to the task of computational modeling, I model students' practices in terms of three important elements. First, a knowledge base includes re- sources for understanding physics, math, and programming structures. Second, a mechanism for monitoring and control describes students' expectations as being directed towards numerical, analytic, qualitative or rote solution approaches and which can be influenced by the problem representation. Third, a set of solution approaches---many of which were identified in this study---describe what aspects of the knowledge base students use and how they use that knowledge to enact their expectations. This framework allows us as researchers to track student discussions and pinpoint the source of difficulties. This work opens up many avenues of potential research. First, this framework gives researchers a vocabulary for extending Resource Theory to other domains of instruction, such as modeling how physics students use graphs. Second, this framework can be used as the basis for modeling expert physicists' programming practices. Important instructional implications also follow from this research. Namely, as we broaden the use of computational modeling in the physics classroom, our instructional practices should focus on helping students understand the step-by-step nature of programming in contrast to the already salient analytic procedures.

  18. 77 FR 33683 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security, U.S. Customs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-07

    ... Border Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY... Framework for Intelligence (AFI) System of Records'' and this proposed rulemaking. In this proposed... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records.'' AFI enhances DHS's...

  19. Analytical Overview of the European and Russian Qualifications Frameworks with a Focus on Doctoral Degree Level

    ERIC Educational Resources Information Center

    Chigisheva, Oksana; Bondarenko, Anna; Soltovets, Elena

    2017-01-01

    The paper provides analytical insights into highly acute issues concerning preparation and adoption of Qualifications Frameworks being an adequate response to the growing interactions at the global labor market and flourishing of knowledge economy. Special attention is paid to the analyses of transnational Meta Qualifications Frameworks (A…

  20. Degrees of School Democracy: A Holistic Framework

    ERIC Educational Resources Information Center

    Woods, Philip A.; Woods, Glenys J.

    2012-01-01

    This article outlines an analytical framework that enables analysis of degrees of democracy in a school or other organizational setting. It is founded in a holistic conception of democracy, which is a model of working together that aspires to truth, goodness, and meaning and the participation of all. We suggest that the analytical framework can be…

  1. Two logics of policy intervention in immigrant integration: an institutionalist framework based on capabilities and aspirations.

    PubMed

    Lutz, Philipp

    2017-01-01

    The effectiveness of immigrant integration policies has gained considerable attention across Western democracies dealing with ethnically and culturally diverse societies. However, the findings on what type of policy produces more favourable integration outcomes remain inconclusive. The conflation of normative and analytical assumptions on integration is a major challenge for causal analysis of integration policies. This article applies actor-centered institutionalism as a new framework for the analysis of immigrant integration outcomes in order to separate two different mechanisms of policy intervention. Conceptualising integration outcomes as a function of capabilities and aspirations allows separating assumptions on the policy intervention in assimilation and multiculturalism as the two main types of policy approaches. The article illustrates that assimilation is an incentive-based policy and primarily designed to increase immigrants' aspirations, whereas multiculturalism is an opportunity-based policy and primarily designed to increase immigrants' capabilities. Conceptualising causal mechanisms of policy intervention clarifies the link between normative concepts of immigrant integration and analytical concepts of policy effectiveness.

  2. Development of an analytical model for estimating global terrestrial carbon assimilation using a rate-limitation framework

    NASA Astrophysics Data System (ADS)

    Donohue, Randall; Yang, Yuting; McVicar, Tim; Roderick, Michael

    2016-04-01

    A fundamental question in climate and ecosystem science is "how does climate regulate the land surface carbon budget?" To better answer that question, here we develop an analytical model for estimating mean annual terrestrial gross primary productivity (GPP), which is the largest carbon flux over land, based on a rate-limitation framework. Actual GPP (climatological mean from 1982 to 2010) is calculated as a function of the balance between two GPP potentials defined by the climate (i.e., precipitation and solar radiation) and a third parameter that encodes other environmental variables and modifies the GPP-climate relationship. The developed model was tested at three spatial scales using different GPP sources, i.e., (1) observed GPP from 94 flux-sites, (2) modelled GPP (using the model-tree-ensemble approach) at 48654 (0.5 degree) grid-cells and (3) at 32 large catchments across the globe. Results show that the proposed model could account for the spatial GPP patterns, with a root-mean-square error of 0.70, 0.65 and 0.3 g C m-2 d-1 and R2 of 0.79, 0.92 and 0.97 for the flux-site, grid-cell and catchment scales, respectively. This analytical GPP model shares a similar form with the Budyko hydroclimatological model, which opens the possibility of a general analytical framework to analyze the linked carbon-water-energy cycles.

  3. The path dependency theory: analytical framework to study institutional integration. The case of France.

    PubMed

    Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique

    2010-06-30

    The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France.

  4. Analytical approach for collective diffusion: One-dimensional lattice with the nearest neighbor and the next nearest neighbor lateral interactions

    NASA Astrophysics Data System (ADS)

    Tarasenko, Alexander

    2018-01-01

    Diffusion of particles adsorbed on a homogeneous one-dimensional lattice is investigated using a theoretical approach and MC simulations. The analytical dependencies calculated in the framework of approach are tested using the numerical data. The perfect coincidence of the data obtained by these different methods demonstrates that the correctness of the approach based on the theory of the non-equilibrium statistical operator.

  5. 8D likelihood effective Higgs couplings extraction framework in h → 4ℓ

    DOE PAGES

    Chen, Yi; Di Marco, Emanuele; Lykken, Joe; ...

    2015-01-23

    We present an overview of a comprehensive analysis framework aimed at performing direct extraction of all possible effective Higgs couplings to neutral electroweak gauge bosons in the decay to electrons and muons, the so called ‘golden channel’. Our framework is based primarily on a maximum likelihood method constructed from analytic expressions of the fully differential cross sections for h → 4l and for the dominant irreduciblemore » $$ q\\overline{q} $$ → 4l background, where 4l = 2e2μ, 4e, 4μ. Detector effects are included by an explicit convolution of these analytic expressions with the appropriate transfer function over all center of mass variables. Utilizing the full set of observables, we construct an unbinned detector-level likelihood which is continuous in the effective couplings. We consider possible ZZ, Zγ, and γγ couplings simultaneously, allowing for general CP odd/even admixtures. A broad overview is given of how the convolution is performed and we discuss the principles and theoretical basis of the framework. This framework can be used in a variety of ways to study Higgs couplings in the golden channel using data obtained at the LHC and other future colliders.« less

  6. Risk-based evaluation of commercial motor vehicle roadside violations : process and results

    DOT National Transportation Integrated Search

    2010-09-01

    This report provides an analytic framework for evaluating the Atlanta Congestion Reduction Demonstration (CRD) under the United States Department of Transportation (U.S. DOT) Urban Partnership Agreement (UPA) and CRD Programs. The Atlanta CRD project...

  7. An Analytical Impact Assessment Framework for Wildlife to Inform the Siting and Permitting of Wind Energy Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Jesse D.M.

    In the United States overall electrical generation capacity is expected to increase by 10-25 gigawatts (GW) per year to meet increases in demand. Wind energy is a key component of state and federal renewable energy standards, and central to the Department of Energy’s 20% by 2030 wind production goals. Increased wind energy development may present increased resource conflict with avian wildlife, and environmental permitting has been identified as a potential obstacle to expansion in the sector. ICF developed an analytical framework to help applicants and agencies examine potential impacts in support of facility siting and permitting. A key objective ofmore » our work was to develop a framework that is scalable from the local to the national level, and one that is generalizable across the different scales at which biological communities operate – from local influences to meta-populations. The intent was to allow natural resource managers to estimate the cumulative impacts of turbine strikes and habitat changes on long-term population performance in the context of a species demography, genetic potential, and life history. We developed three types of models based on our literature review and participation in the scientific review processes. First, the conceptual model was developed as a general description of the analytical framework. Second, we developed the analytical framework based on the relationships between concepts, and the functions presented in the scientific literature. Third, we constructed an application of the model by parameterizing the framework using data from and relevant to the Altamont Pass Wind Resource Area (APWRA), and an existing golden eagle population model. We developed managed source code, database create statements, and written documentation to allow for the reproduction of each phase of the analysis. ICF identified a potential template adaptive management system in the form of the US Fish & Wildlife Service (USFWS) Adaptive Harvest Management (AHM) program, and developed recommendations for the structure and function of a similar wind-facility related program. We provided a straw-man implementation of the analytical framework based on assumptions for APWRA-wide golden eagle fatalities, and presented a statistical examination of the model performance. APWRA-wide fatality rates appear substantial at all scales examined from the local APWRA population to the Bird Conservation Region. Documented fatality rates significantly influenced population performance in terms of non-territorial non-breeding birds. Breeder, Juvenile, Subadult, and Adult abundance were mostly unaffected by Baseline APWRA-wide fatality rates. However, increased variability in fatality rates would likely have impacts on long-term population performance, and would result in a substantially larger loss of resources. We developed four recommendations for future study. First, we recommend establishment of concept experts through the existing system of non-profits, regulatory agencies, academia, and industry in the wind energy sector. Second, we recommend the development of a central or distributed shared data repository, and establish guidelines for data sharing and transparency. Third, we recommend development a forum and process for model selection at the local and national level. Last, we recommend experimental implementation of the prescribed system at broader scales, and refinement the expectations for modeling and adaptive management.« less

  8. Branch length estimation and divergence dating: estimates of error in Bayesian and maximum likelihood frameworks.

    PubMed

    Schwartz, Rachel S; Mueller, Rachel L

    2010-01-11

    Estimates of divergence dates between species improve our understanding of processes ranging from nucleotide substitution to speciation. Such estimates are frequently based on molecular genetic differences between species; therefore, they rely on accurate estimates of the number of such differences (i.e. substitutions per site, measured as branch length on phylogenies). We used simulations to determine the effects of dataset size, branch length heterogeneity, branch depth, and analytical framework on branch length estimation across a range of branch lengths. We then reanalyzed an empirical dataset for plethodontid salamanders to determine how inaccurate branch length estimation can affect estimates of divergence dates. The accuracy of branch length estimation varied with branch length, dataset size (both number of taxa and sites), branch length heterogeneity, branch depth, dataset complexity, and analytical framework. For simple phylogenies analyzed in a Bayesian framework, branches were increasingly underestimated as branch length increased; in a maximum likelihood framework, longer branch lengths were somewhat overestimated. Longer datasets improved estimates in both frameworks; however, when the number of taxa was increased, estimation accuracy for deeper branches was less than for tip branches. Increasing the complexity of the dataset produced more misestimated branches in a Bayesian framework; however, in an ML framework, more branches were estimated more accurately. Using ML branch length estimates to re-estimate plethodontid salamander divergence dates generally resulted in an increase in the estimated age of older nodes and a decrease in the estimated age of younger nodes. Branch lengths are misestimated in both statistical frameworks for simulations of simple datasets. However, for complex datasets, length estimates are quite accurate in ML (even for short datasets), whereas few branches are estimated accurately in a Bayesian framework. Our reanalysis of empirical data demonstrates the magnitude of effects of Bayesian branch length misestimation on divergence date estimates. Because the length of branches for empirical datasets can be estimated most reliably in an ML framework when branches are <1 substitution/site and datasets are > or =1 kb, we suggest that divergence date estimates using datasets, branch lengths, and/or analytical techniques that fall outside of these parameters should be interpreted with caution.

  9. Exploring the Argumentation Pattern in Modeling-Based Learning about Apparent Motion of Mars

    ERIC Educational Resources Information Center

    Park, Su-Kyeong

    2016-01-01

    This study proposed an analytic framework for coding students' dialogic argumentation and investigated the characteristics of the small-group argumentation pattern observed in modeling-based learning. The participants were 122 second grade high school students in South Korea divided into an experimental and a comparison group. Modeling-based…

  10. Trajectory-Oriented Approach to Managing Traffic Complexity: Operational Concept and Preliminary Metrics Definition

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Vivona, Robert; Garcia-Chico, Jose L.

    2008-01-01

    This document describes preliminary research on a distributed, trajectory-oriented approach for traffic complexity management. The approach is to manage traffic complexity in a distributed control environment, based on preserving trajectory flexibility and minimizing constraints. In particular, the document presents an analytical framework to study trajectory flexibility and the impact of trajectory constraints on it. The document proposes preliminary flexibility metrics that can be interpreted and measured within the framework.

  11. Predicted range expansion of Chinese tallow tree (Triadica sebifera) in forestlands of the southern United States

    Treesearch

    Hsiao-Hsuan Wang; William Grant; Todd Swannack; Jianbang Gan; William Rogers; Tomasz Koralewski; James Miller; John W. Taylor Jr.

    2011-01-01

    We present an integrated approach for predicting future range expansion of an invasive species (Chinese tallow tree) that incorporates statistical forecasting and analytical techniques within a spatially explicit, agent-based, simulation framework.

  12. .Network analytics for adverse outcome pathways

    EPA Science Inventory

    Adverse Outcome Pathways (AOPs) organize toxicological knowledge from the molecular level up to the population level, providing evidence-based causal linkages at each step. The AOPWiki serves as a repository of AOPs. With the international adoption of the AOP framework, the AOPw...

  13. Manuscript 116 Mechanisms: DNA Reactive Aagents

    EPA Science Inventory

    ABSTRACT The U.S. Environmental Protection Agency’s Guidelines for Carcinogen Risk Assessment (2005) uses an analytical framework for conducting a quantitative cancer risk assessment that is based on mode of action/key events and human relevance. The approach stresses the enh...

  14. PAUSE: Predictive Analytics Using SPARQL-Endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela; Bond, Nathaniel

    2014-07-11

    This invention relates to the medical industry and more specifically to methods of predicting risks. With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  15. A Network Method of Measuring Affiliation-Based Peer Influence: Assessing the Influences of Teammates' Smoking on Adolescent Smoking

    ERIC Educational Resources Information Center

    Fujimoto, Kayo; Unger, Jennifer B.; Valente, Thomas W.

    2012-01-01

    Using a network analytic framework, this study introduces a new method to measure peer influence based on adolescents' affiliations or 2-mode social network data. Exposure based on affiliations is referred to as the "affiliation exposure model." This study demonstrates the methodology using data on young adolescent smoking being influenced by…

  16. A hierarchical framework for investigating epiphyte assemblages: networks, meta-communities, and scale.

    PubMed

    Burns, K C; Zotz, G

    2010-02-01

    Epiphytes are an important component of many forested ecosystems, yet our understanding of epiphyte communities lags far behind that of terrestrial-based plant communities. This discrepancy is exacerbated by the lack of a theoretical context to assess patterns in epiphyte community structure. We attempt to fill this gap by developing an analytical framework to investigate epiphyte assemblages, which we then apply to a data set on epiphyte distributions in a Panamanian rain forest. On a coarse scale, interactions between epiphyte species and host tree species can be viewed as bipartite networks, similar to pollination and seed dispersal networks. On a finer scale, epiphyte communities on individual host trees can be viewed as meta-communities, or suites of local epiphyte communities connected by dispersal. Similar analytical tools are typically employed to investigate species interaction networks and meta-communities, thus providing a unified analytical framework to investigate coarse-scale (network) and fine-scale (meta-community) patterns in epiphyte distributions. Coarse-scale analysis of the Panamanian data set showed that most epiphyte species interacted with fewer host species than expected by chance. Fine-scale analyses showed that epiphyte species richness on individual trees was lower than null model expectations. Therefore, epiphyte distributions were clumped at both scales, perhaps as a result of dispersal limitations. Scale-dependent patterns in epiphyte species composition were observed. Epiphyte-host networks showed evidence of negative co-occurrence patterns, which could arise from adaptations among epiphyte species to avoid competition for host species, while most epiphyte meta-communities were distributed at random. Application of our "meta-network" analytical framework in other locales may help to identify general patterns in the structure of epiphyte assemblages and their variation in space and time.

  17. Authentic Oral Language Production and Interaction in CALL: An Evolving Conceptual Framework for the Use of Learning Analytics within the SpeakApps Project

    ERIC Educational Resources Information Center

    Nic Giolla Mhichíl, Mairéad; van Engen, Jeroen; Ó Ciardúbháin, Colm; Ó Cléircín, Gearóid; Appel, Christine

    2014-01-01

    This paper sets out to construct and present the evolving conceptual framework of the SpeakApps projects to consider the application of learning analytics to facilitate synchronous and asynchronous oral language skills within this CALL context. Drawing from both the CALL and wider theoretical and empirical literature of learner analytics, the…

  18. Organisational Role Climates: Success-Failure Configurations in Educational Leadership (or Are Educational Administrators Doomed to Succeed?)

    ERIC Educational Resources Information Center

    Inbar, Dan E.

    1980-01-01

    Presents an analytical framework based on a threefold classification--unequivocal failure, "satisficing," and unequivocal success--and four basic role climates--apathetic, frustrating, tense, and tranquil--that is applied to the elementary school principalship. (Author/WD)

  19. Analytical modeling of the dynamics of brushless dc motors for aerospace applications: A conceptual framework

    NASA Technical Reports Server (NTRS)

    Demerdash, N. A. O.

    1976-01-01

    The modes of operation of the brushless d.c. machine and its corresponding characteristics (current flow, torque-position, etc.) are presented. The foundations and basic principles on which the preliminary numerical model is based, are discussed.

  20. Teachers' Learning While Constructing Technology-Based Instructional Resources

    ERIC Educational Resources Information Center

    Polly, Drew

    2011-01-01

    Grounded in a constructionist paradigm, this study examined elementary school teachers' learning while creating technology-rich instructional materials. Sixteen teachers at an elementary school were interviewed about their experience. Using the components of Technological Pedagogical and Content Knowledge as an analytical framework, inductive…

  1. Framework for assessing key variable dependencies in loose-abrasive grinding and polishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, J.S.; Aikens, D.M.; Brown, N.J.

    1995-12-01

    This memo describes a framework for identifying all key variables that determine the figuring performance of loose-abrasive lapping and polishing machines. This framework is intended as a tool for prioritizing R&D issues, assessing the completeness of process models and experimental data, and for providing a mechanism to identify any assumptions in analytical models or experimental procedures. Future plans for preparing analytical models or performing experiments can refer to this framework in establishing the context of the work.

  2. Using Analytics to Transform a Problem-Based Case Library: An Educational Design Research Approach

    ERIC Educational Resources Information Center

    Schmidt, Matthew; Tawfik, Andrew A.

    2018-01-01

    This article describes the iterative design, development, and evaluation of a case-based learning environment focusing on an ill-structured sales management problem. We discuss our processes and situate them within the broader framework of educational design research. The learning environment evolved over the course of three design phases. A…

  3. Confronting Analytical Dilemmas for Understanding Complex Human Interactions in Design-Based Research from a Cultural-Historical Activity Theory (CHAT) Framework

    ERIC Educational Resources Information Center

    Yamagata-Lynch, Lisa C.

    2007-01-01

    Understanding human activity in real-world situations often involves complicated data collection, analysis, and presentation methods. This article discusses how Cultural-Historical Activity Theory (CHAT) can inform design-based research practices that focus on understanding activity in real-world situations. I provide a sample data set with…

  4. The added value of thorough economic evaluation of telemedicine networks.

    PubMed

    Le Goff-Pronost, Myriam; Sicotte, Claude

    2010-02-01

    This paper proposes a thorough framework for the economic evaluation of telemedicine networks. A standard cost analysis methodology was used as the initial base, similar to the evaluation method currently being applied to telemedicine, and to which we suggest adding subsequent stages that enhance the scope and sophistication of the analytical methodology. We completed the methodology with a longitudinal and stakeholder analysis, followed by the calculation of a break-even threshold, a calculation of the economic outcome based on net present value (NPV), an estimate of the social gain through external effects, and an assessment of the probability of social benefits. In order to illustrate the advantages, constraints and limitations of the proposed framework, we tested it in a paediatric cardiology tele-expertise network. The results demonstrate that the project threshold was not reached after the 4 years of the study. Also, the calculation of the project's NPV remained negative. However, the additional analytical steps of the proposed framework allowed us to highlight alternatives that can make this service economically viable. These included: use over an extended period of time, extending the network to other telemedicine specialties, or including it in the services offered by other community hospitals. In sum, the results presented here demonstrate the usefulness of an economic evaluation framework as a way of offering decision makers the tools they need to make comprehensive evaluations of telemedicine networks.

  5. Time and frequency domain characteristics of detrending-operation-based scaling analysis: Exact DFA and DMA frequency responses

    NASA Astrophysics Data System (ADS)

    Kiyono, Ken; Tsujimoto, Yutaka

    2016-07-01

    We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.

  6. Time and frequency domain characteristics of detrending-operation-based scaling analysis: Exact DFA and DMA frequency responses.

    PubMed

    Kiyono, Ken; Tsujimoto, Yutaka

    2016-07-01

    We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.

  7. Modelling vortex-induced fluid-structure interaction.

    PubMed

    Benaroya, Haym; Gabbai, Rene D

    2008-04-13

    The principal goal of this research is developing physics-based, reduced-order, analytical models of nonlinear fluid-structure interactions associated with offshore structures. Our primary focus is to generalize the Hamilton's variational framework so that systems of flow-oscillator equations can be derived from first principles. This is an extension of earlier work that led to a single energy equation describing the fluid-structure interaction. It is demonstrated here that flow-oscillator models are a subclass of the general, physical-based framework. A flow-oscillator model is a reduced-order mechanical model, generally comprising two mechanical oscillators, one modelling the structural oscillation and the other a nonlinear oscillator representing the fluid behaviour coupled to the structural motion.Reduced-order analytical model development continues to be carried out using a Hamilton's principle-based variational approach. This provides flexibility in the long run for generalizing the modelling paradigm to complex, three-dimensional problems with multiple degrees of freedom, although such extension is very difficult. As both experimental and analytical capabilities advance, the critical research path to developing and implementing fluid-structure interaction models entails-formulating generalized equations of motion, as a superset of the flow-oscillator models; and-developing experimentally derived, semi-analytical functions to describe key terms in the governing equations of motion. The developed variational approach yields a system of governing equations. This will allow modelling of multiple d.f. systems. The extensions derived generalize the Hamilton's variational formulation for such problems. The Navier-Stokes equations are derived and coupled to the structural oscillator. This general model has been shown to be a superset of the flow-oscillator model. Based on different assumptions, one can derive a variety of flow-oscillator models.

  8. Sandplay therapy with couples within the framework of analytical psychology.

    PubMed

    Albert, Susan Carol

    2015-02-01

    Sandplay therapy with couples is discussed within an analytical framework. Guidelines are proposed as a means of developing this relatively new area within sandplay therapy, and as a platform to open a wider discussion to bring together sandplay therapy and couple therapy. Examples of sand trays created during couple therapy are also presented to illustrate the transformations during the therapeutic process. © 2015, The Society of Analytical Psychology.

  9. The path dependency theory: analytical framework to study institutional integration. The case of France

    PubMed Central

    Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique

    2010-01-01

    Background The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. Purpose PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. Methods A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Results Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Conclusion Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France. PMID:20689740

  10. Optimal Medical Equipment Maintenance Service Proposal Decision Support System combining Activity Based Costing (ABC) and the Analytic Hierarchy Process (AHP).

    PubMed

    da Rocha, Leticia; Sloane, Elliot; M Bassani, Jose

    2005-01-01

    This study describes a framework to support the choice of the maintenance service (in-house or third party contract) for each category of medical equipment based on: a) the real medical equipment maintenance management system currently used by the biomedical engineering group of the public health system of the Universidade Estadual de Campinas located in Brazil to control the medical equipment maintenance service, b) the Activity Based Costing (ABC) method, and c) the Analytic Hierarchy Process (AHP) method. Results show the cost and performance related to each type of maintenance service. Decision-makers can use these results to evaluate possible strategies for the categories of equipment.

  11. Safety risk assessment using analytic hierarchy process (AHP) during planning and budgeting of construction projects.

    PubMed

    Aminbakhsh, Saman; Gunduz, Murat; Sonmez, Rifat

    2013-09-01

    The inherent and unique risks on construction projects quite often present key challenges to contractors. Health and safety risks are among the most significant risks in construction projects since the construction industry is characterized by a relatively high injury and death rate compared to other industries. In construction project management, safety risk assessment is an important step toward identifying potential hazards and evaluating the risks associated with the hazards. Adequate prioritization of safety risks during risk assessment is crucial for planning, budgeting, and management of safety related risks. In this paper, a safety risk assessment framework is presented based on the theory of cost of safety (COS) model and the analytic hierarchy process (AHP). The main contribution of the proposed framework is that it presents a robust method for prioritization of safety risks in construction projects to create a rational budget and to set realistic goals without compromising safety. The framework provides a decision tool for the decision makers to determine the adequate accident/injury prevention investments while considering the funding limits. The proposed safety risk framework is illustrated using a real-life construction project and the advantages and limitations of the framework are discussed. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  12. The Aggregate Exposure Pathway (AEP): A conceptual framework for advancing exposure science research and applications

    EPA Science Inventory

    Historically, risk assessment has relied upon toxicological data to obtain hazard-based reference levels, which are subsequently compared to exposure estimates to determine whether an unacceptable risk to public health may exist. Recent advances in analytical methods, biomarker ...

  13. Parameter Estimation of Computationally Expensive Watershed Models Through Efficient Multi-objective Optimization and Interactive Decision Analytics

    NASA Astrophysics Data System (ADS)

    Akhtar, Taimoor; Shoemaker, Christine

    2016-04-01

    Watershed model calibration is inherently a multi-criteria problem. Conflicting trade-offs exist between different quantifiable calibration criterions indicating the non-existence of a single optimal parameterization. Hence, many experts prefer a manual approach to calibration where the inherent multi-objective nature of the calibration problem is addressed through an interactive, subjective, time-intensive and complex decision making process. Multi-objective optimization can be used to efficiently identify multiple plausible calibration alternatives and assist calibration experts during the parameter estimation process. However, there are key challenges to the use of multi objective optimization in the parameter estimation process which include: 1) multi-objective optimization usually requires many model simulations, which is difficult for complex simulation models that are computationally expensive; and 2) selection of one from numerous calibration alternatives provided by multi-objective optimization is non-trivial. This study proposes a "Hybrid Automatic Manual Strategy" (HAMS) for watershed model calibration to specifically address the above-mentioned challenges. HAMS employs a 3-stage framework for parameter estimation. Stage 1 incorporates the use of an efficient surrogate multi-objective algorithm, GOMORS, for identification of numerous calibration alternatives within a limited simulation evaluation budget. The novelty of HAMS is embedded in Stages 2 and 3 where an interactive visual and metric based analytics framework is available as a decision support tool to choose a single calibration from the numerous alternatives identified in Stage 1. Stage 2 of HAMS provides a goodness-of-fit measure / metric based interactive framework for identification of a small subset (typically less than 10) of meaningful and diverse set of calibration alternatives from the numerous alternatives obtained in Stage 1. Stage 3 incorporates the use of an interactive visual analytics framework for decision support in selection of one parameter combination from the alternatives identified in Stage 2. HAMS is applied for calibration of flow parameters of a SWAT model, (Soil and Water Assessment Tool) designed to simulate flow in the Cannonsville watershed in upstate New York. Results from the application of HAMS to Cannonsville indicate that efficient multi-objective optimization and interactive visual and metric based analytics can bridge the gap between the effective use of both automatic and manual strategies for parameter estimation of computationally expensive watershed models.

  14. Big data analytics workflow management for eScience

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the real challenge in many practical scientific use cases. This talk will specifically address the main needs, requirements and challenges regarding data analytics workflow management applied to large scientific datasets. Three real use cases concerning analytics workflows for sea situational awareness, fire danger prevention, climate change and biodiversity will be discussed in detail.

  15. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework.

    PubMed

    Teeguarden, Justin G; Tan, Yu-Mei; Edwards, Stephen W; Leonard, Jeremy A; Anderson, Kim A; Corley, Richard A; Kile, Molly L; Simonich, Staci M; Stone, David; Tanguay, Robert L; Waters, Katrina M; Harper, Stacey L; Williams, David E

    2016-05-03

    Driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the "systems approaches" used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) concept in the toxicological sciences. Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more meaningful integration of exposure assessment and hazard identification. Together, the two frameworks form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making.

  16. Analytic Frameworks for Assessing Dialogic Argumentation in Online Learning Environments

    ERIC Educational Resources Information Center

    Clark, Douglas B; Sampson, Victor; Weinberger, Armin; Erkens, Gijsbert

    2007-01-01

    Over the last decade, researchers have developed sophisticated online learning environments to support students engaging in dialogic argumentation. This review examines five categories of analytic frameworks for measuring participant interactions within these environments focusing on (1) formal argumentation structure, (2) conceptual quality, (3)…

  17. Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework.

    PubMed

    Khazaei, Hamzeh; McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil

    2015-11-18

    Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids' NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution.

  18. Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework

    PubMed Central

    McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil

    2015-01-01

    Background Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. Objective To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. Methods We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). Results We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids’ NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Conclusions Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution. PMID:26582268

  19. Estimation of the limit of detection using information theory measures.

    PubMed

    Fonollosa, Jordi; Vergara, Alexander; Huerta, Ramón; Marco, Santiago

    2014-01-31

    Definitions of the limit of detection (LOD) based on the probability of false positive and/or false negative errors have been proposed over the past years. Although such definitions are straightforward and valid for any kind of analytical system, proposed methodologies to estimate the LOD are usually simplified to signals with Gaussian noise. Additionally, there is a general misconception that two systems with the same LOD provide the same amount of information on the source regardless of the prior probability of presenting a blank/analyte sample. Based upon an analogy between an analytical system and a binary communication channel, in this paper we show that the amount of information that can be extracted from an analytical system depends on the probability of presenting the two different possible states. We propose a new definition of LOD utilizing information theory tools that deals with noise of any kind and allows the introduction of prior knowledge easily. Unlike most traditional LOD estimation approaches, the proposed definition is based on the amount of information that the chemical instrumentation system provides on the chemical information source. Our findings indicate that the benchmark of analytical systems based on the ability to provide information about the presence/absence of the analyte (our proposed approach) is a more general and proper framework, while converging to the usual values when dealing with Gaussian noise. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Beyond Compartmentalization: A Relational Approach towards Agency and Vulnerability of Young Migrants

    ERIC Educational Resources Information Center

    Huijsmans, Roy

    2012-01-01

    Based on fieldwork material from Lao People's Democratic Republic, this paper introduces an analytical framework that transcends compartmentalized approaches towards migration involving young people. The notions of fluid and institutionalized forms of migration illuminate key differences and commonalities in the relational fabric underpinning…

  1. Analysis of the Image of Scientists Portrayed in the Lebanese National Science Textbooks

    NASA Astrophysics Data System (ADS)

    Yacoubian, Hagop A.; Al-Khatib, Layan; Mardirossian, Taline

    2017-07-01

    This article presents an analysis of how scientists are portrayed in the Lebanese national science textbooks. The purpose of this study was twofold. First, to develop a comprehensive analytical framework that can serve as a tool to analyze the image of scientists portrayed in educational resources. Second, to analyze the image of scientists portrayed in the Lebanese national science textbooks that are used in Basic Education. An analytical framework, based on an extensive review of the relevant literature, was constructed that served as a tool for analyzing the textbooks. Based on evidence-based stereotypes, the framework focused on the individual and work-related characteristics of scientists. Fifteen science textbooks were analyzed using both quantitative and qualitative measures. Our analysis of the textbooks showed the presence of a number of stereotypical images. The scientists are predominantly white males of European descent. Non-Western scientists, including Lebanese and/or Arab scientists are mostly absent in the textbooks. In addition, the scientists are portrayed as rational individuals who work alone, who conduct experiments in their labs by following the scientific method, and by operating within Eurocentric paradigms. External factors do not influence their work. They are engaged in an enterprise which is objective, which aims for discovering the truth out there, and which involves dealing with direct evidence. Implications for science education are discussed.

  2. Barriers to implementation of risk management for federal wildland fire management agencies in the United States

    Treesearch

    Dave Calkin; Matthew P. Thompson; Alan A. Ager; Mark Finney

    2010-01-01

    In this presentation we review progress towards the implementation of a risk-based management framework for U.S. Federal wildland fire policy and operations. We first describe new developments in wildfire simulation technology that catalyzed the development of risk-based decision support systems for strategic wildfire management. These systems include new analytical...

  3. Real-time video analysis for retail stores

    NASA Astrophysics Data System (ADS)

    Hassan, Ehtesham; Maurya, Avinash K.

    2015-03-01

    With the advancement in video processing technologies, we can capture subtle human responses in a retail store environment which play decisive role in the store management. In this paper, we present a novel surveillance video based analytic system for retail stores targeting localized and global traffic estimate. Development of an intelligent system for human traffic estimation in real-life poses a challenging problem because of the variation and noise involved. In this direction, we begin with a novel human tracking system by an intelligent combination of motion based and image level object detection. We demonstrate the initial evaluation of this approach on available standard dataset yielding promising result. Exact traffic estimate in a retail store require correct separation of customers from service providers. We present a role based human classification framework using Gaussian mixture model for this task. A novel feature descriptor named graded colour histogram is defined for object representation. Using, our role based human classification and tracking system, we have defined a novel computationally efficient framework for two types of analytics generation i.e., region specific people count and dwell-time estimation. This system has been extensively evaluated and tested on four hours of real-life video captured from a retail store.

  4. Reasoning across Ontologically Distinct Levels: Students' Understandings of Molecular Genetics

    ERIC Educational Resources Information Center

    Duncan, Ravit Golan; Reiser, Brian J.

    2007-01-01

    In this article we apply a novel analytical framework to explore students' difficulties in understanding molecular genetics--a domain that is particularly challenging to learn. Our analytical framework posits that reasoning in molecular genetics entails mapping across ontologically distinct levels--an information level containing the genetic…

  5. The Illness Narratives of Health Managers: Developing an Analytical Framework

    ERIC Educational Resources Information Center

    Exworthy, Mark

    2011-01-01

    This paper examines the personal experience of illness and healthcare by health managers through their illness narratives. By synthesising a wider literature of illness narratives and health management, an analytical framework is presented, which considers the impact of illness narratives, comprising the logic of illness narratives, the actors…

  6. Incorporating equity considerations in transport infrastructure evaluation: Current practice and a proposed methodology.

    PubMed

    Thomopoulos, N; Grant-Muller, S; Tight, M R

    2009-11-01

    Interest has re-emerged on the issue of how to incorporate equity considerations in the appraisal of transport projects and large road infrastructure projects in particular. This paper offers a way forward in addressing some of the theoretical and practical concerns that have presented difficulties to date in incorporating equity concerns in the appraisal of such projects. Initially an overview of current practice within transport regarding the appraisal of equity considerations in Europe is offered based on an extensive literature review. Acknowledging the value of a framework approach, research towards introducing a theoretical framework is then presented. The proposed framework is based on the well established MCA Analytic Hierarchy Process and is also contrasted with the use of a CBA based approach. The framework outlined here offers an additional support tool to decision makers who will be able to differentiate choices based on their views on specific equity principles and equity types. It also holds the potential to become a valuable tool for evaluators as a result of the option to assess predefined equity perspectives of decision makers against both the project objectives and the estimated project impacts. This framework may also be of further value to evaluators outside transport.

  7. The Effectiveness of Digital Game-Based Vocabulary Learning: A Framework-Based View of Meta-Analysis

    ERIC Educational Resources Information Center

    Chen, Meng-Hua; Tseng, Wen-Ta; Hsiao, Tsung-Yuan

    2018-01-01

    This study presents the results of a meta-analytic study about the effects of digital game-based learning (DGBL) on vocabulary. The results of the study showed that the effects of DGBL on vocabulary learning may vary with game design features (Q = 5.857, df = 1, p = 0.016), but not with learners' age (Q = 0.906, df = 1, p = 0.341) or linguistic…

  8. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yao; Balaprakash, Prasanna; Meng, Jiayuan

    We present Raexplore, a performance modeling framework for architecture exploration. Raexplore enables rapid, automated, and systematic search of architecture design space by combining hardware counter-based performance characterization and analytical performance modeling. We demonstrate Raexplore for two recent manycore processors IBM Blue- Gene/Q compute chip and Intel Xeon Phi, targeting a set of scientific applications. Our framework is able to capture complex interactions between architectural components including instruction pipeline, cache, and memory, and to achieve a 3–22% error for same-architecture and cross-architecture performance predictions. Furthermore, we apply our framework to assess the two processors, and discover and evaluate a list ofmore » architectural scaling options for future processor designs.« less

  10. Framework for event-based semidistributed modeling that unifies the SCS-CN method, VIC, PDM, and TOPMODEL

    NASA Astrophysics Data System (ADS)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-09-01

    Hydrologists and engineers may choose from a range of semidistributed rainfall-runoff models such as VIC, PDM, and TOPMODEL, all of which predict runoff from a distribution of watershed properties. However, these models are not easily compared to event-based data and are missing ready-to-use analytical expressions that are analogous to the SCS-CN method. The SCS-CN method is an event-based model that describes the runoff response with a rainfall-runoff curve that is a function of the cumulative storm rainfall and antecedent wetness condition. Here we develop an event-based probabilistic storage framework and distill semidistributed models into analytical, event-based expressions for describing the rainfall-runoff response. The event-based versions called VICx, PDMx, and TOPMODELx also are extended with a spatial description of the runoff concept of "prethreshold" and "threshold-excess" runoff, which occur, respectively, before and after infiltration exceeds a storage capacity threshold. For total storm rainfall and antecedent wetness conditions, the resulting ready-to-use analytical expressions define the source areas (fraction of the watershed) that produce runoff by each mechanism. They also define the probability density function (PDF) representing the spatial variability of runoff depths that are cumulative values for the storm duration, and the average unit area runoff, which describes the so-called runoff curve. These new event-based semidistributed models and the traditional SCS-CN method are unified by the same general expression for the runoff curve. Since the general runoff curve may incorporate different model distributions, it may ease the way for relating such distributions to land use, climate, topography, ecology, geology, and other characteristics.

  11. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  12. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  13. The Aggregate Exposure Pathway (AEP): A conceptual framework for advancing exposure science research and transforming risk assessment

    EPA Science Inventory

    Recent advances in analytical methods, biomarker discovery, cell-based assay development, computational tools, sensor/monitor, and omics technology have enabled new streams of exposure and toxicity data to be generated at higher volumes and speed. These new data offer the opport...

  14. Institutionalized Mutuality in Canada-China Management Education Collaboration

    ERIC Educational Resources Information Center

    Wei, Shuguang; Liu, Xianjun

    2015-01-01

    This paper examines the Canada-China Management Education Program (CCMEP, 1983-1996) between the University of Toronto (UT) and Huazhong University of Science and Technology (HUST). In this paper, we create a "Three Levels/Four Parameters" analytical framework, based on the concept of mutuality from Johan Galtung (1980) and the concept…

  15. An uncertainty analysis of wildfire modeling [Chapter 13

    Treesearch

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  16. Vocational Training in Germany: Modernisation and Responsiveness.

    ERIC Educational Resources Information Center

    Organisation for Economic Cooperation and Development, Paris (France).

    This volume presents a report on recent developments and current policy objectives in vocational education and training in Germany. The study is based on a conceptual and analytical framework jointly elaborated by the Organisation for Economic Cooperation and Development Secretariat and representatives of member countries. Part I is "The…

  17. Cash on Demand: A Framework for Managing a Cash Liquidity Position.

    ERIC Educational Resources Information Center

    Augustine, John H.

    1995-01-01

    A well-run college or university will seek to accumulate and maintain an appropriate cash reserve or liquidity position. A rigorous analytic process for estimating the size and cost of a liquidity position, based on judgments about the institution's operating risks and opportunities, is outlined. (MSE)

  18. Vocational Training in the Netherlands: Reform and Innovation.

    ERIC Educational Resources Information Center

    Organisation for Economic Cooperation and Development, Paris (France).

    This volume presents a report on recent developments and current policy objectives in vocational education and training in the Netherlands. The study is based on a conceptual and analytical framework jointly elaborated by the Organisation for Economic Cooperation and Development Secretariat and representatives of member countries. Organized in two…

  19. Assessing Proposals for New Global Health Treaties: An Analytic Framework.

    PubMed

    Hoffman, Steven J; Røttingen, John-Arne; Frenk, Julio

    2015-08-01

    We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties.

  20. Assessing Proposals for New Global Health Treaties: An Analytic Framework

    PubMed Central

    Røttingen, John-Arne; Frenk, Julio

    2015-01-01

    We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties. PMID:26066926

  1. Analytic continuation of quantum Monte Carlo data by stochastic analytical inference.

    PubMed

    Fuchs, Sebastian; Pruschke, Thomas; Jarrell, Mark

    2010-05-01

    We present an algorithm for the analytic continuation of imaginary-time quantum Monte Carlo data which is strictly based on principles of Bayesian statistical inference. Within this framework we are able to obtain an explicit expression for the calculation of a weighted average over possible energy spectra, which can be evaluated by standard Monte Carlo simulations, yielding as by-product also the distribution function as function of the regularization parameter. Our algorithm thus avoids the usual ad hoc assumptions introduced in similar algorithms to fix the regularization parameter. We apply the algorithm to imaginary-time quantum Monte Carlo data and compare the resulting energy spectra with those from a standard maximum-entropy calculation.

  2. Methods for integrating moderation and mediation: a general analytical framework using moderated path analysis.

    PubMed

    Edwards, Jeffrey R; Lambert, Lisa Schurer

    2007-03-01

    Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated and the mediated effects under investigation. This article presents a general analytical framework for combining moderation and mediation that integrates moderated regression analysis and path analysis. This framework clarifies how moderator variables influence the paths that constitute the direct, indirect, and total effects of mediated models. The authors empirically illustrate this framework and give step-by-step instructions for estimation and interpretation. They summarize the advantages of their framework over current approaches, explain how it subsumes moderated mediation and mediated moderation, and describe how it can accommodate additional moderator and mediator variables, curvilinear relationships, and structural equation models with latent variables. (c) 2007 APA, all rights reserved.

  3. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  4. Gating Mechanisms of Mechanosensitive Channels of Large Conductance, I: A Continuum Mechanics-Based Hierarchical Framework

    PubMed Central

    Chen, Xi; Cui, Qiang; Tang, Yuye; Yoo, Jejoong; Yethiraj, Arun

    2008-01-01

    A hierarchical simulation framework that integrates information from molecular dynamics (MD) simulations into a continuum model is established to study the mechanical response of mechanosensitive channel of large-conductance (MscL) using the finite element method (FEM). The proposed MD-decorated FEM (MDeFEM) approach is used to explore the detailed gating mechanisms of the MscL in Escherichia coli embedded in a palmitoyloleoylphosphatidylethanolamine lipid bilayer. In Part I of this study, the framework of MDeFEM is established. The transmembrane and cytoplasmic helices are taken to be elastic rods, the loops are modeled as springs, and the lipid bilayer is approximated by a three-layer sheet. The mechanical properties of the continuum components, as well as their interactions, are derived from molecular simulations based on atomic force fields. In addition, analytical closed-form continuum model and elastic network model are established to complement the MDeFEM approach and to capture the most essential features of gating. In Part II of this study, the detailed gating mechanisms of E. coli-MscL under various types of loading are presented and compared with experiments, structural model, and all-atom simulations, as well as the analytical models established in Part I. It is envisioned that such a hierarchical multiscale framework will find great value in the study of a variety of biological processes involving complex mechanical deformations such as muscle contraction and mechanotransduction. PMID:18390626

  5. A Network Method of Measuring Affiliation-based Peer Influence: Assessing the Influences of Teammates’ Smoking on Adolescent Smoking

    PubMed Central

    Fujimoto, Kayo; Unger, Jennifer B.; Valente, Thomas W.

    2011-01-01

    Using a network analytic framework, this study introduces a new method to measure peer influence based on adolescents’ affiliations or two-mode social network data. Exposure based on affiliations is referred to as the “affiliation exposure model.” This study demonstrates the methodology using data on young adolescent smoking being influenced by joint participation in school-based organized sports activities with smokers. The analytic sample consisted of 1260 American adolescents from age 10 to 13 in middle schools, and the results of the longitudinal regression analyses showed that adolescents were more likely to smoke as they were increasingly exposed to teammates who smoke. This study illustrates the importance of peer influence via affiliation through team sports. PMID:22313152

  6. Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention

    ERIC Educational Resources Information Center

    West, Deborah; Heath, David; Huijser, Henk

    2016-01-01

    This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…

  7. Organizational Culture and Organizational Effectiveness: A Meta-Analytic Investigation of the Competing Values Framework's Theoretical Suppositions

    ERIC Educational Resources Information Center

    Hartnell, Chad A.; Ou, Amy Yi; Kinicki, Angelo

    2011-01-01

    We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial…

  8. Unintended Revelations in History Textbooks: The Precarious Authenticity and Historical Continuity of the Slovak Nation

    ERIC Educational Resources Information Center

    Šulíková, Jana

    2016-01-01

    Purpose: This article proposes an analytical framework that helps to identify and challenge misconceptions of ethnocentrism found in pre-tertiary teaching resources for history and the social sciences in numerous countries. Design: Drawing on nationalism studies, the analytical framework employs ideas known under the umbrella terms of…

  9. Analytical Framework for Identifying and Differentiating Recent Hitchhiking and Severe Bottleneck Effects from Multi-Locus DNA Sequence Data

    DOE PAGES

    Sargsyan, Ori

    2012-05-25

    Hitchhiking and severe bottleneck effects have impact on the dynamics of genetic diversity of a population by inducing homogenization at a single locus and at the genome-wide scale, respectively. As a result, identification and differentiation of the signatures of such events from DNA sequence data at a single locus is challenging. This study develops an analytical framework for identifying and differentiating recent homogenization events at multiple neutral loci in low recombination regions. The dynamics of genetic diversity at a locus after a recent homogenization event is modeled according to the infinite-sites mutation model and the Wright-Fisher model of reproduction withmore » constant population size. In this setting, I derive analytical expressions for the distribution, mean, and variance of the number of polymorphic sites in a random sample of DNA sequences from a locus affected by a recent homogenization event. Based on this framework, three likelihood-ratio based tests are presented for identifying and differentiating recent homogenization events at multiple loci. Lastly, I apply the framework to two data sets. First, I consider human DNA sequences from four non-coding loci on different chromosomes for inferring evolutionary history of modern human populations. The results suggest, in particular, that recent homogenization events at the loci are identifiable when the effective human population size is 50000 or greater in contrast to 10000, and the estimates of the recent homogenization events are agree with the “Out of Africa” hypothesis. Second, I use HIV DNA sequences from HIV-1-infected patients to infer the times of HIV seroconversions. The estimates are contrasted with other estimates derived as the mid-time point between the last HIV-negative and first HIV-positive screening tests. Finally, the results show that significant discrepancies can exist between the estimates.« less

  10. Network Community Detection based on the Physarum-inspired Computational Framework.

    PubMed

    Gao, Chao; Liang, Mingxin; Li, Xianghua; Zhang, Zili; Wang, Zhen; Zhou, Zhili

    2016-12-13

    Community detection is a crucial and essential problem in the structure analytics of complex networks, which can help us understand and predict the characteristics and functions of complex networks. Many methods, ranging from the optimization-based algorithms to the heuristic-based algorithms, have been proposed for solving such a problem. Due to the inherent complexity of identifying network structure, how to design an effective algorithm with a higher accuracy and a lower computational cost still remains an open problem. Inspired by the computational capability and positive feedback mechanism in the wake of foraging process of Physarum, which is a large amoeba-like cell consisting of a dendritic network of tube-like pseudopodia, a general Physarum-based computational framework for community detection is proposed in this paper. Based on the proposed framework, the inter-community edges can be identified from the intra-community edges in a network and the positive feedback of solving process in an algorithm can be further enhanced, which are used to improve the efficiency of original optimization-based and heuristic-based community detection algorithms, respectively. Some typical algorithms (e.g., genetic algorithm, ant colony optimization algorithm, and Markov clustering algorithm) and real-world datasets have been used to estimate the efficiency of our proposed computational framework. Experiments show that the algorithms optimized by Physarum-inspired computational framework perform better than the original ones, in terms of accuracy and computational cost. Moreover, a computational complexity analysis verifies the scalability of our framework.

  11. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  12. Manage Toward Success - Utilization of Analytics in Acquisition Decision Making

    DTIC Science & Technology

    2015-04-01

    on the concept of knowledge- based acquisition described by the GAO. In the GAO (2005) report for National Aeronautics and Space Administration ( NASA ...acquisition programs, GAO recommended to NASA , and NASA subsequently con- curred, that transition to a knowledge-based acquisition framework will...Certification and Accreditation Process; ERAM = Enterprise Risk Assessment Manager; EVMS = Earned Value Management System; GOV = Government; POA&M = Plan of

  13. A new analytical framework of 'continuum of prevention and care' to maximize HIV case detection and retention in care in Vietnam

    PubMed Central

    2012-01-01

    Background The global initiative ‘Treatment 2.0’ calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. Methods We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Results Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Conclusions Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the ‘Treatment 2.0’ initiative. PMID:23272730

  14. On the pursuit of a nuclear development capability: The case of the Cuban nuclear program

    NASA Astrophysics Data System (ADS)

    Benjamin-Alvarado, Jonathan Calvert

    1998-09-01

    While there have been many excellent descriptive accounts of modernization schemes in developing states, energy development studies based on prevalent modernization theory have been rare. Moreover, heretofore there have been very few analyses of efforts to develop a nuclear energy capability by developing states. Rarely have these analyses employed social science research methodologies. The purpose of this study was to develop a general analytical framework, based on such a methodology to analyze nuclear energy development and to utilize this framework for the study of the specific case of Cuba's decision to develop nuclear energy. The analytical framework developed focuses on a qualitative tracing of the process of Cuban policy objectives and implementation to develop a nuclear energy capability, and analyzes the policy in response to three models of modernization offered to explain the trajectory of policy development. These different approaches are the politically motivated modernization model, the economic and technological modernization model and the economic and energy security model. Each model provides distinct and functionally differentiated expectations for the path of development toward this objective. Each model provides expected behaviors to external stimuli that would result in specific policy responses. In the study, Cuba's nuclear policy responses to stimuli from domestic constraints and intensities, institutional development, and external influences are analyzed. The analysis revealed that in pursuing the nuclear energy capability, Cuba primarily responded by filtering most of the stimuli through the twin objectives of economic rationality and technological advancement. Based upon the Cuban policy responses to the domestic and international stimuli, the study concluded that the economic and technological modernization model of nuclear energy development offered a more complete explanation of the trajectory of policy development than either the politically-motivated or economic and energy security models. The findings of this case pose some interesting questions for the general study of energy programs in developing states. By applying the analytical framework employed in this study to a number of other cases, perhaps the understanding of energy development schemes may be expanded through future research.

  15. A new analytical framework of 'continuum of prevention and care' to maximize HIV case detection and retention in care in Vietnam.

    PubMed

    Fujita, Masami; Poudel, Krishna C; Do, Thi Nhan; Bui, Duc Duong; Nguyen, Van Kinh; Green, Kimberly; Nguyen, Thi Minh Thu; Kato, Masaya; Jacka, David; Cao, Thi Thanh Thuy; Nguyen, Thanh Long; Jimba, Masamine

    2012-12-29

    The global initiative 'Treatment 2.0' calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the 'Treatment 2.0' initiative.

  16. The sexual erotic market as an analytical framework for understanding erotic-affective exchanges in interracial sexually intimate and affective relationships.

    PubMed

    Vigoya, Mara Viveros

    2015-01-01

    This paper examines the way in which erotic-affective exchanges in interracial relationships have been analysed in Latin America. It considers how race, gender and class operate within a market of values such that erotic, affective and economic status are shaped by racial, gender and class hierarchies. In this paper I analyse historical and social arrangements that embody the region's political economy of race and sex. Such a perspective allows me to address the simultaneous co-existence of socio-racial exclusion and inclusion and the repressive and productive effects of power, attraction and anxiety as aspects of lived experiences in relation to sexuality. From there, I outline an analytical framework that references an erotic or pleasure-based market in which capital and other resources are exchanged from a structural perspective stressing relationship alliances. I conclude by identifying the scope and limits of such an approach.

  17. Analytical Expressions for Thermo-Osmotic Permeability of Clays

    NASA Astrophysics Data System (ADS)

    Gonçalvès, J.; Ji Yu, C.; Matray, J.-M.; Tremosa, J.

    2018-01-01

    In this study, a new formulation for the thermo-osmotic permeability of natural pore solutions containing monovalent and divalent cations is proposed. The mathematical formulation proposed here is based on the theoretical framework supporting thermo-osmosis which relies on water structure alteration in the pore space of surface-charged materials caused by solid-fluid electrochemical interactions. The ionic content balancing the surface charge of clay minerals causes a disruption in the hydrogen bond network when more structured water is present at the clay surface. Analytical expressions based on our heuristic model are proposed and compared to the available data for NaCl solutions. It is shown that the introduction of divalent cations reduces the thermo-osmotic permeability by one third compared to the monovalent case. The analytical expressions provided here can be used to advantage for safety calculations in deep underground nuclear waste repositories.

  18. Expanding Students' Analytical Frameworks through the Study of Graphic Novels

    ERIC Educational Resources Information Center

    Connors, Sean P.

    2015-01-01

    When teachers work with students to construct a metalanguage that they can draw on to describe and analyze graphic novels, and then invite students to apply that metalanguage in the service of composing multimodal texts of their own, teachers broaden students' analytical frameworks. In the process of doing so, teachers empower students. In this…

  19. PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic, Financial Literacy and Collaborative Problem Solving

    ERIC Educational Resources Information Center

    OECD Publishing, 2017

    2017-01-01

    What is important for citizens to know and be able to do? The OECD Programme for International Student Assessment (PISA) seeks to answer that question through the most comprehensive and rigorous international assessment of student knowledge and skills. The PISA 2015 Assessment and Analytical Framework presents the conceptual foundations of the…

  20. Comparative SWOT analysis of strategic environmental assessment systems in the Middle East and North Africa region.

    PubMed

    Rachid, G; El Fadel, M

    2013-08-15

    This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Pattern search in multi-structure data: a framework for the next-generation evidence-based medicine

    NASA Astrophysics Data System (ADS)

    Sukumar, Sreenivas R.; Ainsworth, Keela C.

    2014-03-01

    With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. Addressing this need, we pose and answer the following questions: (i) How can we jointly analyze and explore measurement data in context with qualitative domain knowledge? (ii) How can we search and hypothesize patterns (not known apriori) from such multi-structure data? (iii) How can we build predictive models by integrating weakly-associated multi-relational multi-structure data? We propose a framework towards answering these questions. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  2. Xpey' Relational Environments: an analytic framework for conceptualizing Indigenous health equity.

    PubMed

    Kent, Alexandra; Loppie, Charlotte; Carriere, Jeannine; MacDonald, Marjorie; Pauly, Bernie

    2017-12-01

    Both health equity research and Indigenous health research are driven by the goal of promoting equitable health outcomes among marginalized and underserved populations. However, the two fields often operate independently, without collaboration. As a result, Indigenous populations are underrepresented in health equity research relative to the disproportionate burden of health inequities they experience. In this methodological article, we present Xpey' Relational Environments, an analytic framework that maps some of the barriers and facilitators to health equity for Indigenous peoples. Health equity research needs to include a focus on Indigenous populations and Indigenized methodologies, a shift that could fill gaps in knowledge with the potential to contribute to 'closing the gap' in Indigenous health. With this in mind, the Equity Lens in Public Health (ELPH) research program adopted the Xpey' Relational Environments framework to add a focus on Indigenous populations to our research on the prioritization and implementation of health equity. The analytic framework introduced an Indigenized health equity lens to our methodology, which facilitated the identification of social, structural and systemic determinants of Indigenous health. To test the framework, we conducted a pilot case study of one of British Columbia's regional health authorities, which included a review of core policies and plans as well as interviews and focus groups with frontline staff, managers and senior executives. ELPH's application of Xpey' Relational Environments serves as an example of the analytic framework's utility for exploring and conceptualizing Indigenous health equity in BC's public health system. Future applications of the framework should be embedded in Indigenous research methodologies.

  3. Interoperable Data Sharing for Diverse Scientific Disciplines

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  4. Completing the link between exposure science and toxicology for improved environmental health decision making: The aggregate exposure pathway framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teeguarden, Justin G.; Tan, Yu -Mei; Edwards, Stephen W.

    Here, driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) concept in the toxicological sciences.more » Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more meaningful integration of exposure assessment and hazard identification. Together, the two frameworks form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making.« less

  5. Completing the link between exposure science and toxicology for improved environmental health decision making: The aggregate exposure pathway framework

    DOE PAGES

    Teeguarden, Justin G.; Tan, Yu -Mei; Edwards, Stephen W.; ...

    2016-01-13

    Here, driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) concept in the toxicological sciences.more » Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more meaningful integration of exposure assessment and hazard identification. Together, the two frameworks form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making.« less

  6. Employability as a New Mission? Organizational Changes in Chinese Vocational Colleges

    ERIC Educational Resources Information Center

    Yang, Po; Lin, Xiao Ying

    2014-01-01

    The purpose of this study is to analyze the recent development of Chinese vocational colleges from two perspectives: the adoption of employability as a new institutional mission and organizational changes in six areas. The analysis is based on a multiple-case study. The analytical frameworks are developed from sociological theory and…

  7. Renaming Teaching Practice through Teacher Reflection Using Critical Incidents on a Virtual Training Course

    ERIC Educational Resources Information Center

    Badia, Antoni; Becerril, Lorena

    2016-01-01

    This study approaches teacher learning from a dialogical viewpoint where lecturers' voices used in a training course context reflect how lecturers generated new professional discourse. The design of the training course considered the analysis of several critical incidents (CIs) in online teaching. An analytical framework based on lecturers'…

  8. Experts workshop on the ecotoxicological risk assessment of ionizable organic chemicals: Towards a science-based framework for chemical assessment

    EPA Science Inventory

    There is a growing need to develop analytical methods and tools that can be applied to assess the environmental risks associated with charged, polar, and ionisable organic chemicals, such as those used as active pharmaceutical ingredients, biocides, and surface active chemicals. ...

  9. Assessing Vocal Performances Using Analytical Assessment: A Case Study

    ERIC Educational Resources Information Center

    Gynnild, Vidar

    2016-01-01

    This study investigated ways to improve the appraisal of vocal performances within a national academy of music. Since a criterion-based assessment framework had already been adopted, the conceptual foundation of an assessment rubric was used as a guide in an action research project. The group of teachers involved wanted to explore thinking…

  10. Does Our Diversity Talk Match Our Walk? Aligning Institutional Goals and Practice

    ERIC Educational Resources Information Center

    Horn, Catherine L.; Marin, Patricia

    2017-01-01

    To help admissions professionals achieve their diversity objectives, a strategy should be adopted that encourages universities to be more broadly self-reflective across a range of interrelated admissions policies and practices. Educational scrutiny is a data-analytic framework for campus-based understanding of the extent to which internal policies…

  11. Meta-analysis in evidence-based healthcare: a paradigm shift away from random effects is overdue.

    PubMed

    Doi, Suhail A R; Furuya-Kanamori, Luis; Thalib, Lukman; Barendregt, Jan J

    2017-12-01

    Each year up to 20 000 systematic reviews and meta-analyses are published whose results influence healthcare decisions, thus making the robustness and reliability of meta-analytic methods one of the world's top clinical and public health priorities. The evidence synthesis makes use of either fixed-effect or random-effects statistical methods. The fixed-effect method has largely been replaced by the random-effects method as heterogeneity of study effects led to poor error estimation. However, despite the widespread use and acceptance of the random-effects method to correct this, it too remains unsatisfactory and continues to suffer from defective error estimation, posing a serious threat to decision-making in evidence-based clinical and public health practice. We discuss here the problem with the random-effects approach and demonstrate that there exist better estimators under the fixed-effect model framework that can achieve optimal error estimation. We argue for an urgent return to the earlier framework with updates that address these problems and conclude that doing so can markedly improve the reliability of meta-analytical findings and thus decision-making in healthcare.

  12. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  13. Reasoning and Knowledge Acquisition Framework for 5G Network Analytics

    PubMed Central

    2017-01-01

    Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration. PMID:29065473

  14. Reasoning and Knowledge Acquisition Framework for 5G Network Analytics.

    PubMed

    Sotelo Monge, Marco Antonio; Maestre Vidal, Jorge; García Villalba, Luis Javier

    2017-10-21

    Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration.

  15. Knowledge Distribution and Power Relations in HIV-Related Education and Prevention for Gay Men: An Application of Bernstein to Australian Community-Based Pedagogical Devices

    ERIC Educational Resources Information Center

    McInnes, David; Murphy, Dean

    2011-01-01

    This paper seeks to make a theoretical and analytic intervention into the field of HIV-related education and prevention by applying the pedagogy framework of Basil Bernstein to a series of pedagogical devices developed and used in community-based programmes targeting gay men in Australia. The paper begins by outlining why it is such an…

  16. Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.

    1992-01-01

    The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.

  17. Towards an Analytical Framework for Understanding the Development of a Quality Assurance System in an International Joint Programme

    ERIC Educational Resources Information Center

    Zheng, Gaoming; Cai, Yuzhuo; Ma, Shaozhuang

    2017-01-01

    This paper intends to construct an analytical framework for understanding quality assurance in international joint programmes and to test it in a case analysis of a European--Chinese joint doctoral degree programme. The development of a quality assurance system for an international joint programme is understood as an institutionalization process…

  18. Grade 8 students' capability of analytical thinking and attitude toward science through teaching and learning about soil and its' pollution based on science technology and society (STS) approach

    NASA Astrophysics Data System (ADS)

    Boonprasert, Lapisarin; Tupsai, Jiraporn; Yuenyong, Chokchai

    2018-01-01

    This study reported Grade 8 students' analytical thinking and attitude toward science in teaching and learning about soil and its' pollution through science technology and society (STS) approach. The participants were 36 Grade 8 students in Naklang, Nongbualumphu, Thailand. The teaching and learning about soil and its' pollution through STS approach had carried out for 6 weeks. The soil and its' pollution unit through STS approach was developed based on framework of Yuenyong (2006) that consisted of five stages including (1) identification of social issues, (2) identification of potential solutions, (3) need for knowledge, (4) decision-making, and (5) socialization stage. Students' analytical thinking and attitude toward science was collected during their learning by participant observation, analytical thinking test, students' tasks, and journal writing. The findings revealed that students could gain their capability of analytical thinking. They could give ideas or behave the characteristics of analytical thinking such as thinking for classifying, compare and contrast, reasoning, interpreting, collecting data and decision making. Students' journal writing reflected that the STS class of soil and its' pollution motivated students. The paper will discuss implications of these for science teaching and learning through STS in Thailand.

  19. Understanding public perceptions of biotechnology through the "Integrative Worldview Framework".

    PubMed

    De Witt, Annick; Osseweijer, Patricia; Pierce, Robin

    2015-07-03

    Biotechnological innovations prompt a range of societal responses that demand understanding. Research has shown such responses are shaped by individuals' cultural worldviews. We aim to demonstrate how the Integrative Worldview Framework (IWF) can be used for analyzing perceptions of biotechnology, by reviewing (1) research on public perceptions of biotechnology and (2) analyses of the stakeholder-debate on the bio-based economy, using the Integrative Worldview Framework (IWF) as analytical lens. This framework operationalizes the concept of worldview and distinguishes between traditional, modern, and postmodern worldviews, among others. Applied to these literatures, this framework illuminates how these worldviews underlie major societal responses, thereby providing a unifying understanding of the literature on perceptions of biotechnology. We conclude the IWF has relevance for informing research on perceptions of socio-technical changes, generating insight into the paradigmatic gaps in social science, and facilitating reflexive and inclusive policy-making and debates on these timely issues. © The Author(s) 2015.

  20. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges

    PubMed Central

    Amarasingham, Ruben; Audet, Anne-Marie J.; Bates, David W.; Glenn Cohen, I.; Entwistle, Martin; Escobar, G. J.; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M.; Shahi, Anand; Stewart, Walter F.; Steyerberg, Ewout W.; Xie, Bin

    2016-01-01

    Context: The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Objectives: Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. Methods: To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. Findings: The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and Ethics) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility.Ethics: Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models. PMID:27141516

  1. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges.

    PubMed

    Amarasingham, Ruben; Audet, Anne-Marie J; Bates, David W; Glenn Cohen, I; Entwistle, Martin; Escobar, G J; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M; Shahi, Anand; Stewart, Walter F; Steyerberg, Ewout W; Xie, Bin

    2016-01-01

    The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and ETHICS) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility. Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models.

  2. Quantitative diagnosis and prognosis framework for concrete degradation due to alkali-silica reaction

    NASA Astrophysics Data System (ADS)

    Mahadevan, Sankaran; Neal, Kyle; Nath, Paromita; Bao, Yanqing; Cai, Guowei; Orme, Peter; Adams, Douglas; Agarwal, Vivek

    2017-02-01

    This research is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in nuclear power plants that are subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements: monitoring, data analytics, uncertainty quantification, and prognosis. The current work focuses on degradation caused by ASR (alkali-silica reaction). Controlled concrete specimens with reactive aggregate are prepared to develop accelerated ASR degradation. Different monitoring techniques — infrared thermography, digital image correlation (DIC), mechanical deformation measurements, nonlinear impact resonance acoustic spectroscopy (NIRAS), and vibro-acoustic modulation (VAM) — are studied for ASR diagnosis of the specimens. Both DIC and mechanical measurements record the specimen deformation caused by ASR gel expansion. Thermography is used to compare the thermal response of pristine and damaged concrete specimens and generate a 2-D map of the damage (i.e., ASR gel and cracked area), thus facilitating localization and quantification of damage. NIRAS and VAM are two separate vibration-based techniques that detect nonlinear changes in dynamic properties caused by the damage. The diagnosis results from multiple techniques are then fused using a Bayesian network, which also helps to quantify the uncertainty in the diagnosis. Prognosis of ASR degradation is then performed based on the current state of degradation obtained from diagnosis, by using a coupled thermo-hydro-mechanical-chemical (THMC) model for ASR degradation. This comprehensive approach of monitoring, data analytics, and uncertainty-quantified diagnosis and prognosis will facilitate the development of a quantitative, risk informed framework that will support continuous assessment and risk management of structural health and performance.

  3. Analytical learning and term-rewriting systems

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1990-01-01

    Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.

  4. Integrated Analytic and Linearized Inverse Kinematics for Precise Full Body Interactions

    NASA Astrophysics Data System (ADS)

    Boulic, Ronan; Raunhardt, Daniel

    Despite the large success of games grounded on movement-based interactions the current state of full body motion capture technologies still prevents the exploitation of precise interactions with complex environments. This paper focuses on ensuring a precise spatial correspondence between the user and the avatar. We build upon our past effort in human postural control with a Prioritized Inverse Kinematics framework. One of its key advantage is to ease the dynamic combination of postural and collision avoidance constraints. However its reliance on a linearized approximation of the problem makes it vulnerable to the well-known full extension singularity of the limbs. In such context the tracking performance is reduced and/or less believable intermediate postural solutions are produced. We address this issue by introducing a new type of analytic constraint that smoothly integrates within the prioritized Inverse Kinematics framework. The paper first recalls the background of full body 3D interactions and the advantages and drawbacks of the linearized IK solution. Then the Flexion-EXTension constraint (FLEXT in short) is introduced for the partial position control of limb-like articulated structures. Comparative results illustrate the interest of this new type of integrated analytical and linearized IK control.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaurov, Alexander A., E-mail: kaurov@uchicago.edu

    The methods for studying the epoch of cosmic reionization vary from full radiative transfer simulations to purely analytical models. While numerical approaches are computationally expensive and are not suitable for generating many mock catalogs, analytical methods are based on assumptions and approximations. We explore the interconnection between both methods. First, we ask how the analytical framework of excursion set formalism can be used for statistical analysis of numerical simulations and visual representation of the morphology of ionization fronts. Second, we explore the methods of training the analytical model on a given numerical simulation. We present a new code which emergedmore » from this study. Its main application is to match the analytical model with a numerical simulation. Then, it allows one to generate mock reionization catalogs with volumes exceeding the original simulation quickly and computationally inexpensively, meanwhile reproducing large-scale statistical properties. These mock catalogs are particularly useful for cosmic microwave background polarization and 21 cm experiments, where large volumes are required to simulate the observed signal.« less

  6. Linking theory with qualitative research through study of stroke caregiving families.

    PubMed

    Pierce, Linda L; Steiner, Victoria; Cervantez Thompson, Teresa L; Friedemann, Marie-Luise

    2014-01-01

    This theoretical article outlines the deliberate process of applying a qualitative data analysis method rooted in Friedemann's Framework of Systemic Organization through the study of a web-based education and support intervention for stroke caregiving families. Directed by Friedemann's framework, the analytic method involved developing, refining, and using a coding rubric to explore interactive patterns between caregivers and care recipients from this 3-month feasibility study using this education and support intervention. Specifically, data were gathered from the intervention's web-based discussion component between caregivers and the nurse specialist, as well as from telephone caregiver interviews. A theoretical framework guided the process of developing and refining this coding rubric for the purpose of organizing data; but, more importantly, guided the investigators' thought processes, allowing them to extract rich information from the data set, as well as synthesize this information to generate a broad understanding of the caring situation. © 2013 Association of Rehabilitation Nurses.

  7. A Crowdsensing Based Analytical Framework for Perceptional Degradation of OTT Web Browsing.

    PubMed

    Li, Ke; Wang, Hai; Xu, Xiaolong; Du, Yu; Liu, Yuansheng; Ahmad, M Omair

    2018-05-15

    Service perception analysis is crucial for understanding both user experiences and network quality as well as for maintaining and optimizing of mobile networks. Given the rapid development of mobile Internet and over-the-top (OTT) services, the conventional network-centric mode of network operation and maintenance is no longer effective. Therefore, developing an approach to evaluate and optimizing users' service perceptions has become increasingly important. Meanwhile, the development of a new sensing paradigm, mobile crowdsensing (MCS), makes it possible to evaluate and analyze the user's OTT service perception from end-user's point of view other than from the network side. In this paper, the key factors that impact users' end-to-end OTT web browsing service perception are analyzed by monitoring crowdsourced user perceptions. The intrinsic relationships among the key factors and the interactions between key quality indicators (KQI) are evaluated from several perspectives. Moreover, an analytical framework of perceptional degradation and a detailed algorithm are proposed whose goal is to identify the major factors that impact the perceptional degradation of web browsing service as well as their significance of contribution. Finally, a case study is presented to show the effectiveness of the proposed method using a dataset crowdsensed from a large number of smartphone users in a real mobile network. The proposed analytical framework forms a valuable solution for mobile network maintenance and optimization and can help improve web browsing service perception and network quality.

  8. GraphReduce: Large-Scale Graph Analytics on Accelerator-Based HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Dipanjan; Agarwal, Kapil; Song, Shuaiwen

    2015-09-30

    Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of both edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the hostmore » and the device.« less

  9. Planning and Evaluation of New Academic Library Services by Means of Web-Based Conjoint Analysis

    ERIC Educational Resources Information Center

    Decker, Reinhold; Hermelbracht, Antonia

    2006-01-01

    New product development is an omnipresent challenge to modern libraries in the information age. Therefore, we present the design and selected results of a comprehensive research project aiming at the systematic and user-oriented planning of academic library services by means of conjoint analysis. The applicability of the analytical framework used…

  10. Earth Systems Science: An Analytic Framework

    ERIC Educational Resources Information Center

    Finley, Fred N.; Nam, Younkeyong; Oughton, John

    2011-01-01

    Earth Systems Science (ESS) is emerging rapidly as a discipline and is being used to replace the older earth science education that has been taught as unrelated disciplines--geology, meteorology, astronomy, and oceanography. ESS is complex and is based on the idea that the earth can be understood as a set of interacting natural and social systems.…

  11. Forestry implications of agricultural short-rotation woody crops in the USA

    Treesearch

    Peter J. Ince; Alexander N. Moiseyev

    2002-01-01

    The purpose of this chapter is to discuss forestry implications of SRWC based on an economic analysis. As with the development of paper recycling, anticipating forestry implications of agricultural SRWC will depend in part on anticipating market conditions and economic impacts of technological developments. This chapter presents an analytic framework and market outlook...

  12. Developing a University Contribution to Teacher Education: Creating an Analytical Space for Learning Narratives

    ERIC Educational Resources Information Center

    Hanley, Chris; Brown, Tony

    2017-01-01

    What might a distinct university contribution to teacher education look like? This paper tracks a group of prospective teachers making the transition from undergraduate to teacher on a one-year school-based postgraduate course. The study employs a practitioner research methodological framework where teacher learning is understood as a process of…

  13. Learning on the Job: A Cultural Historical Activity Theory Approach to Initial Teacher Education across Four Secondary School Subject Departments

    ERIC Educational Resources Information Center

    Douglas, Alaster Scott

    2011-01-01

    This article considers how one may integrate ethnographic data generation with research questions and an analytic framework that are strongly theoretically informed by Cultural Historical Activity Theory (CHAT). Generating data through participant observation of school-based, student teacher education activity and interviewing all those involved…

  14. Reconceptualizing Educational Productivity for New South Wales Public Schools: An Empirical Application of Modified Quadriform Analytics

    ERIC Educational Resources Information Center

    Rolle, R. Anthony

    2016-01-01

    Little is known about the educational productivity of public schooling organizations when examined outside of market-based, cost-minimization frameworks. The purpose of this research was to extend the literature that supports the appropriateness of measuring levels of the economic efficiency of public schools via an alternative approach, utilizing…

  15. Towards a Context-Aware Proactive Decision Support Framework

    DTIC Science & Technology

    2013-11-15

    initiative that has developed text analytic technology that crosses the semantic gap into the area of event recognition and representation. The...recognizing operational context, and techniques for recognizing context shift. Additional research areas include: • Adequately capturing users...Universal Interaction Context Ontology [12] might serve as a foundation • Instantiating formal models of decision making based on information seeking

  16. Primary Class Size Reduction: How Policy Space, Physical Space, and Spatiality Shape What Happens in Real Schools

    ERIC Educational Resources Information Center

    Bascia, Nina; Faubert, Brenton

    2012-01-01

    This article reviews the literature base on class size reduction and proposes a new analytic framework that we believe provides practically useful explanations of how primary class size reduction works. It presents descriptions of classroom practice and grounded explanations for how class size reduction affects educational core activities by…

  17. The Motivational Effects of Success or Failure in Urban Elementary School Teaching

    ERIC Educational Resources Information Center

    Waterman, Bradford H.

    2012-01-01

    This study describes teachers' experiences of success and failure in teaching through interviews. The analytical framework for this study was based on Activity Theory (Leon'tev, 1978), and the research methods were developed by Herzberg et al. (1959). The inclusion of factors identified by Seligman (2006) and Maslach (1982) allowed for…

  18. Assessment of Children's Psychological Development and Data Analytic Framework in New York City Infant Day Care Study.

    ERIC Educational Resources Information Center

    Golden, Mark

    This report briefly describes the procedures for assessing children's psychological development and the data analytic framework used in the New York City Infant Day Care Study. This study is a 5-year, longitudinal investigation in which infants in group and family day care programs and infants reared at home are compared. Children in the study are…

  19. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  20. An analytical procedure to assist decision-making in a government research organization

    Treesearch

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  1. Evidence-Based and Value-Based Decision Making About Healthcare Design: An Economic Evaluation of the Safety and Quality Outcomes.

    PubMed

    Zadeh, Rana; Sadatsafavi, Hessam; Xue, Ryan

    2015-01-01

    This study describes a vision and framework that can facilitate the implementation of evidence-based design (EBD), scientific knowledge base into the process of the design, construction, and operation of healthcare facilities and clarify the related safety and quality outcomes for the stakeholders. The proposed framework pairs EBD with value-driven decision making and aims to improve communication among stakeholders by providing a common analytical language. Recent EBD research indicates that the design and operation of healthcare facilities contribute to an organization's operational success by improving safety, quality, and efficiency. However, because little information is available about the financial returns of evidence-based investments, such investments are readily eliminated during the capital-investment decision-making process. To model the proposed framework, we used engineering economy tools to evaluate the return on investments in six successful cases, identified by a literature review, in which facility design and operation interventions resulted in reductions in hospital-acquired infections, patient falls, staff injuries, and patient anxiety. In the evidence-based cases, calculated net present values, internal rates of return, and payback periods indicated that the long-term benefits of interventions substantially outweighed the intervention costs. This article explained a framework to develop a research-based and value-based communication language on specific interventions along the planning, design and construction, operation, and evaluation stages. Evidence-based and value-based design frameworks can be applied to communicate the life-cycle costs and savings of EBD interventions to stakeholders, thereby contributing to more informed decision makings and the optimization of healthcare infrastructures. © The Author(s) 2015.

  2. The Identification of Factors Affecting the Development and Practice of School-Based Counseling in Different National Contexts: A Grounded Theory Study Using a Worldwide Sample of Descriptive Journal Articles and Book Chapters

    ERIC Educational Resources Information Center

    Martin, Ian; Lauterbach, Alexandra; Carey, John

    2015-01-01

    A grounded theory methodology was used to analyze articles and book chapters describing the development and practice of school-based counseling in 25 different countries in order to identify the factors that affect development and practice. An 11-factor analytic framework was developed. Factors include: Cultural Factors, National Needs, Larger…

  3. Developing weighted criteria to evaluate lean reverse logistics through analytical network process

    NASA Astrophysics Data System (ADS)

    Zagloel, Teuku Yuri M.; Hakim, Inaki Maulida; Krisnawardhani, Rike Adyartie

    2017-11-01

    Reverse logistics is a part of supply chain that bring materials from consumers back to manufacturer in order to gain added value or do a proper disposal. Nowadays, most companies are still facing several problems on reverse logistics implementation which leads to high waste along reverse logistics processes. In order to overcome this problem, Madsen [Framework for Reverse Lean Logistics to Enable Green Manufacturing, Eco Design 2009: 6th International Symposium on Environmentally Conscious Design and Inverse Manufacturing, Sapporo, 2009] has developed a lean reverse logistics framework as a step to eliminate waste by implementing lean on reverse logistics. However, the resulted framework sets aside criteria used to evaluate its performance. This research aims to determine weighted criteria that can be used as a base on reverse logistics evaluation by considering lean principles. The resulted criteria will ensure reverse logistics are kept off from waste, thus implemented efficiently. Analytical Network Process (ANP) is used in this research to determine the weighted criteria. The result shows that criteria used for evaluation lean reverse logistics are Innovation and Learning (35%), Economic (30%), Process Flow Management (14%), Customer Relationship Management (13%), Environment (6%), and Social (2%).

  4. Research Ethics Review: Identifying Public Policy and Program Gaps

    PubMed Central

    Strosberg, Martin A.; Gefenas, Eugenijus; Famenka, Andrei

    2014-01-01

    We present an analytical frame-work for use by fellows of the Fogarty International Center–sponsored Advanced Certificate Program in Research Ethics for Central and Eastern Europe to identify gaps in the public policies establishing research ethics review systems that impede them from doing their job of protecting human research subjects. The framework, illustrated by examples from post-Communist countries, employs a logic model based on the public policy and public management literature. This paper is part of a collection of papers analyzing the Fogarty International Center’s International Research Ethics Education and Curriculum program. PMID:24782068

  5. Analyte detection with Cu-BTC metal-organic framework thin films by means of mass-sensitive and work-function-based readout.

    PubMed

    Davydovskaya, Polina; Ranft, Annekatrin; Lotsch, Bettina V; Pohle, Roland

    2014-07-15

    Metal-organic frameworks (MOFs) constitute a new generation of porous crystalline materials, which have recently come into focus as analyte-specific active elements in thin-film sensor devices. Cu-BTC--also known as HKUST-1--is one of the most theoretically and experimentally investigated members of the MOF family. Its capability to selectively adsorb different gas molecules renders this material a promising candidate for applications in chemical gas and vapor sensing. Here, we explore details of the host-guest interactions between HKUST-1 and various analytes under different environmental conditions and study the vapor adsorption mechanism by mass-sensitive and work-function-based readouts. These complementary transduction mechanisms were successfully applied for the detection of low ppm (2 to 50 ppm) concentrations of different alcohols (methanol, ethanol, 1-propanol, and 2-propanol) adsorbed into Cu-BTC thin films. Evaluation of the results allows for the comparison of the amounts of adsorbed vapors and the contribution of each vapor to the changes of the electronic properties of Cu-BTC. The influence of the length of the alcohol chain (C1-C3) and geometry (1-propanol, 2-propanol) as well as their polarity on the sensing performance was investigated, revealing that in dry air, short chain alcohols are more likely adsorbed than long chain alcohols, whereas in humid air, this preference is changed, and the sensitivity toward alcohols is generally decreased. The adsorption mechanism is revealed to differ for dry and humid atmospheres, changing from a site-specific binding of alcohols to the open metal sites under dry conditions to weak physisorption of the analytes dissolved in surface-adsorbed water reservoirs in humid air, with the signal strength being governed by their relative concentration.

  6. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  7. EvoGraph: On-The-Fly Efficient Mining of Evolving Graphs on GPU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Dipanjan; Song, Shuaiwen

    With the prevalence of the World Wide Web and social networks, there has been a growing interest in high performance analytics for constantly-evolving dynamic graphs. Modern GPUs provide massive AQ1 amount of parallelism for efficient graph processing, but the challenges remain due to their lack of support for the near real-time streaming nature of dynamic graphs. Specifically, due to the current high volume and velocity of graph data combined with the complexity of user queries, traditional processing methods by first storing the updates and then repeatedly running static graph analytics on a sequence of versions or snapshots are deemed undesirablemore » and computational infeasible on GPU. We present EvoGraph, a highly efficient and scalable GPU- based dynamic graph analytics framework.« less

  8. EpiK: A Knowledge Base for Epidemiological Modeling and Analytics of Infectious Diseases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, S. M. Shamimul; Fox, Edward A.; Bisset, Keith

    Computational epidemiology seeks to develop computational methods to study the distribution and determinants of health-related states or events (including disease), and the application of this study to the control of diseases and other health problems. Recent advances in computing and data sciences have led to the development of innovative modeling environments to support this important goal. The datasets used to drive the dynamic models as well as the data produced by these models presents unique challenges owing to their size, heterogeneity and diversity. These datasets form the basis of effective and easy to use decision support and analytical environments. Asmore » a result, it is important to develop scalable data management systems to store, manage and integrate these datasets. In this paper, we develop EpiK—a knowledge base that facilitates the development of decision support and analytical environments to support epidemic science. An important goal is to develop a framework that links the input as well as output datasets to facilitate effective spatio-temporal and social reasoning that is critical in planning and intervention analysis before and during an epidemic. The data management framework links modeling workflow data and its metadata using a controlled vocabulary. The metadata captures information about storage, the mapping between the linked model and the physical layout, and relationships to support services. EpiK is designed to support agent-based modeling and analytics frameworks—aggregate models can be seen as special cases and are thus supported. We use semantic web technologies to create a representation of the datasets that encapsulates both the location and the schema heterogeneity. The choice of RDF as a representation language is motivated by the diversity and growth of the datasets that need to be integrated. A query bank is developed—the queries capture a broad range of questions that can be posed and answered during a typical case study pertaining to disease outbreaks. The queries are constructed using SPARQL Protocol and RDF Query Language (SPARQL) over the EpiK. EpiK can hide schema and location heterogeneity while efficiently supporting queries that span the computational epidemiology modeling pipeline: from model construction to simulation output. As a result, we show that the performance of benchmark queries varies significantly with respect to the choice of hardware underlying the database and resource description framework (RDF) engine.« less

  9. EpiK: A Knowledge Base for Epidemiological Modeling and Analytics of Infectious Diseases

    DOE PAGES

    Hasan, S. M. Shamimul; Fox, Edward A.; Bisset, Keith; ...

    2017-11-06

    Computational epidemiology seeks to develop computational methods to study the distribution and determinants of health-related states or events (including disease), and the application of this study to the control of diseases and other health problems. Recent advances in computing and data sciences have led to the development of innovative modeling environments to support this important goal. The datasets used to drive the dynamic models as well as the data produced by these models presents unique challenges owing to their size, heterogeneity and diversity. These datasets form the basis of effective and easy to use decision support and analytical environments. Asmore » a result, it is important to develop scalable data management systems to store, manage and integrate these datasets. In this paper, we develop EpiK—a knowledge base that facilitates the development of decision support and analytical environments to support epidemic science. An important goal is to develop a framework that links the input as well as output datasets to facilitate effective spatio-temporal and social reasoning that is critical in planning and intervention analysis before and during an epidemic. The data management framework links modeling workflow data and its metadata using a controlled vocabulary. The metadata captures information about storage, the mapping between the linked model and the physical layout, and relationships to support services. EpiK is designed to support agent-based modeling and analytics frameworks—aggregate models can be seen as special cases and are thus supported. We use semantic web technologies to create a representation of the datasets that encapsulates both the location and the schema heterogeneity. The choice of RDF as a representation language is motivated by the diversity and growth of the datasets that need to be integrated. A query bank is developed—the queries capture a broad range of questions that can be posed and answered during a typical case study pertaining to disease outbreaks. The queries are constructed using SPARQL Protocol and RDF Query Language (SPARQL) over the EpiK. EpiK can hide schema and location heterogeneity while efficiently supporting queries that span the computational epidemiology modeling pipeline: from model construction to simulation output. As a result, we show that the performance of benchmark queries varies significantly with respect to the choice of hardware underlying the database and resource description framework (RDF) engine.« less

  10. A tiered, integrated biological and chemical monitoring framework for contaminants of emerging concern in aquatic ecosystems.

    PubMed

    Maruya, Keith A; Dodder, Nathan G; Mehinto, Alvine C; Denslow, Nancy D; Schlenk, Daniel; Snyder, Shane A; Weisberg, Stephen B

    2016-07-01

    The chemical-specific risk-based paradigm that informs monitoring and assessment of environmental contaminants does not apply well to the many thousands of new chemicals that are being introduced into ambient receiving waters. We propose a tiered framework that incorporates bioanalytical screening tools and diagnostic nontargeted chemical analysis to more effectively monitor for contaminants of emerging concern (CECs). The framework is based on a comprehensive battery of in vitro bioassays to first screen for a broad spectrum of CECs and nontargeted analytical methods to identify bioactive contaminants missed by the currently favored targeted analyses. Water quality managers in California have embraced this strategy with plans to further develop and test this framework in regional and statewide pilot studies on waterbodies that receive discharge from municipal wastewater treatment plants and stormwater runoff. In addition to directly informing decisions, the data obtained using this framework can be used to construct and validate models that better predict CEC occurrence and toxicity. The adaptive interplay among screening results, diagnostic assessment and predictive modeling will allow managers to make decisions based on the most current and relevant information, instead of extrapolating from parameters with questionable linkage to CEC impacts. Integr Environ Assess Manag 2016;12:540-547. © 2015 SETAC. © 2015 SETAC.

  11. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teeguarden, Justin G.; Tan, Yu-Mei; Edwards, Stephen W.

    Driven by major scientific advances in analytical methods, biomonitoring, and computational exposure assessment, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the computationally enabled “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) conceptmore » in the toxicological sciences. The AEP framework offers an intuitive approach to successful organization of exposure science data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathway and adverse outcome pathways, completing the source to outcome continuum and setting the stage for more efficient integration of exposure science and toxicity testing information. Together these frameworks form and inform a decision making framework with the flexibility for risk-based, hazard-based or exposure-based decisions.« less

  12. An Analytical Solution for the Impact of Vegetation Changes on Hydrological Partitioning Within the Budyko Framework

    NASA Astrophysics Data System (ADS)

    Zhang, Shulei; Yang, Yuting; McVicar, Tim R.; Yang, Dawen

    2018-01-01

    Vegetation change is a critical factor that profoundly affects the terrestrial water cycle. Here we derive an analytical solution for the impact of vegetation changes on hydrological partitioning within the Budyko framework. This is achieved by deriving an analytical expression between leaf area index (LAI) change and the Budyko land surface parameter (n) change, through the combination of a steady state ecohydrological model with an analytical carbon cost-benefit model for plant rooting depth. Using China where vegetation coverage has experienced dramatic changes over the past two decades as a study case, we quantify the impact of LAI changes on the hydrological partitioning during 1982-2010 and predict the future influence of these changes for the 21st century using climate model projections. Results show that LAI change exhibits an increasing importance on altering hydrological partitioning as climate becomes drier. In semiarid and arid China, increased LAI has led to substantial streamflow reductions over the past three decades (on average -8.5% in 1990s and -11.7% in 2000s compared to the 1980s baseline), and this decreasing trend in streamflow is projected to continue toward the end of this century due to predicted LAI increases. Our result calls for caution regarding the large-scale revegetation activities currently being implemented in arid and semiarid China, which may result in serious future water scarcity issues here. The analytical model developed here is physically based and suitable for simultaneously assessing both vegetation changes and climate change induced changes to streamflow globally.

  13. xQuake: A Modern Approach to Seismic Network Analytics

    NASA Astrophysics Data System (ADS)

    Johnson, C. E.; Aikin, K. E.

    2017-12-01

    While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.

  14. Extracting Effective Higgs Couplings in the Golden Channel

    DOE PAGES

    Chen, Yi; Vega-Morales, Roberto

    2014-04-08

    Kinematic distributions in Higgs decays to four charged leptons, the so called ‘golden channel, are a powerful probe of the tensor structure of its couplings to neutral electroweak gauge bosons. In this study we construct the first part of a comprehensive analysis framework designed to maximize the information contained in this channel in order to perform direct extraction of the various possible Higgs couplings. We first complete an earlier analytic calculation of the leading order fully differential cross sections for the golden channel signal and background to include the 4e and 4μ final states with interference between identical final states.more » We also examine the relative fractions of the different possible combinations of scalar-tensor couplings by integrating the fully differential cross section over all kinematic variables as well as show various doubly differential spectra for both the signal and background. From these analytic expressions we then construct a ‘generator level’ analysis framework based on the maximum likelihood method. Then, we demonstrate the ability of our framework to perform multi-parameter extractions of all the possible effective couplings of a spin-0 scalar to pairs of neutral electroweak gauge bosons including any correlations. Furthermore, this framework provides a powerful method for study of these couplings and can be readily adapted to include the relevant detector and systematic effects which we demonstrate in an accompanying study to follow.« less

  15. Modelling altered revenue function based on varying power consumption distribution and electricity tariff charge using data analytics framework

    NASA Astrophysics Data System (ADS)

    Zainudin, W. N. R. A.; Ramli, N. A.

    2017-09-01

    In 2010, Energy Commission (EC) had introduced Incentive Based Regulation (IBR) to ensure sustainable Malaysian Electricity Supply Industry (MESI), promotes transparent and fair returns, encourage maximum efficiency and maintains policy driven end user tariff. To cater such revolutionary transformation, a sophisticated system to generate policy driven electricity tariff structure is in great need. Hence, this study presents a data analytics framework that generates altered revenue function based on varying power consumption distribution and tariff charge function. For the purpose of this study, the power consumption distribution is being proxy using proportion of household consumption and electricity consumed in KwH and the tariff charge function is being proxy using three-tiered increasing block tariff (IBT). The altered revenue function is useful to give an indication on whether any changes in the power consumption distribution and tariff charges will give positive or negative impact to the economy. The methodology used for this framework begins by defining the revenue to be a function of power consumption distribution and tariff charge function. Then, the proportion of household consumption and tariff charge function is derived within certain interval of electricity power. Any changes in those proportion are conjectured to contribute towards changes in revenue function. Thus, these changes can potentially give an indication on whether the changes in power consumption distribution and tariff charge function are giving positive or negative impact on TNB revenue. Based on the finding of this study, major changes on tariff charge function seems to affect altered revenue function more than power consumption distribution. However, the paper concludes that power consumption distribution and tariff charge function can influence TNB revenue to some great extent.

  16. Ethics and Justice in Learning Analytics

    ERIC Educational Resources Information Center

    Johnson, Jeffrey Alan

    2017-01-01

    The many complex challenges posed by learning analytics can best be understood within a framework of structural justice, which focuses on the ways in which the informational, operational, and organizational structures of learning analytics influence students' capacities for self-development and self-determination. This places primary…

  17. Reading Multimodal Texts: Perceptual, Structural and Ideological Perspectives

    ERIC Educational Resources Information Center

    Serafini, Frank

    2010-01-01

    This article presents a tripartite framework for analyzing multimodal texts. The three analytical perspectives presented include: (1) perceptual, (2) structural, and (3) ideological analytical processes. Using Anthony Browne's picturebook "Piggybook" as an example, assertions are made regarding what each analytical perspective brings to the…

  18. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    PubMed Central

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-01-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of TOF scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (Direct Image Reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias vs. variance performance to iterative TOF reconstruction with a matched resolution model. PMID:27032968

  19. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    NASA Astrophysics Data System (ADS)

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-05-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of time-of-flight (TOF) scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (DIRECT: direct image reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias versus variance performance to iterative TOF reconstruction with a matched resolution model.

  20. An analytic model for accurate spring constant calibration of rectangular atomic force microscope cantilevers.

    PubMed

    Li, Rui; Ye, Hongfei; Zhang, Weisheng; Ma, Guojun; Su, Yewang

    2015-10-29

    Spring constant calibration of the atomic force microscope (AFM) cantilever is of fundamental importance for quantifying the force between the AFM cantilever tip and the sample. The calibration within the framework of thin plate theory undoubtedly has a higher accuracy and broader scope than that within the well-established beam theory. However, thin plate theory-based accurate analytic determination of the constant has been perceived as an extremely difficult issue. In this paper, we implement the thin plate theory-based analytic modeling for the static behavior of rectangular AFM cantilevers, which reveals that the three-dimensional effect and Poisson effect play important roles in accurate determination of the spring constants. A quantitative scaling law is found that the normalized spring constant depends only on the Poisson's ratio, normalized dimension and normalized load coordinate. Both the literature and our refined finite element model validate the present results. The developed model is expected to serve as the benchmark for accurate calibration of rectangular AFM cantilevers.

  1. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework

    PubMed Central

    Teeguarden, Justin. G.; Tan, Yu-Mei; Edwards, Stephen W.; Leonard, Jeremy A.; Anderson, Kim A.; Corley, Richard A.; Harding, Anna K; Kile, Molly L.; Simonich, Staci M; Stone, David; Tanguay, Robert L.; Waters, Katrina M.; Harper, Stacey L.; Williams, David E.

    2016-01-01

    Synopsis Driven by major scientific advances in analytical methods, biomonitoring, computational tools, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the Aggregate Exposure Pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the Adverse Outcome Pathway (AOP) concept in the toxicological sciences. Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more efficient integration of exposure assessment and hazard identification. Together, the two pathways form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making. PMID:26759916

  2. A Working Framework for Enabling International Science Data System Interoperability

    NASA Astrophysics Data System (ADS)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  3. Thinking graphically: Connecting vision and cognition during graph comprehension.

    PubMed

    Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A

    2008-03-01

    Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved

  4. Individual behavioral phenotypes: an integrative meta-theoretical framework. Why "behavioral syndromes" are not analogs of "personality".

    PubMed

    Uher, Jana

    2011-09-01

    Animal researchers are increasingly interested in individual differences in behavior. Their interpretation as meaningful differences in behavioral strategies stable over time and across contexts, adaptive, heritable, and acted upon by natural selection has triggered new theoretical developments. However, the analytical approaches used to explore behavioral data still address population-level phenomena, and statistical methods suitable to analyze individual behavior are rarely applied. I discuss fundamental investigative principles and analytical approaches to explore whether, in what ways, and under which conditions individual behavioral differences are actually meaningful. I elaborate the meta-theoretical ideas underlying common theoretical concepts and integrate them into an overarching meta-theoretical and methodological framework. This unravels commonalities and differences, and shows that assumptions of analogy to concepts of human personality are not always warranted and that some theoretical developments may be based on methodological artifacts. Yet, my results also highlight possible directions for new theoretical developments in animal behavior research. Copyright © 2011 Wiley Periodicals, Inc.

  5. Perceptions of the Nature and 'Goodness' of Argument among College Students, Science Teachers, and Scientists

    NASA Astrophysics Data System (ADS)

    Abi-El-Mona, Issam; Abd-El-Khalick, Fouad

    2011-03-01

    This study aimed to elucidate college freshmen science students, secondary science teachers, and scientists' perceptions of 'scientific' argument; to compare participants' perceptions with Stephen Toulmin's analytical framework of argument; and to characterize the criteria that participants deployed when assessing the 'quality' or 'goodness' of arguments. Thirty students, teachers, and scientists-with 10 members in each group-participated in two semi-structured individual interviews. During the first interview, participants generated an argument in response to a socioscientific issue. In the second interview, each participant 'evaluated' three arguments generated by a member from each participant group without being privy to the arguer's group membership. Interview transcripts were qualitatively analyzed. The findings point to both similarities and differences between participants' conceptions of argument and those based on Toulmin's analytical framework. Participants used an array of common and idiosyncratic criteria to judge the quality or goodness of argument. Finally, contrary to expectations, participants independently agreed that the 'best' arguments were those generated by participant science teachers.

  6. The Impact of Preservice Teachers' Experiences in a Video-Enhanced Training Program on Their Teaching: A Case Study in Physical Education

    ERIC Educational Resources Information Center

    Gaudin, Cyrille; Chaliès, Sébastien; Amathieu, Jérôme

    2018-01-01

    This case study documents the influence of preservice teachers' experiences in a Video-Enhanced Training Program (VETP) on their teaching. The conceptual framework of this VETP comes from a research program in cultural anthropology based on Wittgenstein's analytical philosophy. Influence was identified during self-confrontation interviews with…

  7. Framing Teacher Preparation Research: An Overview of the Field, Part 1

    ERIC Educational Resources Information Center

    Cochran-Smith, Marilyn; Villegas, Ana Maria

    2015-01-01

    This is the first of a two-part article that aims to chart the contemporary landscape of research on teacher preparation and certification. It is based on a review of more than 1,500 studies published between 2000 and 2012. Part 1 provides information about how the review was conducted and describes the theoretical/analytic framework the authors…

  8. Preparing to Teach a Slavery Past: History Teachers and Educators as Navigators of Historical Distance

    ERIC Educational Resources Information Center

    Klein, Stephan

    2017-01-01

    Using an analytical framework based on the concept of historical distance, this article explores how Dutch history teachers and educators navigate between the past and the present when making curriculum decisions on the sensitive topic of the Transatlantic Slave Trade and Slavery. Four history teachers and 2 museum educators were selected on the…

  9. Using Modern Solid-State Analytical Tools for Investigations of an Advanced Carbon Capture Material: Experiments for the Inorganic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai

    2016-01-01

    A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…

  10. Using the AHP in a Workshop Setting to Elicit and Prioritize Fire Research Needs

    Treesearch

    Daniel L. Schmoldt; David L. Peterson

    1997-01-01

    The benefits of convening a group of knowledgeable specialists together in a workshop setting to tackle a difficult problem can often be offset by an over-abundance of unfocused and rambling discussion and by counterproductive group dynamics. In light of this workshop paradox, we have created a generic workshop framework based on the analytic hierarchy process, that...

  11. Jointly optimizing selection of fuel treatments and siting of forest biomass-based energy production facilities for landscape-scale fire hazard reduction.

    Treesearch

    Peter J. Daugherty; Jeremy S. Fried

    2007-01-01

    Landscape-scale fuel treatments for forest fire hazard reduction potentially produce large quantities of material suitable for biomass energy production. The analytic framework FIA BioSum addresses this situation by developing detailed data on forest conditions and production under alternative fuel treatment prescriptions, and computes haul costs to alternative sites...

  12. We Make Spirit By Walking: An Application of Kovel's Spirituality to the Life and Work of Committed Environmentalists.

    ERIC Educational Resources Information Center

    McDonald, Barbara

    An analytical framework based on Joel Kovel's five meditations on spirit (spirit power, spirit being, spirit meaning, spirit and desire, and divine spirit) was used to explore the spirituality of committed environmental activists. A purposeful and snowball sampling method was used to interview 18 individuals with a strong commitment to…

  13. Occupational and Qualification Structures in the Field of Environmental Protection in the Metal and Chemical Industries in the United Kingdom.

    ERIC Educational Resources Information Center

    European Centre for the Development of Vocational Training, Berlin (Germany).

    A study analyzed the occupational structure and qualifications associated with the field of environmental protection in the metal and chemical industries in the United Kingdom. The analysis included nine case studies based on interviews with firms in the chemicals and metals sectors. Information was gathered within an analytical framework that…

  14. On the exactness of effective Floquet Hamiltonians employed in solid-state NMR spectroscopy

    NASA Astrophysics Data System (ADS)

    Garg, Rajat; Ramachandran, Ramesh

    2017-05-01

    Development of theoretical models based on analytic theory has remained an active pursuit in molecular spectroscopy for its utility both in the design of experiments as well as in the interpretation of spectroscopic data. In particular, the role of "Effective Hamiltonians" in the evolution of theoretical frameworks is well known across all forms of spectroscopy. Nevertheless, a constant revalidation of the approximations employed in the theoretical frameworks is necessitated by the constant improvements on the experimental front in addition to the complexity posed by the systems under study. Here in this article, we confine our discussion to the derivation of effective Floquet Hamiltonians based on the contact transformation procedure. While the importance of the effective Floquet Hamiltonians in the qualitative description of NMR experiments has been realized in simpler cases, its extension in quantifying spectral data deserves a cautious approach. With this objective, the validity of the approximations employed in the derivation of the effective Floquet Hamiltonians is re-examined through a comparison with exact numerical methods under differing experimental conditions. The limitations arising from the existing analytic methods are outlined along with remedial measures for improving the accuracy of the derived effective Floquet Hamiltonians.

  15. Uncertainty Quantification for Polynomial Systems via Bernstein Expansions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2012-01-01

    This paper presents a unifying framework to uncertainty quantification for systems having polynomial response metrics that depend on both aleatory and epistemic uncertainties. The approach proposed, which is based on the Bernstein expansions of polynomials, enables bounding the range of moments and failure probabilities of response metrics as well as finding supersets of the extreme epistemic realizations where the limits of such ranges occur. These bounds and supersets, whose analytical structure renders them free of approximation error, can be made arbitrarily tight with additional computational effort. Furthermore, this framework enables determining the importance of particular uncertain parameters according to the extent to which they affect the first two moments of response metrics and failure probabilities. This analysis enables determining the parameters that should be considered uncertain as well as those that can be assumed to be constants without incurring significant error. The analytical nature of the approach eliminates the numerical error that characterizes the sampling-based techniques commonly used to propagate aleatory uncertainties as well as the possibility of under predicting the range of the statistic of interest that may result from searching for the best- and worstcase epistemic values via nonlinear optimization or sampling.

  16. Designing monitoring programs for chemicals of emerging concern in potable reuse--what to include and what not to include?

    PubMed

    Drewes, J E; Anderson, P; Denslow, N; Olivieri, A; Schlenk, D; Snyder, S A; Maruya, K A

    2013-01-01

    This study discussed a proposed process to prioritize chemicals for reclaimed water monitoring programs, selection of analytical methods required for their quantification, toxicological relevance of chemicals of emerging concern regarding human health, and related issues. Given that thousands of chemicals are potentially present in reclaimed water and that information about those chemicals is rapidly evolving, a transparent, science-based framework was developed to guide prioritization of which compounds of emerging concern (CECs) should be included in reclaimed water monitoring programs. The recommended framework includes four steps: (1) compile environmental concentrations (e.g., measured environmental concentration or MEC) of CECs in the source water for reuse projects; (2) develop a monitoring trigger level (MTL) for each of these compounds (or groups thereof) based on toxicological relevance; (3) compare the environmental concentration (e.g., MEC) to the MTL; CECs with a MEC/MTL ratio greater than 1 should be prioritized for monitoring, compounds with a ratio less than '1' should only be considered if they represent viable treatment process performance indicators; and (4) screen the priority list to ensure that a commercially available robust analytical method is available for that compound.

  17. An integrative framework for sensor-based measurement of teamwork in healthcare

    PubMed Central

    Rosen, Michael A; Dietz, Aaron S; Yang, Ting; Priebe, Carey E; Pronovost, Peter J

    2015-01-01

    There is a strong link between teamwork and patient safety. Emerging evidence supports the efficacy of teamwork improvement interventions. However, the availability of reliable, valid, and practical measurement tools and strategies is commonly cited as a barrier to long-term sustainment and spread of these teamwork interventions. This article describes the potential value of sensor-based technology as a methodology to measure and evaluate teamwork in healthcare. The article summarizes the teamwork literature within healthcare, including team improvement interventions and measurement. Current applications of sensor-based measurement of teamwork are reviewed to assess the feasibility of employing this approach in healthcare. The article concludes with a discussion highlighting current application needs and gaps and relevant analytical techniques to overcome the challenges to implementation. Compelling studies exist documenting the feasibility of capturing a broad array of team input, process, and output variables with sensor-based methods. Implications of this research are summarized in a framework for development of multi-method team performance measurement systems. Sensor-based measurement within healthcare can unobtrusively capture information related to social networks, conversational patterns, physical activity, and an array of other meaningful information without having to directly observe or periodically survey clinicians. However, trust and privacy concerns present challenges that need to be overcome through engagement of end users in healthcare. Initial evidence exists to support the feasibility of sensor-based measurement to drive feedback and learning across individual, team, unit, and organizational levels. Future research is needed to refine methods, technologies, theory, and analytical strategies. PMID:25053579

  18. The Strategic Management of Accountability in Nonprofit Organizations: An Analytical Framework.

    ERIC Educational Resources Information Center

    Kearns, Kevin P.

    1994-01-01

    Offers a framework stressing the strategic and tactical choices facing nonprofit organizations and discusses policy and management implications. Claims framework is a useful tool for conducting accountability audits and conceptual foundation for discussions of public policy. (Author/JOW)

  19. Metal-Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform.

    PubMed

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T; Ohodnicki, Paul R

    2018-02-23

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal-organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability of MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2 , N 2 , O 2 , and CO) with rapid (

  20. Adaptive steganography

    NASA Astrophysics Data System (ADS)

    Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.

    2002-04-01

    Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.

  1. Curriculum Innovation for Marketing Analytics

    ERIC Educational Resources Information Center

    Wilson, Elizabeth J.; McCabe, Catherine; Smith, Robert S.

    2018-01-01

    College graduates need better preparation for and experience in data analytics for higher-quality problem solving. Using the curriculum innovation framework of Borin, Metcalf, and Tietje (2007) and case study research methods, we offer rich insights about one higher education institution's work to address the marketing analytics skills gap.…

  2. A Framework for Learning Analytics Using Commodity Wearable Devices

    PubMed Central

    Lu, Yu; Zhang, Sen; Zhang, Zhiqiang; Xiao, Wendong; Yu, Shengquan

    2017-01-01

    We advocate for and introduce LEARNSense, a framework for learning analytics using commodity wearable devices to capture learner’s physical actions and accordingly infer learner context (e.g., student activities and engagement status in class). Our work is motivated by the observations that: (a) the fine-grained individual-specific learner actions are crucial to understand learners and their context information; (b) sensor data available on the latest wearable devices (e.g., wrist-worn and eye wear devices) can effectively recognize learner actions and help to infer learner context information; (c) the commodity wearable devices that are widely available on the market can provide a hassle-free and non-intrusive solution. Following the above observations and under the proposed framework, we design and implement a sensor-based learner context collector running on the wearable devices. The latest data mining and sensor data processing techniques are employed to detect different types of learner actions and context information. Furthermore, we detail all of the above efforts by offering a novel and exemplary use case: it successfully provides the accurate detection of student actions and infers the student engagement states in class. The specifically designed learner context collector has been implemented on the commodity wrist-worn device. Based on the collected and inferred learner information, the novel intervention and incentivizing feedback are introduced into the system service. Finally, a comprehensive evaluation with the real-world experiments, surveys and interviews demonstrates the effectiveness and impact of the proposed framework and this use case. The F1 score for the student action classification tasks achieve 0.9, and the system can effectively differentiate the defined three learner states. Finally, the survey results show that the learners are satisfied with the use of our system (mean score of 3.7 with a standard deviation of 0.55). PMID:28613236

  3. A Framework for Learning Analytics Using Commodity Wearable Devices.

    PubMed

    Lu, Yu; Zhang, Sen; Zhang, Zhiqiang; Xiao, Wendong; Yu, Shengquan

    2017-06-14

    We advocate for and introduce LEARNSense, a framework for learning analytics using commodity wearable devices to capture learner's physical actions and accordingly infer learner context (e.g., student activities and engagement status in class). Our work is motivated by the observations that: (a) the fine-grained individual-specific learner actions are crucial to understand learners and their context information; (b) sensor data available on the latest wearable devices (e.g., wrist-worn and eye wear devices) can effectively recognize learner actions and help to infer learner context information; (c) the commodity wearable devices that are widely available on the market can provide a hassle-free and non-intrusive solution. Following the above observations and under the proposed framework, we design and implement a sensor-based learner context collector running on the wearable devices. The latest data mining and sensor data processing techniques are employed to detect different types of learner actions and context information. Furthermore, we detail all of the above efforts by offering a novel and exemplary use case: it successfully provides the accurate detection of student actions and infers the student engagement states in class. The specifically designed learner context collector has been implemented on the commodity wrist-worn device. Based on the collected and inferred learner information, the novel intervention and incentivizing feedback are introduced into the system service. Finally, a comprehensive evaluation with the real-world experiments, surveys and interviews demonstrates the effectiveness and impact of the proposed framework and this use case. The F1 score for the student action classification tasks achieve 0.9, and the system can effectively differentiate the defined three learner states. Finally, the survey results show that the learners are satisfied with the use of our system (mean score of 3.7 with a standard deviation of 0.55).

  4. The RISE Framework: Using Learning Analytics to Automatically Identify Open Educational Resources for Continuous Improvement

    ERIC Educational Resources Information Center

    Bodily, Robert; Nyland, Rob; Wiley, David

    2017-01-01

    The RISE (Resource Inspection, Selection, and Enhancement) Framework is a framework supporting the continuous improvement of open educational resources (OER). The framework is an automated process that identifies learning resources that should be evaluated and either eliminated or improved. This is particularly useful in OER contexts where the…

  5. Quantifying risks with exact analytical solutions of derivative pricing distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin

    2017-04-01

    Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.

  6. Integrated pathway-based transcription regulation network mining and visualization based on gene expression profiles.

    PubMed

    Kibinge, Nelson; Ono, Naoaki; Horie, Masafumi; Sato, Tetsuo; Sugiura, Tadao; Altaf-Ul-Amin, Md; Saito, Akira; Kanaya, Shigehiko

    2016-06-01

    Conventionally, workflows examining transcription regulation networks from gene expression data involve distinct analytical steps. There is a need for pipelines that unify data mining and inference deduction into a singular framework to enhance interpretation and hypotheses generation. We propose a workflow that merges network construction with gene expression data mining focusing on regulation processes in the context of transcription factor driven gene regulation. The pipeline implements pathway-based modularization of expression profiles into functional units to improve biological interpretation. The integrated workflow was implemented as a web application software (TransReguloNet) with functions that enable pathway visualization and comparison of transcription factor activity between sample conditions defined in the experimental design. The pipeline merges differential expression, network construction, pathway-based abstraction, clustering and visualization. The framework was applied in analysis of actual expression datasets related to lung, breast and prostrate cancer. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. An Illumination- and Temperature-Dependent Analytical Model for Copper Indium Gallium Diselenide (CIGS) Solar Cells

    DOE PAGES

    Sun, Xingshu; Silverman, Timothy; Garris, Rebekah; ...

    2016-07-18

    In this study, we present a physics-based analytical model for copper indium gallium diselenide (CIGS) solar cells that describes the illumination- and temperature-dependent current-voltage (I-V) characteristics and accounts for the statistical shunt variation of each cell. The model is derived by solving the drift-diffusion transport equation so that its parameters are physical and, therefore, can be obtained from independent characterization experiments. The model is validated against CIGS I-V characteristics as a function of temperature and illumination intensity. This physics-based model can be integrated into a large-scale simulation framework to optimize the performance of solar modules, as well as predict themore » long-term output yields of photovoltaic farms under different environmental conditions.« less

  8. A proposed framework for the interpretation of biomonitoring data

    PubMed Central

    Boogaard, Peter J; Money, Chris D

    2008-01-01

    Biomonitoring, the determination of chemical substances in human body fluids or tissues, is more and more frequently applied. At the same time detection limits are decreasing steadily. As a consequence, many data with potential relevance for public health are generated although they need not necessarily allow interpretation in term of health relevance. The European Centre of Ecotoxicology and Toxicology of Chemicals (ECETOC) formed a dedicated task force to build a framework for the interpretation of biomonitoring data. The framework that was developed evaluates biomonitoring data based on their analytical integrity, their ability to describe dose (toxicokinetics), their ability to relate to effects, and an overall evaluation and weight of evidence analysis. This framework was subsequently evaluated with a number of case studies and was shown to provide a rational basis to advance discussions on human biomonitoring allowing better use and application of this type of data in human health risk assessment. PMID:18541066

  9. Critical thinking: a two-phase framework.

    PubMed

    Edwards, Sharon L

    2007-09-01

    This article provides a comprehensive review of how a two-phase framework can promote and engage nurses in the concepts of critical thinking. Nurse education is required to integrate critical thinking in their teaching strategies, as it is widely recognised as an important part of student nurses becoming analytical qualified practitioners. The two-phase framework can be incorporated in the classroom using enquiry-based scenarios or used to investigate situations that arise from practice, for reflection, analysis, theorising or to explore issues. This paper proposes a two-phase framework for incorporation in the classroom and practice to promote critical thinking. Phase 1 attempts to make it easier for nurses to organise and expound often complex and abstract ideas that arise when using critical thinking, identify more than one solution to the problem by using a variety of cues to facilitate action. Phase 2 encourages nurses to be accountable and responsible, to justify a decision, be creative and innovative in implementing change.

  10. Using Framework Analysis in nursing research: a worked example.

    PubMed

    Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica

    2013-11-01

    To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.

  11. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  12. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  13. Theory of precipitation effects on dead cylindrical fuels

    Treesearch

    Michael A. Fosberg

    1972-01-01

    Numerical and analytical solutions of the Fickian diffusion equation were used to determine the effects of precipitation on dead cylindrical forest fuels. The analytical solution provided a physical framework. The numerical solutions were then used to refine the analytical solution through a similarity argument. The theoretical solutions predicted realistic rates of...

  14. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    PubMed

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative analysis of heterogeneous data types in the development of complex botanicals such as polyphenols for eventual clinical and translational applications.

  15. An Affordance-Based Framework for Human Computation and Human-Computer Collaboration.

    PubMed

    Crouser, R J; Chang, R

    2012-12-01

    Visual Analytics is "the science of analytical reasoning facilitated by visual interactive interfaces". The goal of this field is to develop tools and methodologies for approaching problems whose size and complexity render them intractable without the close coupling of both human and machine analysis. Researchers have explored this coupling in many venues: VAST, Vis, InfoVis, CHI, KDD, IUI, and more. While there have been myriad promising examples of human-computer collaboration, there exists no common language for comparing systems or describing the benefits afforded by designing for such collaboration. We argue that this area would benefit significantly from consensus about the design attributes that define and distinguish existing techniques. In this work, we have reviewed 1,271 papers from many of the top-ranking conferences in visual analytics, human-computer interaction, and visualization. From these, we have identified 49 papers that are representative of the study of human-computer collaborative problem-solving, and provide a thorough overview of the current state-of-the-art. Our analysis has uncovered key patterns of design hinging on human and machine-intelligence affordances, and also indicates unexplored avenues in the study of this area. The results of this analysis provide a common framework for understanding these seemingly disparate branches of inquiry, which we hope will motivate future work in the field.

  16. Analytical solutions of travel time to a pumping well with variable evapotranspiration.

    PubMed

    Chen, Tian-Fei; Wang, Xu-Sheng; Wan, Li; Li, Hailong

    2014-01-01

    Analytical solutions of groundwater travel time to a pumping well in an unconfined aquifer have been developed in previous studies, however, the change in evapotranspiration was not considered. Here, we develop a mathematical model of unconfined flow toward a discharge well with redistribution of groundwater evapotranspiration for travel time analysis. Dependency of groundwater evapotranspiration on the depth to water table is described using a linear formula with an extinction depth. Analytical solutions of groundwater level and travel time are obtained. For a typical hypothetical example, these solutions perfectly agree with the numerical simulation results based on MODFLOW and MODPATH. As indicated in a dimensionless framework, a lumped parameter which is proportional to the pumping rate controls the distributions of groundwater evapotranspiration rate and the travel time along the radial direction. © 2013, National Ground Water Association.

  17. Nature-based supportive care opportunities: a conceptual framework.

    PubMed

    Blaschke, Sarah; O'Callaghan, Clare C; Schofield, Penelope

    2018-03-22

    Given preliminary evidence for positive health outcomes related to contact with nature for cancer populations, research is warranted to ascertain possible strategies for incorporating nature-based care opportunities into oncology contexts as additional strategies for addressing multidimensional aspects of cancer patients' health and recovery needs. The objective of this study was to consolidate existing research related to nature-based supportive care opportunities and generate a conceptual framework for discerning relevant applications in the supportive care setting. Drawing on research investigating nature-based engagement in oncology contexts, a two-step analytic process was used to construct a conceptual framework for guiding nature-based supportive care design and future research. Concept analysis methodology generated new representations of understanding by extracting and synthesising salient concepts. Newly formulated concepts were transposed to findings from related research about patient-reported and healthcare expert-developed recommendations for nature-based supportive care in oncology. Five theoretical concepts (themes) were formulated describing patients' reasons for engaging with nature and the underlying needs these interactions address. These included: connecting with what is genuinely valued, distancing from the cancer experience, meaning-making and reframing the cancer experience, finding comfort and safety, and vital nurturance. Eight shared patient and expert recommendations were compiled, which address the identified needs through nature-based initiatives. Eleven additional patient-reported recommendations attend to beneficial and adverse experiential qualities of patients' nature-based engagement and complete the framework. The framework outlines salient findings about helpful nature-based supportive care opportunities for ready access by healthcare practitioners, designers, researchers and patients themselves. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Toward an Analytic Framework of Interdisciplinary Reasoning and Communication (IRC) Processes in Science

    NASA Astrophysics Data System (ADS)

    Shen, Ji; Sung, Shannon; Zhang, Dongmei

    2015-11-01

    Students need to think and work across disciplinary boundaries in the twenty-first century. However, it is unclear what interdisciplinary thinking means and how to analyze interdisciplinary interactions in teamwork. In this paper, drawing on multiple theoretical perspectives and empirical analysis of discourse contents, we formulate a theoretical framework that helps analyze interdisciplinary reasoning and communication (IRC) processes in interdisciplinary collaboration. Specifically, we propose four interrelated IRC processes-integration, translation, transfer, and transformation, and develop a corresponding analytic framework. We apply the framework to analyze two meetings of a project that aims to develop interdisciplinary science assessment items. The results illustrate that the framework can help interpret the interdisciplinary meeting dynamics and patterns. Our coding process and results also suggest that these IRC processes can be further examined in terms of interconnected sub-processes. We also discuss the implications of using the framework in conceptualizing, practicing, and researching interdisciplinary learning and teaching in science education.

  19. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  20. Electrocardiographic interpretation skills of cardiology residents: are they competent?

    PubMed

    Sibbald, Matthew; Davies, Edward G; Dorian, Paul; Yu, Eric H C

    2014-12-01

    Achieving competency at electrocardiogram (ECG) interpretation among cardiology subspecialty residents has traditionally focused on interpreting a target number of ECGs during training. However, there is little evidence to support this approach. Further, there are no data documenting the competency of ECG interpretation skills among cardiology residents, who become de facto the gold standard in their practice communities. We tested 29 Cardiology residents from all 3 years in a large training program using a set of 20 ECGs collected from a community cardiology practice over a 1-month period. Residents interpreted half of the ECGs using a standard analytic framework, and half using their own approach. Residents were scored on the number of correct and incorrect diagnoses listed. Overall diagnostic accuracy was 58%. Of 6 potentially life-threatening diagnoses, residents missed 36% (123 of 348) including hyperkalemia (81%), long QT (52%), complete heart block (35%), and ventricular tachycardia (19%). Residents provided additional inappropriate diagnoses on 238 ECGs (41%). Diagnostic accuracy was similar between ECGs interpreted using an analytic framework vs ECGs interpreted without an analytic framework (59% vs 58%; F(1,1333) = 0.26; P = 0.61). Cardiology resident proficiency at ECG interpretation is suboptimal. Despite the use of an analytic framework, there remain significant deficiencies in ECG interpretation among Cardiology residents. A more systematic method of addressing these important learning gaps is urgently needed. Copyright © 2014 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  1. Psychological effects of Tai Chi Chuan.

    PubMed

    Jimenez, P J; Melendez, A; Albers, U

    2012-01-01

    This article reviews the scientific studies which have been carried out at the international level on the psychological benefits that Tai Chi Chuan (TCC) brings to those who practice it. It analyzes the framework in which the research was performed, the real benefits that this activity achieves and their causes. The present article brings a new analytical perspective to the reviews carried out to date in regard to classifying and analyzing the psychological variables involved in the practice of TCC and offers a homogeneous framework within which to develop research in this field based on the model proposed by Spirduso et al. (2005). Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  2. Surface-enhanced chiroptical spectroscopy with superchiral surface waves.

    PubMed

    Pellegrini, Giovanni; Finazzi, Marco; Celebrano, Michele; Duò, Lamberto; Biagioni, Paolo

    2018-07-01

    We study the chiroptical properties of one-dimensional photonic crystals supporting superchiral surface waves by introducing a simple formalism based on the Fresnel reflection matrix. We show that the proposed framework provides useful insights on the behavior of all the relevant chiroptical quantities, allowing for a deeper understanding of surface-enhanced chiral sensing platforms based on one-dimensional photonic crystals. Finally, we analyze and discuss the limitations of such platforms as the surface concentration of the target chiral analytes is gradually increased. © 2018 Wiley Periodicals, Inc.

  3. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education.

    PubMed

    Hervatis, Vasilis; Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-10-06

    Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators' decision making. A deductive case study approach was applied to develop the conceptual model. The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach.

  4. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education

    PubMed Central

    Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-01-01

    Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840

  5. Teacher Change in an Era of Neo-Liberal Policies: A Neoinstitutional Analysis of Teachers' Perceptions of Their Professional Change

    ERIC Educational Resources Information Center

    Ramberg, Magnus Rye

    2014-01-01

    The aim of this article is to explore how neo-institutional theory may be applied as an analytical framework to investigate the relationships between teachers' perceptions on their professional change on the one hand, and the numerous change efforts embedded in recent neo-liberal educational policies in Norway on the other. Based on biographical…

  6. Development Process of a Praxeology for Supporting the Teaching of Proofs in a CAS Environment Based on Teachers' Experience in a Professional Development Course

    ERIC Educational Resources Information Center

    Zehavi, Nurit; Mann, Giora

    2011-01-01

    This paper presents the development process of a "praxeology" (theory-of-practice) for supporting the teaching of proofs in a CAS environment. The characteristics of the praxeology were elaborated within the frame of a professional development course for teaching analytic geometry with CAS. The theoretical framework draws on Chevallard's…

  7. Development Of Antibody-Based Fiber-Optic Sensors

    NASA Astrophysics Data System (ADS)

    Tromberg, Bruce J.; Sepaniak, Michael J.; Vo-Dinh, Tuan

    1988-06-01

    The speed and specificity characteristic of immunochemical complex formation has encouraged the development of numerous antibody-based analytical techniques. The scope and versatility of these established methods can be enhanced by combining the principles of conventional immunoassay with laser-based fiber-optic fluorimetry. This merger of spectroscopy and immunochemistry provides the framework for the construction of highly sensitive and selective fiber-optic devices (fluoroimmuno-sensors) capable of in-situ detection of drugs, toxins, and naturally occurring biochemicals. Fluoroimmuno-sensors (FIS) employ an immobilized reagent phase at the sampling terminus of a single quartz optical fiber. Laser excitation of antibody-bound analyte produces a fluorescence signal which is either directly proportional (as in the case of natural fluorophor and "antibody sandwich" assays) or inversely proportional (as in the case of competitive-binding assays) to analyte concentration. Factors which influence analysis time, precision, linearity, and detection limits include the nature (solid or liquid) and amount of the reagent phase, the method of analyte delivery (passive diffusion, convection, etc.), and whether equilibrium or non-equilibrium assays are performed. Data will be presented for optical fibers whose sensing termini utilize: (1) covalently-bound solid antibody reagent phases, and (2) membrane-entrapped liquid antibody reagents. Assays for large-molecular weight proteins (antigens) and small-molecular weight, carcinogenic, polynuclear aromatics (haptens) will be considered. In this manner, the influence of a system's chemical characteristics and measurement requirements on sensor design, and the consequence of various sensor designs on analytical performance will be illustrated.

  8. Determination of a risk management primer at petroleum-contaminated sites: developing new human health risk assessment strategy.

    PubMed

    Park, In-Sun; Park, Jae-Woo

    2011-01-30

    Total petroleum hydrocarbon (TPH) is an important environmental contaminant that is toxic to human and environmental receptors. However, human health risk assessment for petroleum, oil, and lubricant (POL)-contaminated sites is especially challenging because TPH is not a single compound, but rather a mixture of numerous substances. To address this concern, this study recommends a new human health risk assessment strategy for POL-contaminated sites. The strategy is based on a newly modified TPH fractionation method and includes an improved analytical protocol. The proposed TPH fractionation method is composed of ten fractions (e.g., aliphatic and aromatic EC8-10, EC10-12, EC12-16, EC16-22 and EC22-40). Physicochemical properties and toxicity values of each fraction were newly defined in this study. The stepwise ultrasonication-based analytical process was established to measure TPH fractions. Analytical results were compared with those from the TPH Criteria Working Group (TPHCWG) Direct Method. Better analytical efficiencies in TPH, aliphatic, and aromatic fractions were achieved when contaminated soil samples were analyzed with the new analytical protocol. Finally, a human health risk assessment was performed based on the developed tiered risk assessment framework. Results showed that a detailed quantitative risk assessment should be conducted to determine scientifically and economically appropriate cleanup target levels, although the phase II process is useful for determining the potency of human health risks posed by POL-contamination. Copyright © 2010 Elsevier B.V. All rights reserved.

  9. The NIH analytical methods and reference materials program for dietary supplements.

    PubMed

    Betz, Joseph M; Fisher, Kenneth D; Saldanha, Leila G; Coates, Paul M

    2007-09-01

    Quality of botanical products is a great uncertainty that consumers, clinicians, regulators, and researchers face. Definitions of quality abound, and include specifications for sanitation, adventitious agents (pesticides, metals, weeds), and content of natural chemicals. Because dietary supplements (DS) are often complex mixtures, they pose analytical challenges and method validation may be difficult. In response to product quality concerns and the need for validated and publicly available methods for DS analysis, the US Congress directed the Office of Dietary Supplements (ODS) at the National Institutes of Health (NIH) to accelerate an ongoing methods validation process, and the Dietary Supplements Methods and Reference Materials Program was created. The program was constructed from stakeholder input and incorporates several federal procurement and granting mechanisms in a coordinated and interlocking framework. The framework facilitates validation of analytical methods, analytical standards, and reference materials.

  10. Recent progress of chiral stationary phases for separation of enantiomers in gas chromatography.

    PubMed

    Xie, Sheng-Ming; Yuan, Li-Ming

    2017-01-01

    Chromatography techniques based on chiral stationary phases are widely used for the separation of enantiomers. In particular, gas chromatography has developed rapidly in recent years due to its merits such as fast analysis speed, lower consumption of stationary phases and analytes, higher column efficiency, making it a better choice for chiral separation in diverse industries. This article summarizes recent progress of novel chiral stationary phases based on cyclofructan derivatives and chiral porous materials including chiral metal-organic frameworks, chiral porous organic frameworks, chiral inorganic mesoporous materials, and chiral porous organic cages in gas chromatography, covering original research papers published since 2010. The chiral recognition properties and mechanisms of separation toward enantiomers are also introduced. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Boosting Population Quits Through Evidence-Based Cessation Treatment and Policy

    PubMed Central

    Abrams, David B.; Graham, Amanda L.; Levy, David T.; Mabry, Patricia L.; Orleans, C. Tracy

    2015-01-01

    Only large increases in adult cessation will rapidly reduce population smoking prevalence. Evidence-based smoking-cessation treatments and treatment policies exist but are underutilized. More needs to be done to coordinate the widespread, efficient dissemination and implementation of effective treatments and policies. This paper is the first in a series of three to demonstrate the impact of an integrated, comprehensive systems approach to cessation treatment and policy. This paper provides an analytic framework and selected literature review that guide the two subsequent computer simulation modeling papers to show how critical leverage points may have an impact on reductions in smoking prevalence. Evidence is reviewed from the U.S. Public Health Service 2008 clinical practice guideline and other sources regarding the impact of five cessation treatment policies on quit attempts, use of evidence-based treatment, and quit rates. Cessation treatment policies would: (1) expand cessation treatment coverage and provider reimbursement; (2) mandate adequate funding for the use and promotion of evidence-based state-sponsored telephone quitlines; (3) support healthcare systems changes to prompt, guide, and incentivize tobacco treatment; (4) support and promote evidence-based treatment via the Internet; and (5) improve individually tailored, stepped-care approaches and the long-term effectiveness of evidence-based treatments. This series of papers provides an analytic framework to inform heuristic simulation models in order to take a new look at ways to markedly increase population smoking cessation by implementing a defined set of treatments and treatment-related policies with the potential to improve motivation to quit, evidence-based treatment use, and long-term effectiveness. PMID:20176308

  12. Which is more effective for suppressing an infectious disease: imperfect vaccination or defense against contagion?

    NASA Astrophysics Data System (ADS)

    Kuga, Kazuki; Tanimoto, Jun

    2018-02-01

    We consider two imperfect ways to protect against an infectious disease such as influenza, namely vaccination giving only partial immunity and a defense against contagion such as wearing a mask. We build up a new analytic framework considering those two cases instead of perfect vaccination, conventionally assumed as a premise, with the assumption of an infinite and well-mixed population. Our framework also considers three different strategy-updating rules based on evolutionary game theory: conventional pairwise comparison with one randomly selected agent, another concept of pairwise comparison referring to a social average, and direct alternative selection not depending on the usual copying concept. We successfully obtain a phase diagram in which vaccination coverage at equilibrium can be compared when assuming the model of either imperfect vaccination or a defense against contagion. The obtained phase diagram reveals that a defense against contagion is marginally inferior to an imperfect vaccination as long as the same coefficient value is used. Highlights - We build a new analytical framework for a vaccination game combined with the susceptible-infected-recovered (SIR) model. - Our model can evaluate imperfect provisions such as vaccination giving only partial immunity and a defense against contagion. - We obtain a phase diagram with which to compare the quantitative effects of partial vaccination and a defense against contagion.

  13. Morphology predicts species' functional roles and their degree of specialization in plant-frugivore interactions.

    PubMed

    Dehling, D Matthias; Jordano, Pedro; Schaefer, H Martin; Böhning-Gaese, Katrin; Schleuning, Matthias

    2016-01-27

    Species' functional roles in key ecosystem processes such as predation, pollination or seed dispersal are determined by the resource use of consumer species. An interaction between resource and consumer species usually requires trait matching (e.g. a congruence in the morphologies of interaction partners). Species' morphology should therefore determine species' functional roles in ecological processes mediated by mutualistic or antagonistic interactions. We tested this assumption for Neotropical plant-bird mutualisms. We used a new analytical framework that assesses a species's functional role based on the analysis of the traits of its interaction partners in a multidimensional trait space. We employed this framework to test (i) whether there is correspondence between the morphology of bird species and their functional roles and (ii) whether morphologically specialized birds fulfil specialized functional roles. We found that morphological differences between bird species reflected their functional differences: (i) bird species with different morphologies foraged on distinct sets of plant species and (ii) morphologically distinct bird species fulfilled specialized functional roles. These findings encourage further assessments of species' functional roles through the analysis of their interaction partners, and the proposed analytical framework facilitates a wide range of novel analyses for network and community ecology. © 2016 The Author(s).

  14. LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox

    ERIC Educational Resources Information Center

    Steiner, Christina M.; Kickmeier-Rust, Michael D.; Albert, Dietrich

    2016-01-01

    To find a balance between learning analytics research and individual privacy, learning analytics initiatives need to appropriately address ethical, privacy, and data protection issues. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection are available, which may…

  15. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  16. Is harm reduction profitable? An analytical framework for corporate social responsibility based on an epidemic model of addictive consumption.

    PubMed

    Massin, Sophie

    2012-06-01

    This article aims to help resolve the apparent paradox of producers of addictive goods who claim to be socially responsible while marketing a product clearly identified as harmful. It advances that reputation effects are crucial in this issue and that determining whether harm reduction practices are costly or profitable for the producers can help to assess the sincerity of their discourse. An analytical framework based on an epidemic model of addictive consumption that includes a deterrent effect of heavy use on initiation is developed. This framework enables us to establish a clear distinction between a simple responsible discourse and genuine harm reduction practices and, among harm reduction practices, between use reduction practices and micro harm reduction practices. Using simulations based on tobacco sales in France from 1950 to 2008, we explore the impact of three corresponding types of actions: communication on damage, restraining selling practices and development of safer products on total sales and on the social cost. We notably find that restraining selling practices toward light users, that is, preventing light users from escalating to heavy use, can be profitable for the producer, especially at early stages of the epidemic, but that such practices also contribute to increase the social cost. These results suggest that the existence of a deterrent effect of heavy use on the initiation of the consumption of an addictive good can shed new light on important issues, such as the motivations for corporate social responsibility and the definition of responsible actions in the particular case of harm reduction. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Variational and perturbative formulations of quantum mechanical/molecular mechanical free energy with mean-field embedding and its analytical gradients.

    PubMed

    Yamamoto, Takeshi

    2008-12-28

    Conventional quantum chemical solvation theories are based on the mean-field embedding approximation. That is, the electronic wavefunction is calculated in the presence of the mean field of the environment. In this paper a direct quantum mechanical/molecular mechanical (QM/MM) analog of such a mean-field theory is formulated based on variational and perturbative frameworks. In the variational framework, an appropriate QM/MM free energy functional is defined and is minimized in terms of the trial wavefunction that best approximates the true QM wavefunction in a statistically averaged sense. Analytical free energy gradient is obtained, which takes the form of the gradient of effective QM energy calculated in the averaged MM potential. In the perturbative framework, the above variational procedure is shown to be equivalent to the first-order expansion of the QM energy (in the exact free energy expression) about the self-consistent reference field. This helps understand the relation between the variational procedure and the exact QM/MM free energy as well as existing QM/MM theories. Based on this, several ways are discussed for evaluating non-mean-field effects (i.e., statistical fluctuations of the QM wavefunction) that are neglected in the mean-field calculation. As an illustration, the method is applied to an S(N)2 Menshutkin reaction in water, NH(3)+CH(3)Cl-->NH(3)CH(3) (+)+Cl(-), for which free energy profiles are obtained at the Hartree-Fock, MP2, B3LYP, and BHHLYP levels by integrating the free energy gradient. Non-mean-field effects are evaluated to be <0.5 kcal/mol using a Gaussian fluctuation model for the environment, which suggests that those effects are rather small for the present reaction in water.

  18. Metal-organic framework based in-syringe solid-phase extraction for the on-site sampling of polycyclic aromatic hydrocarbons from environmental water samples.

    PubMed

    Zhang, Xiaoqiong; Wang, Peiyi; Han, Qiang; Li, Hengzhen; Wang, Tong; Ding, Mingyu

    2018-04-01

    In-syringe solid-phase extraction is a promising sample pretreatment method for the on-site sampling of water samples because of its outstanding advantages of portability, simple operation, short extraction time, and low cost. In this work, a novel in-syringe solid-phase extraction device using metal-organic frameworks as the adsorbent was fabricated for the on-site sampling of polycyclic aromatic hydrocarbons from environmental waters. Trace polycyclic aromatic hydrocarbons were effectively extracted through the self-made device followed by gas chromatography with mass spectrometry analysis. Owing to the excellent adsorption performance of metal-organic frameworks, the analytes could be completely adsorbed during one adsorption cycle, thus effectively shortening the extraction time. Moreover, the adsorbed analytes could remain stable on the device for at least 7 days, revealing the potential of the self-made device for on-site sampling of degradable compounds in remote regions. The limit of detection ranged from 0.20 to 1.9 ng/L under the optimum conditions. Satisfactory recoveries varying from 84.4 to 104.5% and relative standard deviations below 9.7% were obtained in real samples analysis. The results of this study promote the application of metal-organic frameworks in sample preparation and demonstrate the great potential of in-syringe solid-phase extraction for the on-site sampling of trace contaminants in environmental waters. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. A theoretical framework for analyzing the effect of external change on tidal dynamics in estuaries

    NASA Astrophysics Data System (ADS)

    CAI, H.; Savenije, H.; Toffolon, M.

    2013-12-01

    The most densely populated areas of the world are usually located in coastal areas near estuaries. As a result, estuaries are often subject to intense human interventions, such as dredging for navigation, dam construction and fresh water withdrawal etc., which in some areas has led to serious deterioration of invaluable ecosystems. Hence it is important to understand the influence of such interventions on tidal dynamics in these areas. In this study, we present one consistent theoretical framework for tidal hydrodynamics, which can be used as a rapid assessment technique that assist policy maker and managers to make considered decisions for the protection and management of estuarine environment when assessing the effect of human interventions in estuaries. Analytical solutions to the one-dimensional St. Venant equations for the tidal hydrodynamics in convergent unbounded estuaries with negligible river discharge can be cast in the form of a set of four implicit dimensionless equations for phase lag, velocity amplitude, damping, and wave celerity, as a function of two localized parameters describing friction and convergence. This method allows for the comparison of the different analytical approaches by rewriting the different solutions in the same format. In this study, classical and more recent formulations are compared, showing the differences and similarities associated to their specific simplifications. The envelope method, which is based on the consideration of the dynamics at high water and low water, can be used to derive damping equations that use different friction approximations. This results in as many analytical solutions, and thereby allows one to build a consistent theoretical framework. Analysis of the asymptotic behaviour of the equations shows that an equilibrium tidal amplitude exits reflecting the balance between friction and channel convergence. The framework is subsequently extended to take into account the effect of river discharge. Hence, the analytical solutions are applicable even in the upstream part of an estuary, where the influence of river discharge is remarkable. The proposed analytical solutions are transparent and practical, allowing a quantitative and qualitative assessment of human interventions (e.g., dredging, flow reduction) on tidal dynamics. Moreover, they are rapid assessment techniques that enable the users to set up a simple model and to understand the functioning of the system with a minimum of information required. The analytical model is illustrated in three large-scale estuaries with significant influence by human activities, i.e., the Scheldt estuary in the Netherlands, the Modaomen and the Yangtze estuaries in China. In these estuaries, the correspondence with observations is good, which suggests that the proposed model is a useful, yet realistic and reliable instrument for quick detection of the effect of human interventions on tidal dynamics and subsequent environmental issues, such as salt intrusion.

  20. Analytical close-form solutions to the elastic fields of solids with dislocations and surface stress

    NASA Astrophysics Data System (ADS)

    Ye, Wei; Paliwal, Bhasker; Ougazzaden, Abdallah; Cherkaoui, Mohammed

    2013-07-01

    The concept of eigenstrain is adopted to derive a general analytical framework to solve the elastic field for 3D anisotropic solids with general defects by considering the surface stress. The formulation shows the elastic constants and geometrical features of the surface play an important role in determining the elastic fields of the solid. As an application, the analytical close-form solutions to the stress fields of an infinite isotropic circular nanowire are obtained. The stress fields are compared with the classical solutions and those of complex variable method. The stress fields from this work demonstrate the impact from the surface stress when the size of the nanowire shrinks but becomes negligible in macroscopic scale. Compared with the power series solutions of complex variable method, the analytical solutions in this work provide a better platform and they are more flexible in various applications. More importantly, the proposed analytical framework profoundly improves the studies of general 3D anisotropic materials with surface effects.

  1. A Graphics Design Framework to Visualize Multi-Dimensional Economic Datasets

    ERIC Educational Resources Information Center

    Chandramouli, Magesh; Narayanan, Badri; Bertoline, Gary R.

    2013-01-01

    This study implements a prototype graphics visualization framework to visualize multidimensional data. This graphics design framework serves as a "visual analytical database" for visualization and simulation of economic models. One of the primary goals of any kind of visualization is to extract useful information from colossal volumes of…

  2. Information of Complex Systems and Applications in Agent Based Modeling.

    PubMed

    Bao, Lei; Fritchman, Joseph C

    2018-04-18

    Information about a system's internal interactions is important to modeling the system's dynamics. This study examines the finer categories of the information definition and explores the features of a type of local information that describes the internal interactions of a system. Based on the results, a dual-space agent and information modeling framework (AIM) is developed by explicitly distinguishing an information space from the material space. The two spaces can evolve both independently and interactively. The dual-space framework can provide new analytic methods for agent based models (ABMs). Three examples are presented including money distribution, individual's economic evolution, and artificial stock market. The results are analyzed in the dual-space, which more clearly shows the interactions and evolutions within and between the information and material spaces. The outcomes demonstrate the wide-ranging applicability of using the dual-space AIMs to model and analyze a broad range of interactive and intelligent systems.

  3. A new approach to the concept of "relevance" in information retrieval (IR).

    PubMed

    Kagolovsky, Y; Möhr, J R

    2001-01-01

    The concept of "relevance" is the fundamental concept of information science in general and information retrieval, in particular. Although "relevance" is extensively used in evaluation of information retrieval, there are considerable problems associated with reaching an agreement on its definition, meaning, evaluation, and application in information retrieval. There are a number of different views on "relevance" and its use for evaluation. Based on a review of the literature the main problems associated with the concept of "relevance" in information retrieval are identified. The authors argue that the proposal for the solution of the problems can be based on the conceptual IR framework built using a systems analytic approach to IR. Using this framework different kinds of "relevance" relationships in the IR process are identified, and a methodology for evaluation of "relevance" based on methods of semantics capturing and comparison is proposed.

  4. Highly stable aluminum-based metal-organic frameworks as biosensing platforms for assessment of food safety.

    PubMed

    Liu, Chun-Sen; Sun, Chun-Xiao; Tian, Jia-Yue; Wang, Zhuo-Wei; Ji, Hong-Fei; Song, Ying-Pan; Zhang, Shuai; Zhang, Zhi-Hong; He, Ling-Hao; Du, Miao

    2017-05-15

    Two unique immunosensors made of aluminum-based metal-organic frameworks (MOFs), namely, 515- and 516-MOFs, with 4,4',4''-nitrilotribenzoic acid (H 3 NTB) were successfully obtained to efficiently assess food safety. The as-prepared 515- and 516-MOFs exhibited superior thermal and physicochemical stability, high electrochemical activity, and good biocompatibility. Among these immunosensors, 516-MOF showed a preferable biosensing ability toward analytes determined by electrochemical techniques. The developed 516-MOF-based electrochemical biosensor not only demonstrated high sensitivity with low detection limits of 0.70 and 0.40pgmL -1 toward vomitoxin and salbutamol, respectively, but also showed good selectivity in the presence of other interferences. Therefore, with the advantages of high sensitivity, good selectivity, and simple operation, this new strategy is believed to exhibit great potential for simple and convenient detection of poisonous and harmful residues in food. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models

    ERIC Educational Resources Information Center

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa

    2015-01-01

    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  6. Analytical connection between thresholds and immunization strategies of SIS model in random networks

    NASA Astrophysics Data System (ADS)

    Zhou, Ming-Yang; Xiong, Wen-Man; Liao, Hao; Wang, Tong; Wei, Zong-Wen; Fu, Zhong-Qian

    2018-05-01

    Devising effective strategies for hindering the propagation of viruses and protecting the population against epidemics is critical for public security and health. Despite a number of studies based on the susceptible-infected-susceptible (SIS) model devoted to this topic, we still lack a general framework to compare different immunization strategies in completely random networks. Here, we address this problem by suggesting a novel method based on heterogeneous mean-field theory for the SIS model. Our method builds the relationship between the thresholds and different immunization strategies in completely random networks. Besides, we provide an analytical argument that the targeted large-degree strategy achieves the best performance in random networks with arbitrary degree distribution. Moreover, the experimental results demonstrate the effectiveness of the proposed method in both artificial and real-world networks.

  7. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    PubMed

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  8. A dislocation-based crystal plasticity framework for dynamic ductile failure of single crystals

    NASA Astrophysics Data System (ADS)

    Nguyen, Thao; Luscher, D. J.; Wilkerson, J. W.

    2017-11-01

    A framework for dislocation-based viscoplasticity and dynamic ductile failure has been developed to model high strain rate deformation and damage in single crystals. The rate-dependence of the crystal plasticity formulation is based on the physics of relativistic dislocation kinetics suited for extremely high strain rates. The damage evolution is based on the dynamics of void growth, which are governed by both micro-inertia as well as dislocation kinetics and dislocation substructure evolution. An averaging scheme is proposed in order to approximate the evolution of the dislocation substructure in both the macroscale as well as its spatial distribution at the microscale. Additionally, a concept of a single equivalent dislocation density that effectively captures the collective influence of dislocation density on all active slip systems is proposed here. Together, these concepts and approximations enable the use of semi-analytic solutions for void growth dynamics developed in (Wilkerson and Ramesh, 2014), which greatly reduce the computational overhead that would otherwise be required. The resulting homogenized framework has been implemented into a commercially available finite element package, and a validation study against a suite of direct numerical simulations was carried out.

  9. Environmental Stewardship: A Conceptual Review and Analytical Framework.

    PubMed

    Bennett, Nathan J; Whitty, Tara S; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H

    2018-04-01

    There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.

  10. Environmental Stewardship: A Conceptual Review and Analytical Framework

    NASA Astrophysics Data System (ADS)

    Bennett, Nathan J.; Whitty, Tara S.; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H.

    2018-04-01

    There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.

  11. Completing the Link between Exposure Science and ...

    EPA Pesticide Factsheets

    Driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) concept in the toxicological sciences. Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more meaningful integration of exposure assessment and hazard identification. Together, the two frameworks form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making. The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports G

  12. ACCELERATING MR PARAMETER MAPPING USING SPARSITY-PROMOTING REGULARIZATION IN PARAMETRIC DIMENSION

    PubMed Central

    Velikina, Julia V.; Alexander, Andrew L.; Samsonov, Alexey

    2013-01-01

    MR parameter mapping requires sampling along additional (parametric) dimension, which often limits its clinical appeal due to a several-fold increase in scan times compared to conventional anatomic imaging. Data undersampling combined with parallel imaging is an attractive way to reduce scan time in such applications. However, inherent SNR penalties of parallel MRI due to noise amplification often limit its utility even at moderate acceleration factors, requiring regularization by prior knowledge. In this work, we propose a novel regularization strategy, which utilizes smoothness of signal evolution in the parametric dimension within compressed sensing framework (p-CS) to provide accurate and precise estimation of parametric maps from undersampled data. The performance of the method was demonstrated with variable flip angle T1 mapping and compared favorably to two representative reconstruction approaches, image space-based total variation regularization and an analytical model-based reconstruction. The proposed p-CS regularization was found to provide efficient suppression of noise amplification and preservation of parameter mapping accuracy without explicit utilization of analytical signal models. The developed method may facilitate acceleration of quantitative MRI techniques that are not suitable to model-based reconstruction because of complex signal models or when signal deviations from the expected analytical model exist. PMID:23213053

  13. Metal–Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T.

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal–organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability ofmore » MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2, N 2, O 2, and CO) with rapid (< tens of seconds) response time and excellent reversibility, which can be well correlated to the physisorption of gases into a nanoporous MOF. We propose a refractive index based sensing mechanism for the MOF-integrated optical fiber platform which results in an amplification of inherent optical absorption present within the MOF-based sensing layer with increasing values of effective refractive index associated with adsorption of gases.« less

  14. Metal–Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform

    DOE PAGES

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T.; ...

    2018-01-18

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal–organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability ofmore » MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2, N 2, O 2, and CO) with rapid (< tens of seconds) response time and excellent reversibility, which can be well correlated to the physisorption of gases into a nanoporous MOF. We propose a refractive index based sensing mechanism for the MOF-integrated optical fiber platform which results in an amplification of inherent optical absorption present within the MOF-based sensing layer with increasing values of effective refractive index associated with adsorption of gases.« less

  15. Information Tailoring Enhancements for Large Scale Social Data

    DTIC Science & Technology

    2016-03-15

    i.com) 1 Work Performed within This Reporting Period .................................................... 2 1.1 Implemented Temporal Analytics ...following tasks.  Implemented Temporal Analysis Algorithms for Advanced Analytics in Scraawl. We implemented our backend web service design for the...temporal analysis and we created a prototyope GUI web service of Scraawl analytics dashboard.  Upgraded Scraawl computational framework to increase

  16. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  17. Army Global Basing Posture: An Analytic Framework for Maximizing Responsiveness and Effectiveness

    DTIC Science & Technology

    2015-01-01

    potential changes in the national security environment and to evaluate U.S. Army stationing, prepositioning, security cooperation activities, and...report should be of interest to those concerned with U.S. global posture and national security strategy, especially as it pertains to U.S. land power...40   A.1. United Nations Support Index for Plenary Votes, by State ................................................ 64   A.2. United Nations

  18. Ray tracing the Wigner distribution function for optical simulations

    NASA Astrophysics Data System (ADS)

    Mout, Marco; Wick, Michael; Bociort, Florian; Petschulat, Joerg; Urbach, Paul

    2018-01-01

    We study a simulation method that uses the Wigner distribution function to incorporate wave optical effects in an established framework based on geometrical optics, i.e., a ray tracing engine. We use the method to calculate point spread functions and show that it is accurate for paraxial systems but produces unphysical results in the presence of aberrations. The cause of these anomalies is explained using an analytical model.

  19. Impacts of Worldview, Implicit Assumptions, Biases, and Groupthink on Israeli Operational Plans in 1973

    DTIC Science & Technology

    2013-05-23

    is called worldview. It determines how individuals interpret everything. In his book, Toward a Theory of Cultural Linguistics, Gary Palmer explains...person to person and organization to organization. Although analytical frameworks provide a common starting 2Gary B. Palmer, Toward A Theory of Cultural...this point, when overwhelmed, that planners reach out to theory and make determinations based on implicit assumptions and unconscious cognitive biases

  20. Defense Industrial Base: An Overview of an Emerging Issue

    DTIC Science & Technology

    1993-03-01

    supplies it needs to rapidly increase the production of weapons and supporting equipment in wartime. This lack of access is primarily considered a...ommittete on Technology and Securily , ,Joint Econonue Committee, we are attempting Io develop a proposed analytical framework for assessing the national...industry’s continuing ability to develop and produce weapon systems using the most advanced technology. According to recent studies, a growing number

  1. Many-objective reservoir policy identification and refinement to reduce policy inertia and myopia in water management

    NASA Astrophysics Data System (ADS)

    Giuliani, M.; Herman, J. D.; Castelletti, A.; Reed, P.

    2014-04-01

    This study contributes a decision analytic framework to overcome policy inertia and myopia in complex river basin management contexts. The framework combines reservoir policy identification, many-objective optimization under uncertainty, and visual analytics to characterize current operations and discover key trade-offs between alternative policies for balancing competing demands and system uncertainties. The approach is demonstrated on the Conowingo Dam, located within the Lower Susquehanna River, USA. The Lower Susquehanna River is an interstate water body that has been subject to intensive water management efforts due to competing demands from urban water supply, atomic power plant cooling, hydropower production, and federally regulated environmental flows. We have identified a baseline operating policy for the Conowingo Dam that closely reproduces the dynamics of current releases and flows for the Lower Susquehanna and thus can be used to represent the preferences structure guiding current operations. Starting from this baseline policy, our proposed decision analytic framework then combines evolutionary many-objective optimization with visual analytics to discover new operating policies that better balance the trade-offs within the Lower Susquehanna. Our results confirm that the baseline operating policy, which only considers deterministic historical inflows, significantly overestimates the system's reliability in meeting the reservoir's competing demands. Our proposed framework removes this bias by successfully identifying alternative reservoir policies that are more robust to hydroclimatic uncertainties while also better addressing the trade-offs across the Conowingo Dam's multisector services.

  2. A framework for inference about carnivore density from unstructured spatial sampling of scat using detector dogs

    USGS Publications Warehouse

    Thompson, Craig M.; Royle, J. Andrew; Garner, James D.

    2012-01-01

    Wildlife management often hinges upon an accurate assessment of population density. Although undeniably useful, many of the traditional approaches to density estimation such as visual counts, livetrapping, or mark–recapture suffer from a suite of methodological and analytical weaknesses. Rare, secretive, or highly mobile species exacerbate these problems through the reality of small sample sizes and movement on and off study sites. In response to these difficulties, there is growing interest in the use of non-invasive survey techniques, which provide the opportunity to collect larger samples with minimal increases in effort, as well as the application of analytical frameworks that are not reliant on large sample size arguments. One promising survey technique, the use of scat detecting dogs, offers a greatly enhanced probability of detection while at the same time generating new difficulties with respect to non-standard survey routes, variable search intensity, and the lack of a fixed survey point for characterizing non-detection. In order to account for these issues, we modified an existing spatially explicit, capture–recapture model for camera trap data to account for variable search intensity and the lack of fixed, georeferenced trap locations. We applied this modified model to a fisher (Martes pennanti) dataset from the Sierra National Forest, California, and compared the results (12.3 fishers/100 km2) to more traditional density estimates. We then evaluated model performance using simulations at 3 levels of population density. Simulation results indicated that estimates based on the posterior mode were relatively unbiased. We believe that this approach provides a flexible analytical framework for reconciling the inconsistencies between detector dog survey data and density estimation procedures.

  3. Development of a robust analytical framework for assessing landbird trends, dynamics and relationships with environmental covariates in the North Coast and Cascades Network

    USGS Publications Warehouse

    Ray, Chris; Saracco, James; Jenkins, Kurt J.; Huff, Mark; Happe, Patricia J.; Ransom, Jason I.

    2017-01-01

    During 2015-2016, we completed development of a new analytical framework for landbird population monitoring data from the National Park Service (NPS) North Coast and Cascades Inventory and Monitoring Network (NCCN). This new tool for analysis combines several recent advances in modeling population status and trends using point-count data and is designed to supersede the approach previously slated for analysis of trends in the NCCN and other networks, including the Sierra Nevada Network (SIEN). Advances supported by the new model-based approach include 1) the use of combined data on distance and time of detection to estimate detection probability without assuming perfect detection at zero distance, 2) seamless accommodation of variation in sampling effort and missing data, and 3) straightforward estimation of the effects of downscaled climate and other local habitat characteristics on spatial and temporal trends in landbird populations. No changes in the current field protocol are necessary to facilitate the new analyses. We applied several versions of the new model to data from each of 39 species recorded in the three mountain parks of the NCCN, estimating trends and climate relationships for each species during 2005-2014. Our methods and results are also reported in a manuscript in revision for the journal Ecosphere (hereafter, Ray et al.). Here, we summarize the methods and results outlined in depth by Ray et al., discuss benefits of the new analytical framework, and provide recommendations for its application to synthetic analyses of long-term data from the NCCN and SIEN. All code necessary for implementing the new analyses is provided within the Appendices to this report, in the form of fully annotated scripts written in the open-access programming languages R and JAGS.

  4. An Analytical Framework for Studying Small-Number Effects in Catalytic Reaction Networks: A Probability Generating Function Approach to Chemical Master Equations

    PubMed Central

    Nakagawa, Masaki; Togashi, Yuichi

    2016-01-01

    Cell activities primarily depend on chemical reactions, especially those mediated by enzymes, and this has led to these activities being modeled as catalytic reaction networks. Although deterministic ordinary differential equations of concentrations (rate equations) have been widely used for modeling purposes in the field of systems biology, it has been pointed out that these catalytic reaction networks may behave in a way that is qualitatively different from such deterministic representation when the number of molecules for certain chemical species in the system is small. Apart from this, representing these phenomena by simple binary (on/off) systems that omit the quantities would also not be feasible. As recent experiments have revealed the existence of rare chemical species in cells, the importance of being able to model potential small-number phenomena is being recognized. However, most preceding studies were based on numerical simulations, and theoretical frameworks to analyze these phenomena have not been sufficiently developed. Motivated by the small-number issue, this work aimed to develop an analytical framework for the chemical master equation describing the distributional behavior of catalytic reaction networks. For simplicity, we considered networks consisting of two-body catalytic reactions. We used the probability generating function method to obtain the steady-state solutions of the chemical master equation without specifying the parameters. We obtained the time evolution equations of the first- and second-order moments of concentrations, and the steady-state analytical solution of the chemical master equation under certain conditions. These results led to the rank conservation law, the connecting state to the winner-takes-all state, and analysis of 2-molecules M-species systems. A possible interpretation of the theoretical conclusion for actual biochemical pathways is also discussed. PMID:27047384

  5. Graphene/graphene oxide and their derivatives in the separation/isolation and preconcentration of protein species: A review.

    PubMed

    Chen, Xuwei; Hai, Xin; Wang, Jianhua

    2016-05-30

    The distinctive/unique electrical, chemical and optical properties make graphene/graphene oxide-based materials popular in the field of analytical chemistry. Its large surface offers excellent capacity to anchor target analyte, making it an powerful sorbent in the adsorption and preconcentration of trace level analyte of interest in the field of sample preparation. The large delocalized π-electron system of graphene framework provides strong affinity to species containing aromatic rings, such as proteins, and the abundant active sites on its surface offers the chance to modulate adsorption tendency towards specific protein via functional modification/decoration. This review provides an overview of the current research on graphene/graphene oxide-based materials as attractive and powerful adsorption media in the separation/isolation and preconcentration of protein species from biological sample matrixes. These practices are aiming at providing protein sample of high purity for further investigations and applications, or to achieve certain extent of enrichment prior to quantitative assay. In addition, the challenges and future perspectives in the related research fields have been discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data.

    PubMed

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H

    2012-11-06

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the "big data" challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce.

  7. An Multivariate Distance-Based Analytic Framework for Connectome-Wide Association Studies

    PubMed Central

    Shehzad, Zarrar; Kelly, Clare; Reiss, Philip T.; Craddock, R. Cameron; Emerson, John W.; McMahon, Katie; Copland, David A.; Castellanos, F. Xavier; Milham, Michael P.

    2014-01-01

    The identification of phenotypic associations in high-dimensional brain connectivity data represents the next frontier in the neuroimaging connectomics era. Exploration of brain-phenotype relationships remains limited by statistical approaches that are computationally intensive, depend on a priori hypotheses, or require stringent correction for multiple comparisons. Here, we propose a computationally efficient, data-driven technique for connectome-wide association studies (CWAS) that provides a comprehensive voxel-wise survey of brain-behavior relationships across the connectome; the approach identifies voxels whose whole-brain connectivity patterns vary significantly with a phenotypic variable. Using resting state fMRI data, we demonstrate the utility of our analytic framework by identifying significant connectivity-phenotype relationships for full-scale IQ and assessing their overlap with existent neuroimaging findings, as synthesized by openly available automated meta-analysis (www.neurosynth.org). The results appeared to be robust to the removal of nuisance covariates (i.e., mean connectivity, global signal, and motion) and varying brain resolution (i.e., voxelwise results are highly similar to results using 800 parcellations). We show that CWAS findings can be used to guide subsequent seed-based correlation analyses. Finally, we demonstrate the applicability of the approach by examining CWAS for three additional datasets, each encompassing a distinct phenotypic variable: neurotypical development, Attention-Deficit/Hyperactivity Disorder diagnostic status, and L-dopa pharmacological manipulation. For each phenotype, our approach to CWAS identified distinct connectome-wide association profiles, not previously attainable in a single study utilizing traditional univariate approaches. As a computationally efficient, extensible, and scalable method, our CWAS framework can accelerate the discovery of brain-behavior relationships in the connectome. PMID:24583255

  8. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data

    PubMed Central

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H.

    2013-01-01

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the “big data” challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce. PMID:24501719

  9. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  10. Analytic Closed-Form Solution of a Mixed Layer Model for Stratocumulus Clouds

    NASA Astrophysics Data System (ADS)

    Akyurek, Bengu Ozge

    Stratocumulus clouds play an important role in climate cooling and are hard to predict using global climate and weather forecast models. Thus, previous studies in the literature use observations and numerical simulation tools, such as large-eddy simulation (LES), to solve the governing equations for the evolution of stratocumulus clouds. In contrast to the previous works, this work provides an analytic closed-form solution to the cloud thickness evolution of stratocumulus clouds in a mixed-layer model framework. With a focus on application over coastal lands, the diurnal cycle of cloud thickness and whether or not clouds dissipate are of particular interest. An analytic solution enables the sensitivity analysis of implicitly interdependent variables and extrema analysis of cloud variables that are hard to achieve using numerical solutions. In this work, the sensitivity of inversion height, cloud-base height, and cloud thickness with respect to initial and boundary conditions, such as Bowen ratio, subsidence, surface temperature, and initial inversion height, are studied. A critical initial cloud thickness value that can be dissipated pre- and post-sunrise is provided. Furthermore, an extrema analysis is provided to obtain the minima and maxima of the inversion height and cloud thickness within 24 h. The proposed solution is validated against LES results under the same initial and boundary conditions. Then, the proposed analytic framework is extended to incorporate multiple vertical columns that are coupled by advection through wind flow. This enables a bridge between the micro-scale and the mesoscale relations. The effect of advection on cloud evolution is studied and a sensitivity analysis is provided.

  11. Two-condition within-participant statistical mediation analysis: A path-analytic framework.

    PubMed

    Montoya, Amanda K; Hayes, Andrew F

    2017-03-01

    Researchers interested in testing mediation often use designs where participants are measured on a dependent variable Y and a mediator M in both of 2 different circumstances. The dominant approach to assessing mediation in such a design, proposed by Judd, Kenny, and McClelland (2001), relies on a series of hypothesis tests about components of the mediation model and is not based on an estimate of or formal inference about the indirect effect. In this article we recast Judd et al.'s approach in the path-analytic framework that is now commonly used in between-participant mediation analysis. By so doing, it is apparent how to estimate the indirect effect of a within-participant manipulation on some outcome through a mediator as the product of paths of influence. This path-analytic approach eliminates the need for discrete hypothesis tests about components of the model to support a claim of mediation, as Judd et al.'s method requires, because it relies only on an inference about the product of paths-the indirect effect. We generalize methods of inference for the indirect effect widely used in between-participant designs to this within-participant version of mediation analysis, including bootstrap confidence intervals and Monte Carlo confidence intervals. Using this path-analytic approach, we extend the method to models with multiple mediators operating in parallel and serially and discuss the comparison of indirect effects in these more complex models. We offer macros and code for SPSS, SAS, and Mplus that conduct these analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. The philosophical "mind-body problem" and its relevance for the relationship between psychiatry and the neurosciences.

    PubMed

    Van Oudenhove, Lukas; Cuypers, Stefaan E

    2010-01-01

    Parallel to psychiatry, "philosophy of mind" investigates the relationship between mind (mental domain) and body/brain (physical domain). Unlike older forms of philosophy of mind, contemporary analytical philosophy is not exclusively based on introspection and conceptual analysis, but also draws upon the empirical methods and findings of the sciences. This article outlines the conceptual framework of the "mind-body problem" as formulated in contemporary analytical philosophy and argues that this philosophical debate has potentially far-reaching implications for psychiatry as a clinical-scientific discipline, especially for its own autonomy and its relationship to neurology/neuroscience. This point is illustrated by a conceptual analysis of the five principles formulated in Kandel's 1998 article "A New Intellectual Framework for Psychiatry." Kandel's position in the philosophical mind-body debate is ambiguous, ranging from reductive physicalism (psychophysical identity theory) to non-reductive physicalism (in which the mental "supervenes" on the physical) to epiphenomenalist dualism or even emergent dualism. We illustrate how these diverging interpretations result in radically different views on the identity of psychiatry and its relationship with the rapidly expanding domain of neurology/neuroscience.

  13. On Connectivity of Wireless Sensor Networks with Directional Antennas

    PubMed Central

    Wang, Qiu; Dai, Hong-Ning; Zheng, Zibin; Imran, Muhammad; Vasilakos, Athanasios V.

    2017-01-01

    In this paper, we investigate the network connectivity of wireless sensor networks with directional antennas. In particular, we establish a general framework to analyze the network connectivity while considering various antenna models and the channel randomness. Since existing directional antenna models have their pros and cons in the accuracy of reflecting realistic antennas and the computational complexity, we propose a new analytical directional antenna model called the iris model to balance the accuracy against the complexity. We conduct extensive simulations to evaluate the analytical framework. Our results show that our proposed analytical model on the network connectivity is accurate, and our iris antenna model can provide a better approximation to realistic directional antennas than other existing antenna models. PMID:28085081

  14. Developing an Analytical Framework for Argumentation on Energy Consumption Issues

    ERIC Educational Resources Information Center

    Jin, Hui; Mehl, Cathy E.; Lan, Deborah H.

    2015-01-01

    In this study, we aimed to develop a framework for analyzing the argumentation practice of high school students and high school graduates. We developed the framework in a specific context--how energy consumption activities such as changing diet, converting forests into farmlands, and choosing transportation modes affect the carbon cycle. The…

  15. A Theoretical Framework of the Relation between Socioeconomic Status and Academic Achievement of Students

    ERIC Educational Resources Information Center

    Lam, Gigi

    2014-01-01

    A socio-psychological analytical framework will be adopted to illuminate the relation between socioeconomic status and academic achievement. The framework puts the emphasis to incorporate micro familial factors into macro factor of the tracking system. Initially, children of the poor families always lack major prerequisite: diminution of cognitive…

  16. European Qualifications Framework: Weighing Some Pros and Cons out of a French Perspective

    ERIC Educational Resources Information Center

    Bouder, Annie

    2008-01-01

    Purpose: The purpose of this paper is to question the appropriateness of a proposal for a new European Qualifications Framework. The framework has three perspectives: historical; analytical; and national. Design/methodology/approach: The approaches are diverse since the first insists on the institutional and decision-making processes at European…

  17. Dynamic motion planning of 3D human locomotion using gradient-based optimization.

    PubMed

    Kim, Hyung Joo; Wang, Qian; Rahmatalla, Salam; Swan, Colby C; Arora, Jasbir S; Abdel-Malek, Karim; Assouline, Jose G

    2008-06-01

    Since humans can walk with an infinite variety of postures and limb movements, there is no unique solution to the modeling problem to predict human gait motions. Accordingly, we test herein the hypothesis that the redundancy of human walking mechanisms makes solving for human joint profiles and force time histories an indeterminate problem best solved by inverse dynamics and optimization methods. A new optimization-based human-modeling framework is thus described for predicting three-dimensional human gait motions on level and inclined planes. The basic unknowns in the framework are the joint motion time histories of a 25-degree-of-freedom human model and its six global degrees of freedom. The joint motion histories are calculated by minimizing an objective function such as deviation of the trunk from upright posture that relates to the human model's performance. A variety of important constraints are imposed on the optimization problem, including (1) satisfaction of dynamic equilibrium equations by requiring the model's zero moment point (ZMP) to lie within the instantaneous geometrical base of support, (2) foot collision avoidance, (3) limits on ground-foot friction, and (4) vanishing yawing moment. Analytical forms of objective and constraint functions are presented and discussed for the proposed human-modeling framework in which the resulting optimization problems are solved using gradient-based mathematical programming techniques. When the framework is applied to the modeling of bipedal locomotion on level and inclined planes, acyclic human walking motions that are smooth and realistic as opposed to less natural robotic motions are obtained. The aspects of the modeling framework requiring further investigation and refinement, as well as potential applications of the framework in biomechanics, are discussed.

  18. Beyond compartmentalization: a relational approach towards agency and vulnerability of young migrants.

    PubMed

    Huijsmans, Roy

    2012-01-01

    Based on fieldwork material from Lao People's Democratic Republic, this paper introduces an analytical framework that transcends compartmentalized approaches towards migration involving young people. The notions of fluid and institutionalized forms of migration illuminate key differences and commonalities in the relational fabric underpinning empirically diverse migration scenarios. Applying this framework to the role of networks in becoming a young migrant, this chapter sheds light on young migrants' differential scope for exercising agency. This redirects concerns about young migrants away from descriptive and static factors towards their relational position in the process of migration, which shapes their agency and vulnerability. Copyright © 2012 Wiley Periodicals, Inc., A Wiley Company.

  19. Mathematical modeling of synthesis gas fueled electrochemistry and transport including H2/CO co-oxidation and surface diffusion in solid oxide fuel cell

    NASA Astrophysics Data System (ADS)

    Bao, Cheng; Jiang, Zeyi; Zhang, Xinxin

    2015-10-01

    Fuel flexibility is a significant advantage of solid oxide fuel cell (SOFC). A comprehensive macroscopic framework is proposed for synthesis gas (syngas) fueled electrochemistry and transport in SOFC anode with two main novelties, i.e. analytical H2/CO electrochemical co-oxidation, and correction of gas species concentration at triple phase boundary considering competitive absorption and surface diffusion. Staring from analytical approximation of the decoupled charge and mass transfer, we present analytical solutions of two defined variables, i.e. hydrogen current fraction and enhancement factor. Giving explicit answer (rather than case-by-case numerical calculation) on how many percent of the current output contributed by H2 or CO and on how great the water gas shift reaction plays role on, this approach establishes at the first time an adaptive superposition mechanism of H2-fuel and CO-fuel electrochemistry for syngas fuel. Based on the diffusion equivalent circuit model, assuming series-connected resistances of surface diffusion and bulk diffusion, the model predicts well at high fuel utilization by keeping fixed porosity/tortuosity ratio. The model has been validated by experimental polarization behaviors in a wide range of operation on a button cell for H2-H2O-CO-CO2-N2 fuel systems. The framework could be helpful to narrow the gap between macro-scale and meso-scale SOFC modeling.

  20. Urban Partnership Agreement and Congestion Reduction Demonstration : National Evaluation Framework

    DOT National Transportation Integrated Search

    2008-11-21

    This report provides an analytical framework for evaluating six deployments under the United States Department of Transportation (U.S. DOT) Urban Partnership Agreement (UPA) and Congestion Reduction Demonstration (CRD) Programs. The six UPA/CRD sites...

  1. Scaling Student Success with Predictive Analytics: Reflections after Four Years in the Data Trenches

    ERIC Educational Resources Information Center

    Wagner, Ellen; Longanecker, David

    2016-01-01

    The metrics used in the US to track students do not include adults and part-time students. This has led to the development of a massive data initiative--the Predictive Analytics Reporting (PAR) framework--that uses predictive analytics to trace the progress of all types of students in the system. This development has allowed actionable,…

  2. A survey on platforms for big data analytics.

    PubMed

    Singh, Dilpreet; Reddy, Chandan K

    The primary purpose of this paper is to provide an in-depth analysis of different platforms available for performing big data analytics. This paper surveys different hardware platforms available for big data analytics and assesses the advantages and drawbacks of each of these platforms based on various metrics such as scalability, data I/O rate, fault tolerance, real-time processing, data size supported and iterative task support. In addition to the hardware, a detailed description of the software frameworks used within each of these platforms is also discussed along with their strengths and drawbacks. Some of the critical characteristics described here can potentially aid the readers in making an informed decision about the right choice of platforms depending on their computational needs. Using a star ratings table, a rigorous qualitative comparison between different platforms is also discussed for each of the six characteristics that are critical for the algorithms of big data analytics. In order to provide more insights into the effectiveness of each of the platform in the context of big data analytics, specific implementation level details of the widely used k-means clustering algorithm on various platforms are also described in the form pseudocode.

  3. Assessment of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.

    2014-01-01

    National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…

  4. A new model for fluid velocity slip on a solid surface.

    PubMed

    Shu, Jian-Jun; Teo, Ji Bin Melvin; Chan, Weng Kong

    2016-10-12

    A general adsorption model is developed to describe the interactions between near-wall fluid molecules and solid surfaces. This model serves as a framework for the theoretical modelling of boundary slip phenomena. Based on this adsorption model, a new general model for the slip velocity of fluids on solid surfaces is introduced. The slip boundary condition at a fluid-solid interface has hitherto been considered separately for gases and liquids. In this paper, we show that the slip velocity in both gases and liquids may originate from dynamical adsorption processes at the interface. A unified analytical model that is valid for both gas-solid and liquid-solid slip boundary conditions is proposed based on surface science theory. The corroboration with the experimental data extracted from the literature shows that the proposed model provides an improved prediction compared to existing analytical models for gases at higher shear rates and close agreement for liquid-solid interfaces in general.

  5. Design and Implementation of an Architectural Framework for Web Portals in a Ubiquitous Pervasive Environment

    PubMed Central

    Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol

    2009-01-01

    Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal’s gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world’s largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework. PMID:22346693

  6. Design and implementation of an architectural framework for web portals in a ubiquitous pervasive environment.

    PubMed

    Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol

    2009-01-01

    Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal's gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world's largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework.

  7. Rainbow: A Framework for Analysing Computer-Mediated Pedagogical Debates

    ERIC Educational Resources Information Center

    Baker, Michael; Andriessen, Jerry; Lund, Kristine; van Amelsvoort, Marie; Quignard, Matthieu

    2007-01-01

    In this paper we present a framework for analysing when and how students engage in a specific form of interactive knowledge elaboration in CSCL environments: broadening and deepening understanding of a space of debate. The framework is termed "Rainbow," as it comprises seven principal analytical categories, to each of which a colour is assigned,…

  8. Green Framework and Its Role in Sustainable City Development (by Example of Yekaterinburg)

    NASA Astrophysics Data System (ADS)

    Maltseva, A.

    2017-11-01

    The article focuses on the destruction of the city green framework in Yekaterinburg. The strategy of its recovery by means of a bioactive core represented by a botanic garden has been proposed. The analytical framework for modification in the proportion of green territories and the total city area has been described.

  9. Integrated corridor management initiative : demonstration phase evaluation - final national evaluation framework.

    DOT National Transportation Integrated Search

    2012-05-01

    This report provides an analytical framework for evaluating the two field deployments under the United States Department of Transportation (U.S. DOT) Integrated Corridor Management (ICM) Initiative Demonstration Phase. The San Diego Interstate 15 cor...

  10. Getting inside acupuncture trials - Exploring intervention theory and rationale

    PubMed Central

    2011-01-01

    Background Acupuncture can be described as a complex intervention. In reports of clinical trials the mechanism of acupuncture (that is, the process by which change is effected) is often left unstated or not known. This is problematic in assisting understanding of how acupuncture might work and in drawing together evidence on the potential benefits of acupuncture. Our aim was to aid the identification of the assumed mechanisms underlying the acupuncture interventions in clinical trials by developing an analytical framework to differentiate two contrasting approaches to acupuncture (traditional acupuncture and Western medical acupuncture). Methods Based on the principles of realist review, an analytical framework to differentiate these two contrasting approaches was developed. In order to see how useful the framework was in uncovering the theoretical rationale, it was applied to a set of trials of acupuncture for fatigue and vasomotor symptoms, identified from a wider literature review of acupuncture and early stage breast cancer. Results When examined for the degree to which a study demonstrated adherence to a theoretical model, two of the fourteen selected studies could be considered TA, five MA, with the remaining seven not fitting into any recognisable model. When examined by symptom, five of the nine vasomotor studies, all from one group of researchers, are arguably in the MA category, and two a TA model; in contrast, none of the five fatigue studies could be classed as either MA or TA and all studies had a weak rationale for the chosen treatment for fatigue. Conclusion Our application of the framework to the selected studies suggests that it is a useful tool to help uncover the therapeutic rationale of acupuncture interventions in clinical trials, for distinguishing between TA and MA approaches and for exploring issues of model validity. English language acupuncture trials frequently fail to report enough detail relating to the intervention. We advocate using this framework to aid reporting, along with further testing and refinement of the framework. PMID:21414187

  11. Distinctive aspects of the evolution of galactic magnetic fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yar-Mukhamedov, D., E-mail: danial.su@gmail.com

    2016-11-15

    We perform an in-depth analysis of the evolution of galactic magnetic fields within a semi-analytic galaxy formation and evolution framework, determine various distinctive aspects of the evolution process, and obtain analytic solutions for a wide range of possible evolution scenarios.

  12. Strategic, Analytic and Operational Domains of Information Management.

    ERIC Educational Resources Information Center

    Diener, Richard AV

    1992-01-01

    Discussion of information management focuses on three main areas of activities and their interrelationship: (1) strategic, including establishing frameworks and principles of operations; (2) analytic, or research elements, including user needs assessment, data gathering, and data analysis; and (3) operational activities, including reference…

  13. Highly sensitive and selective fluoride detection in water through fluorophore release from a metal-organic framework

    PubMed Central

    Hinterholzinger, Florian M.; Rühle, Bastian; Wuttke, Stefan; Karaghiosoff, Konstantin; Bein, Thomas

    2013-01-01

    The detection, differentiation and visualization of compounds such as gases, liquids or ions are key challenges for the design of selective optical chemosensors. Optical chemical sensors employ a transduction mechanism that converts a specific analyte recognition event into an optical signal. Here we report a novel concept for fluoride ion sensing where a porous crystalline framework serves as a host for a fluorescent reporter molecule. The detection is based on the decomposition of the host scaffold which induces the release of the fluorescent dye molecule. Specifically, the hybrid composite of the metal-organic framework NH2-MIL-101(Al) and fluorescein acting as reporter shows an exceptional turn-on fluorescence in aqueous fluoride-containing solutions. Using this novel strategy, the optical detection of fluoride is extremely sensitive and highly selective in the presence of many other anions. PMID:24008779

  14. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting

    PubMed Central

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen; Wald, Lawrence L.

    2017-01-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization. PMID:26915119

  15. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.

    PubMed

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L

    2016-08-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.

  16. An integrative framework for sensor-based measurement of teamwork in healthcare.

    PubMed

    Rosen, Michael A; Dietz, Aaron S; Yang, Ting; Priebe, Carey E; Pronovost, Peter J

    2015-01-01

    There is a strong link between teamwork and patient safety. Emerging evidence supports the efficacy of teamwork improvement interventions. However, the availability of reliable, valid, and practical measurement tools and strategies is commonly cited as a barrier to long-term sustainment and spread of these teamwork interventions. This article describes the potential value of sensor-based technology as a methodology to measure and evaluate teamwork in healthcare. The article summarizes the teamwork literature within healthcare, including team improvement interventions and measurement. Current applications of sensor-based measurement of teamwork are reviewed to assess the feasibility of employing this approach in healthcare. The article concludes with a discussion highlighting current application needs and gaps and relevant analytical techniques to overcome the challenges to implementation. Compelling studies exist documenting the feasibility of capturing a broad array of team input, process, and output variables with sensor-based methods. Implications of this research are summarized in a framework for development of multi-method team performance measurement systems. Sensor-based measurement within healthcare can unobtrusively capture information related to social networks, conversational patterns, physical activity, and an array of other meaningful information without having to directly observe or periodically survey clinicians. However, trust and privacy concerns present challenges that need to be overcome through engagement of end users in healthcare. Initial evidence exists to support the feasibility of sensor-based measurement to drive feedback and learning across individual, team, unit, and organizational levels. Future research is needed to refine methods, technologies, theory, and analytical strategies. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliations see end of article.

  17. Tensor body: real-time reconstruction of the human body and avatar synthesis from RGB-D.

    PubMed

    Barmpoutis, Angelos

    2013-10-01

    Real-time 3-D reconstruction of the human body has many applications in anthropometry, telecommunications, gaming, fashion, and other areas of human-computer interaction. In this paper, a novel framework is presented for reconstructing the 3-D model of the human body from a sequence of RGB-D frames. The reconstruction is performed in real time while the human subject moves arbitrarily in front of the camera. The method employs a novel parameterization of cylindrical-type objects using Cartesian tensor and b-spline bases along the radial and longitudinal dimension respectively. The proposed model, dubbed tensor body, is fitted to the input data using a multistep framework that involves segmentation of the different body regions, robust filtering of the data via a dynamic histogram, and energy-based optimization with positive-definite constraints. A Riemannian metric on the space of positive-definite tensor splines is analytically defined and employed in this framework. The efficacy of the presented methods is demonstrated in several real-data experiments using the Microsoft Kinect sensor.

  18. The Key Events Dose-Response Framework: a cross-disciplinary mode-of-action based approach to examining dose-response and thresholds.

    PubMed

    Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S

    2009-09-01

    The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.

  19. Capacity-Delay Trade-Off in Collaborative Hybrid Ad-Hoc Networks with Coverage Sensing.

    PubMed

    Chen, Lingyu; Luo, Wenbin; Liu, Chen; Hong, Xuemin; Shi, Jianghong

    2017-01-26

    The integration of ad hoc device-to-device (D2D) communications and open-access small cells can result in a networking paradigm called hybrid the ad hoc network, which is particularly promising in delivering delay-tolerant data. The capacity-delay performance of hybrid ad hoc networks has been studied extensively under a popular framework called scaling law analysis. These studies, however, do not take into account aspects of interference accumulation and queueing delay and, therefore, may lead to over-optimistic results. Moreover, focusing on the average measures, existing works fail to give finer-grained insights into the distribution of delays. This paper proposes an alternative analytical framework based on queueing theoretic models and physical interference models. We apply this framework to study the capacity-delay performance of a collaborative cellular D2D network with coverage sensing and two-hop relay. The new framework allows us to fully characterize the delay distribution in the transform domain and pinpoint the impacts of coverage sensing, user and base station densities, transmit power, user mobility and packet size on the capacity-delay trade-off. We show that under the condition of queueing equilibrium, the maximum throughput capacity per device saturates to an upper bound of 0.7239 λ b / λ u bits/s/Hz, where λ b and λ u are the densities of base stations and mobile users, respectively.

  20. Capacity-Delay Trade-Off in Collaborative Hybrid Ad-Hoc Networks with Coverage Sensing

    PubMed Central

    Chen, Lingyu; Luo, Wenbin; Liu, Chen; Hong, Xuemin; Shi, Jianghong

    2017-01-01

    The integration of ad hoc device-to-device (D2D) communications and open-access small cells can result in a networking paradigm called hybrid the ad hoc network, which is particularly promising in delivering delay-tolerant data. The capacity-delay performance of hybrid ad hoc networks has been studied extensively under a popular framework called scaling law analysis. These studies, however, do not take into account aspects of interference accumulation and queueing delay and, therefore, may lead to over-optimistic results. Moreover, focusing on the average measures, existing works fail to give finer-grained insights into the distribution of delays. This paper proposes an alternative analytical framework based on queueing theoretic models and physical interference models. We apply this framework to study the capacity-delay performance of a collaborative cellular D2D network with coverage sensing and two-hop relay. The new framework allows us to fully characterize the delay distribution in the transform domain and pinpoint the impacts of coverage sensing, user and base station densities, transmit power, user mobility and packet size on the capacity-delay trade-off. We show that under the condition of queueing equilibrium, the maximum throughput capacity per device saturates to an upper bound of 0.7239 λb/λu bits/s/Hz, where λb and λu are the densities of base stations and mobile users, respectively. PMID:28134769

  1. Joseph v. Brady: Synthesis Reunites What Analysis Has Divided

    ERIC Educational Resources Information Center

    Thompson, Travis

    2012-01-01

    Joseph V. Brady (1922-2011) created behavior-analytic neuroscience and the analytic framework for understanding how the external and internal neurobiological environments and mechanisms interact. Brady's approach offered synthesis as well as analysis. He embraced Findley's approach to constructing multioperant behavioral repertoires that found…

  2. A dislocation-based crystal plasticity framework for dynamic ductile failure of single crystals

    DOE PAGES

    Nguyen, Thao; Luscher, D. J.; Wilkerson, J. W.

    2017-08-02

    We developed a framework for dislocation-based viscoplasticity and dynamic ductile failure to model high strain rate deformation and damage in single crystals. The rate-dependence of the crystal plasticity formulation is based on the physics of relativistic dislocation kinetics suited for extremely high strain rates. The damage evolution is based on the dynamics of void growth, which are governed by both micro-inertia as well as dislocation kinetics and dislocation substructure evolution. Furthermore, an averaging scheme is proposed in order to approximate the evolution of the dislocation substructure in both the macroscale as well as its spatial distribution at the microscale. Inmore » addition, a concept of a single equivalent dislocation density that effectively captures the collective influence of dislocation density on all active slip systems is proposed here. Together, these concepts and approximations enable the use of semi-analytic solutions for void growth dynamics developed in [J. Wilkerson and K. Ramesh. A dynamic void growth model governed by dislocation kinetics. J. Mech. Phys. Solids, 70:262–280, 2014.], which greatly reduce the computational overhead that would otherwise be required. The resulting homogenized framework has been implemented into a commercially available finite element package, and a validation study against a suite of direct numerical simulations was carried out.« less

  3. A research and experimentation framework for exploiting VoI-based methods within analyst workflows in tactical operation centers

    NASA Astrophysics Data System (ADS)

    Sadler, Laurel

    2017-05-01

    In today's battlefield environments, analysts are inundated with real-time data received from the tactical edge that must be evaluated and used for managing and modifying current missions as well as planning for future missions. This paper describes a framework that facilitates a Value of Information (VoI) based data analytics tool for information object (IO) analysis in a tactical and command and control (C2) environment, which reduces analyst work load by providing automated or analyst assisted applications. It allows the analyst to adjust parameters for data matching of the IOs that will be received and provides agents for further filtering or fusing of the incoming data. It allows for analyst enhancement and markup to be made to and/or comments to be attached to the incoming IOs, which can then be re-disseminated utilizing the VoI based dissemination service. The analyst may also adjust the underlying parameters before re-dissemination of an IO, which will subsequently adjust the value of the IO based on this new/additional information that has been added, possibly increasing the value from the original. The framework is flexible and extendable, providing an easy to use, dynamically changing Command and Control decision aid that focuses and enhances the analyst workflow.

  4. Modeling convection-diffusion-reaction systems for microfluidic molecular communications with surface-based receivers in Internet of Bio-Nano Things

    PubMed Central

    Akan, Ozgur B.

    2018-01-01

    We consider a microfluidic molecular communication (MC) system, where the concentration-encoded molecular messages are transported via fluid flow-induced convection and diffusion, and detected by a surface-based MC receiver with ligand receptors placed at the bottom of the microfluidic channel. The overall system is a convection-diffusion-reaction system that can only be solved by numerical methods, e.g., finite element analysis (FEA). However, analytical models are key for the information and communication technology (ICT), as they enable an optimisation framework to develop advanced communication techniques, such as optimum detection methods and reliable transmission schemes. In this direction, we develop an analytical model to approximate the expected time course of bound receptor concentration, i.e., the received signal used to decode the transmitted messages. The model obviates the need for computationally expensive numerical methods by capturing the nonlinearities caused by laminar flow resulting in parabolic velocity profile, and finite number of ligand receptors leading to receiver saturation. The model also captures the effects of reactive surface depletion layer resulting from the mass transport limitations and moving reaction boundary originated from the passage of finite-duration molecular concentration pulse over the receiver surface. Based on the proposed model, we derive closed form analytical expressions that approximate the received pulse width, pulse delay and pulse amplitude, which can be used to optimize the system from an ICT perspective. We evaluate the accuracy of the proposed model by comparing model-based analytical results to the numerical results obtained by solving the exact system model with COMSOL Multiphysics. PMID:29415019

  5. Modeling convection-diffusion-reaction systems for microfluidic molecular communications with surface-based receivers in Internet of Bio-Nano Things.

    PubMed

    Kuscu, Murat; Akan, Ozgur B

    2018-01-01

    We consider a microfluidic molecular communication (MC) system, where the concentration-encoded molecular messages are transported via fluid flow-induced convection and diffusion, and detected by a surface-based MC receiver with ligand receptors placed at the bottom of the microfluidic channel. The overall system is a convection-diffusion-reaction system that can only be solved by numerical methods, e.g., finite element analysis (FEA). However, analytical models are key for the information and communication technology (ICT), as they enable an optimisation framework to develop advanced communication techniques, such as optimum detection methods and reliable transmission schemes. In this direction, we develop an analytical model to approximate the expected time course of bound receptor concentration, i.e., the received signal used to decode the transmitted messages. The model obviates the need for computationally expensive numerical methods by capturing the nonlinearities caused by laminar flow resulting in parabolic velocity profile, and finite number of ligand receptors leading to receiver saturation. The model also captures the effects of reactive surface depletion layer resulting from the mass transport limitations and moving reaction boundary originated from the passage of finite-duration molecular concentration pulse over the receiver surface. Based on the proposed model, we derive closed form analytical expressions that approximate the received pulse width, pulse delay and pulse amplitude, which can be used to optimize the system from an ICT perspective. We evaluate the accuracy of the proposed model by comparing model-based analytical results to the numerical results obtained by solving the exact system model with COMSOL Multiphysics.

  6. Gauge-origin independent formalism of two-component relativistic framework based on unitary transformation in nuclear magnetic shielding constant

    NASA Astrophysics Data System (ADS)

    Hayami, Masao; Seino, Junji; Nakai, Hiromi

    2018-03-01

    This article proposes a gauge-origin independent formalism of the nuclear magnetic shielding constant in the two-component relativistic framework based on the unitary transformation. The proposed scheme introduces the gauge factor and the unitary transformation into the atomic orbitals. The two-component relativistic equation is formulated by block-diagonalizing the Dirac Hamiltonian together with gauge factors. This formulation is available for arbitrary relativistic unitary transformations. Then, the infinite-order Douglas-Kroll-Hess (IODKH) transformation is applied to the present formulation. Next, the analytical derivatives of the IODKH Hamiltonian for the evaluation of the nuclear magnetic shielding constant are derived. Results obtained from the numerical assessments demonstrate that the present formulation removes the gauge-origin dependence completely. Furthermore, the formulation with the IODKH transformation gives results that are close to those in four-component and other two-component relativistic schemes.

  7. Emergence of grouping in multi-resource minority game dynamics

    NASA Astrophysics Data System (ADS)

    Huang, Zi-Gang; Zhang, Ji-Qiang; Dong, Jia-Qi; Huang, Liang; Lai, Ying-Cheng

    2012-10-01

    Complex systems arising in a modern society typically have many resources and strategies available for their dynamical evolutions. To explore quantitatively the behaviors of such systems, we propose a class of models to investigate Minority Game (MG) dynamics with multiple strategies. In particular, agents tend to choose the least used strategies based on available local information. A striking finding is the emergence of grouping states defined in terms of distinct strategies. We develop an analytic theory based on the mean-field framework to understand the ``bifurcations'' of the grouping states. The grouping phenomenon has also been identified in the Shanghai Stock-Market system, and we discuss its prevalence in other real-world systems. Our work demonstrates that complex systems obeying the MG rules can spontaneously self-organize themselves into certain divided states, and our model represents a basic and general mathematical framework to address this kind of phenomena in social, economical and political systems.

  8. Analysis of laser shock experiments on precompressed samples using a quartz reference and application to warm dense hydrogen and helium

    DOE PAGES

    Brygoo, Stephanie; Millot, Marius; Loubeyre, Paul; ...

    2015-11-16

    Megabar (1 Mbar = 100 GPa) laser shocks on precompressed samples allow reaching unprecedented high densities and moderately high ~10 3–10 4 K temperatures. We describe in this paper a complete analysis framework for the velocimetry (VISAR) and pyrometry (SOP) data produced in these experiments. Since the precompression increases the initial density of both the sample of interest and the quartz reference for pressure-density, reflectivity, and temperature measurements, we describe analytical corrections based on available experimental data on warm dense silica and density-functional-theory based molecular dynamics computer simulations. Finally, using our improved analysis framework, we report a re-analysis of previouslymore » published data on warm dense hydrogen and helium, compare the newly inferred pressure, density, and temperature data with most advanced equation of state models and provide updated reflectivity values.« less

  9. Human exposure and internal dose assessments of acrylamide in food.

    PubMed

    Dybing, E; Farmer, P B; Andersen, M; Fennell, T R; Lalljie, S P D; Müller, D J G; Olin, S; Petersen, B J; Schlatter, J; Scholz, G; Scimeca, J A; Slimani, N; Törnqvist, M; Tuijtelaars, S; Verger, P

    2005-03-01

    This review provides a framework contributing to the risk assessment of acrylamide in food. It is based on the outcome of the ILSI Europe FOSIE process, a risk assessment framework for chemicals in foods and adds to the overall framework by focusing especially on exposure assessment and internal dose assessment of acrylamide in food. Since the finding that acrylamide is formed in food during heat processing and preparation of food, much effort has been (and still is being) put into understanding its mechanism of formation, on developing analytical methods and determination of levels in food, and on evaluation of its toxicity and potential toxicity and potential human health consequences. Although several exposure estimations have been proposed, a systematic review of key information relevant to exposure assessment is currently lacking. The European and North American branches of the International Life Sciences Institute, ILSI, discussed critical aspects of exposure assessment, parameters influencing the outcome of exposure assessment and summarised data relevant to the acrylamide exposure assessment to aid the risk characterisation process. This paper reviews the data on acrylamide levels in food including its formation and analytical methods, the determination of human consumption patterns, dietary intake of the general population, estimation of maximum intake levels and identification of groups of potentially high intakes. Possible options and consequences of mitigation efforts to reduce exposure are discussed. Furthermore the association of intake levels with biomarkers of exposure and internal dose, considering aspects of bioavailability, is reviewed, and a physiologically-based toxicokinetic (PBTK) model is described that provides a good description of the kinetics of acrylamide in the rat. Each of the sections concludes with a summary of remaining gaps and uncertainties.

  10. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  11. Development of balanced key performance indicators for emergency departments strategic dashboards following analytic hierarchical process.

    PubMed

    Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh

    2014-01-01

    Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.

  12. On the optimal design of molecular sensing interfaces with lipid bilayer assemblies - A knowledge based approach

    NASA Astrophysics Data System (ADS)

    Siontorou, Christina G.

    2012-12-01

    Biosensors are analytic devices that incorporate a biochemical recognition system (biological, biologicalderived or biomimic: enzyme, antibody, DNA, receptor, etc.) in close contact with a physicochemical transducer (electrochemical, optical, piezoelectric, conductimetric, etc.) that converts the biochemical information, produced by the specific biological recognition reaction (analyte-biomolecule binding), into a chemical or physical output signal, related to the concentration of the analyte in the measuring sample. The biosensing concept is based on natural chemoreception mechanisms, which are feasible over/within/by means of a biological membrane, i.e., a structured lipid bilayer, incorporating or attached to proteinaceous moieties that regulate molecular recognition events which trigger ion flux changes (facilitated or passive) through the bilayer. The creation of functional structures that are similar to natural signal transduction systems, correlating and interrelating compatibly and successfully the physicochemical transducer with the lipid film that is self-assembled on its surface while embedding the reconstituted biological recognition system, and at the same time manage to satisfy the basic conditions for measuring device development (simplicity, easy handling, ease of fabrication) is far from trivial. The aim of the present work is to present a methodological framework for designing such molecular sensing interfaces, functioning within a knowledge-based system built on an ontological platform for supplying sub-systems options, compatibilities, and optimization parameters.

  13. Metal-organic framework MIL-101 as sorbent based on double-pumps controlled on-line solid-phase extraction coupled with high-performance liquid chromatography for the determination of flavonoids in environmental water samples.

    PubMed

    Liu, Yue; Hu, Jia; Li, Yan; Li, Xiao-Shuang; Wang, Zhong-Liang

    2016-10-01

    A novel method with high sensitivity for the rapid determination of chrysin, apigenin and luteolin in environment water samples was developed by double-pumps controlled on-line solid-phase extraction (SPE) coupled with high-performance liquid chromatography (HPLC). In the developed technique, metal organic framework MIL-101 was synthesized and applied as a sorbent for SPE. The as-synthesized MIL-101 was characterized by scanning electron microscope, X-ray diffraction spectrometry, thermal gravimetric analysis and micropore physisorption analysis. The MIL-101 behaved as a fast kinetics in the adsorption of chrysin, apigenin and luteolin. On-line SPE of chrysin, apigenin and luteolin was processed by loading a sample solution at a flow rate of 1.0 mL/min for 10 min. The extracted analytes were subsequently eluted into a ZORBAX Bonus-RP analytical column (25 cm long × 4.6 mm i.d.) for HPLC separation under isocratic condition with a mobile phase (MeOH: ACN: 0.02 M H 3 PO 4 = 35:35:30) at a flow rate of 1.0 mL/min. Experimental conditions, including ionic strength, sample pH, sample loading rates, sample loading time and desorption analytes time, were further optimized to obtain efficient preconcentration and high-precision determination of the analytes mentioned above. The method achieved the merits of simplicity, rapidity, sensitivity, wide linear range and high sample throughput. The possible mechanism for the adsorption of flavonoids on MIL-101 was proposed. The developed method has been applied to determine trace chrysin, apigenin and luteolin in a variety of environmental water samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  15. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance

    PubMed Central

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405

  16. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance.

    PubMed

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.

  17. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE PAGES

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...

    2016-01-28

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  18. Big data analytics in healthcare: promise and potential.

    PubMed

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  19. Many-Objective Reservoir Policy Identification and Refinement to Reduce Institutional Myopia in Water Management

    NASA Astrophysics Data System (ADS)

    Giuliani, M.; Herman, J. D.; Castelletti, A.; Reed, P. M.

    2013-12-01

    Institutional inertia strongly limits our ability to adapt water reservoir operations to better manage growing water demands as well as their associated uncertainties in a changing climate. Although it has long been recognized that these systems are generally framed in heterogeneous socio-economic contexts involving a myriad of conflicting, non-commensurable operating objectives, our broader understanding of the multiobjective consequences of current operating rules as well as their vulnerability to hydroclimatic uncertainties is severely limited. This study proposes a decision analytic framework to overcome policy inertia and myopia in complex river basin management contexts. The framework combines reservoir policy identification and many-objective optimization under uncertainty to characterize current operations and discover key tradeoffs between alternative policies for balancing evolving demands and system uncertainties. The approach is demonstrated on the Conowingo Dam, located within the Lower Susquehanna River, USA. The Lower Susquehanna River is an interstate water body that has been subject to intensive water management efforts due to the system's competing demands from urban water supply, atomic power plant cooling, hydropower production, and federally regulated environmental flows. Initially our proposed framework uses available streamflow observations to implicitly identify the Conowingo Dam's current but unknown operating policy. This baseline policy is identified by fitting radial basis functions to existing system dynamics. Our assumption in the baseline policy is that the dam operator is represented as a rational agent seeking to maximize primary operational objectives (i.e., guaranteeing the public water supply and maximizing the hydropower revenue). The quality of the identified baseline policy is evaluated by its ability to replicate historical release dynamics. Once identified, the historical baseline policy then provides a means of representing the decision preferences guiding current operations. Our results show that the estimated policy closely captures the dynamics of current releases and flows for the Lower Susquehanna. After identifying the historical baseline policy, our proposed decision analytic framework then combines evolutionary many-objective optimization with visual analytics to discover improved operating policies. Our Lower Susquehanna results confirm that the system's current history-based operations are negatively biased to overestimate the reliability of the reservoir's multi-sector services. Moreover, our proposed framework has successfully identified alternative reservoir policies that are more robust to hydroclimatic uncertainties while being capable of better addressing the tradeoffs across the Conowingo Dam's multi-sector services.

  20. Multi-phase-field method for surface tension induced elasticity

    NASA Astrophysics Data System (ADS)

    Schiedung, Raphael; Steinbach, Ingo; Varnik, Fathollah

    2018-01-01

    A method, based on the multi-phase-field framework, is proposed that adequately accounts for the effects of a coupling between surface free energy and elastic deformation in solids. The method is validated via a number of analytically solvable problems. In addition to stress states at mechanical equilibrium in complex geometries, the underlying multi-phase-field framework naturally allows us to account for the influence of surface energy induced stresses on phase transformation kinetics. This issue, which is of fundamental importance on the nanoscale, is demonstrated in the limit of fast diffusion for a solid sphere, which melts due to the well-known Gibbs-Thompson effect. This melting process is slowed down when coupled to surface energy induced elastic deformation.

  1. Quantifying drivers of wild pig movement across multiple spatial and temporal scales

    USGS Publications Warehouse

    Kay, Shannon L.; Fischer, Justin W.; Monaghan, Andrew J.; Beasley, James C; Boughton, Raoul; Campbell, Tyler A; Cooper, Susan M; Ditchkoff, Stephen S.; Hartley, Stephen B.; Kilgo, John C; Wisely, Samantha M; Wyckoff, A Christy; Vercauteren, Kurt C.; Pipen, Kim M

    2017-01-01

    The analytical framework we present can be used to assess movement patterns arising from multiple data sources for a range of species while accounting for spatio-temporal correlations. Our analyses show the magnitude by which reaction norms can change based on the temporal scale of response data, illustrating the importance of appropriately defining temporal scales of both the movement response and covariates depending on the intended implications of research (e.g., predicting effects of movement due to climate change versus planning local-scale management). We argue that consideration of multiple spatial scales within the same framework (rather than comparing across separate studies post-hoc) gives a more accurate quantification of cross-scale spatial effects by appropriately accounting for error correlation.

  2. Designing perfect linear polarization converters using perfect electric and magnetic conducting surfaces

    PubMed Central

    Zhou, Gaochao; Tao, Xudong; Shen, Ze; Zhu, Guanghao; Jin, Biaobing; Kang, Lin; Xu, Weiwei; Chen, Jian; Wu, Peiheng

    2016-01-01

    We propose a kind of general framework for the design of a perfect linear polarization converter that works in the transmission mode. Using an intuitive picture that is based on the method of bi-directional polarization mode decomposition, it is shown that when the device under consideration simultaneously possesses two complementary symmetry planes, with one being equivalent to a perfect electric conducting surface and the other being equivalent to a perfect magnetic conducting surface, linear polarization conversion can occur with an efficiency of 100% in the absence of absorptive losses. The proposed framework is validated by two design examples that operate near 10 GHz, where the numerical, experimental and analytic results are in good agreements. PMID:27958313

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pei, Zongrui; Stocks, George Malcolm

    The sensitivity in predicting glide behaviour of dislocations has been a long-standing problem in the framework of the Peierls-Nabarro model. The predictions of both the model itself and the analytic formulas based on it are too sensitive to the input parameters. In order to reveal the origin of this important problem in materials science, a new empirical-parameter-free formulation is proposed in the same framework. Unlike previous formulations, it includes only a limited small set of parameters all of which can be determined by convergence tests. Under special conditions the new formulation is reduced to its classic counterpart. In the lightmore » of this formulation, new relationships between Peierls stresses and the input parameters are identified, where the sensitivity is greatly reduced or even removed.« less

  4. Study designs for identification of rare disease variants in complex diseases: the utility of family-based designs.

    PubMed

    Ionita-Laza, Iuliana; Ottman, Ruth

    2011-11-01

    The recent progress in sequencing technologies makes possible large-scale medical sequencing efforts to assess the importance of rare variants in complex diseases. The results of such efforts depend heavily on the use of efficient study designs and analytical methods. We introduce here a unified framework for association testing of rare variants in family-based designs or designs based on unselected affected individuals. This framework allows us to quantify the enrichment in rare disease variants in families containing multiple affected individuals and to investigate the optimal design of studies aiming to identify rare disease variants in complex traits. We show that for many complex diseases with small values for the overall sibling recurrence risk ratio, such as Alzheimer's disease and most cancers, sequencing affected individuals with a positive family history of the disease can be extremely advantageous for identifying rare disease variants. In contrast, for complex diseases with large values of the sibling recurrence risk ratio, sequencing unselected affected individuals may be preferable.

  5. Eyes of the Storm: Can Fusion Centers Play a Crucial Role During the Response Phase of Natural Disasters Through Collaborative Relationships With Emergency Operations Centers?

    DTIC Science & Technology

    2014-09-01

    Hispanic, street, and outlaw motorcycle gang activity in the Commonwealth. The VFC manages the suspicious activity reporting (SAR) initiative...Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington... management . This thesis examined three case studies of fusion center disaster responses through a collaborative-based analytical framework. The resulting

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallace Tyner

    The project had three main objectives: to build and incorporate an explicit biomass energy sector within the GTAP analytical framework and data base; to provide an analysis of the impact of renewable fuel standards and other policies in the U.S. and E.U, as well as alternative biofuel policies in other parts of the world, on changes in production, prices, consumption, trade and poverty; and to evaluate environmental impacts of alternative policies for bioenergy development. Progress and outputs related to each objective are reported.

  7. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends

    PubMed Central

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096

  8. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends.

    PubMed

    Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.

  9. Metal-Organic Framework Modified Glass Substrate for Analysis of Highly Volatile Chemical Warfare Agents by Paper Spray Mass Spectrometry.

    PubMed

    Dhummakupt, Elizabeth S; Carmany, Daniel O; Mach, Phillip M; Tovar, Trenton M; Ploskonka, Ann M; Demond, Paul S; DeCoste, Jared B; Glaros, Trevor

    2018-03-07

    Paper spray mass spectrometry has been shown to successfully analyze chemical warfare agent (CWA) simulants. However, due to the volatility differences between the simulants and real G-series (i.e., sarin, soman) CWAs, analysis from an untreated paper substrate proved difficult. To extend the analytical lifetime of these G-agents, metal-organic frameworks (MOFs) were successfully integrated onto the paper spray substrates to increase adsorption and desorption. In this study, several MOFs and nanoparticles were tested to extend the analytical lifetimes of sarin, soman, and cyclosarin on paper spray substrates. It was found that the addition of either UiO-66 or HKUST-1 to the paper substrate increased the analytical lifetime of the G-agents from less than 5 min detectability to at least 50 min.

  10. A channel-based framework for steering, non-locality and beyond

    NASA Astrophysics Data System (ADS)

    Hoban, Matty J.; Belén Sainz, Ana

    2018-05-01

    Non-locality and steering are both non-classical phenomena witnessed in nature as a result of quantum entanglement. It is now well-established that one can study non-locality independently of the formalism of quantum mechanics, in the so-called device-independent framework. With regards to steering, although one cannot study it completely independently of the quantum formalism, ‘post-quantum steering’ has been described, which is steering that cannot be reproduced by measurements on entangled states but does not lead to superluminal signalling. In this work we present a framework based on the study of quantum channels in which one can study steering (and non-locality) in quantum theory and beyond. In this framework, we show that kinds of steering, whether quantum or post-quantum, are directly related to particular families of quantum channels that have been previously introduced by Beckman et al (2001 Phys. Rev. A 64 052309). Utilizing this connection we also demonstrate new analytical examples of post-quantum steering, give a quantum channel interpretation of almost quantum non-locality and steering, easily recover and generalize the celebrated Gisin–Hughston–Jozsa–Wootters theorem, and initiate the study of post-quantum Buscemi non-locality and non-classical teleportation. In this way, we see post-quantum non-locality and steering as just two aspects of a more general phenomenon.

  11. Cultural Cleavage and Criminal Justice.

    ERIC Educational Resources Information Center

    Scheingold, Stuart A.

    1978-01-01

    Reviews major theories of criminal justice, proposes an alternative analytic framework which focuses on cultural factors, applies this framework to several cases, and discusses implications of a cultural perspective for rule of law values. Journal available from Office of Publication, Department of Political Science, University of Florida,…

  12. Institutional Racist Melancholia: A Structural Understanding of Grief and Power in Schooling

    ERIC Educational Resources Information Center

    Vaught, Sabina E.

    2012-01-01

    In this article, Sabina Vaught undertakes the theoretical and analytical project of conceptually integrating "Whiteness as property", a key structural framework of Critical Race Theory (CRT), and "melancholia", a framework originally emerging from psychoanalysis. Specifically, Vaught engages "Whiteness as property" as…

  13. Matrix-enhanced secondary ion mass spectrometry: The Alchemist's solution?

    NASA Astrophysics Data System (ADS)

    Delcorte, Arnaud

    2006-07-01

    Because of the requirements of large molecule characterization and high-lateral resolution SIMS imaging, the possibility of improving molecular ion yields by the use of specific sample preparation procedures has recently generated a renewed interest in the static SIMS community. In comparison with polyatomic projectiles, however, signal enhancement by a matrix might appear to some as the alchemist's versus the scientist's solution to the current problems of organic SIMS. In this contribution, I would like to discuss critically the pros and cons of matrix-enhanced SIMS procedures, in the new framework that includes polyatomic ion bombardment. This discussion is based on a short review of the experimental and theoretical developments achieved in the last decade with respect to the three following approaches: (i) blending the analyte with a low-molecular weight organic matrix (MALDI-type preparation procedure); (ii) mixing alkali/noble metal salts with the analyte; (iii) evaporating a noble metal layer on the analyte sample surface (organic molecules, polymers).

  14. Ontology for customer centric digital services and analytics

    NASA Astrophysics Data System (ADS)

    Keat, Ng Wai; Shahrir, Mohammad Shazri

    2017-11-01

    In computer science research, ontologies are commonly utilised to create a unified abstract across many rich and different fields. In this paper, we apply the concept to the customer centric domain of digital services analytics and present an analytics solution ontology. The essence is based from traditional Entity Relationship Diagram (ERD), which then was abstracted out to cover wider areas on customer centric digital services. The ontology we developed covers both static aspects (customer identifiers) and dynamic aspects (customer's temporal interactions). The structure of the customer scape is modeled with classes that represent different types of customer touch points, ranging from digital and digital-stamps which represent physical analogies. The dynamic aspects of customer centric digital service are modeled with a set of classes, with the importance is represented in different associations involving establishment and termination of the target interaction. The realized ontology can be used in development of frameworks for customer centric applications, and for specification of common data format used by cooperating digital service applications.

  15. Combining numerical simulations with time-domain random walk for pathogen risk assessment in groundwater

    NASA Astrophysics Data System (ADS)

    Cvetkovic, V.; Molin, S.

    2012-02-01

    We present a methodology that combines numerical simulations of groundwater flow and advective transport in heterogeneous porous media with analytical retention models for computing the infection risk probability from pathogens in aquifers. The methodology is based on the analytical results presented in [1,2] for utilising the colloid filtration theory in a time-domain random walk framework. It is shown that in uniform flow, the results from the numerical simulations of advection yield comparable results as the analytical TDRW model for generating advection segments. It is shown that spatial variability of the attachment rate may be significant, however, it appears to affect risk in a different manner depending on if the flow is uniform or radially converging. In spite of the fact that numerous issues remain open regarding pathogen transport in aquifers on the field scale, the methodology presented here may be useful for screening purposes, and may also serve as a basis for future studies that would include greater complexity.

  16. Why does the sign problem occur in evaluating the overlap of HFB wave functions?

    NASA Astrophysics Data System (ADS)

    Mizusaki, Takahiro; Oi, Makito; Shimizu, Noritaka

    2018-04-01

    For the overlap matrix element between Hartree-Fock-Bogoliubov states, there are two analytically different formulae: one with the square root of the determinant (the Onishi formula) and the other with the Pfaffian (Robledo's Pfaffian formula). The former formula is two-valued as a complex function, hence it leaves the sign of the norm overlap undetermined (i.e., the so-called sign problem of the Onishi formula). On the other hand, the latter formula does not suffer from the sign problem. The derivations for these two formulae are so different that the reasons are obscured why the resultant formulae possess different analytical properties. In this paper, we discuss the reason why the difference occurs by means of the consistent framework, which is based on the linked cluster theorem and the product-sum identity for the Pfaffian. Through this discussion, we elucidate the source of the sign problem in the Onishi formula. We also point out that different summation methods of series expansions may result in analytically different formulae.

  17. Phylogenetic analysis accounting for age-dependent death and sampling with applications to epidemics.

    PubMed

    Lambert, Amaury; Alexander, Helen K; Stadler, Tanja

    2014-07-07

    The reconstruction of phylogenetic trees based on viral genetic sequence data sequentially sampled from an epidemic provides estimates of the past transmission dynamics, by fitting epidemiological models to these trees. To our knowledge, none of the epidemiological models currently used in phylogenetics can account for recovery rates and sampling rates dependent on the time elapsed since transmission, i.e. age of infection. Here we introduce an epidemiological model where infectives leave the epidemic, by either recovery or sampling, after some random time which may follow an arbitrary distribution. We derive an expression for the likelihood of the phylogenetic tree of sampled infectives under our general epidemiological model. The analytic concept developed in this paper will facilitate inference of past epidemiological dynamics and provide an analytical framework for performing very efficient simulations of phylogenetic trees under our model. The main idea of our analytic study is that the non-Markovian epidemiological model giving rise to phylogenetic trees growing vertically as time goes by can be represented by a Markovian "coalescent point process" growing horizontally by the sequential addition of pairs of coalescence and sampling times. As examples, we discuss two special cases of our general model, described in terms of influenza and HIV epidemics. Though phrased in epidemiological terms, our framework can also be used for instance to fit macroevolutionary models to phylogenies of extant and extinct species, accounting for general species lifetime distributions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Analytical steady-state solutions for water-limited cropping systems using saline irrigation water

    NASA Astrophysics Data System (ADS)

    Skaggs, T. H.; Anderson, R. G.; Corwin, D. L.; Suarez, D. L.

    2014-12-01

    Due to the diminishing availability of good quality water for irrigation, it is increasingly important that irrigation and salinity management tools be able to target submaximal crop yields and support the use of marginal quality waters. In this work, we present a steady-state irrigated systems modeling framework that accounts for reduced plant water uptake due to root zone salinity. Two explicit, closed-form analytical solutions for the root zone solute concentration profile are obtained, corresponding to two alternative functional forms of the uptake reduction function. The solutions express a general relationship between irrigation water salinity, irrigation rate, crop salt tolerance, crop transpiration, and (using standard approximations) crop yield. Example applications are illustrated, including the calculation of irrigation requirements for obtaining targeted submaximal yields, and the generation of crop-water production functions for varying irrigation waters, irrigation rates, and crops. Model predictions are shown to be mostly consistent with existing models and available experimental data. Yet the new solutions possess advantages over available alternatives, including: (i) the solutions were derived from a complete physical-mathematical description of the system, rather than based on an ad hoc formulation; (ii) the analytical solutions are explicit and can be evaluated without iterative techniques; (iii) the solutions permit consideration of two common functional forms of salinity induced reductions in crop water uptake, rather than being tied to one particular representation; and (iv) the utilized modeling framework is compatible with leading transient-state numerical models.

  19. Defense Resource Management Studies: Introduction to Capability and Acquisition Planning Processes

    DTIC Science & Technology

    2010-08-01

    interchangeable and useful in a common contextual framework . Currently, both simulations use a common scenario, the same fictitious country, and...culture, legal framework , and institutions. • Incorporate Principles of Good Governance and Respect for Human Rights: Stress accountability and...Preparing for the assessments requires defining the missions to be analyzed; subdividing the mission definitions to provide a framework for analytic work

  20. A comprehensive test of clinical reasoning for medical students: An olympiad experience in Iran.

    PubMed

    Monajemi, Alireza; Arabshahi, Kamran Soltani; Soltani, Akbar; Arbabi, Farshid; Akbari, Roghieh; Custers, Eugene; Hadadgar, Arash; Hadizadeh, Fatemeh; Changiz, Tahereh; Adibi, Peyman

    2012-01-01

    Although some tests for clinical reasoning assessment are now available, the theories of medical expertise have not played a major role in this filed. In this paper, illness script theory was chose as a theoretical framework and contemporary clinical reasoning tests were put together based on this theoretical model. This paper is a qualitative study performed with an action research approach. This style of research is performed in a context where authorities focus on promoting their organizations' performance and is carried out in the form of teamwork called participatory research. Results are presented in four parts as basic concepts, clinical reasoning assessment, test framework, and scoring. we concluded that no single test could thoroughly assess clinical reasoning competency, and therefore a battery of clinical reasoning tests is needed. This battery should cover all three parts of clinical reasoning process: script activation, selection and verification. In addition, not only both analytical and non-analytical reasoning, but also both diagnostic and management reasoning should evenly take into consideration in this battery. This paper explains the process of designing and implementing the battery of clinical reasoning in the Olympiad for medical sciences students through an action research.

  1. Framework for behavioral analytics in anomaly identification

    NASA Astrophysics Data System (ADS)

    Touma, Maroun; Bertino, Elisa; Rivera, Brian; Verma, Dinesh; Calo, Seraphin

    2017-05-01

    Behavioral Analytics (BA) relies on digital breadcrumbs to build user profiles and create clusters of entities that exhibit a large degree of similarity. The prevailing assumption is that an entity will assimilate the group behavior of the cluster it belongs to. Our understanding of BA and its application in different domains continues to evolve and is a direct result of the growing interest in Machine Learning research. When trying to detect security threats, we use BA techniques to identify anomalies, defined in this paper as deviation from the group behavior. Early research papers in this field reveal a high number of false positives where a security alert is triggered based on deviation from the cluster learned behavior but still within the norm of what the system defines as an acceptable behavior. Further, domain specific security policies tend to be narrow and inadequately represent what an entity can do. Hence, they: a) limit the amount of useful data during the learning phase; and, b) lead to violation of policy during the execution phase. In this paper, we propose a framework for future research on the role of policies and behavior security in a coalition setting with emphasis on anomaly detection and individual's deviation from group activities.

  2. The remarkable environmental rebound effect of electric cars: a microeconomic approach.

    PubMed

    Font Vivanco, David; Freire-González, Jaume; Kemp, René; van der Voet, Ester

    2014-10-21

    This article presents a stepwise, refined, and practical analytical framework to model the microeconomic environmental rebound effect (ERE) stemming from cost differences of electric cars in terms of changes in multiple life cycle environmental indicators. The analytical framework is based on marginal consumption analysis and hybrid life cycle assessment (LCA). The article makes a novel contribution through a reinterpretation of the traditional rebound effect and methodological refinements. It also provides novel empirical results about the ERE for plug-in hybrid electric (PHE), full-battery electric (FBE), and hydrogen fuel cell (HFC) cars for Europe. The ERE is found to have a remarkable impact on product-level environmental scores. For the PHE car, the ERE causes a marginal increase in demand and environmental pressures due to a small decrease in the cost of using this technology. For FBE and HFC cars, the high capital costs cause a noteworthy decrease in environmental pressures for some indicators (negative rebound effect). The results corroborate the concern over the high influence of cost differences for environmental assessment, and they prompt sustainable consumption policies to consider markets and prices as tools rather than as an immutable background.

  3. On the relationship between the causal-inference and meta-analytic paradigms for the validation of surrogate endpoints.

    PubMed

    Alonso, Ariel; Van der Elst, Wim; Molenberghs, Geert; Buyse, Marc; Burzykowski, Tomasz

    2015-03-01

    The increasing cost of drug development has raised the demand for surrogate endpoints when evaluating new drugs in clinical trials. However, over the years, it has become clear that surrogate endpoints need to be statistically evaluated and deemed valid, before they can be used as substitutes of "true" endpoints in clinical studies. Nowadays, two paradigms, based on causal-inference and meta-analysis, dominate the scene. Nonetheless, although the literature emanating from these paradigms is wide, till now the relationship between them has largely been left unexplored. In the present work, we discuss the conceptual framework underlying both approaches and study the relationship between them using theoretical elements and the analysis of a real case study. Furthermore, we show that the meta-analytic approach can be embedded within a causal-inference framework on the one hand and that it can be heuristically justified why surrogate endpoints successfully evaluated using this approach will often be appealing from a causal-inference perspective as well, on the other. A newly developed and user friendly R package Surrogate is provided to carry out the evaluation exercise. © 2014, The International Biometric Society.

  4. Adaptive learning in complex reproducing kernel Hilbert spaces employing Wirtinger's subgradients.

    PubMed

    Bouboulis, Pantelis; Slavakis, Konstantinos; Theodoridis, Sergios

    2012-03-01

    This paper presents a wide framework for non-linear online supervised learning tasks in the context of complex valued signal processing. The (complex) input data are mapped into a complex reproducing kernel Hilbert space (RKHS), where the learning phase is taking place. Both pure complex kernels and real kernels (via the complexification trick) can be employed. Moreover, any convex, continuous and not necessarily differentiable function can be used to measure the loss between the output of the specific system and the desired response. The only requirement is the subgradient of the adopted loss function to be available in an analytic form. In order to derive analytically the subgradients, the principles of the (recently developed) Wirtinger's calculus in complex RKHS are exploited. Furthermore, both linear and widely linear (in RKHS) estimation filters are considered. To cope with the problem of increasing memory requirements, which is present in almost all online schemes in RKHS, the sparsification scheme, based on projection onto closed balls, has been adopted. We demonstrate the effectiveness of the proposed framework in a non-linear channel identification task, a non-linear channel equalization problem and a quadrature phase shift keying equalization scheme, using both circular and non circular synthetic signal sources.

  5. Analytically exploiting noise correlations inside the feedback loop to improve locked-oscillator performance.

    PubMed

    Sastrawan, J; Jones, C; Akhalwaya, I; Uys, H; Biercuk, M J

    2016-08-01

    We introduce concepts from optimal estimation to the stabilization of precision frequency standards limited by noisy local oscillators. We develop a theoretical framework casting various measures for frequency standard variance in terms of frequency-domain transfer functions, capturing the effects of feedback stabilization via a time series of Ramsey measurements. Using this framework, we introduce an optimized hybrid predictive feedforward measurement protocol that employs results from multiple past measurements and transfer-function-based calculations of measurement covariance to improve the accuracy of corrections within the feedback loop. In the presence of common non-Markovian noise processes these measurements will be correlated in a calculable manner, providing a means to capture the stochastic evolution of the local oscillator frequency during the measurement cycle. We present analytic calculations and numerical simulations of oscillator performance under competing feedback schemes and demonstrate benefits in both correction accuracy and long-term oscillator stability using hybrid feedforward. Simulations verify that in the presence of uncompensated dead time and noise with significant spectral weight near the inverse cycle time predictive feedforward outperforms traditional feedback, providing a path towards developing a class of stabilization software routines for frequency standards limited by noisy local oscillators.

  6. ICSNPathway: identify candidate causal SNPs and pathways from genome-wide association study by one analytical framework.

    PubMed

    Zhang, Kunlin; Chang, Suhua; Cui, Sijia; Guo, Liyuan; Zhang, Liuyan; Wang, Jing

    2011-07-01

    Genome-wide association study (GWAS) is widely utilized to identify genes involved in human complex disease or some other trait. One key challenge for GWAS data interpretation is to identify causal SNPs and provide profound evidence on how they affect the trait. Currently, researches are focusing on identification of candidate causal variants from the most significant SNPs of GWAS, while there is lack of support on biological mechanisms as represented by pathways. Although pathway-based analysis (PBA) has been designed to identify disease-related pathways by analyzing the full list of SNPs from GWAS, it does not emphasize on interpreting causal SNPs. To our knowledge, so far there is no web server available to solve the challenge for GWAS data interpretation within one analytical framework. ICSNPathway is developed to identify candidate causal SNPs and their corresponding candidate causal pathways from GWAS by integrating linkage disequilibrium (LD) analysis, functional SNP annotation and PBA. ICSNPathway provides a feasible solution to bridge the gap between GWAS and disease mechanism study by generating hypothesis of SNP → gene → pathway(s). The ICSNPathway server is freely available at http://icsnpathway.psych.ac.cn/.

  7. Translating Learning into Numbers: A Generic Framework for Learning Analytics

    ERIC Educational Resources Information Center

    Greller, Wolfgang; Drachsler, Hendrik

    2012-01-01

    With the increase in available educational data, it is expected that Learning Analytics will become a powerful means to inform and support learners, teachers and their institutions in better understanding and predicting personal learning needs and performance. However, the processes and requirements behind the beneficial application of Learning…

  8. Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment

    ERIC Educational Resources Information Center

    Dawson, Shane; Siemens, George

    2014-01-01

    The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional "literacy" skills towards an enhanced set of…

  9. How is genetic testing evaluated? A systematic review of the literature.

    PubMed

    Pitini, Erica; De Vito, Corrado; Marzuillo, Carolina; D'Andrea, Elvira; Rosso, Annalisa; Federici, Antonio; Di Maria, Emilio; Villari, Paolo

    2018-05-01

    Given the rapid development of genetic tests, an assessment of their benefits, risks, and limitations is crucial for public health practice. We performed a systematic review aimed at identifying and comparing the existing evaluation frameworks for genetic tests. We searched PUBMED, SCOPUS, ISI Web of Knowledge, Google Scholar, Google, and gray literature sources for any documents describing such frameworks. We identified 29 evaluation frameworks published between 2000 and 2017, mostly based on the ACCE Framework (n = 13 models), or on the HTA process (n = 6), or both (n = 2). Others refer to the Wilson and Jungner screening criteria (n = 3) or to a mixture of different criteria (n = 5). Due to the widespread use of the ACCE Framework, the most frequently used evaluation criteria are analytic and clinical validity, clinical utility and ethical, legal and social implications. Less attention is given to the context of implementation. An economic dimension is always considered, but not in great detail. Consideration of delivery models, organizational aspects, and consumer viewpoint is often lacking. A deeper analysis of such context-related evaluation dimensions may strengthen a comprehensive evaluation of genetic tests and support the decision-making process.

  10. Vapor-liquid equilibrium and equation of state of two-dimensional fluids from a discrete perturbation theory

    NASA Astrophysics Data System (ADS)

    Trejos, Víctor M.; Santos, Andrés; Gámez, Francisco

    2018-05-01

    The interest in the description of the properties of fluids of restricted dimensionality is growing for theoretical and practical reasons. In this work, we have firstly developed an analytical expression for the Helmholtz free energy of the two-dimensional square-well fluid in the Barker-Henderson framework. This equation of state is based on an approximate analytical radial distribution function for d-dimensional hard-sphere fluids (1 ≤ d ≤ 3) and is validated against existing and new simulation results. The so-obtained equation of state is implemented in a discrete perturbation theory able to account for general potential shapes. The prototypical Lennard-Jones and Yukawa fluids are tested in its two-dimensional version against available and new simulation data with semiquantitative agreement.

  11. A Framework for Integrating Environmental Justice in Regulatory Analysis

    PubMed Central

    Nweke, Onyemaechi C.

    2011-01-01

    With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235

  12. Framework for Deploying a Virtualized Computing Environment for Collaborative and Secure Data Analytics

    PubMed Central

    Meyer, Adrian; Green, Laura; Faulk, Ciearro; Galla, Stephen; Meyer, Anne-Marie

    2016-01-01

    Introduction: Large amounts of health data generated by a wide range of health care applications across a variety of systems have the potential to offer valuable insight into populations and health care systems, but robust and secure computing and analytic systems are required to leverage this information. Framework: We discuss our experiences deploying a Secure Data Analysis Platform (SeDAP), and provide a framework to plan, build and deploy a virtual desktop infrastructure (VDI) to enable innovation, collaboration and operate within academic funding structures. It outlines 6 core components: Security, Ease of Access, Performance, Cost, Tools, and Training. Conclusion: A platform like SeDAP is not simply successful through technical excellence and performance. It’s adoption is dependent on a collaborative environment where researchers and users plan and evaluate the requirements of all aspects. PMID:27683665

  13. Discovering Tradeoffs, Vulnerabilities, and Dependencies within Water Resources Systems

    NASA Astrophysics Data System (ADS)

    Reed, P. M.

    2015-12-01

    There is a growing recognition and interest in using emerging computational tools for discovering the tradeoffs that emerge across complex combinations infrastructure options, adaptive operations, and sign posts. As a field concerned with "deep uncertainties", it is logically consistent to include a more direct acknowledgement that our choices for dealing with computationally demanding simulations, advanced search algorithms, and sensitivity analysis tools are themselves subject to failures that could adversely bias our understanding of how systems' vulnerabilities change with proposed actions. Balancing simplicity versus complexity in our computational frameworks is nontrivial given that we are often exploring high impact irreversible decisions. It is not always clear that accepted models even encompass important failure modes. Moreover as they become more complex and computationally demanding the benefits and consequences of simplifications are often untested. This presentation discusses our efforts to address these challenges through our "many-objective robust decision making" (MORDM) framework for the design and management water resources systems. The MORDM framework has four core components: (1) elicited problem conception and formulation, (2) parallel many-objective search, (3) interactive visual analytics, and (4) negotiated selection of robust alternatives. Problem conception and formulation is the process of abstracting a practical design problem into a mathematical representation. We build on the emerging work in visual analytics to exploit interactive visualization of both the design space and the objective space in multiple heterogeneous linked views that permit exploration and discovery. Many-objective search produces tradeoff solutions from potentially competing problem formulations that can each consider up to ten conflicting objectives based on current computational search capabilities. Negotiated design selection uses interactive visualization, reformulation, and optimization to discover desirable designs for implementation. Multi-city urban water supply portfolio planning will be used to illustrate the MORDM framework.

  14. The Fiscal Consequences Attributed to Changes in Morbidity and Mortality Linked to Investments in Health Care: A Government Perspective Analytic Framework.

    PubMed

    Connolly, Mark P; Kotsopoulos, Nikolaos; Postma, Maarten J; Bhatt, Aomesh

    2017-02-01

    Governments have an enormous economic and political stake in the health of their populations. Population health is not only fundamental to economic growth but also affects short-term and long-term government expenditure on health care, disability, and other social programs and influences direct and indirect tax receipts. Fiscal transfers between citizen and state are mostly ignored in conventional welfare economics analyses based on the hypothesis that there are no winners or losers through transference of wealth. However, from the government perspective, this position is flawed, as disability costs and lost taxes attributed to poor health and reduced productive output represent real costs that pose budgetary and growth implications. To address the value of health and health care investments for government, we have developed a fiscal health analytic framework that captures how changes in morbidity and mortality influence tax revenue and transfer costs (e.g., disability, allowances, ongoing health costs). The framework can be used to evaluate the marginal impact of discrete investments or a mix of interventions in health care to inform governmental budgetary consequences. In this context, the framework can be considered as a fiscal budget impact and/or cost-benefit analysis model that accounts for how morbidity and mortality linked to specific programs represent both ongoing costs and tax revenue for government. Mathematical models identical to those used in cost-effectiveness analyses can be employed in fiscal analysis to reflect how disease progression influences public accounts (e.g., tax revenue and transfers). Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  15. Effects of environmental change on agriculture, nutrition and health: A framework with a focus on fruits and vegetables

    PubMed Central

    Tuomisto, Hanna L.; Scheelbeek, Pauline F.D.; Chalabi, Zaid; Green, Rosemary; Smith, Richard D.; Haines, Andy; Dangour, Alan D.

    2017-01-01

    Environmental changes are likely to affect agricultural production over the next  decades. The interactions between environmental change, agricultural yields and crop quality, and the critical pathways to future diets and health outcomes are largely undefined. There are currently no quantitative models to test the impact of multiple environmental changes on nutrition and health outcomes. Using an interdisciplinary approach, we developed a framework to link the multiple interactions between environmental change, agricultural productivity and crop quality, population-level food availability, dietary intake and health outcomes, with a specific focus on fruits and vegetables. The main components of the framework consist of: i) socio-economic and societal factors, ii) environmental change stressors, iii) interventions and policies, iv) food system activities, v) food and nutrition security, and vi) health and well-being outcomes. The framework, based on currently available evidence, provides an overview of the multidimensional and complex interactions with feedback between environmental change, production of fruits and vegetables, diets and health, and forms the analytical basis for future modelling and scenario testing. PMID:29511740

  16. On Governance, Embedding and Marketing: Reflections on the Construction of Alternative Sustainable Food Networks.

    PubMed

    Roep, Dirk; Wiskerke, Johannes S C

    Based on the reconstruction of the development of 14 food supply chain initiatives in 7 European countries, we developed a conceptual framework that demonstrates that the process of increasing the sustainability of food supply chains is rooted in strategic choices regarding governance , embedding, and marketing and in the coordination of these three dimensions that are inextricably interrelated. The framework also shows that when seeking to further develop an initiative (e.g., through scaling up or product diversification) these interrelations need continuous rebalancing. We argue that the framework can serve different purposes: it can be used as an analytical tool by researchers studying food supply chain dynamics, as a policy tool by policymakers that want to support the development of sustainable food supply chains, and as a reflexive tool by practitioners and their advisors to help them to position themselves, develop a clear strategy, find the right allies, develop their skills, and build the capacities that they need. In this paper, we elaborate upon the latter function of the framework and illustrate this briefly with empirical evidence from three of the initiatives that we studied.

  17. Devising a consensus definition and framework for non-technical skills in healthcare to support educational design: A modified Delphi study.

    PubMed

    Gordon, Morris; Baker, Paul; Catchpole, Ken; Darbyshire, Daniel; Schocken, Dawn

    2015-01-01

    Non-technical skills are a subset of human factors that focus on the individual and promote safety through teamwork and awareness. There is no widely adopted competency- or outcome-based framework for non-technical skills training in healthcare. The authors set out to devise such a framework using a modified Delphi approach. An exhaustive list of published and team suggested items was presented to the expert panel for ranking and to propose a definition. In the second round, a focused list was presented, as well as the proposed definition elements. The finalised framework was sent to the panel for review. Sixteen experts participated. The final framework consists of 16 competencies for all and eight specific competencies for team leaders. The consensus definition describes non-technical skills as "a set of social (communication and team work) and cognitive (analytical and personal behaviour) skills that support high quality, safe, effective and efficient inter-professional care within the complex healthcare system". The authors have produced a new competency framework, through the works of an International expert panel, which is not discipline specific that can be used by curriculum developers, educational innovators and clinical teachers to support developments in the field.

  18. Deriving Appropriate Educational Program Costs in Illinois.

    ERIC Educational Resources Information Center

    Parrish, Thomas B.; Chambers, Jay G.

    This document describes the comprehensive analytical framework for school finance used by the Illinois State Board of Education to assist policymakers in their decisions about equitable distribution of state aid and appropriate levels of resources to meet the varying educational requirements of differing student populations. This framework, the…

  19. Analyzing Agricultural Technology Systems: A Research Report.

    ERIC Educational Resources Information Center

    Swanson, Burton E.

    The International Program for Agricultural Knowledge Systems (INTERPAKS) research team is developing a descriptive and analytic framework to examine and assess agricultural technology systems. The first part of the framework is an inductive methodology that organizes data collection and orders data for comparison between countries. It requires and…

  20. An overview of ethical frameworks in public health: can they be supportive in the evaluation of programs to prevent overweight?

    PubMed Central

    2010-01-01

    Background The prevention of overweight sometimes raises complex ethical questions. Ethical public health frameworks may be helpful in evaluating programs or policy for overweight prevention. We give an overview of the purpose, form and contents of such public health frameworks and investigate to which extent they are useful for evaluating programs to prevent overweight and/or obesity. Methods Our search for frameworks consisted of three steps. Firstly, we asked experts in the field of ethics and public health for the frameworks they were aware of. Secondly, we performed a search in Pubmed. Thirdly, we checked literature references in the articles on frameworks we found. In total, we thus found six ethical frameworks. We assessed the area on which the available ethical frameworks focus, the users they target at, the type of policy or intervention they propose to address, and their aim. Further, we looked at their structure and content, that is, tools for guiding the analytic process, the main ethical principles or values, possible criteria for dealing with ethical conflicts, and the concrete policy issues they are applied to. Results All frameworks aim to support public health professionals or policymakers. Most of them provide a set of values or principles that serve as a standard for evaluating policy. Most frameworks articulate both the positive ethical foundations for public health and ethical constraints or concerns. Some frameworks offer analytic tools for guiding the evaluative process. Procedural guidelines and concrete criteria for solving important ethical conflicts in the particular area of the prevention of overweight or obesity are mostly lacking. Conclusions Public health ethical frameworks may be supportive in the evaluation of overweight prevention programs or policy, but seem to lack practical guidance to address ethical conflicts in this particular area. PMID:20969761

  1. Invariant Tori in the Secular Motions of the Three-body Planetary Systems

    NASA Astrophysics Data System (ADS)

    Locatelli, Ugo; Giorgilli, Antonio

    We consider the problem of the applicability of KAM theorem to a realistic problem of three bodies. In the framework of the averaged dynamics over the fast angles for the Sun-Jupiter-Saturn system we can prove the perpetual stability of the orbit. The proof is based on semi-numerical algorithms requiring both explicit algebraic manipulations of series and analytical estimates. The proof is made rigorous by using interval arithmetics in order to control the numerical errors.

  2. On the dispersion relations for an inhomogeneous waveguide with attenuation

    NASA Astrophysics Data System (ADS)

    Vatul'yan, A. O.; Yurlov, V. O.

    2016-09-01

    Some general laws concerning the structure of dispersion relations for solid inhomogeneous waveguides with attenuation are studied. An approach based on the analysis of a first-order matrix differential equation is presented in the framework of the concept of complex moduli. Some laws concerning the structure of components of the dispersion set for a viscoelastic inhomogeneous cylindrical waveguide are studied analytically and numerically, and the asymptotics of components of the dispersion set are constructed for arbitrary inhomogeneity laws in the low-frequency region.

  3. Radioactivity and Environmental Security in the Oceans: New Research and Policy Priorities in the Arctic and North Atlantic

    DTIC Science & Technology

    1993-06-09

    within the framework of an update for the computer database "DiaNIK" which has been developed at the Vernadsky Institute of Geochemistry and Analytical...chemical thermodynamic data for minerals and mineral-forming substances. The structure of thermodynamic database "DiaNIK" is based on the principles...in the database . A substantial portion of the thermodynamic values recommended by "DiaNIK" experts for the substances in User Version 3.1 resulted from

  4. A visual analytic framework for data fusion in investigative intelligence

    NASA Astrophysics Data System (ADS)

    Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David

    2014-05-01

    Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.

  5. Principles of Catholic Social Teaching, Critical Pedagogy, and the Theory of Intersectionality: An Integrated Framework to Examine the Roles of Social Status in the Formation of Catholic Teachers

    ERIC Educational Resources Information Center

    Eick, Caroline Marie; Ryan, Patrick A.

    2014-01-01

    This article discusses the relevance of an analytic framework that integrates principles of Catholic social teaching, critical pedagogy, and the theory of intersectionality to explain attitudes toward marginalized youth held by Catholic students preparing to become teachers. The framework emerges from five years of action research data collected…

  6. Collaborative visual analytics of radio surveys in the Big Data era

    NASA Astrophysics Data System (ADS)

    Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.

    2017-06-01

    Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.

  7. Competency Analytics Tool: Analyzing Curriculum Using Course Competencies

    ERIC Educational Resources Information Center

    Gottipati, Swapna; Shankararaman, Venky

    2018-01-01

    The applications of learning outcomes and competency frameworks have brought better clarity to engineering programs in many universities. Several frameworks have been proposed to integrate outcomes and competencies into course design, delivery and assessment. However, in many cases, competencies are course-specific and their overall impact on the…

  8. A Methodological Framework to Analyze Stakeholder Preferences and Propose Strategic Pathways for a Sustainable University

    ERIC Educational Resources Information Center

    Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda

    2016-01-01

    Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…

  9. Collaborative Rhetorical Structure: A Discourse Analysis Method for Analyzing Student Collaborative Inquiry via Computer Conferencing

    ERIC Educational Resources Information Center

    Kou, Xiaojing

    2011-01-01

    Various formats of online discussion have proven valuable for enhancing learning and collaboration in distance and blended learning contexts. However, despite their capacity to reveal essential processes in collaborative inquiry, current mainstream analytical frameworks, such as the cognitive presence framework (Garrison, Anderson, & Archer,…

  10. A Cognitive Framework for the Analysis of Online Chemistry Courses

    ERIC Educational Resources Information Center

    Evans, Karen L.; Leinhardt, Gaea

    2008-01-01

    Many students now are receiving instruction in online environments created by universities, museums, corporations, and even students. What features of a given online course contribute to its effectiveness? This paper addresses that query by proposing and applying an analytic framework to five online introductory chemistry courses. Introductory…

  11. Optimizing the Long-Term Retention of Skills: Structural and Analytic Approaches to Skill Maintenance

    DTIC Science & Technology

    1990-08-01

    evidence for a surprising degree of long-term skill retention. We formulated a theoretical framework , focusing on the importance of procedural reinstatement...considerable forgetting over even relatively short retention intervals. We have been able to place these studies in the same general theoretical framework developed

  12. Managing Offshore Branch Campuses: An Analytical Framework for Institutional Strategies

    ERIC Educational Resources Information Center

    Shams, Farshid; Huisman, Jeroen

    2012-01-01

    The aim of this article is to develop a framework that encapsulates the key managerial complexities of running offshore branch campuses. In the transnational higher education (TNHE) literature, several managerial ramifications and impediments have been addressed by scholars and practitioners. However, the strands of the literature are highly…

  13. A Decision Analytic Approach to Exposure-Based Chemical Prioritization

    PubMed Central

    Mitchell, Jade; Pabon, Nicolas; Collier, Zachary A.; Egeghy, Peter P.; Cohen-Hubal, Elaine; Linkov, Igor; Vallero, Daniel A.

    2013-01-01

    The manufacture of novel synthetic chemicals has increased in volume and variety, but often the environmental and health risks are not fully understood in terms of toxicity and, in particular, exposure. While efforts to assess risks have generally been effective when sufficient data are available, the hazard and exposure data necessary to assess risks adequately are unavailable for the vast majority of chemicals in commerce. The US Environmental Protection Agency has initiated the ExpoCast Program to develop tools for rapid chemical evaluation based on potential for exposure. In this context, a model is presented in which chemicals are evaluated based on inherent chemical properties and behaviorally-based usage characteristics over the chemical’s life cycle. These criteria are assessed and integrated within a decision analytic framework, facilitating rapid assessment and prioritization for future targeted testing and systems modeling. A case study outlines the prioritization process using 51 chemicals. The results show a preliminary relative ranking of chemicals based on exposure potential. The strength of this approach is the ability to integrate relevant statistical and mechanistic data with expert judgment, allowing for an initial tier assessment that can further inform targeted testing and risk management strategies. PMID:23940664

  14. MetaKTSP: a meta-analytic top scoring pair method for robust cross-study validation of omics prediction analysis.

    PubMed

    Kim, SungHwan; Lin, Chien-Wei; Tseng, George C

    2016-07-01

    Supervised machine learning is widely applied to transcriptomic data to predict disease diagnosis, prognosis or survival. Robust and interpretable classifiers with high accuracy are usually favored for their clinical and translational potential. The top scoring pair (TSP) algorithm is an example that applies a simple rank-based algorithm to identify rank-altered gene pairs for classifier construction. Although many classification methods perform well in cross-validation of single expression profile, the performance usually greatly reduces in cross-study validation (i.e. the prediction model is established in the training study and applied to an independent test study) for all machine learning methods, including TSP. The failure of cross-study validation has largely diminished the potential translational and clinical values of the models. The purpose of this article is to develop a meta-analytic top scoring pair (MetaKTSP) framework that combines multiple transcriptomic studies and generates a robust prediction model applicable to independent test studies. We proposed two frameworks, by averaging TSP scores or by combining P-values from individual studies, to select the top gene pairs for model construction. We applied the proposed methods in simulated data sets and three large-scale real applications in breast cancer, idiopathic pulmonary fibrosis and pan-cancer methylation. The result showed superior performance of cross-study validation accuracy and biomarker selection for the new meta-analytic framework. In conclusion, combining multiple omics data sets in the public domain increases robustness and accuracy of the classification model that will ultimately improve disease understanding and clinical treatment decisions to benefit patients. An R package MetaKTSP is available online. (http://tsenglab.biostat.pitt.edu/software.htm). ctseng@pitt.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Understanding selective molecular recognition in integrated carbon nanotube-polymer sensors by simulating physical analyte binding on carbon nanotube-polymer scaffolds.

    PubMed

    Lin, Shangchao; Zhang, Jingqing; Strano, Michael S; Blankschtein, Daniel

    2014-08-28

    Macromolecular scaffolds made of polymer-wrapped single-walled carbon nanotubes (SWCNTs) have been explored recently (Zhang et al., Nature Nanotechnology, 2013) as a new class of molecular-recognition motifs. However, selective analyte recognition is still challenging and lacks the underlying fundamental understanding needed for its practical implementation in biological sensors. In this report, we combine coarse-grained molecular dynamics (CGMD) simulations, physical adsorption/binding theories, and photoluminescence (PL) experiments to provide molecular insight into the selectivity of such sensors towards a large set of biologically important analytes. We find that the physical binding affinities of the analytes on a bare SWCNT partially correlate with their distribution coefficients in a bulk water/octanol system, suggesting that the analyte hydrophobicity plays a key role in determining the binding affinities of the analytes considered, along with the various specific interactions between the analytes and the polymer anchor groups. Two distinct categories of analytes are identified to demonstrate a complex picture for the correlation between optical sensor signals and the simulated binding affinities. Specifically, a good correlation was found between the sensor signals and the physical binding affinities of the three hormones (estradiol, melatonin, and thyroxine), the neurotransmitter (dopamine), and the vitamin (riboflavin) to the SWCNT-polymer scaffold. The four amino acids (aspartate, glycine, histidine, and tryptophan) and the two monosaccharides (fructose and glucose) considered were identified as blank analytes which are unable to induce sensor signals. The results indicate great success of our physical adsorption-based model in explaining the ranking in sensor selectivities. The combined framework presented here can be used to screen and select polymers that can potentially be used for creating synthetic molecular recognition motifs.

  16. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  17. BIRCH: a user-oriented, locally-customizable, bioinformatics system.

    PubMed

    Fristensky, Brian

    2007-02-09

    Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.

  18. BIRCH: A user-oriented, locally-customizable, bioinformatics system

    PubMed Central

    Fristensky, Brian

    2007-01-01

    Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351

  19. Educational Change in Post-Conflict Contexts: Reflections on the South African Experience 20 Years Later

    ERIC Educational Resources Information Center

    Christie, Pam

    2016-01-01

    Reflecting on South African experience, this paper develops an analytical framework using the work of Henri Lefebvre and Nancy Fraser to understand why socially just arrangements may be so difficult to achieve in post-conflict reconstruction. The paper uses Lefebvre's analytic to trace three sets of entangled practices…

  20. Triple Helix Systems: An Analytical Framework for Innovation Policy and Practice in the Knowledge Society

    ERIC Educational Resources Information Center

    Ranga, Marina; Etzkowitz, Henry

    2013-01-01

    This paper introduces the concept of Triple Helix systems as an analytical construct that synthesizes the key features of university--industry--government (Triple Helix) interactions into an "innovation system" format, defined according to systems theory as a set of components, relationships and functions. Among the components of Triple…

  1. Methods for Integrating Moderation and Mediation: A General Analytical Framework Using Moderated Path Analysis

    ERIC Educational Resources Information Center

    Edwards, Jeffrey R.; Lambert, Lisa Schurer

    2007-01-01

    Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated…

  2. Learning Analytics as a Counterpart to Surveys of Student Experience

    ERIC Educational Resources Information Center

    Borden, Victor M. H.; Coates, Hamish

    2017-01-01

    Analytics derived from the student learning environment provide new insights into the collegiate experience; they can be used as a supplement to or, to some extent, in place of traditional surveys. To serve this purpose, however, greater attention must be paid to conceptual frameworks and to advancing institutional systems, activating new…

  3. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    ERIC Educational Resources Information Center

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  4. Applying Learning Analytics for the Early Prediction of Students' Academic Performance in Blended Learning

    ERIC Educational Resources Information Center

    Lu, Owen H. T.; Huang, Anna Y. Q.; Huang, Jeff C. H.; Lin, Albert J. Q.; Ogata, Hiroaki; Yang, Stephen J. H.

    2018-01-01

    Blended learning combines online digital resources with traditional classroom activities and enables students to attain higher learning performance through well-defined interactive strategies involving online and traditional learning activities. Learning analytics is a conceptual framework and is a part of our Precision education used to analyze…

  5. A Complete Validated Learning Analytics Framework: Designing Issues from Data Preparation Perspective

    ERIC Educational Resources Information Center

    Tlili, Ahmed; Essalmi, Fathi; Jemni, Mohamed; Kinshuk; Chen, Nian-Shing

    2018-01-01

    With the rapid growth of online education in recent years, Learning Analytics (LA) has gained increasing attention from researchers and educational institutions as an area which can improve the overall effectiveness of learning experiences. However, the lack of guidelines on what should be taken into consideration during application of LA hinders…

  6. Translating the Theoretical into Practical: A Logical Framework of Functional Analytic Psychotherapy Interactions for Research, Training, and Clinical Purposes

    ERIC Educational Resources Information Center

    Weeks, Cristal E.; Kanter, Jonathan W.; Bonow, Jordan T.; Landes, Sara J.; Busch, Andrew M.

    2012-01-01

    Functional analytic psychotherapy (FAP) provides a behavioral analysis of the psychotherapy relationship that directly applies basic research findings to outpatient psychotherapy settings. Specifically, FAP suggests that a therapist's in vivo (i.e., in-session) contingent responding to targeted client behaviors, particularly positive reinforcement…

  7. Parallel Aircraft Trajectory Optimization with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Falck, Robert D.; Gray, Justin S.; Naylor, Bret

    2016-01-01

    Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.

  8. Pattern matching through Chaos Game Representation: bridging numerical and discrete data structures for biological sequence analysis

    PubMed Central

    2012-01-01

    Background Chaos Game Representation (CGR) is an iterated function that bijectively maps discrete sequences into a continuous domain. As a result, discrete sequences can be object of statistical and topological analyses otherwise reserved to numerical systems. Characteristically, CGR coordinates of substrings sharing an L-long suffix will be located within 2-L distance of each other. In the two decades since its original proposal, CGR has been generalized beyond its original focus on genomic sequences and has been successfully applied to a wide range of problems in bioinformatics. This report explores the possibility that it can be further extended to approach algorithms that rely on discrete, graph-based representations. Results The exploratory analysis described here consisted of selecting foundational string problems and refactoring them using CGR-based algorithms. We found that CGR can take the role of suffix trees and emulate sophisticated string algorithms, efficiently solving exact and approximate string matching problems such as finding all palindromes and tandem repeats, and matching with mismatches. The common feature of these problems is that they use longest common extension (LCE) queries as subtasks of their procedures, which we show to have a constant time solution with CGR. Additionally, we show that CGR can be used as a rolling hash function within the Rabin-Karp algorithm. Conclusions The analysis of biological sequences relies on algorithmic foundations facing mounting challenges, both logistic (performance) and analytical (lack of unifying mathematical framework). CGR is found to provide the latter and to promise the former: graph-based data structures for sequence analysis operations are entailed by numerical-based data structures produced by CGR maps, providing a unifying analytical framework for a diversity of pattern matching problems. PMID:22551152

  9. Infinite slope stability under steady unsaturated seepage conditions

    USGS Publications Warehouse

    Lu, Ning; Godt, Jonathan W.

    2008-01-01

    We present a generalized framework for the stability of infinite slopes under steady unsaturated seepage conditions. The analytical framework allows the water table to be located at any depth below the ground surface and variation of soil suction and moisture content above the water table under steady infiltration conditions. The framework also explicitly considers the effect of weathering and porosity increase near the ground surface on changes in the friction angle of the soil. The factor of safety is conceptualized as a function of the depth within the vadose zone and can be reduced to the classical analytical solution for subaerial infinite slopes in the saturated zone. Slope stability analyses with hypothetical sandy and silty soils are conducted to illustrate the effectiveness of the framework. These analyses indicate that for hillslopes of both sandy and silty soils, failure can occur above the water table under steady infiltration conditions, which is consistent with some field observations that cannot be predicted by the classical infinite slope theory. A case study of shallow slope failures of sandy colluvium on steep coastal hillslopes near Seattle, Washington, is presented to examine the predictive utility of the proposed framework.

  10. Structural controllability of unidirectional bipartite networks

    NASA Astrophysics Data System (ADS)

    Nacher, Jose C.; Akutsu, Tatsuya

    2013-04-01

    The interactions between fundamental life molecules, people and social organisations build complex architectures that often result in undesired behaviours. Despite all of the advances made in our understanding of network structures over the past decade, similar progress has not been achieved in the controllability of real-world networks. In particular, an analytical framework to address the controllability of bipartite networks is still absent. Here, we present a dominating set (DS)-based approach to bipartite network controllability that identifies the topologies that are relatively easy to control with the minimum number of driver nodes. Our theoretical calculations, assisted by computer simulations and an evaluation of real-world networks offer a promising framework to control unidirectional bipartite networks. Our analysis should open a new approach to reverting the undesired behaviours in unidirectional bipartite networks at will.

  11. Origin of the sensitivity in modeling the glide behaviour of dislocations

    DOE PAGES

    Pei, Zongrui; Stocks, George Malcolm

    2018-03-26

    The sensitivity in predicting glide behaviour of dislocations has been a long-standing problem in the framework of the Peierls-Nabarro model. The predictions of both the model itself and the analytic formulas based on it are too sensitive to the input parameters. In order to reveal the origin of this important problem in materials science, a new empirical-parameter-free formulation is proposed in the same framework. Unlike previous formulations, it includes only a limited small set of parameters all of which can be determined by convergence tests. Under special conditions the new formulation is reduced to its classic counterpart. In the lightmore » of this formulation, new relationships between Peierls stresses and the input parameters are identified, where the sensitivity is greatly reduced or even removed.« less

  12. Sound Processing Features for Speaker-Dependent and Phrase-Independent Emotion Recognition in Berlin Database

    NASA Astrophysics Data System (ADS)

    Anagnostopoulos, Christos Nikolaos; Vovoli, Eftichia

    An emotion recognition framework based on sound processing could improve services in human-computer interaction. Various quantitative speech features obtained from sound processing of acting speech were tested, as to whether they are sufficient or not to discriminate between seven emotions. Multilayered perceptrons were trained to classify gender and emotions on the basis of a 24-input vector, which provide information about the prosody of the speaker over the entire sentence using statistics of sound features. Several experiments were performed and the results were presented analytically. Emotion recognition was successful when speakers and utterances were “known” to the classifier. However, severe misclassifications occurred during the utterance-independent framework. At least, the proposed feature vector achieved promising results for utterance-independent recognition of high- and low-arousal emotions.

  13. Encapsulation of Hemin in Metal-Organic Frameworks for Catalyzing the Chemiluminescence Reaction of the H2O2-Luminol System and Detecting Glucose in the Neutral Condition.

    PubMed

    Luo, Fenqiang; Lin, Yaolin; Zheng, Liyan; Lin, Xiaomei; Chi, Yuwu

    2015-06-03

    Novel metal-organic frameworks (MOFs) based solid catalysts have been synthesized by encapsulating Hemin into the HKUST-1 MOF materials. These have been first applied in the chemiluminescence field with outstanding performance. The functionalized MOFs not only maintain an excellent catalytic activity inheriting from Hemin but also can be cyclically utilized as solid mimic peroxidases in the neutral condition. The synthesized Hemin@HKUST-1 composites have been used to develop practical sensors for H2O2 and glucose with wide response ranges and low detection limits. It was envisioned that catalyst-functionalized MOFs for chemiluminescence sensing would have promising applications in green, selective, and sensitive detection of target analytes in the future.

  14. TB in Vulnerable Populations

    PubMed Central

    Ugarte-Gil, César; Caro, Godofredo; Aylas, Rula; Castro, César; Lema, Claudia

    2016-01-01

    Abstract This article analyzes the factors associated with vulnerability of the Ashaninka, the most populous indigenous Peruvian Amazonian people, to tuberculosis (TB). By applying a human rights-based analytical framework that assesses public policy against human rights standards and principles, and by offering a step-by-step framework for a full assessment of compliance, it provides evidence of the relationship between the incidence of TB among the Ashaninka and Peru’s poor level of compliance with its human rights obligations. The article argues that one of the main reasons for the historical vulnerability of the Ashaninka to diseases such as TB is a lack of political will on the part of the national government to increase public health spending, ensure that resources reach the most vulnerable population, and adopt and invest in a culturally appropriate health system. PMID:27780999

  15. Revisiting the use of 'place' as an analytic tool for elucidating geographic issues central to Canadian rural palliative care.

    PubMed

    Giesbrecht, Melissa; Crooks, Valorie A; Castleden, Heather; Schuurman, Nadine; Skinner, Mark W; Williams, Allison M

    2016-09-01

    In 2010, Castleden and colleagues published a paper in this journal using the concept of 'place' as an analytic tool to understand the nature of palliative care provision in a rural region in British Columbia, Canada. This publication was based upon pilot data collected for a larger research project that has since been completed. With the addition of 40 semi-structured interviews with users and providers of palliative care in four other rural communities located across Canada, we revisit Castleden and colleagues' (2010) original framework. Applying the concept of place to the full dataset confirmed the previously published findings, but also revealed two new place-based dimensions related to experiences of rural palliative care in Canada: (1) borders and boundaries; and (2) 'making' place for palliative care progress. These new findings offer a refined understanding of the complex interconnections between various dimensions of place and palliative care in rural Canada. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Convergence in full motion video processing, exploitation, and dissemination and activity based intelligence

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Lewis, Gina

    2012-06-01

    Over the last decade, intelligence capabilities within the Department of Defense/Intelligence Community (DoD/IC) have evolved from ad hoc, single source, just-in-time, analog processing; to multi source, digitally integrated, real-time analytics; to multi-INT, predictive Processing, Exploitation and Dissemination (PED). Full Motion Video (FMV) technology and motion imagery tradecraft advancements have greatly contributed to Intelligence, Surveillance and Reconnaissance (ISR) capabilities during this timeframe. Imagery analysts have exploited events, missions and high value targets, generating and disseminating critical intelligence reports within seconds of occurrence across operationally significant PED cells. Now, we go beyond FMV, enabling All-Source Analysts to effectively deliver ISR information in a multi-INT sensor rich environment. In this paper, we explore the operational benefits and technical challenges of an Activity Based Intelligence (ABI) approach to FMV PED. Existing and emerging ABI features within FMV PED frameworks are discussed, to include refined motion imagery tools, additional intelligence sources, activity relevant content management techniques and automated analytics.

  17. The Impact of the Mode of Thought in Complex Decisions: Intuitive Decisions are Better

    PubMed Central

    Usher, Marius; Russo, Zohar; Weyers, Mark; Brauner, Ran; Zakay, Dan

    2011-01-01

    A number of recent studies have reported that decision quality is enhanced under conditions of inattention or distraction (unconscious thought; Dijksterhuis, 2004; Dijksterhuis and Nordgren, 2006; Dijksterhuis et al., 2006). These reports have generated considerable controversy, for both experimental (problems of replication) and theoretical reasons (interpretation). Here we report the results of four experiments. The first experiment replicates the unconscious thought effect, under conditions that validate and control the subjective criterion of decision quality. The second and third experiments examine the impact of a mode of thought manipulation (without distraction) on decision quality in immediate decisions. Here we find that intuitive or affective manipulations improve decision quality compared to analytic/deliberation manipulations. The fourth experiment combines the two methods (distraction and mode of thought manipulations) and demonstrates enhanced decision quality, in a situation that attempts to preserve ecological validity. The results are interpreted within a framework that is based on two interacting subsystems of decision-making: an affective/intuition based system and an analytic/deliberation system. PMID:21716605

  18. The rise of environmental analytical chemistry as an interdisciplinary activity.

    PubMed

    Brown, Richard

    2009-07-01

    Modern scientific endeavour is increasingly delivered within an interdisciplinary framework. Analytical environmental chemistry is a long-standing example of an interdisciplinary approach to scientific research where value is added by the close cooperation of different disciplines. This editorial piece discusses the rise of environmental analytical chemistry as an interdisciplinary activity and outlines the scope of the Analytical Chemistry and the Environmental Chemistry domains of TheScientificWorldJOURNAL (TSWJ), and the appropriateness of TSWJ's domain format in covering interdisciplinary research. All contributions of new data, methods, case studies, and instrumentation, or new interpretations and developments of existing data, case studies, methods, and instrumentation, relating to analytical and/or environmental chemistry, to the Analytical and Environmental Chemistry domains, are welcome and will be considered equally.

  19. Reflectance from images: a model-based approach for human faces.

    PubMed

    Fuchs, Martin; Blanz, Volker; Lensch, Hendrik; Seidel, Hans-Peter

    2005-01-01

    In this paper, we present an image-based framework that acquires the reflectance properties of a human face. A range scan of the face is not required. Based on a morphable face model, the system estimates the 3D shape and establishes point-to-point correspondence across images taken from different viewpoints and across different individuals' faces. This provides a common parameterization of all reconstructed surfaces that can be used to compare and transfer BRDF data between different faces. Shape estimation from images compensates deformations of the face during the measurement process, such as facial expressions. In the common parameterization, regions of homogeneous materials on the face surface can be defined a priori. We apply analytical BRDF models to express the reflectance properties of each region and we estimate their parameters in a least-squares fit from the image data. For each of the surface points, the diffuse component of the BRDF is locally refined, which provides high detail. We present results for multiple analytical BRDF models, rendered at novel orientations and lighting conditions.

  20. Galaxy Formation At Extreme Redshifts: Semi-Analytic Model Predictions And Challenges For Observations

    NASA Astrophysics Data System (ADS)

    Yung, L. Y. Aaron; Somerville, Rachel S.

    2017-06-01

    The well-established Santa Cruz semi-analytic galaxy formation framework has been shown to be quite successful at explaining observations in the local Universe, as well as making predictions for low-redshift observations. Recently, metallicity-based gas partitioning and H2-based star formation recipes have been implemented in our model, replacing the legacy cold-gas based recipe. We then use our revised model to explore the high-redshift Universe and make predictions up to z = 15. Although our model is only calibrated to observations from the local universe, our predictions seem to match incredibly well with mid- to high-redshift observational constraints available-to-date, including rest-frame UV luminosity functions and the reionization history as constrained by CMB and IGM observations. We provide predictions for individual and statistical galaxy properties at a wide range of redshifts (z = 4 - 15), including objects that are too far or too faint to be detected with current facilities. And using our model predictions, we also provide forecasted luminosity functions and other observables for upcoming studies with JWST.

  1. A mathematical description of the inclusive fitness theory.

    PubMed

    Wakano, Joe Yuichiro; Ohtsuki, Hisashi; Kobayashi, Yutaka

    2013-03-01

    Recent developments in the inclusive fitness theory have revealed that the direction of evolution can be analytically predicted in a wider class of models than previously thought, such as those models dealing with network structure. This paper aims to provide a mathematical description of the inclusive fitness theory. Specifically, we provide a general framework based on a Markov chain that can implement basic models of inclusive fitness. Our framework is based on the probability distribution of "offspring-to-parent map", from which the key concepts of the theory, such as fitness function, relatedness and inclusive fitness, are derived in a straightforward manner. We prove theorems showing that inclusive fitness always provides a correct prediction on which of two competing genes more frequently appears in the long run in the Markov chain. As an application of the theorems, we prove a general formula of the optimal dispersal rate in the Wright's island model with recurrent mutations. We also show the existence of the critical mutation rate, which does not depend on the number of islands and below which a positive dispersal rate evolves. Our framework can also be applied to lattice or network structured populations. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Examining the Value of a Scaffolded Critique Framework to Promote Argumentative and Explanatory Writings Within an Argument-Based Inquiry Approach

    NASA Astrophysics Data System (ADS)

    Jang, Jeong-yoon; Hand, Brian

    2017-12-01

    This study investigated the value of using a scaffolded critique framework to promote two different types of writing—argumentative writing and explanatory writing—with different purposes within an argument-based inquiry approach known as the Science Writing Heuristic (SWH) approach. A quasi-experimental design with sixth and seventh grade students taught by two teachers was used. A total of 170 students participated in the study, with 87 in the control group (four classes) and 83 in the treatment group (four classes). All students used the SWH templates as an argumentative writing to guide their written work and completed these templates during the SWH investigations of each unit. After completing the SWH investigations, both groups of students were asked to complete the summary writing task as an explanatory writing at the end of each unit. All students' writing samples were scored using analytical frameworks developed for the study. The results indicated that the treatment group performed significantly better on the explanatory writing task than the control group. In addition, the results of the partial correlation suggested that there is a very strong significantly positive relationship between the argumentative writing and the explanatory writing.

  3. Modelling the protocol stack in NCS with deterministic and stochastic petri net

    NASA Astrophysics Data System (ADS)

    Hui, Chen; Chunjie, Zhou; Weifeng, Zhu

    2011-06-01

    Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.

  4. An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Zhou, Ning

    With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less

  5. Conceptual framework for outcomes research studies of hepatitis C: an analytical review

    PubMed Central

    Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M

    2016-01-01

    Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments. PMID:27313473

  6. Earth Science Data Fusion with Event Building Approach

    NASA Technical Reports Server (NTRS)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  7. The role of data fusion in predictive maintenance using digital twin

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Meyendorf, Norbert; Mrad, Nezih

    2018-04-01

    Modern aerospace industry is migrating from reactive to proactive and predictive maintenance to increase platform operational availability and efficiency, extend its useful life cycle and reduce its life cycle cost. Multiphysics modeling together with data-driven analytics generate a new paradigm called "Digital Twin." The digital twin is actually a living model of the physical asset or system, which continually adapts to operational changes based on the collected online data and information, and can forecast the future of the corresponding physical counterpart. This paper reviews the overall framework to develop a digital twin coupled with the industrial Internet of Things technology to advance aerospace platforms autonomy. Data fusion techniques particularly play a significant role in the digital twin framework. The flow of information from raw data to high-level decision making is propelled by sensor-to-sensor, sensor-to-model, and model-to-model fusion. This paper further discusses and identifies the role of data fusion in the digital twin framework for aircraft predictive maintenance.

  8. Automated robot-assisted surgical skill evaluation: Predictive analytics approach.

    PubMed

    Fard, Mahtab J; Ameri, Sattar; Darin Ellis, R; Chinnam, Ratna B; Pandya, Abhilash K; Klein, Michael D

    2018-02-01

    Surgical skill assessment has predominantly been a subjective task. Recently, technological advances such as robot-assisted surgery have created great opportunities for objective surgical evaluation. In this paper, we introduce a predictive framework for objective skill assessment based on movement trajectory data. Our aim is to build a classification framework to automatically evaluate the performance of surgeons with different levels of expertise. Eight global movement features are extracted from movement trajectory data captured by a da Vinci robot for surgeons with two levels of expertise - novice and expert. Three classification methods - k-nearest neighbours, logistic regression and support vector machines - are applied. The result shows that the proposed framework can classify surgeons' expertise as novice or expert with an accuracy of 82.3% for knot tying and 89.9% for a suturing task. This study demonstrates and evaluates the ability of machine learning methods to automatically classify expert and novice surgeons using global movement features. Copyright © 2017 John Wiley & Sons, Ltd.

  9. TDS exposure project: application of the analytic hierarchy process for the prioritization of substances to be analyzed in a total diet study.

    PubMed

    Papadopoulos, A; Sioen, I; Cubadda, F; Ozer, H; Basegmez, H I Oktay; Turrini, A; Lopez Esteban, M T; Fernandez San Juan, P M; Sokolić-Mihalak, D; Jurkovic, M; De Henauw, S; Aureli, F; Vin, K; Sirot, V

    2015-02-01

    The objective of this article is to develop a general method based on the analytic hierarchy process (AHP) methodology to rank the substances to be studied in a Total Diet Studies (TDS). This method was tested for different substances and groups of substances (N = 113), for which the TDS approach has been considered relevant. This work was performed by a group of 7 experts from different European countries representing their institutes, which are involved in the TDS EXPOSURE project. The AHP methodology is based on a score system taking into account experts' judgments quantified assigning comparative scores to the different identified issues. Hence, the 10 substances of highest interest in the framework of a TDS are trace elements (methylmercury, cadmium, inorganic arsenic, lead, aluminum, inorganic mercury), dioxins, furans and polychlorinated biphenyls (PCBs), and some additives (sulfites and nitrites). The priority list depends on both the national situation (geographical variations, consumer concern, etc.) and the availability of data. Thus, the list depends on the objectives of the TDS and on reachable analytical performances. Moreover, such a list is highly variable with time and new data (e.g. social context, vulnerable population groups, emerging substances, new toxicological data or health-based guidance values). Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Micromechanics Analysis Code (MAC) User Guide: Version 1.0

    NASA Technical Reports Server (NTRS)

    Wilt, T. E.; Arnold, S. M.

    1994-01-01

    The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code, MAC, who's predictive capability rests entirely upon the fully analytical generalized method of cells, GMC, micromechanics model is described. MAC is a versatile form of research software that 'drives' the double or triple ply periodic micromechanics constitutive models based upon GMC. MAC enhances the basic capabilities of GMC by providing a modular framework wherein (1) various thermal, mechanical (stress or strain control), and thermomechanical load histories can be imposed; (2) different integration algorithms may be selected; (3) a variety of constituent constitutive models may be utilized and/or implemented; and (4) a variety of fiber architectures may be easily accessed through their corresponding representative volume elements.

  11. Micromechanics Analysis Code (MAC). User Guide: Version 2.0

    NASA Technical Reports Server (NTRS)

    Wilt, T. E.; Arnold, S. M.

    1996-01-01

    The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code's (MAC) who's predictive capability rests entirely upon the fully analytical generalized method of cells (GMC), micromechanics model is described. MAC is a versatile form of research software that 'drives' the double or triply periodic micromechanics constitutive models based upon GMC. MAC enhances the basic capabilities of GMC by providing a modular framework wherein (1) various thermal, mechanical (stress or strain control) and thermomechanical load histories can be imposed, (2) different integration algorithms may be selected, (3) a variety of constituent constitutive models may be utilized and/or implemented, and (4) a variety of fiber and laminate architectures may be easily accessed through their corresponding representative volume elements.

  12. Intuition: A Concept Analysis.

    PubMed

    Chilcote, Deborah R

    2017-01-01

    The purpose of this article is to conceptually examine intuition; identify the importance of intuition in nursing education, clinical practice, and patient care; encourage acceptance of the use of intuition; and add to the body of nursing knowledge. Nurses often report using intuition when making clinical decisions. Intuition is a rapid, unconscious process based in global knowledge that views the patient holistically while synthesizing information to improve patient outcomes. However, with the advent of evidence-based practice (EBP), the use of intuition has become undervalued in nursing. Walker and Avant's framework was used to analyze intuition. A literature search from 1987 to 2014 was conducted using the following keywords: intuition, intuition and nursing, clinical decision making, clinical decision making and intuition, patient outcomes, EBP, and analytical thinking. The use of intuition is reported by nurses, but is not legitimized within the nursing profession. Defining attributes of intuition are an unconscious, holistic knowledge gathered without using an analytical process and knowledge derived through synthesis, not analysis. Consequences include verification of intuition through an analytical process and translating that knowledge into a course of action. This article supports the use of intuition in nursing by offering clarity to the concept, adds to the nursing knowledge base, encourages a holistic view of the patient during clinical decision making, and encourages nurse educators to promote the use of intuition. © 2016 Wiley Periodicals, Inc.

  13. A theoretical framework for convergence and continuous dependence of estimates in inverse problems for distributed parameter systems

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Ito, K.

    1988-01-01

    Numerical techniques for parameter identification in distributed-parameter systems are developed analytically. A general convergence and stability framework (for continuous dependence on observations) is derived for first-order systems on the basis of (1) a weak formulation in terms of sesquilinear forms and (2) the resolvent convergence form of the Trotter-Kato approximation. The extension of this framework to second-order systems is considered.

  14. Technical Note for 8D Likelihood Effective Higgs Couplings Extraction Framework in the Golden Channel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yi; Di Marco, Emanuele; Lykken, Joe

    2014-10-17

    In this technical note we present technical details on various aspects of the framework introduced in arXiv:1401.2077 aimed at extracting effective Higgs couplings in themore » $$h\\to 4\\ell$$ `golden channel'. Since it is the primary feature of the framework, we focus in particular on the convolution integral which takes us from `truth' level to `detector' level and the numerical and analytic techniques used to obtain it. We also briefly discuss other aspects of the framework.« less

  15. Analytical optimization of demand management strategies across all urban water use sectors

    NASA Astrophysics Data System (ADS)

    Friedman, Kenneth; Heaney, James P.; Morales, Miguel; Palenchar, John

    2014-07-01

    An effective urban water demand management program can greatly influence both peak and average demand and therefore long-term water supply and infrastructure planning. Although a theoretical framework for evaluating residential indoor demand management has been well established, little has been done to evaluate other water use sectors such as residential irrigation in a compatible manner for integrating these results into an overall solution. This paper presents a systematic procedure to evaluate the optimal blend of single family residential irrigation demand management strategies to achieve a specified goal based on performance functions derived from parcel level tax assessor's data linked to customer level monthly water billing data. This framework is then generalized to apply to any urban water sector, as exponential functions can be fit to all resulting cumulative water savings functions. Two alternative formulations are presented: maximize net benefits, or minimize total costs subject to satisfying a target water savings. Explicit analytical solutions are presented for both formulations based on appropriate exponential best fits of performance functions. A direct result of this solution is the dual variable which represents the marginal cost of water saved at a specified target water savings goal. A case study of 16,303 single family irrigators in Gainesville Regional Utilities utilizing high quality tax assessor and monthly billing data along with parcel level GIS data provide an illustrative example of these techniques. Spatial clustering of targeted homes can be easily performed in GIS to identify priority demand management areas.

  16. Virtual machine-based simulation platform for mobile ad-hoc network-based cyber infrastructure

    DOE PAGES

    Yoginath, Srikanth B.; Perumalla, Kayla S.; Henz, Brian J.

    2015-09-29

    In modeling and simulating complex systems such as mobile ad-hoc networks (MANETs) in de-fense communications, it is a major challenge to reconcile multiple important considerations: the rapidity of unavoidable changes to the software (network layers and applications), the difficulty of modeling the critical, implementation-dependent behavioral effects, the need to sustain larger scale scenarios, and the desire for faster simulations. Here we present our approach in success-fully reconciling them using a virtual time-synchronized virtual machine(VM)-based parallel ex-ecution framework that accurately lifts both the devices as well as the network communications to a virtual time plane while retaining full fidelity. At themore » core of our framework is a scheduling engine that operates at the level of a hypervisor scheduler, offering a unique ability to execute multi-core guest nodes over multi-core host nodes in an accurate, virtual time-synchronized manner. In contrast to other related approaches that suffer from either speed or accuracy issues, our framework provides MANET node-wise scalability, high fidelity of software behaviors, and time-ordering accuracy. The design and development of this framework is presented, and an ac-tual implementation based on the widely used Xen hypervisor system is described. Benchmarks with synthetic and actual applications are used to identify the benefits of our approach. The time inaccuracy of traditional emulation methods is demonstrated, in comparison with the accurate execution of our framework verified by theoretically correct results expected from analytical models of the same scenarios. In the largest high fidelity tests, we are able to perform virtual time-synchronized simulation of 64-node VM-based full-stack, actual software behaviors of MANETs containing a mix of static and mobile (unmanned airborne vehicle) nodes, hosted on a 32-core host, with full fidelity of unmodified ad-hoc routing protocols, unmodified application executables, and user-controllable physical layer effects including inter-device wireless signal strength, reachability, and connectivity.« less

  17. Virtual machine-based simulation platform for mobile ad-hoc network-based cyber infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoginath, Srikanth B.; Perumalla, Kayla S.; Henz, Brian J.

    In modeling and simulating complex systems such as mobile ad-hoc networks (MANETs) in de-fense communications, it is a major challenge to reconcile multiple important considerations: the rapidity of unavoidable changes to the software (network layers and applications), the difficulty of modeling the critical, implementation-dependent behavioral effects, the need to sustain larger scale scenarios, and the desire for faster simulations. Here we present our approach in success-fully reconciling them using a virtual time-synchronized virtual machine(VM)-based parallel ex-ecution framework that accurately lifts both the devices as well as the network communications to a virtual time plane while retaining full fidelity. At themore » core of our framework is a scheduling engine that operates at the level of a hypervisor scheduler, offering a unique ability to execute multi-core guest nodes over multi-core host nodes in an accurate, virtual time-synchronized manner. In contrast to other related approaches that suffer from either speed or accuracy issues, our framework provides MANET node-wise scalability, high fidelity of software behaviors, and time-ordering accuracy. The design and development of this framework is presented, and an ac-tual implementation based on the widely used Xen hypervisor system is described. Benchmarks with synthetic and actual applications are used to identify the benefits of our approach. The time inaccuracy of traditional emulation methods is demonstrated, in comparison with the accurate execution of our framework verified by theoretically correct results expected from analytical models of the same scenarios. In the largest high fidelity tests, we are able to perform virtual time-synchronized simulation of 64-node VM-based full-stack, actual software behaviors of MANETs containing a mix of static and mobile (unmanned airborne vehicle) nodes, hosted on a 32-core host, with full fidelity of unmodified ad-hoc routing protocols, unmodified application executables, and user-controllable physical layer effects including inter-device wireless signal strength, reachability, and connectivity.« less

  18. Integration within the Felsenstein equation for improved Markov chain Monte Carlo methods in population genetics

    PubMed Central

    Hey, Jody; Nielsen, Rasmus

    2007-01-01

    In 1988, Felsenstein described a framework for assessing the likelihood of a genetic data set in which all of the possible genealogical histories of the data are considered, each in proportion to their probability. Although not analytically solvable, several approaches, including Markov chain Monte Carlo methods, have been developed to find approximate solutions. Here, we describe an approach in which Markov chain Monte Carlo simulations are used to integrate over the space of genealogies, whereas other parameters are integrated out analytically. The result is an approximation to the full joint posterior density of the model parameters. For many purposes, this function can be treated as a likelihood, thereby permitting likelihood-based analyses, including likelihood ratio tests of nested models. Several examples, including an application to the divergence of chimpanzee subspecies, are provided. PMID:17301231

  19. A framework for understanding waste management studies in construction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu Weisheng, E-mail: wilsonlu@hku.hk; Yuan Hongping, E-mail: hp.yuan@polyu.edu.hk

    2011-06-15

    During the past decades, construction and demolition (C and D) waste issues have received increasing attention from both practitioners and researchers around the world. A plethora of research relating to C and D waste management (WM) has been published in scholarly journals. However, a comprehensive understanding of the C and D WM research is somehow absent in spite of its proliferation. The aim of this paper is to develop a framework that helps readers understand the C and D WM research as archived in selected journals. Papers under the topic of C and D WM are retrieved based on amore » set of rigorous procedures. The information of these papers is then analyzed with the assistance of the Qualitative Social Research (QSR) software package NVivo. A framework for understanding C and D WM research is created based on the analytic results. By following the framework, a bibliometric analysis of research in C and D WM is presented, followed by an in-depth literature analysis. It is found that C and D generation, reduction, and recycling are the three major topics in the discipline of C and D WM. Future research is recommended to (a) investigate C and D waste issues in wider scopes including design, maintenance and demolition, (b) develop a unified measurement for waste generation so that WM performance can be compared across various economies, and (c) enhance effectiveness of WM approaches (e.g. waste charging scheme) based on new WM concepts (e.g. Extended Producer Responsibility). In addition to the above research findings, the approach for producing the research framework can be useful references for other studies which attempt to understand the research of a given discipline.« less

  20. A framework for understanding waste management studies in construction.

    PubMed

    Lu, Weisheng; Yuan, Hongping

    2011-06-01

    During the past decades, construction and demolition (C&D) waste issues have received increasing attention from both practitioners and researchers around the world. A plethora of research relating to C&D waste management (WM) has been published in scholarly journals. However, a comprehensive understanding of the C&D WM research is somehow absent in spite of its proliferation. The aim of this paper is to develop a framework that helps readers understand the C&D WM research as archived in selected journals. Papers under the topic of C&D WM are retrieved based on a set of rigorous procedures. The information of these papers is then analyzed with the assistance of the Qualitative Social Research (QSR) software package NVivo. A framework for understanding C&D WM research is created based on the analytic results. By following the framework, a bibliometric analysis of research in C&D WM is presented, followed by an in-depth literature analysis. It is found that C&D generation, reduction, and recycling are the three major topics in the discipline of C&D WM. Future research is recommended to (a) investigate C&D waste issues in wider scopes including design, maintenance and demolition, (b) develop a unified measurement for waste generation so that WM performance can be compared across various economies, and (c) enhance effectiveness of WM approaches (e.g. waste charging scheme) based on new WM concepts (e.g. Extended Producer Responsibility). In addition to the above research findings, the approach for producing the research framework can be useful references for other studies which attempt to understand the research of a given discipline. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Integrating Sediment Connectivity into Water Resources Management Trough a Graph Theoretic, Stochastic Modeling Framework.

    NASA Astrophysics Data System (ADS)

    Schmitt, R. J. P.; Castelletti, A.; Bizzi, S.

    2014-12-01

    Understanding sediment transport processes at the river basin scale, their temporal spectra and spatial patterns is key to identify and minimize morphologic risks associated to channel adjustments processes. This work contributes a stochastic framework for modeling bed-load connectivity based on recent advances in the field (e.g., Bizzi & Lerner, 2013; Czubas & Foufoulas-Georgiu, 2014). It presents river managers with novel indicators from reach scale vulnerability to channel adjustment in large river networks with sparse hydrologic and sediment observations. The framework comprises three steps. First, based on a distributed hydrological model and remotely sensed information, the framework identifies a representative grain size class for each reach. Second, sediment residence time distributions are calculated for each reach in a Monte-Carlo approach applying standard sediment transport equations driven by local hydraulic conditions. Third, a network analysis defines the up- and downstream connectivity for various travel times resulting in characteristic up/downstream connectivity signatures for each reach. Channel vulnerability indicators quantify the imbalance between up/downstream connectivity for each travel time domain, representing process dependent latency of morphologic response. Last, based on the stochastic core of the model, a sensitivity analysis identifies drivers of change and major sources of uncertainty in order to target key detrimental processes and to guide effective gathering of additional data. The application, limitation and integration into a decision analytic framework is demonstrated for a major part of the Red River Basin in Northern Vietnam (179.000 km2). Here, a plethora of anthropic alterations ranging from large reservoir construction to land-use changes results in major downstream deterioration and calls for deriving concerted sediment management strategies to mitigate current and limit future morphologic alterations.

  2. An Analytical Framework for the Cross-Country Comparison of Higher Education Governance

    ERIC Educational Resources Information Center

    Dobbins, Michael; Knill, Christoph; Vogtle, Eva Maria

    2011-01-01

    In this article we provide an integrated framework for the analysis of higher education governance which allows us to more systematically trace the changes that European higher education systems are currently undergoing. We argue that, despite highly insightful previous analyses, there is a need for more specific empirically observable indicators…

  3. A Human Dimensions Framework: Guidelines for Conducting Social Assessments

    Treesearch

    Alan D. Bright; H. Ken Cordell; Anne P. Hoover; Michael A Tarrant

    2003-01-01

    This paper provides a framework and guidelines for identifying and organizing human dimension information for use in forest planning. It synthesizes concepts from a variety of social science disciplines and connects them with measurable indicators for use in analysis and reporting. Suggestions of analytical approaches and sources of data for employment of the...

  4. A Data Analytical Framework for Improving Real-Time, Decision Support Systems in Healthcare

    ERIC Educational Resources Information Center

    Yahav, Inbal

    2010-01-01

    In this dissertation we develop a framework that combines data mining, statistics and operations research methods for improving real-time decision support systems in healthcare. Our approach consists of three main concepts: data gathering and preprocessing, modeling, and deployment. We introduce the notion of offline and semi-offline modeling to…

  5. How Do Mathematicians Learn Math?: Resources and Acts for Constructing and Understanding Mathematics

    ERIC Educational Resources Information Center

    Wilkerson-Jerde, Michelle H.; Wilensky, Uri J.

    2011-01-01

    In this paper, we present an analytic framework for investigating expert mathematical learning as the process of building a "network of mathematical resources" by establishing relationships between different components and properties of mathematical ideas. We then use this framework to analyze the reasoning of ten mathematicians and mathematics…

  6. Focus for Area Development Analysis: Urban Orientation of Counties.

    ERIC Educational Resources Information Center

    Bluestone, Herman

    The orientation of counties to metropolitan systems and urban centers is identified by population density and percentage of urban population. This analytical framework differentiates 6 kinds of counties, ranging from most urban-oriented (group 1) to least urban-oriented (group 6). With this framework, it can be seen that the economic well-being of…

  7. Analyzing Educators' Online Interactions: A Framework of Online Learning Support Roles

    ERIC Educational Resources Information Center

    Nacu, Denise C.; Martin, Caitlin K.; Pinkard, Nichole; Gray, Tené

    2016-01-01

    While the potential benefits of participating in online learning communities are documented, so too are inequities in terms of how different populations access and use them. We present the online learning support roles (OLSR) framework, an approach using both automated analytics and qualitative interpretation to identify and explore online…

  8. Mind-Sets Matter: A Meta-Analytic Review of Implicit Theories and Self-Regulation

    ERIC Educational Resources Information Center

    Burnette, Jeni L.; O'Boyle, Ernest H.; VanEpps, Eric M.; Pollack, Jeffrey M.; Finkel, Eli J.

    2013-01-01

    This review builds on self-control theory (Carver & Scheier, 1998) to develop a theoretical framework for investigating associations of implicit theories with self-regulation. This framework conceptualizes self-regulation in terms of 3 crucial processes: goal setting, goal operating, and goal monitoring. In this meta-analysis, we included…

  9. Supporting Multidisciplinary Networks through Relationality and a Critical Sense of Belonging: Three "Gardening Tools" and the "Relational Agency Framework"

    ERIC Educational Resources Information Center

    Duhn, Iris; Fleer, Marilyn; Harrison, Linda

    2016-01-01

    This article focuses on the "Relational Agency Framework" (RAF), an analytical tool developed for an Australian review and evaluation study of an early years' policy initiative. We explore Anne Edward's concepts of "relational expertise", "building common knowledge" and "relational agency" to explore how…

  10. An Analytic Framework to Support E.Learning Strategy Development

    ERIC Educational Resources Information Center

    Marshall, Stephen J.

    2012-01-01

    Purpose: The purpose of this paper is to discuss and demonstrate the relevance of a new conceptual framework for leading and managing the development of learning and teaching to e.learning strategy development. Design/methodology/approach: After reviewing and discussing the research literature on e.learning in higher education institutions from…

  11. University Reform and Institutional Autonomy: A Framework for Analysing the Living Autonomy

    ERIC Educational Resources Information Center

    Maassen, Peter; Gornitzka, Åse; Fumasoli, Tatiana

    2017-01-01

    In this article we discuss recent university reforms aimed at enhancing university autonomy, highlighting various tensions in the underlying reform ideologies. We examine how the traditional interpretation of university autonomy has been expanded in the reform rationales. An analytical framework for studying how autonomy is interpreted and used…

  12. High School Students' Informal Reasoning on a Socio-Scientific Issue: Qualitative and Quantitative Analyses

    ERIC Educational Resources Information Center

    Wu, Ying-Tien; Tsai, Chin-Chung

    2007-01-01

    Recently, the significance of learners' informal reasoning on socio-scientific issues has received increasing attention among science educators. To gain deeper insights into this important issue, an integrated analytic framework was developed in this study. With this framework, 71 Grade 10 students' informal reasoning about nuclear energy usage…

  13. A Framework for Describing Mathematics Discourse in Instruction and Interpreting Differences in Teaching

    ERIC Educational Resources Information Center

    Adler, Jill; Ronda, Erlina

    2015-01-01

    We describe and use an analytical framework to document mathematics discourse in instruction (MDI), and interpret differences in mathematics teaching. MDI is characterised by four interacting components in the teaching of a mathematics lesson: exemplification (occurring through a sequence of examples and related tasks), explanatory talk (talk that…

  14. Tracking the debate around marine protected areas: key issues and the BEG framework.

    PubMed

    Thorpe, Andy; Bavinck, Maarten; Coulthard, Sarah

    2011-04-01

    Marine conservation is often criticized for a mono-disciplinary approach, which delivers fragmented solutions to complex problems with differing interpretations of success. As a means of reflecting on the breadth and range of scientific research on the management of the marine environment, this paper develops an analytical framework to gauge the foci of policy documents and published scientific work on Marine Protected Areas. We evaluate the extent to which MPA research articles delineate objectives around three domains: biological-ecological [B]; economic-social[E]; and governance-management [G]. This permits us to develop an analytic [BEG] framework which we then test on a sample of selected journal article cohorts. While the framework reveals the dominance of biologically focussed research [B], analysis also reveals a growing frequency of the use of governance/management terminology in the literature over the last 15 years, which may be indicative of a shift towards more integrated consideration of governance concerns. However, consideration of the economic/social domain appears to lag behind biological and governance concerns in both frequency and presence in MPA literature.

  15. Deterministic and fuzzy-based methods to evaluate community resilience

    NASA Astrophysics Data System (ADS)

    Kammouh, Omar; Noori, Ali Zamani; Taurino, Veronica; Mahin, Stephen A.; Cimellaro, Gian Paolo

    2018-04-01

    Community resilience is becoming a growing concern for authorities and decision makers. This paper introduces two indicator-based methods to evaluate the resilience of communities based on the PEOPLES framework. PEOPLES is a multi-layered framework that defines community resilience using seven dimensions. Each of the dimensions is described through a set of resilience indicators collected from literature and they are linked to a measure allowing the analytical computation of the indicator's performance. The first method proposed in this paper requires data on previous disasters as an input and returns as output a performance function for each indicator and a performance function for the whole community. The second method exploits a knowledge-based fuzzy modeling for its implementation. This method allows a quantitative evaluation of the PEOPLES indicators using descriptive knowledge rather than deterministic data including the uncertainty involved in the analysis. The output of the fuzzy-based method is a resilience index for each indicator as well as a resilience index for the community. The paper also introduces an open source online tool in which the first method is implemented. A case study illustrating the application of the first method and the usage of the tool is also provided in the paper.

  16. Linguistics and the Study of Literature. Linguistics in the Undergraduate Curriculum, Appendix 4-D.

    ERIC Educational Resources Information Center

    Steward, Ann Harleman

    Linguistics gives the student of literature an analytical tool whose sole purpose is to describe faithfully the workings of language. It provides a theoretical framework, an analytical method, and a vocabulary for communicating its insights--all designed to serve concerns other than literary interpretation and evaluation, but all useful for…

  17. A Two-Stage Approach to Synthesizing Covariance Matrices in Meta-Analytic Structural Equation Modeling

    ERIC Educational Resources Information Center

    Cheung, Mike W. L.; Chan, Wai

    2009-01-01

    Structural equation modeling (SEM) is widely used as a statistical framework to test complex models in behavioral and social sciences. When the number of publications increases, there is a need to systematically synthesize them. Methodology of synthesizing findings in the context of SEM is known as meta-analytic SEM (MASEM). Although correlation…

  18. Using Functional Analytic Therapy to Train Therapists in Acceptance and Commitment Therapy, a Conceptual and Practical Framework

    ERIC Educational Resources Information Center

    Schoendorff, Benjamin; Steinwachs, Joanne

    2012-01-01

    How can therapists be effectively trained in clinical functional contextualism? In this conceptual article we propose a new way of training therapists in Acceptance and Commitment Therapy skills using tools from Functional Analytic Psychotherapy in a training context functionally similar to the therapeutic relationship. FAP has been successfully…

  19. Dissociable meta-analytic brain networks contribute to coordinated emotional processing.

    PubMed

    Riedel, Michael C; Yanes, Julio A; Ray, Kimberly L; Eickhoff, Simon B; Fox, Peter T; Sutherland, Matthew T; Laird, Angela R

    2018-06-01

    Meta-analytic techniques for mining the neuroimaging literature continue to exert an impact on our conceptualization of functional brain networks contributing to human emotion and cognition. Traditional theories regarding the neurobiological substrates contributing to affective processing are shifting from regional- towards more network-based heuristic frameworks. To elucidate differential brain network involvement linked to distinct aspects of emotion processing, we applied an emergent meta-analytic clustering approach to the extensive body of affective neuroimaging results archived in the BrainMap database. Specifically, we performed hierarchical clustering on the modeled activation maps from 1,747 experiments in the affective processing domain, resulting in five meta-analytic groupings of experiments demonstrating whole-brain recruitment. Behavioral inference analyses conducted for each of these groupings suggested dissociable networks supporting: (1) visual perception within primary and associative visual cortices, (2) auditory perception within primary auditory cortices, (3) attention to emotionally salient information within insular, anterior cingulate, and subcortical regions, (4) appraisal and prediction of emotional events within medial prefrontal and posterior cingulate cortices, and (5) induction of emotional responses within amygdala and fusiform gyri. These meta-analytic outcomes are consistent with a contemporary psychological model of affective processing in which emotionally salient information from perceived stimuli are integrated with previous experiences to engender a subjective affective response. This study highlights the utility of using emergent meta-analytic methods to inform and extend psychological theories and suggests that emotions are manifest as the eventual consequence of interactions between large-scale brain networks. © 2018 Wiley Periodicals, Inc.

  20. A dimensionless approach for the runoff peak assessment: effects of the rainfall event structure

    NASA Astrophysics Data System (ADS)

    Gnecco, Ilaria; Palla, Anna; La Barbera, Paolo

    2018-02-01

    The present paper proposes a dimensionless analytical framework to investigate the impact of the rainfall event structure on the hydrograph peak. To this end a methodology to describe the rainfall event structure is proposed based on the similarity with the depth-duration-frequency (DDF) curves. The rainfall input consists of a constant hyetograph where all the possible outcomes in the sample space of the rainfall structures can be condensed. Soil abstractions are modelled using the Soil Conservation Service method and the instantaneous unit hydrograph theory is undertaken to determine the dimensionless form of the hydrograph; the two-parameter gamma distribution is selected to test the proposed methodology. The dimensionless approach is introduced in order to implement the analytical framework to any study case (i.e. natural catchment) for which the model assumptions are valid (i.e. linear causative and time-invariant system). A set of analytical expressions are derived in the case of a constant-intensity hyetograph to assess the maximum runoff peak with respect to a given rainfall event structure irrespective of the specific catchment (such as the return period associated with the reference rainfall event). Looking at the results, the curve of the maximum values of the runoff peak reveals a local minimum point corresponding to the design hyetograph derived according to the statistical DDF curve. A specific catchment application is discussed in order to point out the dimensionless procedure implications and to provide some numerical examples of the rainfall structures with respect to observed rainfall events; finally their effects on the hydrograph peak are examined.

Top