Closed-form solutions and scaling laws for Kerr frequency combs
Renninger, William H.; Rakich, Peter T.
2016-01-01
A single closed-form analytical solution of the driven nonlinear Schrödinger equation is developed, reproducing a large class of the behaviors in Kerr-comb systems, including bright-solitons, dark-solitons, and a large class of periodic wavetrains. From this analytical framework, a Kerr-comb area theorem and a pump-detuning relation are developed, providing new insights into soliton- and wavetrain-based combs along with concrete design guidelines for both. This new area theorem reveals significant deviation from the conventional soliton area theorem, which is crucial to understanding cavity solitons in certain limits. Moreover, these closed-form solutions represent the first step towards an analytical framework for wavetrain formation, and reveal new parameter regimes for enhanced Kerr-comb performance. PMID:27108810
ERIC Educational Resources Information Center
Yogev, Sara; Brett, Jeanne
This paper offers a conceptual framework for the intersection of work and family roles based on the constructs of work involvement and family involvement. The theoretical and empirical literature on the intersection of work and family roles is reviewed from two analytical approaches. From the individual level of analysis, the literature reviewed…
The Earth Data Analytic Services (EDAS) Framework
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; Duffy, D.
2017-12-01
Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.
The Climate Data Analytic Services (CDAS) Framework.
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; Duffy, D.
2016-12-01
Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.
Gkoumas, Spyridon; Villanueva-Perez, Pablo; Wang, Zhentian; Romano, Lucia; Abis, Matteo; Stampanoni, Marco
2016-01-01
In X-ray grating interferometry, dark-field contrast arises due to partial extinction of the detected interference fringes. This is also called visibility reduction and is attributed to small-angle scattering from unresolved structures in the imaged object. In recent years, analytical quantitative frameworks of dark-field contrast have been developed for highly diluted monodisperse microsphere suspensions with maximum 6% volume fraction. These frameworks assume that scattering particles are separated by large enough distances, which make any interparticle scattering interference negligible. In this paper, we start from the small-angle scattering intensity equation and, by linking Fourier and real-space, we introduce the structure factor and thus extend the analytical and experimental quantitative interpretation of dark-field contrast, for a range of suspensions with volume fractions reaching 40%. The structure factor accounts for interparticle scattering interference. Without introducing any additional fitting parameters, we successfully predict the experimental values measured at the TOMCAT beamline, Swiss Light Source. Finally, we apply this theoretical framework to an experiment probing a range of system correlation lengths by acquiring dark-field images at different energies. This proposed method has the potential to be applied in single-shot-mode using a polychromatic X-ray tube setup and a single-photon-counting energy-resolving detector. PMID:27734931
Differential Validation of a Path Analytic Model of University Dropout.
ERIC Educational Resources Information Center
Winteler, Adolf
Tinto's conceptual schema of college dropout forms the theoretical framework for the development of a model of university student dropout intention. This study validated Tinto's model in two different departments within a single university. Analyses were conducted on a sample of 684 college freshmen in the Education and Economics Department. A…
bigSCale: an analytical framework for big-scale single-cell data.
Iacono, Giovanni; Mereu, Elisabetta; Guillaumet-Adkins, Amy; Corominas, Roser; Cuscó, Ivon; Rodríguez-Esteban, Gustavo; Gut, Marta; Pérez-Jurado, Luis Alberto; Gut, Ivo; Heyn, Holger
2018-06-01
Single-cell RNA sequencing (scRNA-seq) has significantly deepened our insights into complex tissues, with the latest techniques capable of processing tens of thousands of cells simultaneously. Analyzing increasing numbers of cells, however, generates extremely large data sets, extending processing time and challenging computing resources. Current scRNA-seq analysis tools are not designed to interrogate large data sets and often lack sensitivity to identify marker genes. With bigSCale, we provide a scalable analytical framework to analyze millions of cells, which addresses the challenges associated with large data sets. To handle the noise and sparsity of scRNA-seq data, bigSCale uses large sample sizes to estimate an accurate numerical model of noise. The framework further includes modules for differential expression analysis, cell clustering, and marker identification. A directed convolution strategy allows processing of extremely large data sets, while preserving transcript information from individual cells. We evaluated the performance of bigSCale using both a biological model of aberrant gene expression in patient-derived neuronal progenitor cells and simulated data sets, which underlines the speed and accuracy in differential expression analysis. To test its applicability for large data sets, we applied bigSCale to assess 1.3 million cells from the mouse developing forebrain. Its directed down-sampling strategy accumulates information from single cells into index cell transcriptomes, thereby defining cellular clusters with improved resolution. Accordingly, index cell clusters identified rare populations, such as reelin ( Reln )-positive Cajal-Retzius neurons, for which we report previously unrecognized heterogeneity associated with distinct differentiation stages, spatial organization, and cellular function. Together, bigSCale presents a solution to address future challenges of large single-cell data sets. © 2018 Iacono et al.; Published by Cold Spring Harbor Laboratory Press.
2014-07-01
powder x-ray diffraction (PXRD), thermogravimentric analysis (TGA), and Fourier transform infrared (FTIR). 15. SUBJECT TERMS Metal organic frame work...the inclusion by using a variety of analytical techniques, such as powder x-ray diffraction (PXRD), thermo-gravimetric analysis (TGA), Fourier...Characterizations Analysis of the MOF and the complexes with the MOF and the guest molecules was performed using an Agilent GC-MS (Model 6890N GC and Model 5973N
Sargsyan, Ori
2012-05-25
Hitchhiking and severe bottleneck effects have impact on the dynamics of genetic diversity of a population by inducing homogenization at a single locus and at the genome-wide scale, respectively. As a result, identification and differentiation of the signatures of such events from DNA sequence data at a single locus is challenging. This study develops an analytical framework for identifying and differentiating recent homogenization events at multiple neutral loci in low recombination regions. The dynamics of genetic diversity at a locus after a recent homogenization event is modeled according to the infinite-sites mutation model and the Wright-Fisher model of reproduction withmore » constant population size. In this setting, I derive analytical expressions for the distribution, mean, and variance of the number of polymorphic sites in a random sample of DNA sequences from a locus affected by a recent homogenization event. Based on this framework, three likelihood-ratio based tests are presented for identifying and differentiating recent homogenization events at multiple loci. Lastly, I apply the framework to two data sets. First, I consider human DNA sequences from four non-coding loci on different chromosomes for inferring evolutionary history of modern human populations. The results suggest, in particular, that recent homogenization events at the loci are identifiable when the effective human population size is 50000 or greater in contrast to 10000, and the estimates of the recent homogenization events are agree with the “Out of Africa” hypothesis. Second, I use HIV DNA sequences from HIV-1-infected patients to infer the times of HIV seroconversions. The estimates are contrasted with other estimates derived as the mid-time point between the last HIV-negative and first HIV-positive screening tests. Finally, the results show that significant discrepancies can exist between the estimates.« less
A novel numerical framework for self-similarity in plasticity: Wedge indentation in single crystals
NASA Astrophysics Data System (ADS)
Juul, K. J.; Niordson, C. F.; Nielsen, K. L.; Kysar, J. W.
2018-03-01
A novel numerical framework for analyzing self-similar problems in plasticity is developed and demonstrated. Self-similar problems of this kind include processes such as stationary cracks, void growth, indentation etc. The proposed technique offers a simple and efficient method for handling this class of complex problems by avoiding issues related to traditional Lagrangian procedures. Moreover, the proposed technique allows for focusing the mesh in the region of interest. In the present paper, the technique is exploited to analyze the well-known wedge indentation problem of an elastic-viscoplastic single crystal. However, the framework may be readily adapted to any constitutive law of interest. The main focus herein is the development of the self-similar framework, while the indentation study serves primarily as verification of the technique by comparing to existing numerical and analytical studies. In this study, the three most common metal crystal structures will be investigated, namely the face-centered cubic (FCC), body-centered cubic (BCC), and hexagonal close packed (HCP) crystal structures, where the stress and slip rate fields around the moving contact point singularity are presented.
NASA Astrophysics Data System (ADS)
Akhtar, Taimoor; Shoemaker, Christine
2016-04-01
Watershed model calibration is inherently a multi-criteria problem. Conflicting trade-offs exist between different quantifiable calibration criterions indicating the non-existence of a single optimal parameterization. Hence, many experts prefer a manual approach to calibration where the inherent multi-objective nature of the calibration problem is addressed through an interactive, subjective, time-intensive and complex decision making process. Multi-objective optimization can be used to efficiently identify multiple plausible calibration alternatives and assist calibration experts during the parameter estimation process. However, there are key challenges to the use of multi objective optimization in the parameter estimation process which include: 1) multi-objective optimization usually requires many model simulations, which is difficult for complex simulation models that are computationally expensive; and 2) selection of one from numerous calibration alternatives provided by multi-objective optimization is non-trivial. This study proposes a "Hybrid Automatic Manual Strategy" (HAMS) for watershed model calibration to specifically address the above-mentioned challenges. HAMS employs a 3-stage framework for parameter estimation. Stage 1 incorporates the use of an efficient surrogate multi-objective algorithm, GOMORS, for identification of numerous calibration alternatives within a limited simulation evaluation budget. The novelty of HAMS is embedded in Stages 2 and 3 where an interactive visual and metric based analytics framework is available as a decision support tool to choose a single calibration from the numerous alternatives identified in Stage 1. Stage 2 of HAMS provides a goodness-of-fit measure / metric based interactive framework for identification of a small subset (typically less than 10) of meaningful and diverse set of calibration alternatives from the numerous alternatives obtained in Stage 1. Stage 3 incorporates the use of an interactive visual analytics framework for decision support in selection of one parameter combination from the alternatives identified in Stage 2. HAMS is applied for calibration of flow parameters of a SWAT model, (Soil and Water Assessment Tool) designed to simulate flow in the Cannonsville watershed in upstate New York. Results from the application of HAMS to Cannonsville indicate that efficient multi-objective optimization and interactive visual and metric based analytics can bridge the gap between the effective use of both automatic and manual strategies for parameter estimation of computationally expensive watershed models.
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096
Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.
ERIC Educational Resources Information Center
Rienties, Bart; Boroowa, Avinash; Cross, Simon; Kubiak, Chris; Mayles, Kevin; Murphy, Sam
2016-01-01
There is an urgent need to develop an evidence-based framework for learning analytics whereby stakeholders can manage, evaluate, and make decisions about which types of interventions work well and under which conditions. In this article, we will work towards developing a foundation of an Analytics4Action Evaluation Framework (A4AEF) that is…
The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change
NASA Astrophysics Data System (ADS)
Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.
2015-12-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.
A dislocation-based crystal plasticity framework for dynamic ductile failure of single crystals
NASA Astrophysics Data System (ADS)
Nguyen, Thao; Luscher, D. J.; Wilkerson, J. W.
2017-11-01
A framework for dislocation-based viscoplasticity and dynamic ductile failure has been developed to model high strain rate deformation and damage in single crystals. The rate-dependence of the crystal plasticity formulation is based on the physics of relativistic dislocation kinetics suited for extremely high strain rates. The damage evolution is based on the dynamics of void growth, which are governed by both micro-inertia as well as dislocation kinetics and dislocation substructure evolution. An averaging scheme is proposed in order to approximate the evolution of the dislocation substructure in both the macroscale as well as its spatial distribution at the microscale. Additionally, a concept of a single equivalent dislocation density that effectively captures the collective influence of dislocation density on all active slip systems is proposed here. Together, these concepts and approximations enable the use of semi-analytic solutions for void growth dynamics developed in (Wilkerson and Ramesh, 2014), which greatly reduce the computational overhead that would otherwise be required. The resulting homogenized framework has been implemented into a commercially available finite element package, and a validation study against a suite of direct numerical simulations was carried out.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-10
... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY: Privacy Office... Homeland Security/U.S. Customs and Border Protection, DHS/CBP--017 Analytical Framework for Intelligence... Analytical Framework for Intelligence (AFI) System of Records'' from one or more provisions of the Privacy...
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
Canuet, Lucien; Védrenne, Nicolas; Conan, Jean-Marc; Petit, Cyril; Artaud, Geraldine; Rissons, Angelique; Lacan, Jerome
2018-01-01
In the framework of satellite-to-ground laser downlinks, an analytical model describing the variations of the instantaneous coupled flux into a single-mode fiber after correction of the incoming wavefront by partial adaptive optics (AO) is presented. Expressions for the probability density function and the cumulative distribution function as well as for the average fading duration and fading duration distribution of the corrected coupled flux are given. These results are of prime interest for the computation of metrics related to coded transmissions over correlated channels, and they are confronted by end-to-end wave-optics simulations in the case of a geosynchronous satellite (GEO)-to-ground and a low earth orbit satellite (LEO)-to-ground scenario. Eventually, the impact of different AO performances on the aforementioned fading duration distribution is analytically investigated for both scenarios.
Vector solitons in a laser passively mode-locked by single-wall carbon nanotubes
NASA Astrophysics Data System (ADS)
Wong, Jia Haur; Wu, Kan; Liu, Huan Huan; Ouyang, Chunmei; Wang, Honghai; Aditya, Sheel; Shum, Ping; Fu, Songnian; Kelleher, E. J. R.; Chernov, A.; Obraztsova, E. D.
2011-04-01
Polarization Rotation Locked Vector Solitons (PRLVSs) are experimentally observed for the first time in a fiber ring laser passively mode-locked by a single-wall carbon nanotube (SWCNT) saturable absorber. Period-doubling of these solitons at certain birefringence values has also been observed. We show that fine adjustment to the intracavity birefringence can swing the PRLVSs from period-doubled to period-one state without simultaneous reduction in the pump strength. The timing jitter for both states has also been measured experimentally and discussed analytically using the theoretical framework provided by the Haus model.
Quality Indicators for Learning Analytics
ERIC Educational Resources Information Center
Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus
2014-01-01
This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…
Huo, Zhiguang; Ding, Ying; Liu, Silvia; Oesterreich, Steffi; Tseng, George
2016-01-01
Disease phenotyping by omics data has become a popular approach that potentially can lead to better personalized treatment. Identifying disease subtypes via unsupervised machine learning is the first step towards this goal. In this paper, we extend a sparse K-means method towards a meta-analytic framework to identify novel disease subtypes when expression profiles of multiple cohorts are available. The lasso regularization and meta-analysis identify a unique set of gene features for subtype characterization. An additional pattern matching reward function guarantees consistent subtype signatures across studies. The method was evaluated by simulations and leukemia and breast cancer data sets. The identified disease subtypes from meta-analysis were characterized with improved accuracy and stability compared to single study analysis. The breast cancer model was applied to an independent METABRIC dataset and generated improved survival difference between subtypes. These results provide a basis for diagnosis and development of targeted treatments for disease subgroups. PMID:27330233
Werling, Donna M; Brand, Harrison; An, Joon-Yong; Stone, Matthew R; Zhu, Lingxue; Glessner, Joseph T; Collins, Ryan L; Dong, Shan; Layer, Ryan M; Markenscoff-Papadimitriou, Eirene; Farrell, Andrew; Schwartz, Grace B; Wang, Harold Z; Currall, Benjamin B; Zhao, Xuefang; Dea, Jeanselle; Duhn, Clif; Erdman, Carolyn A; Gilson, Michael C; Yadav, Rachita; Handsaker, Robert E; Kashin, Seva; Klei, Lambertus; Mandell, Jeffrey D; Nowakowski, Tomasz J; Liu, Yuwen; Pochareddy, Sirisha; Smith, Louw; Walker, Michael F; Waterman, Matthew J; He, Xin; Kriegstein, Arnold R; Rubenstein, John L; Sestan, Nenad; McCarroll, Steven A; Neale, Benjamin M; Coon, Hilary; Willsey, A Jeremy; Buxbaum, Joseph D; Daly, Mark J; State, Matthew W; Quinlan, Aaron R; Marth, Gabor T; Roeder, Kathryn; Devlin, Bernie; Talkowski, Michael E; Sanders, Stephan J
2018-05-01
Genomic association studies of common or rare protein-coding variation have established robust statistical approaches to account for multiple testing. Here we present a comparable framework to evaluate rare and de novo noncoding single-nucleotide variants, insertion/deletions, and all classes of structural variation from whole-genome sequencing (WGS). Integrating genomic annotations at the level of nucleotides, genes, and regulatory regions, we define 51,801 annotation categories. Analyses of 519 autism spectrum disorder families did not identify association with any categories after correction for 4,123 effective tests. Without appropriate correction, biologically plausible associations are observed in both cases and controls. Despite excluding previously identified gene-disrupting mutations, coding regions still exhibited the strongest associations. Thus, in autism, the contribution of de novo noncoding variation is probably modest in comparison to that of de novo coding variants. Robust results from future WGS studies will require large cohorts and comprehensive analytical strategies that consider the substantial multiple-testing burden.
Huo, Zhiguang; Ding, Ying; Liu, Silvia; Oesterreich, Steffi; Tseng, George
Disease phenotyping by omics data has become a popular approach that potentially can lead to better personalized treatment. Identifying disease subtypes via unsupervised machine learning is the first step towards this goal. In this paper, we extend a sparse K -means method towards a meta-analytic framework to identify novel disease subtypes when expression profiles of multiple cohorts are available. The lasso regularization and meta-analysis identify a unique set of gene features for subtype characterization. An additional pattern matching reward function guarantees consistent subtype signatures across studies. The method was evaluated by simulations and leukemia and breast cancer data sets. The identified disease subtypes from meta-analysis were characterized with improved accuracy and stability compared to single study analysis. The breast cancer model was applied to an independent METABRIC dataset and generated improved survival difference between subtypes. These results provide a basis for diagnosis and development of targeted treatments for disease subgroups.
Multiaxis sensing using metal organic frameworks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talin, Albert Alec; Allendorf, Mark D.; Leonard, Francois
2017-01-17
A sensor device including a sensor substrate; and a thin film comprising a porous metal organic framework (MOF) on the substrate that presents more than one transduction mechanism when exposed to an analyte. A method including exposing a porous metal organic framework (MOF) on a substrate to an analyte; and identifying more than one transduction mechanism in response to the exposure to the analyte.
ERIC Educational Resources Information Center
Drachsler, H.; Kalz, M.
2016-01-01
The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…
Lavoie, Josée; Boulton, Amohia; Dwyer, Judith
2010-01-01
Contracting in health care is a mechanism used by the governments of Canada, Australia and New Zealand to improve the participation of marginalized populations in primary health care and improve responsiveness to local needs. As a result, complex contractual environments have emerged. The literature on contracting in health has tended to focus on the pros and cons of classical versus relational contracts from the funder's perspective. This article proposes an analytical framework to explore the strengths and weaknesses of contractual environments that depend on a number of classical contracts, a single relational contract or a mix of the two. Examples from indigenous contracting environments are used to inform the elaboration of the framework. Results show that contractual environments that rely on a multiplicity of specific contracts are administratively onerous, while constraining opportunities for local responsiveness. Contractual environments dominated by a single relational contract produce a more flexible and administratively streamlined system.
NASA Astrophysics Data System (ADS)
Wollman, Adam J. M.; Miller, Helen; Foster, Simon; Leake, Mark C.
2016-10-01
Staphylococcus aureus is an important pathogen, giving rise to antimicrobial resistance in cell strains such as Methicillin Resistant S. aureus (MRSA). Here we report an image analysis framework for automated detection and image segmentation of cells in S. aureus cell clusters, and explicit identification of their cell division planes. We use a new combination of several existing analytical tools of image analysis to detect cellular and subcellular morphological features relevant to cell division from millisecond time scale sampled images of live pathogens at a detection precision of single molecules. We demonstrate this approach using a fluorescent reporter GFP fused to the protein EzrA that localises to a mid-cell plane during division and is involved in regulation of cell size and division. This image analysis framework presents a valuable platform from which to study candidate new antimicrobials which target the cell division machinery, but may also have more general application in detecting morphologically complex structures of fluorescently labelled proteins present in clusters of other types of cells.
NASA Astrophysics Data System (ADS)
Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan
2016-04-01
Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP framework to offload the tasks to GPUs instead of doing it in the main application. We observe increased resource utilization and overall productivity in this approach by using HFP framework for end-to-end workflow.
Nonlinear viscoelastic characterization of structural adhesives
NASA Technical Reports Server (NTRS)
Rochefort, M. A.; Brinson, H. F.
1983-01-01
Measurements of the nonliner viscoelastic behavior of two adhesives, FM-73 and FM-300, are presented and discussed. Analytical methods to quantify the measurements are given and fitted into a framework of an accelerated testing and analysis procedure. The single integral model used is shown to function well and is analogous to a time-temperature stress-superposition procedure (TTSSP). Advantages and disadvantages of the creep power law method used in this study are given.
A dislocation-based crystal plasticity framework for dynamic ductile failure of single crystals
Nguyen, Thao; Luscher, D. J.; Wilkerson, J. W.
2017-08-02
We developed a framework for dislocation-based viscoplasticity and dynamic ductile failure to model high strain rate deformation and damage in single crystals. The rate-dependence of the crystal plasticity formulation is based on the physics of relativistic dislocation kinetics suited for extremely high strain rates. The damage evolution is based on the dynamics of void growth, which are governed by both micro-inertia as well as dislocation kinetics and dislocation substructure evolution. Furthermore, an averaging scheme is proposed in order to approximate the evolution of the dislocation substructure in both the macroscale as well as its spatial distribution at the microscale. Inmore » addition, a concept of a single equivalent dislocation density that effectively captures the collective influence of dislocation density on all active slip systems is proposed here. Together, these concepts and approximations enable the use of semi-analytic solutions for void growth dynamics developed in [J. Wilkerson and K. Ramesh. A dynamic void growth model governed by dislocation kinetics. J. Mech. Phys. Solids, 70:262–280, 2014.], which greatly reduce the computational overhead that would otherwise be required. The resulting homogenized framework has been implemented into a commercially available finite element package, and a validation study against a suite of direct numerical simulations was carried out.« less
Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl
2010-01-01
The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and the use of ecosystem services in decision making.
Cammi, R
2009-10-28
We present a general formulation of the coupled-cluster (CC) theory for a molecular solute described within the framework of the polarizable continuum model (PCM). The PCM-CC theory is derived in its complete form, called PTDE scheme, in which the correlated electronic density is used to have a self-consistent reaction field, and in an approximate form, called PTE scheme, in which the PCM-CC equations are solved assuming the fixed Hartree-Fock solvent reaction field. Explicit forms for the PCM-CC-PTDE equations are derived at the single and double (CCSD) excitation level of the cluster operator. At the same level, explicit equations for the analytical first derivatives of the PCM basic energy functional are presented, and analytical second derivatives are also discussed. The corresponding PCM-CCSD-PTE equations are given as a special case of the full theory.
NASA Technical Reports Server (NTRS)
Bhadra, Dipasis; Morser, Frederick R.
2006-01-01
In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.
A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models.
Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S; Wu, Xiaowei; Müller, Rolf
2018-01-01
Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design.
A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models
Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S.; Wu, Xiaowei; Müller, Rolf
2017-01-01
Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design. PMID:29749977
A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.
2015-12-01
Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.
Statistical mechanics of ribbons under bending and twisting torques.
Sinha, Supurna; Samuel, Joseph
2013-11-20
We present an analytical study of ribbons subjected to an external torque. We first describe the elastic response of a ribbon within a purely mechanical framework. We then study the role of thermal fluctuations in modifying its elastic response. We predict the moment-angle relation of bent and twisted ribbons. Such a study is expected to shed light on the role of twist in DNA looping and on bending elasticity of twisted graphene ribbons. Our quantitative predictions can be tested against future single molecule experiments.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... Border Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY... Framework for Intelligence (AFI) System of Records'' and this proposed rulemaking. In this proposed... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records.'' AFI enhances DHS's...
ERIC Educational Resources Information Center
Chigisheva, Oksana; Bondarenko, Anna; Soltovets, Elena
2017-01-01
The paper provides analytical insights into highly acute issues concerning preparation and adoption of Qualifications Frameworks being an adequate response to the growing interactions at the global labor market and flourishing of knowledge economy. Special attention is paid to the analyses of transnational Meta Qualifications Frameworks (A…
Degrees of School Democracy: A Holistic Framework
ERIC Educational Resources Information Center
Woods, Philip A.; Woods, Glenys J.
2012-01-01
This article outlines an analytical framework that enables analysis of degrees of democracy in a school or other organizational setting. It is founded in a holistic conception of democracy, which is a model of working together that aspires to truth, goodness, and meaning and the participation of all. We suggest that the analytical framework can be…
Canis, Laure; Linkov, Igor; Seager, Thomas P
2010-11-15
The unprecedented uncertainty associated with engineered nanomaterials greatly expands the need for research regarding their potential environmental consequences. However, decision-makers such as regulatory agencies, product developers, or other nanotechnology stakeholders may not find the results of such research directly informative of decisions intended to mitigate environmental risks. To help interpret research findings and prioritize new research needs, there is an acute need for structured decision-analytic aids that are operable in a context of extraordinary uncertainty. Whereas existing stochastic decision-analytic techniques explore uncertainty only in decision-maker preference information, this paper extends model uncertainty to technology performance. As an illustrative example, the framework is applied to the case of single-wall carbon nanotubes. Four different synthesis processes (arc, high pressure carbon monoxide, chemical vapor deposition, and laser) are compared based on five salient performance criteria. A probabilistic rank ordering of preferred processes is determined using outranking normalization and a linear-weighted sum for different weighting scenarios including completely unknown weights and four fixed-weight sets representing hypothetical stakeholder views. No single process pathway dominates under all weight scenarios, but it is likely that some inferior process technologies could be identified as low priorities for further research.
A Discounting Framework for Choice With Delayed and Probabilistic Rewards
Green, Leonard; Myerson, Joel
2005-01-01
When choosing between delayed or uncertain outcomes, individuals discount the value of such outcomes on the basis of the expected time to or the likelihood of their occurrence. In an integrative review of the expanding experimental literature on discounting, the authors show that although the same form of hyperbola-like function describes discounting of both delayed and probabilistic outcomes, a variety of recent findings are inconsistent with a single-process account. The authors also review studies that compare discounting in different populations and discuss the theoretical and practical implications of the findings. The present effort illustrates the value of studying choice involving both delayed and probabilistic outcomes within a general discounting framework that uses similar experimental procedures and a common analytical approach. PMID:15367080
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.
2009-04-14
This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less
NASA Astrophysics Data System (ADS)
Ebrahimkhanlou, Arvin; Salamone, Salvatore
2017-09-01
Tracking edge-reflected acoustic emission (AE) waves can allow the localization of their sources. Specifically, in bounded isotropic plate structures, only one sensor may be used to perform these source localizations. The primary goal of this paper is to develop a three-step probabilistic framework to quantify the uncertainties associated with such single-sensor localizations. According to this framework, a probabilistic approach is first used to estimate the direct distances between AE sources and the sensor. Then, an analytical model is used to reconstruct the envelope of edge-reflected AE signals based on the source-to-sensor distance estimations and their first arrivals. Finally, the correlation between the probabilistically reconstructed envelopes and recorded AE signals are used to estimate confidence contours for the location of AE sources. To validate the proposed framework, Hsu-Nielsen pencil lead break (PLB) tests were performed on the surface as well as the edges of an aluminum plate. The localization results show that the estimated confidence contours surround the actual source locations. In addition, the performance of the framework was tested in a noisy environment simulated by two dummy transducers and an arbitrary wave generator. The results show that in low-noise environments, the shape and size of the confidence contours depend on the sources and their locations. However, at highly noisy environments, the size of the confidence contours monotonically increases with the noise floor. Such probabilistic results suggest that the proposed probabilistic framework could thus provide more comprehensive information regarding the location of AE sources.
Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards
Smith, Justin D.
2013-01-01
This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874
Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol
2009-01-01
Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal’s gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world’s largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework. PMID:22346693
Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol
2009-01-01
Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal's gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world's largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework.
Analytical optimization of demand management strategies across all urban water use sectors
NASA Astrophysics Data System (ADS)
Friedman, Kenneth; Heaney, James P.; Morales, Miguel; Palenchar, John
2014-07-01
An effective urban water demand management program can greatly influence both peak and average demand and therefore long-term water supply and infrastructure planning. Although a theoretical framework for evaluating residential indoor demand management has been well established, little has been done to evaluate other water use sectors such as residential irrigation in a compatible manner for integrating these results into an overall solution. This paper presents a systematic procedure to evaluate the optimal blend of single family residential irrigation demand management strategies to achieve a specified goal based on performance functions derived from parcel level tax assessor's data linked to customer level monthly water billing data. This framework is then generalized to apply to any urban water sector, as exponential functions can be fit to all resulting cumulative water savings functions. Two alternative formulations are presented: maximize net benefits, or minimize total costs subject to satisfying a target water savings. Explicit analytical solutions are presented for both formulations based on appropriate exponential best fits of performance functions. A direct result of this solution is the dual variable which represents the marginal cost of water saved at a specified target water savings goal. A case study of 16,303 single family irrigators in Gainesville Regional Utilities utilizing high quality tax assessor and monthly billing data along with parcel level GIS data provide an illustrative example of these techniques. Spatial clustering of targeted homes can be easily performed in GIS to identify priority demand management areas.
ERIC Educational Resources Information Center
Nic Giolla Mhichíl, Mairéad; van Engen, Jeroen; Ó Ciardúbháin, Colm; Ó Cléircín, Gearóid; Appel, Christine
2014-01-01
This paper sets out to construct and present the evolving conceptual framework of the SpeakApps projects to consider the application of learning analytics to facilitate synchronous and asynchronous oral language skills within this CALL context. Drawing from both the CALL and wider theoretical and empirical literature of learner analytics, the…
Lloyd, Jeffrey T.; Clayton, John D.; Austin, Ryan A.; ...
2015-07-10
Background: The shock response of metallic single crystals can be captured using a micro-mechanical description of the thermoelastic-viscoplastic material response; however, using a such a description within the context of traditional numerical methods may introduce a physical artifacts. Advantages and disadvantages of complex material descriptions, in particular the viscoplastic response, must be framed within approximations introduced by numerical methods. Methods: Three methods of modeling the shock response of metallic single crystals are summarized: finite difference simulations, steady wave simulations, and algebraic solutions of the Rankine-Hugoniot jump conditions. For the former two numerical techniques, a dislocation density based framework describes themore » rate- and temperature-dependent shear strength on each slip system. For the latter analytical technique, a simple (two-parameter) rate- and temperature-independent linear hardening description is necessarily invoked to enable simultaneous solution of the governing equations. For all models, the same nonlinear thermoelastic energy potential incorporating elastic constants of up to order 3 is applied. Results: Solutions are compared for plate impact of highly symmetric orientations (all three methods) and low symmetry orientations (numerical methods only) of aluminum single crystals shocked to 5 GPa (weak shock regime) and 25 GPa (overdriven regime). Conclusions: For weak shocks, results of the two numerical methods are very similar, regardless of crystallographic orientation. For strong shocks, artificial viscosity affects the finite difference solution, and effects of transverse waves for the lower symmetry orientations not captured by the steady wave method become important. The analytical solution, which can only be applied to highly symmetric orientations, provides reasonable accuracy with regards to prediction of most variables in the final shocked state but, by construction, does not provide insight into the shock structure afforded by the numerical methods.« less
Real-Time Analytics for the Healthcare Industry: Arrhythmia Detection.
Agneeswaran, Vijay Srinivas; Mukherjee, Joydeb; Gupta, Ashutosh; Tonpay, Pranay; Tiwari, Jayati; Agarwal, Nitin
2013-09-01
It is time for the healthcare industry to move from the era of "analyzing our health history" to the age of "managing the future of our health." In this article, we illustrate the importance of real-time analytics across the healthcare industry by providing a generic mechanism to reengineer traditional analytics expressed in the R programming language into Storm-based real-time analytics code. This is a powerful abstraction, since most data scientists use R to write the analytics and are not clear on how to make the data work in real-time and on high-velocity data. Our paper focuses on the applications necessary to a healthcare analytics scenario, specifically focusing on the importance of electrocardiogram (ECG) monitoring. A physician can use our framework to compare ECG reports by categorization and consequently detect Arrhythmia. The framework can read the ECG signals and uses a machine learning-based categorizer that runs within a Storm environment to compare different ECG signals. The paper also presents some performance studies of the framework to illustrate the throughput and accuracy trade-off in real-time analytics.
Microchip-Based Single-Cell Functional Proteomics for Biomedical Applications
Lu, Yao; Yang, Liu; Wei, Wei; Shi, Qihui
2017-01-01
Cellular heterogeneity has been widely recognized but only recently have single cell tools become available that allow characterizing heterogeneity at the genomic and proteomic levels. We review the technological advances in microchip-based toolkits for single-cell functional proteomics. Each of these tools has distinct advantages and limitations, and a few have advanced toward being applied to address biological or clinical problems that fail to be addressed by traditional population-based methods. High-throughput single-cell proteomic assays generate high-dimensional data sets that contain new information and thus require developing new analytical framework to extract new biology. In this review article, we highlight a few biological and clinical applications in which the microchip-based single-cell proteomic tools provide unique advantages. The examples include resolving functional heterogeneity and dynamics of immune cells, dissecting cell-cell interaction by creating well-contolled on-chip microenvironment, capturing high-resolution snapshots of immune system functions in patients for better immunotherapy and elucidating phosphoprotein signaling networks in cancer cells for guiding effective molecularly targeted therapies. PMID:28280819
Framework for assessing key variable dependencies in loose-abrasive grinding and polishing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, J.S.; Aikens, D.M.; Brown, N.J.
1995-12-01
This memo describes a framework for identifying all key variables that determine the figuring performance of loose-abrasive lapping and polishing machines. This framework is intended as a tool for prioritizing R&D issues, assessing the completeness of process models and experimental data, and for providing a mechanism to identify any assumptions in analytical models or experimental procedures. Future plans for preparing analytical models or performing experiments can refer to this framework in establishing the context of the work.
Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework
ERIC Educational Resources Information Center
Ranjan, Jayanthi; Bhatnagar, Vishal
2011-01-01
Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…
RT-18: Value of Flexibility. Phase 1
2010-09-25
an analytical framework based on sound mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory...framework that is mathematically consistent, domain independent and applicable under varying information levels. This report presents our advances in...During this period, we also explored the development of an analytical framework based on sound mathematical constructs. A review of the current state
Sandplay therapy with couples within the framework of analytical psychology.
Albert, Susan Carol
2015-02-01
Sandplay therapy with couples is discussed within an analytical framework. Guidelines are proposed as a means of developing this relatively new area within sandplay therapy, and as a platform to open a wider discussion to bring together sandplay therapy and couple therapy. Examples of sand trays created during couple therapy are also presented to illustrate the transformations during the therapeutic process. © 2015, The Society of Analytical Psychology.
A comprehensive test of clinical reasoning for medical students: An olympiad experience in Iran.
Monajemi, Alireza; Arabshahi, Kamran Soltani; Soltani, Akbar; Arbabi, Farshid; Akbari, Roghieh; Custers, Eugene; Hadadgar, Arash; Hadizadeh, Fatemeh; Changiz, Tahereh; Adibi, Peyman
2012-01-01
Although some tests for clinical reasoning assessment are now available, the theories of medical expertise have not played a major role in this filed. In this paper, illness script theory was chose as a theoretical framework and contemporary clinical reasoning tests were put together based on this theoretical model. This paper is a qualitative study performed with an action research approach. This style of research is performed in a context where authorities focus on promoting their organizations' performance and is carried out in the form of teamwork called participatory research. Results are presented in four parts as basic concepts, clinical reasoning assessment, test framework, and scoring. we concluded that no single test could thoroughly assess clinical reasoning competency, and therefore a battery of clinical reasoning tests is needed. This battery should cover all three parts of clinical reasoning process: script activation, selection and verification. In addition, not only both analytical and non-analytical reasoning, but also both diagnostic and management reasoning should evenly take into consideration in this battery. This paper explains the process of designing and implementing the battery of clinical reasoning in the Olympiad for medical sciences students through an action research.
Analytic Frameworks for Assessing Dialogic Argumentation in Online Learning Environments
ERIC Educational Resources Information Center
Clark, Douglas B; Sampson, Victor; Weinberger, Armin; Erkens, Gijsbert
2007-01-01
Over the last decade, researchers have developed sophisticated online learning environments to support students engaging in dialogic argumentation. This review examines five categories of analytic frameworks for measuring participant interactions within these environments focusing on (1) formal argumentation structure, (2) conceptual quality, (3)…
Modelling vortex-induced fluid-structure interaction.
Benaroya, Haym; Gabbai, Rene D
2008-04-13
The principal goal of this research is developing physics-based, reduced-order, analytical models of nonlinear fluid-structure interactions associated with offshore structures. Our primary focus is to generalize the Hamilton's variational framework so that systems of flow-oscillator equations can be derived from first principles. This is an extension of earlier work that led to a single energy equation describing the fluid-structure interaction. It is demonstrated here that flow-oscillator models are a subclass of the general, physical-based framework. A flow-oscillator model is a reduced-order mechanical model, generally comprising two mechanical oscillators, one modelling the structural oscillation and the other a nonlinear oscillator representing the fluid behaviour coupled to the structural motion.Reduced-order analytical model development continues to be carried out using a Hamilton's principle-based variational approach. This provides flexibility in the long run for generalizing the modelling paradigm to complex, three-dimensional problems with multiple degrees of freedom, although such extension is very difficult. As both experimental and analytical capabilities advance, the critical research path to developing and implementing fluid-structure interaction models entails-formulating generalized equations of motion, as a superset of the flow-oscillator models; and-developing experimentally derived, semi-analytical functions to describe key terms in the governing equations of motion. The developed variational approach yields a system of governing equations. This will allow modelling of multiple d.f. systems. The extensions derived generalize the Hamilton's variational formulation for such problems. The Navier-Stokes equations are derived and coupled to the structural oscillator. This general model has been shown to be a superset of the flow-oscillator model. Based on different assumptions, one can derive a variety of flow-oscillator models.
Preissl, Sebastian; Fang, Rongxin; Huang, Hui; Zhao, Yuan; Raviram, Ramya; Gorkin, David U; Zhang, Yanxiao; Sos, Brandon C; Afzal, Veena; Dickel, Diane E; Kuan, Samantha; Visel, Axel; Pennacchio, Len A; Zhang, Kun; Ren, Bing
2018-03-01
Analysis of chromatin accessibility can reveal transcriptional regulatory sequences, but heterogeneity of primary tissues poses a significant challenge in mapping the precise chromatin landscape in specific cell types. Here we report single-nucleus ATAC-seq, a combinatorial barcoding-assisted single-cell assay for transposase-accessible chromatin that is optimized for use on flash-frozen primary tissue samples. We apply this technique to the mouse forebrain through eight developmental stages. Through analysis of more than 15,000 nuclei, we identify 20 distinct cell populations corresponding to major neuronal and non-neuronal cell types. We further define cell-type-specific transcriptional regulatory sequences, infer potential master transcriptional regulators and delineate developmental changes in forebrain cellular composition. Our results provide insight into the molecular and cellular dynamics that underlie forebrain development in the mouse and establish technical and analytical frameworks that are broadly applicable to other heterogeneous tissues.
An Analytical Framework for Evaluating E-Commerce Business Models and Strategies.
ERIC Educational Resources Information Center
Lee, Chung-Shing
2001-01-01
Considers electronic commerce as a paradigm shift, or a disruptive innovation, and presents an analytical framework based on the theories of transaction costs and switching costs. Topics include business transformation process; scale effect; scope effect; new sources of revenue; and e-commerce value creation model and strategy. (LRW)
Reasoning across Ontologically Distinct Levels: Students' Understandings of Molecular Genetics
ERIC Educational Resources Information Center
Duncan, Ravit Golan; Reiser, Brian J.
2007-01-01
In this article we apply a novel analytical framework to explore students' difficulties in understanding molecular genetics--a domain that is particularly challenging to learn. Our analytical framework posits that reasoning in molecular genetics entails mapping across ontologically distinct levels--an information level containing the genetic…
The Illness Narratives of Health Managers: Developing an Analytical Framework
ERIC Educational Resources Information Center
Exworthy, Mark
2011-01-01
This paper examines the personal experience of illness and healthcare by health managers through their illness narratives. By synthesising a wider literature of illness narratives and health management, an analytical framework is presented, which considers the impact of illness narratives, comprising the logic of illness narratives, the actors…
A Data Protection Framework for Learning Analytics
ERIC Educational Resources Information Center
Cormack, Andrew
2016-01-01
Most studies on the use of digital student data adopt an ethical framework derived from human-subject research, based on the informed consent of the experimental subject. However, consent gives universities little guidance on using learning analytics as a routine part of educational provision: which purposes are legitimate and which analyses…
Durstewitz, Daniel
2017-06-01
The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic) network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional) state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs) are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs) within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC) obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover relevant aspects of the nonlinear dynamics underlying observed neuronal time series, and directly link these to computational properties.
Thermostating extended Lagrangian Born-Oppenheimer molecular dynamics.
Martínez, Enrique; Cawkwell, Marc J; Voter, Arthur F; Niklasson, Anders M N
2015-04-21
Extended Lagrangian Born-Oppenheimer molecular dynamics is developed and analyzed for applications in canonical (NVT) simulations. Three different approaches are considered: the Nosé and Andersen thermostats and Langevin dynamics. We have tested the temperature distribution under different conditions of self-consistent field (SCF) convergence and time step and compared the results to analytical predictions. We find that the simulations based on the extended Lagrangian Born-Oppenheimer framework provide accurate canonical distributions even under approximate SCF convergence, often requiring only a single diagonalization per time step, whereas regular Born-Oppenheimer formulations exhibit unphysical fluctuations unless a sufficiently high degree of convergence is reached at each time step. The thermostated extended Lagrangian framework thus offers an accurate approach to sample processes in the canonical ensemble at a fraction of the computational cost of regular Born-Oppenheimer molecular dynamics simulations.
Assessing Proposals for New Global Health Treaties: An Analytic Framework.
Hoffman, Steven J; Røttingen, John-Arne; Frenk, Julio
2015-08-01
We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties.
Assessing Proposals for New Global Health Treaties: An Analytic Framework
Røttingen, John-Arne; Frenk, Julio
2015-01-01
We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties. PMID:26066926
Edwards, Jeffrey R; Lambert, Lisa Schurer
2007-03-01
Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated and the mediated effects under investigation. This article presents a general analytical framework for combining moderation and mediation that integrates moderated regression analysis and path analysis. This framework clarifies how moderator variables influence the paths that constitute the direct, indirect, and total effects of mediated models. The authors empirically illustrate this framework and give step-by-step instructions for estimation and interpretation. They summarize the advantages of their framework over current approaches, explain how it subsumes moderated mediation and mediated moderation, and describe how it can accommodate additional moderator and mediator variables, curvilinear relationships, and structural equation models with latent variables. (c) 2007 APA, all rights reserved.
Roberts, James H.; Anderson, Gregory B.; Angermeier, Paul
2016-01-01
Projects to assess environmental impact or restoration success in rivers focus on project-specific questions but can also provide valuable insights for future projects. Both restoration actions and impact assessments can become “adaptive” by using the knowledge gained from long-term monitoring and analysis to revise the actions, monitoring, conceptual model, or interpretation of findings so that subsequent actions or assessments are better informed. Assessments of impact or restoration success are especially challenging when the indicators of interest are imperiled species and/or the impacts being addressed are complex. From 1997 to 2015, we worked closely with two federal agencies to monitor habitat availability for and population density of Roanoke logperch (Percina rex), an endangered fish, in a 24-km-long segment of the upper Roanoke River, VA. We primarily used a Before-After-Control-Impact analytical framework to assess potential impacts of a river channelization project on the P. rex population. In this paper, we summarize how our extensive monitoring facilitated the evolution of our (a) conceptual understanding of the ecosystem and fish population dynamics; (b) choices of ecological indicators and analytical tools; and (c) conclusions regarding the magnitude, mechanisms, and significance of observed impacts. Our experience with this case study taught us important lessons about how to adaptively develop and conduct a monitoring program, which we believe are broadly applicable to assessments of environmental impact and restoration success in other rivers. In particular, we learned that (a) pre-treatment planning can enhance monitoring effectiveness, help avoid unforeseen pitfalls, and lead to more robust conclusions; (b) developing adaptable conceptual and analytical models early was crucial to organizing our knowledge, guiding our study design, and analyzing our data; (c) catchment-wide processes that we did not monitor, or initially consider, had profound implications for interpreting our findings; and (d) using multiple analytical frameworks, with varying assumptions, led to clearer interpretation of findings than the use of a single framework alone. Broader integration of these guiding principles into monitoring studies, though potentially challenging, could lead to more scientifically defensible assessments of project effects.
Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention
ERIC Educational Resources Information Center
West, Deborah; Heath, David; Huijser, Henk
2016-01-01
This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…
ERIC Educational Resources Information Center
Hartnell, Chad A.; Ou, Amy Yi; Kinicki, Angelo
2011-01-01
We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial…
ERIC Educational Resources Information Center
Šulíková, Jana
2016-01-01
Purpose: This article proposes an analytical framework that helps to identify and challenge misconceptions of ethnocentrism found in pre-tertiary teaching resources for history and the social sciences in numerous countries. Design: Drawing on nationalism studies, the analytical framework employs ideas known under the umbrella terms of…
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
Calculation of the time resolution of the J-PET tomograph using kernel density estimation
NASA Astrophysics Data System (ADS)
Raczyński, L.; Wiślicki, W.; Krzemień, W.; Kowalski, P.; Alfs, D.; Bednarski, T.; Białas, P.; Curceanu, C.; Czerwiński, E.; Dulski, K.; Gajos, A.; Głowacz, B.; Gorgol, M.; Hiesmayr, B.; Jasińska, B.; Kamińska, D.; Korcyl, G.; Kozik, T.; Krawczyk, N.; Kubicz, E.; Mohammed, M.; Pawlik-Niedźwiecka, M.; Niedźwiecki, S.; Pałka, M.; Rudy, Z.; Rundel, O.; Sharma, N. G.; Silarski, M.; Smyrski, J.; Strzelecki, A.; Wieczorek, A.; Zgardzińska, B.; Zieliński, M.; Moskal, P.
2017-06-01
In this paper we estimate the time resolution of the J-PET scanner built from plastic scintillators. We incorporate the method of signal processing using the Tikhonov regularization framework and the kernel density estimation method. We obtain simple, closed-form analytical formulae for time resolution. The proposed method is validated using signals registered by means of the single detection unit of the J-PET tomograph built from a 30 cm long plastic scintillator strip. It is shown that the experimental and theoretical results obtained for the J-PET scanner equipped with vacuum tube photomultipliers are consistent.
Methods for detection of GMOs in food and feed.
Marmiroli, Nelson; Maestri, Elena; Gullì, Mariolina; Malcevschi, Alessio; Peano, Clelia; Bordoni, Roberta; De Bellis, Gianluca
2008-10-01
This paper reviews aspects relevant to detection and quantification of genetically modified (GM) material within the feed/food chain. The GM crop regulatory framework at the international level is evaluated with reference to traceability and labelling. Current analytical methods for the detection, identification, and quantification of transgenic DNA in food and feed are reviewed. These methods include quantitative real-time PCR, multiplex PCR, and multiplex real-time PCR. Particular attention is paid to methods able to identify multiple GM events in a single reaction and to the development of microdevices and microsensors, though they have not been fully validated for application.
Big data and high-performance analytics in structural health monitoring for bridge management
NASA Astrophysics Data System (ADS)
Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed
2016-04-01
Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.
Analytical framework and tool kit for SEA follow-up
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran
2009-04-15
Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less
Lin, Shangchao; Zhang, Jingqing; Strano, Michael S; Blankschtein, Daniel
2014-08-28
Macromolecular scaffolds made of polymer-wrapped single-walled carbon nanotubes (SWCNTs) have been explored recently (Zhang et al., Nature Nanotechnology, 2013) as a new class of molecular-recognition motifs. However, selective analyte recognition is still challenging and lacks the underlying fundamental understanding needed for its practical implementation in biological sensors. In this report, we combine coarse-grained molecular dynamics (CGMD) simulations, physical adsorption/binding theories, and photoluminescence (PL) experiments to provide molecular insight into the selectivity of such sensors towards a large set of biologically important analytes. We find that the physical binding affinities of the analytes on a bare SWCNT partially correlate with their distribution coefficients in a bulk water/octanol system, suggesting that the analyte hydrophobicity plays a key role in determining the binding affinities of the analytes considered, along with the various specific interactions between the analytes and the polymer anchor groups. Two distinct categories of analytes are identified to demonstrate a complex picture for the correlation between optical sensor signals and the simulated binding affinities. Specifically, a good correlation was found between the sensor signals and the physical binding affinities of the three hormones (estradiol, melatonin, and thyroxine), the neurotransmitter (dopamine), and the vitamin (riboflavin) to the SWCNT-polymer scaffold. The four amino acids (aspartate, glycine, histidine, and tryptophan) and the two monosaccharides (fructose and glucose) considered were identified as blank analytes which are unable to induce sensor signals. The results indicate great success of our physical adsorption-based model in explaining the ranking in sensor selectivities. The combined framework presented here can be used to screen and select polymers that can potentially be used for creating synthetic molecular recognition motifs.
Expanding Students' Analytical Frameworks through the Study of Graphic Novels
ERIC Educational Resources Information Center
Connors, Sean P.
2015-01-01
When teachers work with students to construct a metalanguage that they can draw on to describe and analyze graphic novels, and then invite students to apply that metalanguage in the service of composing multimodal texts of their own, teachers broaden students' analytical frameworks. In the process of doing so, teachers empower students. In this…
ERIC Educational Resources Information Center
OECD Publishing, 2017
2017-01-01
What is important for citizens to know and be able to do? The OECD Programme for International Student Assessment (PISA) seeks to answer that question through the most comprehensive and rigorous international assessment of student knowledge and skills. The PISA 2015 Assessment and Analytical Framework presents the conceptual foundations of the…
Xpey' Relational Environments: an analytic framework for conceptualizing Indigenous health equity.
Kent, Alexandra; Loppie, Charlotte; Carriere, Jeannine; MacDonald, Marjorie; Pauly, Bernie
2017-12-01
Both health equity research and Indigenous health research are driven by the goal of promoting equitable health outcomes among marginalized and underserved populations. However, the two fields often operate independently, without collaboration. As a result, Indigenous populations are underrepresented in health equity research relative to the disproportionate burden of health inequities they experience. In this methodological article, we present Xpey' Relational Environments, an analytic framework that maps some of the barriers and facilitators to health equity for Indigenous peoples. Health equity research needs to include a focus on Indigenous populations and Indigenized methodologies, a shift that could fill gaps in knowledge with the potential to contribute to 'closing the gap' in Indigenous health. With this in mind, the Equity Lens in Public Health (ELPH) research program adopted the Xpey' Relational Environments framework to add a focus on Indigenous populations to our research on the prioritization and implementation of health equity. The analytic framework introduced an Indigenized health equity lens to our methodology, which facilitated the identification of social, structural and systemic determinants of Indigenous health. To test the framework, we conducted a pilot case study of one of British Columbia's regional health authorities, which included a review of core policies and plans as well as interviews and focus groups with frontline staff, managers and senior executives. ELPH's application of Xpey' Relational Environments serves as an example of the analytic framework's utility for exploring and conceptualizing Indigenous health equity in BC's public health system. Future applications of the framework should be embedded in Indigenous research methodologies.
Saunders, Christina T; Blume, Jeffrey D
2017-10-26
Mediation analysis explores the degree to which an exposure's effect on an outcome is diverted through a mediating variable. We describe a classical regression framework for conducting mediation analyses in which estimates of causal mediation effects and their variance are obtained from the fit of a single regression model. The vector of changes in exposure pathway coefficients, which we named the essential mediation components (EMCs), is used to estimate standard causal mediation effects. Because these effects are often simple functions of the EMCs, an analytical expression for their model-based variance follows directly. Given this formula, it is instructive to revisit the performance of routinely used variance approximations (e.g., delta method and resampling methods). Requiring the fit of only one model reduces the computation time required for complex mediation analyses and permits the use of a rich suite of regression tools that are not easily implemented on a system of three equations, as would be required in the Baron-Kenny framework. Using data from the BRAIN-ICU study, we provide examples to illustrate the advantages of this framework and compare it with the existing approaches. © The Author 2017. Published by Oxford University Press.
A comprehensive test of clinical reasoning for medical students: An olympiad experience in Iran
Monajemi, Alireza; Arabshahi, Kamran Soltani; Soltani, Akbar; Arbabi, Farshid; Akbari, Roghieh; Custers, Eugene; Hadadgar, Arash; Hadizadeh, Fatemeh; Changiz, Tahereh; Adibi, Peyman
2012-01-01
Background: Although some tests for clinical reasoning assessment are now available, the theories of medical expertise have not played a major role in this filed. In this paper, illness script theory was chose as a theoretical framework and contemporary clinical reasoning tests were put together based on this theoretical model. Materials and Methods: This paper is a qualitative study performed with an action research approach. This style of research is performed in a context where authorities focus on promoting their organizations’ performance and is carried out in the form of teamwork called participatory research. Results: Results are presented in four parts as basic concepts, clinical reasoning assessment, test framework, and scoring. Conclusion: we concluded that no single test could thoroughly assess clinical reasoning competency, and therefore a battery of clinical reasoning tests is needed. This battery should cover all three parts of clinical reasoning process: script activation, selection and verification. In addition, not only both analytical and non-analytical reasoning, but also both diagnostic and management reasoning should evenly take into consideration in this battery. This paper explains the process of designing and implementing the battery of clinical reasoning in the Olympiad for medical sciences students through an action research. PMID:23555113
The dynamics of adapting, unregulated populations and a modified fundamental theorem.
O'Dwyer, James P
2013-01-06
A population in a novel environment will accumulate adaptive mutations over time, and the dynamics of this process depend on the underlying fitness landscape: the fitness of and mutational distance between possible genotypes in the population. Despite its fundamental importance for understanding the evolution of a population, inferring this landscape from empirical data has been problematic. We develop a theoretical framework to describe the adaptation of a stochastic, asexual, unregulated, polymorphic population undergoing beneficial, neutral and deleterious mutations on a correlated fitness landscape. We generate quantitative predictions for the change in the mean fitness and within-population variance in fitness over time, and find a simple, analytical relationship between the distribution of fitness effects arising from a single mutation, and the change in mean population fitness over time: a variant of Fisher's 'fundamental theorem' which explicitly depends on the form of the landscape. Our framework can therefore be thought of in three ways: (i) as a set of theoretical predictions for adaptation in an exponentially growing phase, with applications in pathogen populations, tumours or other unregulated populations; (ii) as an analytically tractable problem to potentially guide theoretical analysis of regulated populations; and (iii) as a basis for developing empirical methods to infer general features of a fitness landscape.
Kim, SungHwan; Lin, Chien-Wei; Tseng, George C
2016-07-01
Supervised machine learning is widely applied to transcriptomic data to predict disease diagnosis, prognosis or survival. Robust and interpretable classifiers with high accuracy are usually favored for their clinical and translational potential. The top scoring pair (TSP) algorithm is an example that applies a simple rank-based algorithm to identify rank-altered gene pairs for classifier construction. Although many classification methods perform well in cross-validation of single expression profile, the performance usually greatly reduces in cross-study validation (i.e. the prediction model is established in the training study and applied to an independent test study) for all machine learning methods, including TSP. The failure of cross-study validation has largely diminished the potential translational and clinical values of the models. The purpose of this article is to develop a meta-analytic top scoring pair (MetaKTSP) framework that combines multiple transcriptomic studies and generates a robust prediction model applicable to independent test studies. We proposed two frameworks, by averaging TSP scores or by combining P-values from individual studies, to select the top gene pairs for model construction. We applied the proposed methods in simulated data sets and three large-scale real applications in breast cancer, idiopathic pulmonary fibrosis and pan-cancer methylation. The result showed superior performance of cross-study validation accuracy and biomarker selection for the new meta-analytic framework. In conclusion, combining multiple omics data sets in the public domain increases robustness and accuracy of the classification model that will ultimately improve disease understanding and clinical treatment decisions to benefit patients. An R package MetaKTSP is available online. (http://tsenglab.biostat.pitt.edu/software.htm). ctseng@pitt.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Struijs, Jeroen N; Drewes, Hanneke W; Heijink, Richard; Baan, Caroline A
2015-04-01
Many countries face the persistent twin challenge of providing high-quality care while keeping health systems affordable and accessible. As a result, the interest for more efficient strategies to stimulate population health is increasing. A possible successful strategy is population management (PM). PM strives to address health needs for the population at-risk and the chronically ill at all points along the health continuum by integrating services across health care, prevention, social care and welfare. The Care Continuum Alliance (CCA) population health guide, which recently changed their name in Population Health Alliance (PHA) provides a useful instrument for implementing and evaluating such innovative approaches. This framework is developed for PM specifically and describes the core elements of the PM-concept on the basis of six subsequent interrelated steps. The aim of this article is to transform the CCA framework into an analytical framework. Quantitative methods are refined and we operationalized a set of indicators to measure the impact of PM in terms of the Triple Aim (population health, quality of care and cost per capita). Additionally, we added a qualitative part to gain insight into the implementation process of PM. This resulted in a broadly applicable analytical framework based on a mixed-methods approach. In the coming years, the analytical framework will be applied within the Dutch Monitor Population Management to derive transferable 'lessons learned' and to methodologically underpin the concept of PM. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Thermostating extended Lagrangian Born-Oppenheimer molecular dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martínez, Enrique; Cawkwell, Marc J.; Voter, Arthur F.
Here, Extended Lagrangian Born-Oppenheimer molecular dynamics is developed and analyzed for applications in canonical (NVT) simulations. Three different approaches are considered: the Nosé and Andersen thermostats and Langevin dynamics. We have tested the temperature distribution under different conditions of self-consistent field (SCF) convergence and time step and compared the results to analytical predictions. We find that the simulations based on the extended Lagrangian Born-Oppenheimer framework provide accurate canonical distributions even under approximate SCF convergence, often requiring only a single diagonalization per time step, whereas regular Born-Oppenheimer formulations exhibit unphysical fluctuations unless a sufficiently high degree of convergence is reached atmore » each time step. Lastly, the thermostated extended Lagrangian framework thus offers an accurate approach to sample processes in the canonical ensemble at a fraction of the computational cost of regular Born-Oppenheimer molecular dynamics simulations.« less
Thermostating extended Lagrangian Born-Oppenheimer molecular dynamics
Martínez, Enrique; Cawkwell, Marc J.; Voter, Arthur F.; ...
2015-04-21
Here, Extended Lagrangian Born-Oppenheimer molecular dynamics is developed and analyzed for applications in canonical (NVT) simulations. Three different approaches are considered: the Nosé and Andersen thermostats and Langevin dynamics. We have tested the temperature distribution under different conditions of self-consistent field (SCF) convergence and time step and compared the results to analytical predictions. We find that the simulations based on the extended Lagrangian Born-Oppenheimer framework provide accurate canonical distributions even under approximate SCF convergence, often requiring only a single diagonalization per time step, whereas regular Born-Oppenheimer formulations exhibit unphysical fluctuations unless a sufficiently high degree of convergence is reached atmore » each time step. Lastly, the thermostated extended Lagrangian framework thus offers an accurate approach to sample processes in the canonical ensemble at a fraction of the computational cost of regular Born-Oppenheimer molecular dynamics simulations.« less
Fock space, symbolic algebra, and analytical solutions for small stochastic systems.
Santos, Fernando A N; Gadêlha, Hermes; Gaffney, Eamonn A
2015-12-01
Randomness is ubiquitous in nature. From single-molecule biochemical reactions to macroscale biological systems, stochasticity permeates individual interactions and often regulates emergent properties of the system. While such systems are regularly studied from a modeling viewpoint using stochastic simulation algorithms, numerous potential analytical tools can be inherited from statistical and quantum physics, replacing randomness due to quantum fluctuations with low-copy-number stochasticity. Nevertheless, classical studies remained limited to the abstract level, demonstrating a more general applicability and equivalence between systems in physics and biology rather than exploiting the physics tools to study biological systems. Here the Fock space representation, used in quantum mechanics, is combined with the symbolic algebra of creation and annihilation operators to consider explicit solutions for the chemical master equations describing small, well-mixed, biochemical, or biological systems. This is illustrated with an exact solution for a Michaelis-Menten single enzyme interacting with limited substrate, including a consideration of very short time scales, which emphasizes when stiffness is present even for small copy numbers. Furthermore, we present a general matrix representation for Michaelis-Menten kinetics with an arbitrary number of enzymes and substrates that, following diagonalization, leads to the solution of this ubiquitous, nonlinear enzyme kinetics problem. For this, a flexible symbolic maple code is provided, demonstrating the prospective advantages of this framework compared to stochastic simulation algorithms. This further highlights the possibilities for analytically based studies of stochastic systems in biology and chemistry using tools from theoretical quantum physics.
ERIC Educational Resources Information Center
Zheng, Gaoming; Cai, Yuzhuo; Ma, Shaozhuang
2017-01-01
This paper intends to construct an analytical framework for understanding quality assurance in international joint programmes and to test it in a case analysis of a European--Chinese joint doctoral degree programme. The development of a quality assurance system for an international joint programme is understood as an institutionalization process…
The Framework of Intervention Engine Based on Learning Analytics
ERIC Educational Resources Information Center
Sahin, Muhittin; Yurdugül, Halil
2017-01-01
Learning analytics primarily deals with the optimization of learning environments and the ultimate goal of learning analytics is to improve learning and teaching efficiency. Studies on learning analytics seem to have been made in the form of adaptation engine and intervention engine. Adaptation engine studies are quite widespread, but intervention…
NASA Astrophysics Data System (ADS)
Sakellariou, J. S.; Fassois, S. D.
2017-01-01
The identification of a single global model for a stochastic dynamical system operating under various conditions is considered. Each operating condition is assumed to have a pseudo-static effect on the dynamics and be characterized by a single measurable scheduling variable. Identification is accomplished within a recently introduced Functionally Pooled (FP) framework, which offers a number of advantages over Linear Parameter Varying (LPV) identification techniques. The focus of the work is on the extension of the framework to include the important FP-ARMAX model case. Compared to their simpler FP-ARX counterparts, FP-ARMAX models are much more general and offer improved flexibility in describing various types of stochastic noise, but at the same time lead to a more complicated, non-quadratic, estimation problem. Prediction Error (PE), Maximum Likelihood (ML), and multi-stage estimation methods are postulated, and the PE estimator optimality, in terms of consistency and asymptotic efficiency, is analytically established. The postulated estimators are numerically assessed via Monte Carlo experiments, while the effectiveness of the approach and its superiority over its FP-ARX counterpart are demonstrated via an application case study pertaining to simulated railway vehicle suspension dynamics under various mass loading conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Thao; Luscher, D. J.; Wilkerson, J. W.
We developed a framework for dislocation-based viscoplasticity and dynamic ductile failure to model high strain rate deformation and damage in single crystals. The rate-dependence of the crystal plasticity formulation is based on the physics of relativistic dislocation kinetics suited for extremely high strain rates. The damage evolution is based on the dynamics of void growth, which are governed by both micro-inertia as well as dislocation kinetics and dislocation substructure evolution. Furthermore, an averaging scheme is proposed in order to approximate the evolution of the dislocation substructure in both the macroscale as well as its spatial distribution at the microscale. Inmore » addition, a concept of a single equivalent dislocation density that effectively captures the collective influence of dislocation density on all active slip systems is proposed here. Together, these concepts and approximations enable the use of semi-analytic solutions for void growth dynamics developed in [J. Wilkerson and K. Ramesh. A dynamic void growth model governed by dislocation kinetics. J. Mech. Phys. Solids, 70:262–280, 2014.], which greatly reduce the computational overhead that would otherwise be required. The resulting homogenized framework has been implemented into a commercially available finite element package, and a validation study against a suite of direct numerical simulations was carried out.« less
Integrated nanoplasmonic quantum interfaces for room-temperature single-photon sources
NASA Astrophysics Data System (ADS)
Peyskens, Frédéric; Englund, Dirk; Chang, Darrick
2017-12-01
We describe a general analytical framework of a nanoplasmonic cavity-emitter system interacting with a dielectric photonic waveguide. Taking into account emitter quenching and dephasing, our model directly reveals the single-photon extraction efficiency η as well as the indistinguishability I of photons coupled into the waveguide mode. Rather than minimizing the cavity modal volume, our analysis predicts an optimum modal volume to maximize η that balances waveguide coupling and spontaneous emission rate enhancement. Surprisingly, our model predicts that near-unity indistinguishability is possible, but this requires a much smaller modal volume, implying a fundamental performance trade-off between high η and I at room temperature. Finally, we show that maximizing η I requires that the system has to be driven in the weak coupling regime because quenching effects and decreased waveguide coupling drastically reduce η in the strong coupling regime.
An Multivariate Distance-Based Analytic Framework for Connectome-Wide Association Studies
Shehzad, Zarrar; Kelly, Clare; Reiss, Philip T.; Craddock, R. Cameron; Emerson, John W.; McMahon, Katie; Copland, David A.; Castellanos, F. Xavier; Milham, Michael P.
2014-01-01
The identification of phenotypic associations in high-dimensional brain connectivity data represents the next frontier in the neuroimaging connectomics era. Exploration of brain-phenotype relationships remains limited by statistical approaches that are computationally intensive, depend on a priori hypotheses, or require stringent correction for multiple comparisons. Here, we propose a computationally efficient, data-driven technique for connectome-wide association studies (CWAS) that provides a comprehensive voxel-wise survey of brain-behavior relationships across the connectome; the approach identifies voxels whose whole-brain connectivity patterns vary significantly with a phenotypic variable. Using resting state fMRI data, we demonstrate the utility of our analytic framework by identifying significant connectivity-phenotype relationships for full-scale IQ and assessing their overlap with existent neuroimaging findings, as synthesized by openly available automated meta-analysis (www.neurosynth.org). The results appeared to be robust to the removal of nuisance covariates (i.e., mean connectivity, global signal, and motion) and varying brain resolution (i.e., voxelwise results are highly similar to results using 800 parcellations). We show that CWAS findings can be used to guide subsequent seed-based correlation analyses. Finally, we demonstrate the applicability of the approach by examining CWAS for three additional datasets, each encompassing a distinct phenotypic variable: neurotypical development, Attention-Deficit/Hyperactivity Disorder diagnostic status, and L-dopa pharmacological manipulation. For each phenotype, our approach to CWAS identified distinct connectome-wide association profiles, not previously attainable in a single study utilizing traditional univariate approaches. As a computationally efficient, extensible, and scalable method, our CWAS framework can accelerate the discovery of brain-behavior relationships in the connectome. PMID:24583255
Sheldon, Michael R
2016-01-01
Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. © 2016 American Physical Therapy Association.
ERIC Educational Resources Information Center
Golden, Mark
This report briefly describes the procedures for assessing children's psychological development and the data analytic framework used in the New York City Infant Day Care Study. This study is a 5-year, longitudinal investigation in which infants in group and family day care programs and infants reared at home are compared. Children in the study are…
Value of Flexibility - Phase 1
2010-09-25
weaknesses of each approach. During this period, we also explored the development of an analytical framework based on sound mathematical constructs... mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory or guidance on best approaches to...research activities is in developing a coherent value based definition of flexibility that is based on an analytical framework that is mathematically
A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions
Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.
2009-01-01
Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453
Integrating count and detection–nondetection data to model population dynamics
Zipkin, Elise F.; Rossman, Sam; Yackulic, Charles B.; Wiens, David; Thorson, James T.; Davis, Raymond J.; Grant, Evan H. Campbell
2017-01-01
There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture–recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection–nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection–nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection–nondetection data (1995–2014) with newly collected count data (2015–2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance.
Global dynamic optimization approach to predict activation in metabolic pathways.
de Hijas-Liste, Gundián M; Klipp, Edda; Balsa-Canto, Eva; Banga, Julio R
2014-01-06
During the last decade, a number of authors have shown that the genetic regulation of metabolic networks may follow optimality principles. Optimal control theory has been successfully used to compute optimal enzyme profiles considering simple metabolic pathways. However, applying this optimal control framework to more general networks (e.g. branched networks, or networks incorporating enzyme production dynamics) yields problems that are analytically intractable and/or numerically very challenging. Further, these previous studies have only considered a single-objective framework. In this work we consider a more general multi-objective formulation and we present solutions based on recent developments in global dynamic optimization techniques. We illustrate the performance and capabilities of these techniques considering two sets of problems. First, we consider a set of single-objective examples of increasing complexity taken from the recent literature. We analyze the multimodal character of the associated non linear optimization problems, and we also evaluate different global optimization approaches in terms of numerical robustness, efficiency and scalability. Second, we consider generalized multi-objective formulations for several examples, and we show how this framework results in more biologically meaningful results. The proposed strategy was used to solve a set of single-objective case studies related to unbranched and branched metabolic networks of different levels of complexity. All problems were successfully solved in reasonable computation times with our global dynamic optimization approach, reaching solutions which were comparable or better than those reported in previous literature. Further, we considered, for the first time, multi-objective formulations, illustrating how activation in metabolic pathways can be explained in terms of the best trade-offs between conflicting objectives. This new methodology can be applied to metabolic networks with arbitrary topologies, non-linear dynamics and constraints.
Integrating count and detection-nondetection data to model population dynamics.
Zipkin, Elise F; Rossman, Sam; Yackulic, Charles B; Wiens, J David; Thorson, James T; Davis, Raymond J; Grant, Evan H Campbell
2017-06-01
There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture-recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection-nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection-nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection-nondetection data (1995-2014) with newly collected count data (2015-2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance. © 2017 by the Ecological Society of America.
An analytical procedure to assist decision-making in a government research organization
H. Dean Claxton; Giuseppe Rensi
1972-01-01
An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...
Optimization of single-base-pair mismatch discrimination in oligonucleotide microarrays
NASA Technical Reports Server (NTRS)
Urakawa, Hidetoshi; El Fantroussi, Said; Smidt, Hauke; Smoot, James C.; Tribou, Erik H.; Kelly, John J.; Noble, Peter A.; Stahl, David A.
2003-01-01
The discrimination between perfect-match and single-base-pair-mismatched nucleic acid duplexes was investigated by using oligonucleotide DNA microarrays and nonequilibrium dissociation rates (melting profiles). DNA and RNA versions of two synthetic targets corresponding to the 16S rRNA sequences of Staphylococcus epidermidis (38 nucleotides) and Nitrosomonas eutropha (39 nucleotides) were hybridized to perfect-match probes (18-mer and 19-mer) and to a set of probes having all possible single-base-pair mismatches. The melting profiles of all probe-target duplexes were determined in parallel by using an imposed temperature step gradient. We derived an optimum wash temperature for each probe and target by using a simple formula to calculate a discrimination index for each temperature of the step gradient. This optimum corresponded to the output of an independent analysis using a customized neural network program. These results together provide an experimental and analytical framework for optimizing mismatch discrimination among all probes on a DNA microarray.
Rizvi, Abbas H.; Camara, Pablo G.; Kandror, Elena K.; Roberts, Thomas J.; Schieren, Ira; Maniatis, Tom; Rabadan, Raul
2017-01-01
Transcriptional programs control cellular lineage commitment and differentiation during development. Understanding cell fate has been advanced by studying single-cell RNA-seq, but is limited by the assumptions of current analytic methods regarding the structure of data. We present single-cell topological data analysis (scTDA), an algorithm for topology-based computational analyses to study temporal, unbiased transcriptional regulation. Compared to other methods, scTDA is a non-linear, model-independent, unsupervised statistical framework that can characterize transient cellular states. We applied scTDA to the analysis of murine embryonic stem cell (mESC) differentiation in vitro in response to inducers of motor neuron differentiation. scTDA resolved asynchrony and continuity in cellular identity over time, and identified four transient states (pluripotent, precursor, progenitor, and fully differentiated cells) based on changes in stage-dependent combinations of transcription factors, RNA-binding proteins and long non-coding RNAs. scTDA can be applied to study asynchronous cellular responses to either developmental cues or environmental perturbations. PMID:28459448
DOE Office of Scientific and Technical Information (OSTI.GOV)
Youn, H; Jeon, H; Nam, J
Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law.more » In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.« less
Putting climate impact estimates to work: the empirical approach of the American Climate Prospectus
NASA Astrophysics Data System (ADS)
Jina, A.; Hsiang, S. M.; Kopp, R. E., III; Rasmussen, D.; Rising, J.
2014-12-01
The American Climate Prospectus (ACP), the technical analysis underlying the Risky Business project, quantitatively assesses climate risks posed to the United States' economy in a number of sectors [1]. Four of these - crop yield, crime, labor productivity, and mortality - draw upon research which identifies social impacts using contemporary variability in climate. We first identify a group of rigorous studies that use climate variability to identify responses to temperature and precipitation, while controlling for unobserved differences between locations. To incorporate multiple studies from a single sector, we employ a meta-analytical approach that draws on Bayesian methods commonly used in medical research and previously implemented in [2]. We generate a series of aggregate response functions for each sector using this meta-analytical method. We combine response functions with downscaled physical climate projections to estimate climate impacts out to the end of the century, incorporating uncertainty from statistical estimates, weather, climate models, and different emissions scenarios. Incorporating multiple studies in a single estimation framework allows us to directly compare impacts across the economy. We find that increased mortality has the largest effect on the US economy, followed by costs associated with decreased labor productivity. Agricultural losses and increases in crime contribute lesser but nonetheless substantial costs, and agriculture, notably, shows many areas benefitting from projected climate changes. The ACP also presents results throughout the 21stcentury. The dynamics of each of the impact categories differs, with, for example, mortality showing little change until the end of the century, but crime showing a monotonic increase from the present day. The ACP approach can expand to include new findings in current sectors, new sectors, and new geographical areas of interest. It represents an analytical framework that can incorporate empirical studies into a broad characterization of climate impacts across an economy, ensuring that each individual study can contribute to guiding policy priorities on climate change. References: [1] T. Houser et al. (2014), American Climate Prospectus, www.climateprospectus.org. [2] Hsiang, Burke, and Miguel (2013), Science.
Estimating Aquifer Properties Using Sinusoidal Pumping Tests
NASA Astrophysics Data System (ADS)
Rasmussen, T. C.; Haborak, K. G.; Young, M. H.
2001-12-01
We develop the theoretical and applied framework for using sinusoidal pumping tests to estimate aquifer properties for confined, leaky, and partially penetrating conditions. The framework 1) derives analytical solutions for three boundary conditions suitable for many practical applications, 2) validates the analytical solutions against a finite element model, 3) establishes a protocol for conducting sinusoidal pumping tests, and 4) estimates aquifer hydraulic parameters based on the analytical solutions. The analytical solutions to sinusoidal stimuli in radial coordinates are derived for boundary value problems that are analogous to the Theis (1935) confined aquifer solution, the Hantush and Jacob (1955) leaky aquifer solution, and the Hantush (1964) partially penetrated confined aquifer solution. The analytical solutions compare favorably to a finite-element solution of a simulated flow domain, except in the region immediately adjacent to the pumping well where the implicit assumption of zero borehole radius is violated. The procedure is demonstrated in one unconfined and two confined aquifer units near the General Separations Area at the Savannah River Site, a federal nuclear facility located in South Carolina. Aquifer hydraulic parameters estimated using this framework provide independent confirmation of parameters obtained from conventional aquifer tests. The sinusoidal approach also resulted in the elimination of investigation-derived wastes.
NASA Astrophysics Data System (ADS)
Das, Anusheela; Chaudhury, Srabanti
2015-11-01
Metal nanoparticles are heterogeneous catalysts and have a multitude of non-equivalent, catalytic sites on the nanoparticle surface. The product dissociation step in such reaction schemes can follow multiple pathways. Proposed here for the first time is a completely analytical theoretical framework, based on the first passage time distribution, that incorporates the effect of heterogeneity in nanoparticle catalysis explicitly by considering multiple, non-equivalent catalytic sites on the nanoparticle surface. Our results show that in nanoparticle catalysis, the effect of dynamic disorder is manifested even at limiting substrate concentrations in contrast to an enzyme that has only one well-defined active site.
Transmission eigenchannels for coherent phonon transport
NASA Astrophysics Data System (ADS)
Klöckner, J. C.; Cuevas, J. C.; Pauly, F.
2018-04-01
We present a procedure to determine transmission eigenchannels for coherent phonon transport in nanoscale devices using the framework of nonequilibrium Green's functions. We illustrate our procedure by analyzing a one-dimensional chain, where all steps can be carried out analytically. More importantly, we show how the procedure can be combined with ab initio calculations to provide a better understanding of phonon heat transport in realistic atomic-scale junctions. In particular, we study the phonon eigenchannels in a gold metallic atomic-size contact and different single-molecule junctions based on molecules such as an alkane chain, a brominated benzene-diamine, where destructive phonon interference effects take place, and a C60 junction.
Malkin, B Z; Lummen, T T A; van Loosdrecht, P H M; Dhalenne, G; Zakirov, A R
2010-07-14
The experimental temperature dependence (T = 2-300 K) of single crystal bulk and site susceptibilities of rare earth titanate pyrochlores R(2)Ti(2)O(7) (R = Sm, Eu, Gd, Tb, Dy, Ho, Er, Yb) is analyzed in the framework of crystal field theory and a mean field approximation. Analytical expressions for the site and bulk susceptibilities of the pyrochlore lattice are derived taking into account long range dipole-dipole interactions and anisotropic exchange interactions between the nearest neighbor rare earth ions. The sets of crystal field parameters and anisotropic exchange coupling constants have been determined and their variations along the lanthanide series are discussed.
Corbin, Laura J; Tan, Vanessa Y; Hughes, David A; Wade, Kaitlin H; Paul, Dirk S; Tansey, Katherine E; Butcher, Frances; Dudbridge, Frank; Howson, Joanna M; Jallow, Momodou W; John, Catherine; Kingston, Nathalie; Lindgren, Cecilia M; O'Donavan, Michael; O'Rahilly, Stephen; Owen, Michael J; Palmer, Colin N A; Pearson, Ewan R; Scott, Robert A; van Heel, David A; Whittaker, John; Frayling, Tim; Tobin, Martin D; Wain, Louise V; Smith, George Davey; Evans, David M; Karpe, Fredrik; McCarthy, Mark I; Danesh, John; Franks, Paul W; Timpson, Nicholas J
2018-02-19
Detailed phenotyping is required to deepen our understanding of the biological mechanisms behind genetic associations. In addition, the impact of potentially modifiable risk factors on disease requires analytical frameworks that allow causal inference. Here, we discuss the characteristics of Recall-by-Genotype (RbG) as a study design aimed at addressing both these needs. We describe two broad scenarios for the application of RbG: studies using single variants and those using multiple variants. We consider the efficacy and practicality of the RbG approach, provide a catalogue of UK-based resources for such studies and present an online RbG study planner.
Analytical method of waste allocation in waste management systems: Concept, method and case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergeron, Francis C., E-mail: francis.b.c@videotron.ca
Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. Themore » conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste management in Geneva.« less
ERIC Educational Resources Information Center
Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting
2016-01-01
This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…
Hassell, Kerry M; LeBlanc, Yves; McLuckey, Scott A
2009-11-01
Charge inversion ion/ion reactions can convert several cation types associated with a single analyte molecule to a single anion type for subsequent mass analysis. Specifically, analyte ions present with one of a variety of cationizing agents, such as an excess proton, excess sodium ion, or excess potassium ion, can all be converted to the deprotonated molecule, provided that a stable anion can be generated for the analyte. Multiply deprotonated species that are capable of exchanging a proton for a metal ion serve as the reagent anions for the reaction. This process is demonstrated here for warfarin and for a glutathione conjugate. Examples for several other glutathione conjugates are provided as supplementary material to demonstrate the generality of the reaction. In the case of glutathione conjugates, multiple metal ions can be associated with the singly-charged analyte due to the presence of two carboxylate groups. The charge inversion reaction involves the removal of the excess cationizing agent, as well as any metal ions associated with anionic groups to yield a singly deprotonated analyte molecule. The ability to convert multiple cation types to a single anion type is analytically desirable in cases in which the analyte signal is distributed among several cation types, as is common in the electrospray ionization of solutions with relatively high salt contents. For analyte species that undergo efficient charge inversion, such as glutathione conjugates, there is the additional potential advantage for significantly improved signal-to-noise ratios when species that give rise to 'chemical noise' in the positive ion spectrum do not undergo efficient charge inversion.
Hierarchical Bayesian inference of the initial mass function in composite stellar populations
NASA Astrophysics Data System (ADS)
Dries, M.; Trager, S. C.; Koopmans, L. V. E.; Popping, G.; Somerville, R. S.
2018-03-01
The initial mass function (IMF) is a key ingredient in many studies of galaxy formation and evolution. Although the IMF is often assumed to be universal, there is continuing evidence that it is not universal. Spectroscopic studies that derive the IMF of the unresolved stellar populations of a galaxy often assume that this spectrum can be described by a single stellar population (SSP). To alleviate these limitations, in this paper we have developed a unique hierarchical Bayesian framework for modelling composite stellar populations (CSPs). Within this framework, we use a parametrized IMF prior to regulate a direct inference of the IMF. We use this new framework to determine the number of SSPs that is required to fit a set of realistic CSP mock spectra. The CSP mock spectra that we use are based on semi-analytic models and have an IMF that varies as a function of stellar velocity dispersion of the galaxy. Our results suggest that using a single SSP biases the determination of the IMF slope to a higher value than the true slope, although the trend with stellar velocity dispersion is overall recovered. If we include more SSPs in the fit, the Bayesian evidence increases significantly and the inferred IMF slopes of our mock spectra converge, within the errors, to their true values. Most of the bias is already removed by using two SSPs instead of one. We show that we can reconstruct the variable IMF of our mock spectra for signal-to-noise ratios exceeding ˜75.
Development Of Antibody-Based Fiber-Optic Sensors
NASA Astrophysics Data System (ADS)
Tromberg, Bruce J.; Sepaniak, Michael J.; Vo-Dinh, Tuan
1988-06-01
The speed and specificity characteristic of immunochemical complex formation has encouraged the development of numerous antibody-based analytical techniques. The scope and versatility of these established methods can be enhanced by combining the principles of conventional immunoassay with laser-based fiber-optic fluorimetry. This merger of spectroscopy and immunochemistry provides the framework for the construction of highly sensitive and selective fiber-optic devices (fluoroimmuno-sensors) capable of in-situ detection of drugs, toxins, and naturally occurring biochemicals. Fluoroimmuno-sensors (FIS) employ an immobilized reagent phase at the sampling terminus of a single quartz optical fiber. Laser excitation of antibody-bound analyte produces a fluorescence signal which is either directly proportional (as in the case of natural fluorophor and "antibody sandwich" assays) or inversely proportional (as in the case of competitive-binding assays) to analyte concentration. Factors which influence analysis time, precision, linearity, and detection limits include the nature (solid or liquid) and amount of the reagent phase, the method of analyte delivery (passive diffusion, convection, etc.), and whether equilibrium or non-equilibrium assays are performed. Data will be presented for optical fibers whose sensing termini utilize: (1) covalently-bound solid antibody reagent phases, and (2) membrane-entrapped liquid antibody reagents. Assays for large-molecular weight proteins (antigens) and small-molecular weight, carcinogenic, polynuclear aromatics (haptens) will be considered. In this manner, the influence of a system's chemical characteristics and measurement requirements on sensor design, and the consequence of various sensor designs on analytical performance will be illustrated.
Park, In-Sun; Park, Jae-Woo
2011-01-30
Total petroleum hydrocarbon (TPH) is an important environmental contaminant that is toxic to human and environmental receptors. However, human health risk assessment for petroleum, oil, and lubricant (POL)-contaminated sites is especially challenging because TPH is not a single compound, but rather a mixture of numerous substances. To address this concern, this study recommends a new human health risk assessment strategy for POL-contaminated sites. The strategy is based on a newly modified TPH fractionation method and includes an improved analytical protocol. The proposed TPH fractionation method is composed of ten fractions (e.g., aliphatic and aromatic EC8-10, EC10-12, EC12-16, EC16-22 and EC22-40). Physicochemical properties and toxicity values of each fraction were newly defined in this study. The stepwise ultrasonication-based analytical process was established to measure TPH fractions. Analytical results were compared with those from the TPH Criteria Working Group (TPHCWG) Direct Method. Better analytical efficiencies in TPH, aliphatic, and aromatic fractions were achieved when contaminated soil samples were analyzed with the new analytical protocol. Finally, a human health risk assessment was performed based on the developed tiered risk assessment framework. Results showed that a detailed quantitative risk assessment should be conducted to determine scientifically and economically appropriate cleanup target levels, although the phase II process is useful for determining the potency of human health risks posed by POL-contamination. Copyright © 2010 Elsevier B.V. All rights reserved.
Ethics and Justice in Learning Analytics
ERIC Educational Resources Information Center
Johnson, Jeffrey Alan
2017-01-01
The many complex challenges posed by learning analytics can best be understood within a framework of structural justice, which focuses on the ways in which the informational, operational, and organizational structures of learning analytics influence students' capacities for self-development and self-determination. This places primary…
Reading Multimodal Texts: Perceptual, Structural and Ideological Perspectives
ERIC Educational Resources Information Center
Serafini, Frank
2010-01-01
This article presents a tripartite framework for analyzing multimodal texts. The three analytical perspectives presented include: (1) perceptual, (2) structural, and (3) ideological analytical processes. Using Anthony Browne's picturebook "Piggybook" as an example, assertions are made regarding what each analytical perspective brings to the…
Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework
Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.
2016-01-01
Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of TOF scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (Direct Image Reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias vs. variance performance to iterative TOF reconstruction with a matched resolution model. PMID:27032968
Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework
NASA Astrophysics Data System (ADS)
Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.
2016-05-01
Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of time-of-flight (TOF) scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (DIRECT: direct image reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias versus variance performance to iterative TOF reconstruction with a matched resolution model.
The Strategic Management of Accountability in Nonprofit Organizations: An Analytical Framework.
ERIC Educational Resources Information Center
Kearns, Kevin P.
1994-01-01
Offers a framework stressing the strategic and tactical choices facing nonprofit organizations and discusses policy and management implications. Claims framework is a useful tool for conducting accountability audits and conceptual foundation for discussions of public policy. (Author/JOW)
NASA Astrophysics Data System (ADS)
Duan, Suqin Q.; Wright, Jonathon S.; Romps, David M.
2018-02-01
Atmospheric water-vapor isotopes have been proposed as a potentially powerful constraint on convection, which plays a critical role in Earth's present and future climate. It is shown here, however, that the mean tropical profile of HDO in the free troposphere does not usefully constrain the mean convective entrainment rate or precipitation efficiency. This is demonstrated using a single-column analytical model of atmospheric water isotopes. The model has three parameters: the entrainment rate, the precipitation efficiency, and the distance that evaporating condensates fall. At a given relative humidity, the possible range of HDO is small: its range is comparable to both the measurement uncertainty in the mean tropical profile and the structural uncertainty of a single-column model. Therefore, the mean tropical HDO profile is unlikely to add information about convective processes in a bulk-plume framework that cannot already be learned from relative humidity alone.
RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.
Varghese, Blesson; Patel, Ishan; Barker, Adam
2015-01-01
Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.
On effective temperature in network models of collective behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porfiri, Maurizio, E-mail: mporfiri@nyu.edu; Ariel, Gil, E-mail: arielg@math.biu.ac.il
Collective behavior of self-propelled units is studied analytically within the Vectorial Network Model (VNM), a mean-field approximation of the well-known Vicsek model. We propose a dynamical systems framework to study the stochastic dynamics of the VNM in the presence of general additive noise. We establish that a single parameter, which is a linear function of the circular mean of the noise, controls the macroscopic phase of the system—ordered or disordered. By establishing a fluctuation–dissipation relation, we posit that this parameter can be regarded as an effective temperature of collective behavior. The exact critical temperature is obtained analytically for systems withmore » small connectivity, equivalent to low-density ensembles of self-propelled units. Numerical simulations are conducted to demonstrate the applicability of this new notion of effective temperature to the Vicsek model. The identification of an effective temperature of collective behavior is an important step toward understanding order–disorder phase transitions, informing consistent coarse-graining techniques and explaining the physics underlying the emergence of collective phenomena.« less
NASA Astrophysics Data System (ADS)
Phipps, Marja; Lewis, Gina
2012-06-01
Over the last decade, intelligence capabilities within the Department of Defense/Intelligence Community (DoD/IC) have evolved from ad hoc, single source, just-in-time, analog processing; to multi source, digitally integrated, real-time analytics; to multi-INT, predictive Processing, Exploitation and Dissemination (PED). Full Motion Video (FMV) technology and motion imagery tradecraft advancements have greatly contributed to Intelligence, Surveillance and Reconnaissance (ISR) capabilities during this timeframe. Imagery analysts have exploited events, missions and high value targets, generating and disseminating critical intelligence reports within seconds of occurrence across operationally significant PED cells. Now, we go beyond FMV, enabling All-Source Analysts to effectively deliver ISR information in a multi-INT sensor rich environment. In this paper, we explore the operational benefits and technical challenges of an Activity Based Intelligence (ABI) approach to FMV PED. Existing and emerging ABI features within FMV PED frameworks are discussed, to include refined motion imagery tools, additional intelligence sources, activity relevant content management techniques and automated analytics.
Broadband impedance boundary conditions for the simulation of sound propagation in the time domain.
Bin, Jonghoon; Yousuff Hussaini, M; Lee, Soogab
2009-02-01
An accurate and practical surface impedance boundary condition in the time domain has been developed for application to broadband-frequency simulation in aeroacoustic problems. To show the capability of this method, two kinds of numerical simulations are performed and compared with the analytical/experimental results: one is acoustic wave reflection by a monopole source over an impedance surface and the other is acoustic wave propagation in a duct with a finite impedance wall. Both single-frequency and broadband-frequency simulations are performed within the framework of linearized Euler equations. A high-order dispersion-relation-preserving finite-difference method and a low-dissipation, low-dispersion Runge-Kutta method are used for spatial discretization and time integration, respectively. The results show excellent agreement with the analytical/experimental results at various frequencies. The method accurately predicts both the amplitude and the phase of acoustic pressure and ensures the well-posedness of the broadband time-domain impedance boundary condition.
Curriculum Innovation for Marketing Analytics
ERIC Educational Resources Information Center
Wilson, Elizabeth J.; McCabe, Catherine; Smith, Robert S.
2018-01-01
College graduates need better preparation for and experience in data analytics for higher-quality problem solving. Using the curriculum innovation framework of Borin, Metcalf, and Tietje (2007) and case study research methods, we offer rich insights about one higher education institution's work to address the marketing analytics skills gap.…
ERIC Educational Resources Information Center
Bodily, Robert; Nyland, Rob; Wiley, David
2017-01-01
The RISE (Resource Inspection, Selection, and Enhancement) Framework is a framework supporting the continuous improvement of open educational resources (OER). The framework is an automated process that identifies learning resources that should be evaluated and either eliminated or improved. This is particularly useful in OER contexts where the…
Quantifying risks with exact analytical solutions of derivative pricing distribution
NASA Astrophysics Data System (ADS)
Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin
2017-04-01
Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.
Wu, Zheyang; Zhao, Hongyu
2012-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies.
Wu, Zheyang; Zhao, Hongyu
2013-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies. PMID:23956610
Theory of precipitation effects on dead cylindrical fuels
Michael A. Fosberg
1972-01-01
Numerical and analytical solutions of the Fickian diffusion equation were used to determine the effects of precipitation on dead cylindrical forest fuels. The analytical solution provided a physical framework. The numerical solutions were then used to refine the analytical solution through a similarity argument. The theoretical solutions predicted realistic rates of...
SLIVISU, an Interactive Visualisation Framework for Analysis of Geological Sea-Level Indicators
NASA Astrophysics Data System (ADS)
Klemann, V.; Schulte, S.; Unger, A.; Dransch, D.
2011-12-01
Flanking data analysis in earth system sciences by advanced visualisation tools is a striking feature due to rising complexity, amount and variety of available data. With respect to sea-level indicators (SLIs), their analysis in earth-system applications, such as modelling and simulation on regional or global scales, demands the consideration of large amounts of data - we talk about thousands of SLIs - and, so, to go ahead of analysing single sea-level curves. On the other hand, a gross analysis by means of statistical methods is hindered by the often heterogeneous and individual character of the single SLIs, i.e., the spatio-temporal context and often heterogenous information is difficult to handle or to represent in an objective way. Therefore a concept of integrating automated analysis and visualisation is mandatory. This is provided by visual analytics. As an implementation of this concept, we present the visualisation framework SLIVISU, developed at GFZ, which bases on multiple linked views and provides a synoptic analysis of observational data, model configurations, model outputs and results of automated analysis in glacial isostatic adjustment. Starting as a visualisation tool for an existing database of SLIs, it now serves as an analysis tool for the evaluation of model simulations in studies of glacial-isostatic adjustment.
Classification of Dynamical Diffusion States in Single Molecule Tracking Microscopy
Bosch, Peter J.; Kanger, Johannes S.; Subramaniam, Vinod
2014-01-01
Single molecule tracking of membrane proteins by fluorescence microscopy is a promising method to investigate dynamic processes in live cells. Translating the trajectories of proteins to biological implications, such as protein interactions, requires the classification of protein motion within the trajectories. Spatial information of protein motion may reveal where the protein interacts with cellular structures, because binding of proteins to such structures often alters their diffusion speed. For dynamic diffusion systems, we provide an analytical framework to determine in which diffusion state a molecule is residing during the course of its trajectory. We compare different methods for the quantification of motion to utilize this framework for the classification of two diffusion states (two populations with different diffusion speed). We found that a gyration quantification method and a Bayesian statistics-based method are the most accurate in diffusion-state classification for realistic experimentally obtained datasets, of which the gyration method is much less computationally demanding. After classification of the diffusion, the lifetime of the states can be determined, and images of the diffusion states can be reconstructed at high resolution. Simulations validate these applications. We apply the classification and its applications to experimental data to demonstrate the potential of this approach to obtain further insights into the dynamics of cell membrane proteins. PMID:25099798
NASA Astrophysics Data System (ADS)
Shen, Ji; Sung, Shannon; Zhang, Dongmei
2015-11-01
Students need to think and work across disciplinary boundaries in the twenty-first century. However, it is unclear what interdisciplinary thinking means and how to analyze interdisciplinary interactions in teamwork. In this paper, drawing on multiple theoretical perspectives and empirical analysis of discourse contents, we formulate a theoretical framework that helps analyze interdisciplinary reasoning and communication (IRC) processes in interdisciplinary collaboration. Specifically, we propose four interrelated IRC processes-integration, translation, transfer, and transformation, and develop a corresponding analytic framework. We apply the framework to analyze two meetings of a project that aims to develop interdisciplinary science assessment items. The results illustrate that the framework can help interpret the interdisciplinary meeting dynamics and patterns. Our coding process and results also suggest that these IRC processes can be further examined in terms of interconnected sub-processes. We also discuss the implications of using the framework in conceptualizing, practicing, and researching interdisciplinary learning and teaching in science education.
Open-source Framework for Storing and Manipulation of Plasma Chemical Reaction Data
NASA Astrophysics Data System (ADS)
Jenkins, T. G.; Averkin, S. N.; Cary, J. R.; Kruger, S. E.
2017-10-01
We present a new open-source framework for storage and manipulation of plasma chemical reaction data that has emerged from our in-house project MUNCHKIN. This framework consists of python scripts and C + + programs. It stores data in an SQL data base for fast retrieval and manipulation. For example, it is possible to fit cross-section data into most widely used analytical expressions, calculate reaction rates for Maxwellian distribution functions of colliding particles, and fit them into different analytical expressions. Another important feature of this framework is the ability to calculate transport properties based on the cross-section data and supplied distribution functions. In addition, this framework allows the export of chemical reaction descriptions in LaTeX format for ease of inclusion in scientific papers. With the help of this framework it is possible to generate corresponding VSim (Particle-In-Cell simulation code) and USim (unstructured multi-fluid code) input blocks with appropriate cross-sections.
Electrocardiographic interpretation skills of cardiology residents: are they competent?
Sibbald, Matthew; Davies, Edward G; Dorian, Paul; Yu, Eric H C
2014-12-01
Achieving competency at electrocardiogram (ECG) interpretation among cardiology subspecialty residents has traditionally focused on interpreting a target number of ECGs during training. However, there is little evidence to support this approach. Further, there are no data documenting the competency of ECG interpretation skills among cardiology residents, who become de facto the gold standard in their practice communities. We tested 29 Cardiology residents from all 3 years in a large training program using a set of 20 ECGs collected from a community cardiology practice over a 1-month period. Residents interpreted half of the ECGs using a standard analytic framework, and half using their own approach. Residents were scored on the number of correct and incorrect diagnoses listed. Overall diagnostic accuracy was 58%. Of 6 potentially life-threatening diagnoses, residents missed 36% (123 of 348) including hyperkalemia (81%), long QT (52%), complete heart block (35%), and ventricular tachycardia (19%). Residents provided additional inappropriate diagnoses on 238 ECGs (41%). Diagnostic accuracy was similar between ECGs interpreted using an analytic framework vs ECGs interpreted without an analytic framework (59% vs 58%; F(1,1333) = 0.26; P = 0.61). Cardiology resident proficiency at ECG interpretation is suboptimal. Despite the use of an analytic framework, there remain significant deficiencies in ECG interpretation among Cardiology residents. A more systematic method of addressing these important learning gaps is urgently needed. Copyright © 2014 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.
Tian, Jingqi; Liu, Qian; Shi, Jinle; Hu, Jianming; Asiri, Abdullah M; Sun, Xuping; He, Yuquan
2015-09-15
Considerable recent attention has been paid to homogeneous fluorescent DNA detection with the use of nanostructures as a universal "quencher", but it still remains a great challenge to develop such nanosensor with the benefits of low cost, high speed, sensitivity, and selectivity. In this work, we report the use of iron-based metal-organic framework nanorods as a high-efficient sensing platform for fluorescent DNA detection. It only takes about 4 min to complete the whole "mix-and-detect" process with a low detection limit of 10 pM and a strong discrimination of single point mutation. Control experiments reveal the remarkable sensing behavior is a consequence of the synergies of the metal center and organic linker. This work elucidates how composition control of nanostructures can significantly impact their sensing properties, enabling new opportunities for the rational design of functional materials for analytical applications. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhiyuan; Liu, Dong; Camacho-Bunquin, Jeffrey
ABSTRACT: A stable and structurally well-defined titanium alkoxide catalyst supported on a metal-organic-framework (MOF) of UiO-67 topology (ANL1-Ti(OiPr)2) was synthesized and fully characterized by a variety of analytical and spectroscopic techniques, including BET, TGA, PXRD, XAS, DRIFT, SEM, and DFT computations. The Ti-functionalized MOF was demonstrated active for the catalytic hydroboration of a wide range of aldehydes and ketones with HBpin as the boron source. Compared to traditional homogeneous and supported hydroboration catalysts, ANL1-Ti(OiPr)2 is completely recyclable and reusable, making it a promising hydroboration catalyst alternative for green and sustainable chemical synthesis. DFT calculations suggest that the catalytic hydroboration proceedsmore » via a (1) hydride transfer between the active Ti-hydride species and a carbonyl moiety (rate determining step), and (2) alkoxide transfer (intramolecular σ-bond metathesis) to generate the boronate ester product.« less
The NIH analytical methods and reference materials program for dietary supplements.
Betz, Joseph M; Fisher, Kenneth D; Saldanha, Leila G; Coates, Paul M
2007-09-01
Quality of botanical products is a great uncertainty that consumers, clinicians, regulators, and researchers face. Definitions of quality abound, and include specifications for sanitation, adventitious agents (pesticides, metals, weeds), and content of natural chemicals. Because dietary supplements (DS) are often complex mixtures, they pose analytical challenges and method validation may be difficult. In response to product quality concerns and the need for validated and publicly available methods for DS analysis, the US Congress directed the Office of Dietary Supplements (ODS) at the National Institutes of Health (NIH) to accelerate an ongoing methods validation process, and the Dietary Supplements Methods and Reference Materials Program was created. The program was constructed from stakeholder input and incorporates several federal procurement and granting mechanisms in a coordinated and interlocking framework. The framework facilitates validation of analytical methods, analytical standards, and reference materials.
The origins of religious disbelief.
Norenzayan, Ara; Gervais, Will M
2013-01-01
Although most people are religious, there are hundreds of millions of religious disbelievers in the world. What is religious disbelief and how does it arise? Recent developments in the scientific study of religious beliefs and behaviors point to the conclusion that religious disbelief arises from multiple interacting pathways, traceable to cognitive, motivational, and cultural learning mechanisms. We identify four such pathways, leading to four distinct forms of atheism, which we term mindblind atheism, apatheism, inCREDulous atheism, and analytic atheism. Religious belief and disbelief share the same underlying pathways and can be explained within a single evolutionary framework that is grounded in both genetic and cultural evolution. Copyright © 2012 Elsevier Ltd. All rights reserved.
Issues in Developing a Normative Descriptive Model for Dyadic Decision Making
NASA Technical Reports Server (NTRS)
Serfaty, D.; Kleinman, D. L.
1984-01-01
Most research in modelling human information processing and decision making has been devoted to the case of the single human operator. In the present effort, concepts from the fields of organizational behavior, engineering psychology, team theory and mathematical modelling are merged in an attempt to consider first the case of two cooperating decisionmakers (the Dyad) in a multi-task environment. Rooted in the well-known Dynamic Decision Model (DDM), the normative descriptive approach brings basic cognitive and psychophysical characteristics inherent to human behavior into a team theoretic analytic framework. An experimental paradigm, involving teams in dynamic decision making tasks, is designed to produce the data with which to build the theoretical model.
Analytical solutions for efficient interpretation of single-well push-pull tracer tests
Single-well push-pull tracer tests have been used to characterize the extent, fate, and transport of subsurface contamination. Analytical solutions provide one alternative for interpreting test results. In this work, an exact analytical solution to two-dimensional equations descr...
An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application
ERIC Educational Resources Information Center
Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth
2016-01-01
Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…
LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox
ERIC Educational Resources Information Center
Steiner, Christina M.; Kickmeier-Rust, Michael D.; Albert, Dietrich
2016-01-01
To find a balance between learning analytics research and individual privacy, learning analytics initiatives need to appropriately address ethical, privacy, and data protection issues. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection are available, which may…
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Analytical close-form solutions to the elastic fields of solids with dislocations and surface stress
NASA Astrophysics Data System (ADS)
Ye, Wei; Paliwal, Bhasker; Ougazzaden, Abdallah; Cherkaoui, Mohammed
2013-07-01
The concept of eigenstrain is adopted to derive a general analytical framework to solve the elastic field for 3D anisotropic solids with general defects by considering the surface stress. The formulation shows the elastic constants and geometrical features of the surface play an important role in determining the elastic fields of the solid. As an application, the analytical close-form solutions to the stress fields of an infinite isotropic circular nanowire are obtained. The stress fields are compared with the classical solutions and those of complex variable method. The stress fields from this work demonstrate the impact from the surface stress when the size of the nanowire shrinks but becomes negligible in macroscopic scale. Compared with the power series solutions of complex variable method, the analytical solutions in this work provide a better platform and they are more flexible in various applications. More importantly, the proposed analytical framework profoundly improves the studies of general 3D anisotropic materials with surface effects.
A Graphics Design Framework to Visualize Multi-Dimensional Economic Datasets
ERIC Educational Resources Information Center
Chandramouli, Magesh; Narayanan, Badri; Bertoline, Gary R.
2013-01-01
This study implements a prototype graphics visualization framework to visualize multidimensional data. This graphics design framework serves as a "visual analytical database" for visualization and simulation of economic models. One of the primary goals of any kind of visualization is to extract useful information from colossal volumes of…
Bridging the Timescales of Single-Cell and Population Dynamics
NASA Astrophysics Data System (ADS)
Jafarpour, Farshid; Wright, Charles S.; Gudjonson, Herman; Riebling, Jedidiah; Dawson, Emma; Lo, Klevin; Fiebig, Aretha; Crosson, Sean; Dinner, Aaron R.; Iyer-Biswas, Srividya
2018-04-01
How are granular details of stochastic growth and division of individual cells reflected in smooth deterministic growth of population numbers? We provide an integrated, multiscale perspective of microbial growth dynamics by formulating a data-validated theoretical framework that accounts for observables at both single-cell and population scales. We derive exact analytical complete time-dependent solutions to cell-age distributions and population growth rates as functionals of the underlying interdivision time distributions, for symmetric and asymmetric cell division. These results provide insights into the surprising implications of stochastic single-cell dynamics for population growth. Using our results for asymmetric division, we deduce the time to transition from the reproductively quiescent (swarmer) to the replication-competent (stalked) stage of the Caulobacter crescentus life cycle. Remarkably, population numbers can spontaneously oscillate with time. We elucidate the physics leading to these population oscillations. For C. crescentus cells, we show that a simple measurement of the population growth rate, for a given growth condition, is sufficient to characterize the condition-specific cellular unit of time and, thus, yields the mean (single-cell) growth and division timescales, fluctuations in cell division times, the cell-age distribution, and the quiescence timescale.
Hellmich, Wibke; Greif, Dominik; Pelargus, Christoph; Anselmetti, Dario; Ros, Alexandra
2006-10-20
Single cell analytics is a key method in the framework of proteom research allowing analyses, which are not subjected to ensemble-averaging, cell-cycle or heterogeneous cell-population effects. Our previous studies on single cell analysis in poly(dimethylsiloxane) microfluidic devices with native label-free laser induced fluorescence detection [W. Hellmich, C. Pelargus, K. Leffhalm, A. Ros, D. Anselmetti, Electrophoresis 26 (2005) 3689] were extended in order to improve separation efficiency and detection sensitivity. Here, we particularly focus on the influence of poly(oxyethylene) based coatings on the separation performance. In addition, the influence on background fluorescence is studied by the variation of the incident laser power as well as the adaptation of the confocal volume to the microfluidic channel dimensions. Last but not least, the use of carbon black particles further enhanced the detection limit to 25 nM, thereby reaching the relevant concentration ranges necessary for the label-free detection of low abundant proteins in single cells. On the basis of these results, we demonstrate the first electropherogram from an individual Spodoptera frugiperda (Sf9) cell with native label-free UV-LIF detection in a microfluidic chip.
ERIC Educational Resources Information Center
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
Design and development of a medical big data processing system based on Hadoop.
Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song
2015-03-01
Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.
Clinical reasoning of junior doctors in emergency medicine: a grounded theory study.
Adams, E; Goyder, C; Heneghan, C; Brand, L; Ajjawi, R
2017-02-01
Emergency medicine (EM) has a high case turnover and acuity making it a demanding clinical reasoning domain especially for junior doctors who lack experience. We aimed to better understand their clinical reasoning using dual cognition as a guiding theory. EM junior doctors were recruited from six hospitals in the south of England to participate in semi-structured interviews (n=20) and focus groups (n=17) based on recall of two recent cases. Transcripts were analysed using a grounded theory approach to identify themes and to develop a model of junior doctors' clinical reasoning in EM. Within cases, clinical reasoning occurred in three phases. In phase 1 (case framing), initial case cues and first impressions were predominantly intuitive, but checked by analytical thought and determined the urgency of clinical assessment. In phase 2 (evolving reasoning), non-analytical single cue and pattern recognitions were common which were subsequently validated by specific analytical strategies such as use of red flags. In phase 3 (ongoing uncertainty) analytical self-monitoring and reassurance strategies were used to precipitate a decision regarding discharge. We found a constant dialectic between intuitive and analytical cognition throughout the reasoning process. Our model of clinical reasoning by EM junior doctors illustrates the specific contextual manifestations of the dual cognition theory. Distinct diagnostic strategies are identified and together these give EM learners and educators a framework and vocabulary for discussion and learning about clinical reasoning. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Modeling Progressive Failure of Bonded Joints Using a Single Joint Finite Element
NASA Technical Reports Server (NTRS)
Stapleton, Scott E.; Waas, Anthony M.; Bednarcyk, Brett A.
2010-01-01
Enhanced finite elements are elements with an embedded analytical solution which can capture detailed local fields, enabling more efficient, mesh-independent finite element analysis. In the present study, an enhanced finite element is applied to generate a general framework capable of modeling an array of joint types. The joint field equations are derived using the principle of minimum potential energy, and the resulting solutions for the displacement fields are used to generate shape functions and a stiffness matrix for a single joint finite element. This single finite element thus captures the detailed stress and strain fields within the bonded joint, but it can function within a broader structural finite element model. The costs associated with a fine mesh of the joint can thus be avoided while still obtaining a detailed solution for the joint. Additionally, the capability to model non-linear adhesive constitutive behavior has been included within the method, and progressive failure of the adhesive can be modeled by using a strain-based failure criteria and re-sizing the joint as the adhesive fails. Results of the model compare favorably with experimental and finite element results.
Fast analytical scatter estimation using graphics processing units.
Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris
2015-01-01
To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.
Environmental Stewardship: A Conceptual Review and Analytical Framework.
Bennett, Nathan J; Whitty, Tara S; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H
2018-04-01
There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.
Environmental Stewardship: A Conceptual Review and Analytical Framework
NASA Astrophysics Data System (ADS)
Bennett, Nathan J.; Whitty, Tara S.; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H.
2018-04-01
There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.
A framework for characterizing eHealth literacy demands and barriers.
Chan, Connie V; Kaufman, David R
2011-11-17
Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum.
A Framework for Characterizing eHealth Literacy Demands and Barriers
Chan, Connie V
2011-01-01
Background Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. Objective We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. Methods We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. Results The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. Conclusions The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum. PMID:22094891
PARTICLE SCATTERING OFF OF RIGHT-HANDED DISPERSIVE WAVES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schreiner, C.; Kilian, P.; Spanier, F., E-mail: cschreiner@astro.uni-wuerzburg.de
Resonant scattering of fast particles off low frequency plasma waves is a major process determining transport characteristics of energetic particles in the heliosphere and contributing to their acceleration. Usually, only Alfvén waves are considered for this process, although dispersive waves are also present throughout the heliosphere. We investigate resonant interaction of energetic electrons with dispersive, right-handed waves. For the interaction of particles and a single wave a variable transformation into the rest frame of the wave can be performed. Here, well-established analytic models derived in the framework of magnetostatic quasi-linear theory can be used as a reference to validate simulationmore » results. However, this approach fails as soon as several dispersive waves are involved. Based on analytic solutions modeling the scattering amplitude in the magnetostatic limit, we present an approach to modify these equations for use in the plasma frame. Thereby we aim at a description of particle scattering in the presence of several waves. A particle-in-cell code is employed to study wave–particle scattering on a micro-physically correct level and to test the modified model equations. We investigate the interactions of electrons at different energies (from 1 keV to 1 MeV) and right-handed waves with various amplitudes. Differences between model and simulation arise in the case of high amplitudes or several waves. Analyzing the trajectories of single particles we find no microscopic diffusion in the case of a single plasma wave, although a broadening of the particle distribution can be observed.« less
Investigating the two-moment characterisation of subcellular biochemical networks.
Ullah, Mukhtar; Wolkenhauer, Olaf
2009-10-07
While ordinary differential equations (ODEs) form the conceptual framework for modelling many cellular processes, specific situations demand stochastic models to capture the influence of noise. The most common formulation of stochastic models for biochemical networks is the chemical master equation (CME). While stochastic simulations are a practical way to realise the CME, analytical approximations offer more insight into the influence of noise. Towards that end, the two-moment approximation (2MA) is a promising addition to the established analytical approaches including the chemical Langevin equation (CLE) and the related linear noise approximation (LNA). The 2MA approach directly tracks the mean and (co)variance which are coupled in general. This coupling is not obvious in CME and CLE and ignored by LNA and conventional ODE models. We extend previous derivations of 2MA by allowing (a) non-elementary reactions and (b) relative concentrations. Often, several elementary reactions are approximated by a single step. Furthermore, practical situations often require the use of relative concentrations. We investigate the applicability of the 2MA approach to the well-established fission yeast cell cycle model. Our analytical model reproduces the clustering of cycle times observed in experiments. This is explained through multiple resettings of M-phase promoting factor (MPF), caused by the coupling between mean and (co)variance, near the G2/M transition.
Information Tailoring Enhancements for Large Scale Social Data
2016-03-15
i.com) 1 Work Performed within This Reporting Period .................................................... 2 1.1 Implemented Temporal Analytics ...following tasks. Implemented Temporal Analysis Algorithms for Advanced Analytics in Scraawl. We implemented our backend web service design for the...temporal analysis and we created a prototyope GUI web service of Scraawl analytics dashboard. Upgraded Scraawl computational framework to increase
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Widanapathirana, Chathuranga
2014-01-01
Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…
Banerjee, Debasis; Wang, Hao; Plonka, Anna M; Emge, Thomas J; Parise, John B; Li, Jing
2016-08-08
Gate-opening is a unique and interesting phenomenon commonly observed in flexible porous frameworks, where the pore characteristics and/or crystal structures change in response to external stimuli such as adding or removing guest molecules. For gate-opening that is induced by gas adsorption, the pore-opening pressure often varies for different adsorbate molecules and, thus, can be applied to selectively separate a gas mixture. The detailed understanding of this phenomenon is of fundamental importance to the design of industrially applicable gas-selective sorbents, which remains under investigated due to the lack of direct structural evidence for such systems. We report a mechanistic study of gas-induced gate-opening process of a microporous metal-organic framework, [Mn(ina)2 ] (ina=isonicotinate) associated with commensurate adsorption, by a combination of several analytical techniques including single crystal X-ray diffraction, in situ powder X-ray diffraction coupled with differential scanning calorimetry (XRD-DSC), and gas adsorption-desorption methods. Our study reveals that the pronounced and reversible gate opening/closing phenomena observed in [Mn(ina)2 ] are coupled with a structural transition that involves rotation of the organic linker molecules as a result of interaction of the framework with adsorbed gas molecules including carbon dioxide and propane. The onset pressure to open the gate correlates with the extent of such interaction. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Huang, Chao; Wu, Jie; Song, Chuanjun; Ding, Ran; Qiao, Yan; Hou, Hongwei; Chang, Junbiao; Fan, Yaoting
2015-06-28
Upon single-crystal-to-single-crystal (SCSC) oxidation/reduction, reversible structural transformations take place between the anionic porous zeolite-like Cu(I) framework and a topologically equivalent neutral Cu(I)Cu(II) mixed-valent framework. The unique conversion behavior of the Cu(I) framework endowed it as a redox-switchable catalyst for the direct arylation of heterocycle C-H bonds.
Propulsion System Modeling and Simulation
NASA Technical Reports Server (NTRS)
Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile
2002-01-01
The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.
NASA Astrophysics Data System (ADS)
Giuliani, M.; Herman, J. D.; Castelletti, A.; Reed, P.
2014-04-01
This study contributes a decision analytic framework to overcome policy inertia and myopia in complex river basin management contexts. The framework combines reservoir policy identification, many-objective optimization under uncertainty, and visual analytics to characterize current operations and discover key trade-offs between alternative policies for balancing competing demands and system uncertainties. The approach is demonstrated on the Conowingo Dam, located within the Lower Susquehanna River, USA. The Lower Susquehanna River is an interstate water body that has been subject to intensive water management efforts due to competing demands from urban water supply, atomic power plant cooling, hydropower production, and federally regulated environmental flows. We have identified a baseline operating policy for the Conowingo Dam that closely reproduces the dynamics of current releases and flows for the Lower Susquehanna and thus can be used to represent the preferences structure guiding current operations. Starting from this baseline policy, our proposed decision analytic framework then combines evolutionary many-objective optimization with visual analytics to discover new operating policies that better balance the trade-offs within the Lower Susquehanna. Our results confirm that the baseline operating policy, which only considers deterministic historical inflows, significantly overestimates the system's reliability in meeting the reservoir's competing demands. Our proposed framework removes this bias by successfully identifying alternative reservoir policies that are more robust to hydroclimatic uncertainties while also better addressing the trade-offs across the Conowingo Dam's multisector services.
NASA Astrophysics Data System (ADS)
Song, Y.; Gui, Z.; Wu, H.; Wei, Y.
2017-09-01
Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.
An Analytical Framework for the Steady State Impact of Carbonate Compensation on Atmospheric CO2
NASA Astrophysics Data System (ADS)
Omta, Anne Willem; Ferrari, Raffaele; McGee, David
2018-04-01
The deep-ocean carbonate ion concentration impacts the fraction of the marine calcium carbonate production that is buried in sediments. This gives rise to the carbonate compensation feedback, which is thought to restore the deep-ocean carbonate ion concentration on multimillennial timescales. We formulate an analytical framework to investigate the impact of carbonate compensation under various changes in the carbon cycle relevant for anthropogenic change and glacial cycles. Using this framework, we show that carbonate compensation amplifies by 15-20% changes in atmospheric CO2 resulting from a redistribution of carbon between the atmosphere and ocean (e.g., due to changes in temperature, salinity, or nutrient utilization). A counterintuitive result emerges when the impact of organic matter burial in the ocean is examined. The organic matter burial first leads to a slight decrease in atmospheric CO2 and an increase in the deep-ocean carbonate ion concentration. Subsequently, enhanced calcium carbonate burial leads to outgassing of carbon from the ocean to the atmosphere, which is quantified by our framework. Results from simulations with a multibox model including the minor acids and bases important for the ocean-atmosphere exchange of carbon are consistent with our analytical predictions. We discuss the potential role of carbonate compensation in glacial-interglacial cycles as an example of how our theoretical framework may be applied.
On Connectivity of Wireless Sensor Networks with Directional Antennas
Wang, Qiu; Dai, Hong-Ning; Zheng, Zibin; Imran, Muhammad; Vasilakos, Athanasios V.
2017-01-01
In this paper, we investigate the network connectivity of wireless sensor networks with directional antennas. In particular, we establish a general framework to analyze the network connectivity while considering various antenna models and the channel randomness. Since existing directional antenna models have their pros and cons in the accuracy of reflecting realistic antennas and the computational complexity, we propose a new analytical directional antenna model called the iris model to balance the accuracy against the complexity. We conduct extensive simulations to evaluate the analytical framework. Our results show that our proposed analytical model on the network connectivity is accurate, and our iris antenna model can provide a better approximation to realistic directional antennas than other existing antenna models. PMID:28085081
Piegorsch, Walter W.; Lussier, Yves A.
2015-01-01
Motivation: The conventional approach to personalized medicine relies on molecular data analytics across multiple patients. The path to precision medicine lies with molecular data analytics that can discover interpretable single-subject signals (N-of-1). We developed a global framework, N-of-1-pathways, for a mechanistic-anchored approach to single-subject gene expression data analysis. We previously employed a metric that could prioritize the statistical significance of a deregulated pathway in single subjects, however, it lacked in quantitative interpretability (e.g. the equivalent to a gene expression fold-change). Results: In this study, we extend our previous approach with the application of statistical Mahalanobis distance (MD) to quantify personal pathway-level deregulation. We demonstrate that this approach, N-of-1-pathways Paired Samples MD (N-OF-1-PATHWAYS-MD), detects deregulated pathways (empirical simulations), while not inflating false-positive rate using a study with biological replicates. Finally, we establish that N-OF-1-PATHWAYS-MD scores are, biologically significant, clinically relevant and are predictive of breast cancer survival (P < 0.05, n = 80 invasive carcinoma; TCGA RNA-sequences). Conclusion: N-of-1-pathways MD provides a practical approach towards precision medicine. The method generates the magnitude and the biological significance of personal deregulated pathways results derived solely from the patient’s transcriptome. These pathways offer the opportunities for deriving clinically actionable decisions that have the potential to complement the clinical interpretability of personal polymorphisms obtained from DNA acquired or inherited polymorphisms and mutations. In addition, it offers an opportunity for applicability to diseases in which DNA changes may not be relevant, and thus expand the ‘interpretable ‘omics’ of single subjects (e.g. personalome). Availability and implementation: http://www.lussierlab.net/publications/N-of-1-pathways. Contact: yves@email.arizona.edu or piegorsch@math.arizona.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26072495
Developing an Analytical Framework for Argumentation on Energy Consumption Issues
ERIC Educational Resources Information Center
Jin, Hui; Mehl, Cathy E.; Lan, Deborah H.
2015-01-01
In this study, we aimed to develop a framework for analyzing the argumentation practice of high school students and high school graduates. We developed the framework in a specific context--how energy consumption activities such as changing diet, converting forests into farmlands, and choosing transportation modes affect the carbon cycle. The…
ERIC Educational Resources Information Center
Lam, Gigi
2014-01-01
A socio-psychological analytical framework will be adopted to illuminate the relation between socioeconomic status and academic achievement. The framework puts the emphasis to incorporate micro familial factors into macro factor of the tracking system. Initially, children of the poor families always lack major prerequisite: diminution of cognitive…
European Qualifications Framework: Weighing Some Pros and Cons out of a French Perspective
ERIC Educational Resources Information Center
Bouder, Annie
2008-01-01
Purpose: The purpose of this paper is to question the appropriateness of a proposal for a new European Qualifications Framework. The framework has three perspectives: historical; analytical; and national. Design/methodology/approach: The approaches are diverse since the first insists on the institutional and decision-making processes at European…
McDermott, Imelda; Checkland, Kath; Harrison, Stephen; Snow, Stephanie; Coleman, Anna
2013-01-01
The language used by National Health Service (NHS) "commissioning" managers when discussing their roles and responsibilities can be seen as a manifestation of "identity work", defined as a process of identifying. This paper aims to offer a novel approach to analysing "identity work" by triangulation of multiple analytical methods, combining analysis of the content of text with analysis of its form. Fairclough's discourse analytic methodology is used as a framework. Following Fairclough, the authors use analytical methods associated with Halliday's systemic functional linguistics. While analysis of the content of interviews provides some information about NHS Commissioners' perceptions of their roles and responsibilities, analysis of the form of discourse that they use provides a more detailed and nuanced view. Overall, the authors found that commissioning managers have a higher level of certainty about what commissioning is not rather than what commissioning is; GP managers have a high level of certainty of their identity as a GP rather than as a manager; and both GP managers and non-GP managers oscillate between multiple identities depending on the different situations they are in. This paper offers a novel approach to triangulation, based not on the usual comparison of multiple data sources, but rather based on the application of multiple analytical methods to a single source of data. This paper also shows the latent uncertainty about the nature of commissioning enterprise in the English NHS.
Use of multiple colorimetric indicators for paper-based microfluidic devices.
Dungchai, Wijitar; Chailapakul, Orawon; Henry, Charles S
2010-08-03
We report here the use of multiple indicators for a single analyte for paper-based microfluidic devices (microPAD) in an effort to improve the ability to visually discriminate between analyte concentrations. In existing microPADs, a single dye system is used for the measurement of a single analyte. In our approach, devices are designed to simultaneously quantify analytes using multiple indicators for each analyte improving the accuracy of the assay. The use of multiple indicators for a single analyte allows for different indicator colors to be generated at different analyte concentration ranges as well as increasing the ability to better visually discriminate colors. The principle of our devices is based on the oxidation of indicators by hydrogen peroxide produced by oxidase enzymes specific for each analyte. Each indicator reacts at different peroxide concentrations and therefore analyte concentrations, giving an extended range of operation. To demonstrate the utility of our approach, the mixture of 4-aminoantipyrine and 3,5-dichloro-2-hydroxy-benzenesulfonic acid, o-dianisidine dihydrochloride, potassium iodide, acid black, and acid yellow were chosen as the indicators for simultaneous semi-quantitative measurement of glucose, lactate, and uric acid on a microPAD. Our approach was successfully applied to quantify glucose (0.5-20 mM), lactate (1-25 mM), and uric acid (0.1-7 mM) in clinically relevant ranges. The determination of glucose, lactate, and uric acid in control serum and urine samples was also performed to demonstrate the applicability of this device for biological sample analysis. Finally results for the multi-indicator and single indicator system were compared using untrained readers to demonstrate the improvements in accuracy achieved with the new system. 2010 Elsevier B.V. All rights reserved.
Müllerová, Ludmila; Dubský, Pavel; Gaš, Bohuslav
2015-03-06
Interactions among analyte forms that undergo simultaneous dissociation/protonation and complexation with multiple selectors take the shape of a highly interconnected multi-equilibrium scheme. This makes it difficult to express the effective mobility of the analyte in these systems, which are often encountered in electrophoretical separations, unless a generalized model is introduced. In the first part of this series, we presented the theory of electromigration of a multivalent weakly acidic/basic/amphoteric analyte undergoing complexation with a mixture of an arbitrary number of selectors. In this work we demonstrate the validity of this concept experimentally. The theory leads to three useful perspectives, each of which is closely related to the one originally formulated for simpler systems. If pH, IS and the selector mixture composition are all kept constant, the system is treated as if only a single analyte form interacted with a single selector. If the pH changes at constant IS and mixture composition, the already well-established models of a weakly acidic/basic analyte interacting with a single selector can be employed. Varying the mixture composition at constant IS and pH leads to a situation where virtually a single analyte form interacts with a mixture of selectors. We show how to switch between the three perspectives in practice and confirm that they can be employed interchangeably according to the specific needs by measurements performed in single- and dual-selector systems at a pH where the analyte is fully dissociated, partly dissociated or fully protonated. Weak monoprotic analyte (R-flurbiprofen) and two selectors (native β-cyclodextrin and monovalent positively charged 6-monodeoxy-6-monoamino-β-cyclodextrin) serve as a model system. Copyright © 2015 Elsevier B.V. All rights reserved.
Urban Partnership Agreement and Congestion Reduction Demonstration : National Evaluation Framework
DOT National Transportation Integrated Search
2008-11-21
This report provides an analytical framework for evaluating six deployments under the United States Department of Transportation (U.S. DOT) Urban Partnership Agreement (UPA) and Congestion Reduction Demonstration (CRD) Programs. The six UPA/CRD sites...
Scaling Student Success with Predictive Analytics: Reflections after Four Years in the Data Trenches
ERIC Educational Resources Information Center
Wagner, Ellen; Longanecker, David
2016-01-01
The metrics used in the US to track students do not include adults and part-time students. This has led to the development of a massive data initiative--the Predictive Analytics Reporting (PAR) framework--that uses predictive analytics to trace the progress of all types of students in the system. This development has allowed actionable,…
Assessment of Critical-Analytic Thinking
ERIC Educational Resources Information Center
Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.
2014-01-01
National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…
Earthdata Cloud Analytics Project
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Lynnes, Chris
2018-01-01
This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.
Zhang, Yilong; Han, Sung Won; Cox, Laura M; Li, Huilin
2017-12-01
Human microbiome is the collection of microbes living in and on the various parts of our body. The microbes living on our body in nature do not live alone. They act as integrated microbial community with massive competing and cooperating and contribute to our human health in a very important way. Most current analyses focus on examining microbial differences at a single time point, which do not adequately capture the dynamic nature of the microbiome data. With the advent of high-throughput sequencing and analytical tools, we are able to probe the interdependent relationship among microbial species through longitudinal study. Here, we propose a multivariate distance-based test to evaluate the association between key phenotypic variables and microbial interdependence utilizing the repeatedly measured microbiome data. Extensive simulations were performed to evaluate the validity and efficiency of the proposed method. We also demonstrate the utility of the proposed test using a well-designed longitudinal murine experiment and a longitudinal human study. The proposed methodology has been implemented in the freely distributed open-source R package and Python code. © 2017 WILEY PERIODICALS, INC.
A unified framework for the evaluation of surrogate endpoints in mental-health clinical trials.
Molenberghs, Geert; Burzykowski, Tomasz; Alonso, Ariel; Assam, Pryseley; Tilahun, Abel; Buyse, Marc
2010-06-01
For a number of reasons, surrogate endpoints are considered instead of the so-called true endpoint in clinical studies, especially when such endpoints can be measured earlier, and/or with less burden for patient and experimenter. Surrogate endpoints may occur more frequently than their standard counterparts. For these reasons, it is not surprising that the use of surrogate endpoints in clinical practice is increasing. Building on the seminal work of Prentice(1) and Freedman et al.,(2) Buyse et al. (3) framed the evaluation exercise within a meta-analytic setting, in an effort to overcome difficulties that necessarily surround evaluation efforts based on a single trial. In this article, we review the meta-analytic approach for continuous outcomes, discuss extensions to non-normal and longitudinal settings, as well as proposals to unify the somewhat disparate collection of validation measures currently on the market. Implications for design and for predicting the effect of treatment in a new trial, based on the surrogate, are discussed. A case study in schizophrenia is analysed.
Schwartz, Rachel S; Mueller, Rachel L
2010-01-11
Estimates of divergence dates between species improve our understanding of processes ranging from nucleotide substitution to speciation. Such estimates are frequently based on molecular genetic differences between species; therefore, they rely on accurate estimates of the number of such differences (i.e. substitutions per site, measured as branch length on phylogenies). We used simulations to determine the effects of dataset size, branch length heterogeneity, branch depth, and analytical framework on branch length estimation across a range of branch lengths. We then reanalyzed an empirical dataset for plethodontid salamanders to determine how inaccurate branch length estimation can affect estimates of divergence dates. The accuracy of branch length estimation varied with branch length, dataset size (both number of taxa and sites), branch length heterogeneity, branch depth, dataset complexity, and analytical framework. For simple phylogenies analyzed in a Bayesian framework, branches were increasingly underestimated as branch length increased; in a maximum likelihood framework, longer branch lengths were somewhat overestimated. Longer datasets improved estimates in both frameworks; however, when the number of taxa was increased, estimation accuracy for deeper branches was less than for tip branches. Increasing the complexity of the dataset produced more misestimated branches in a Bayesian framework; however, in an ML framework, more branches were estimated more accurately. Using ML branch length estimates to re-estimate plethodontid salamander divergence dates generally resulted in an increase in the estimated age of older nodes and a decrease in the estimated age of younger nodes. Branch lengths are misestimated in both statistical frameworks for simulations of simple datasets. However, for complex datasets, length estimates are quite accurate in ML (even for short datasets), whereas few branches are estimated accurately in a Bayesian framework. Our reanalysis of empirical data demonstrates the magnitude of effects of Bayesian branch length misestimation on divergence date estimates. Because the length of branches for empirical datasets can be estimated most reliably in an ML framework when branches are <1 substitution/site and datasets are > or =1 kb, we suggest that divergence date estimates using datasets, branch lengths, and/or analytical techniques that fall outside of these parameters should be interpreted with caution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Scott D.; Eckberg, Alison D.; Thallapally, Praveen K.
2011-09-01
The metal-organic framework Cu-BTC was evaluated for its ability to selectively interact with Lewis-base analytes, including explosives, by examining retention on GC columns packed with Chromosorb W HP that contained 3.0% SE-30 along with various loadings of Cu-BTC. SEM images of the support material showed the characteristic Cu-BTC crystals embedded in the SE-30 coating on the diatomaceous support. Results indicated that the Cu-BTC-containing stationary phase had limited thermal stability (220°C) and strong general retention for analytes. Kováts index calculations showed selective retention (amounting to about 300 Kováts units) relative to n-alkanes for many small Lewis-base analytes on a column thatmore » contained 0.75% Cu-BTC compared to an SE-30 control. Short columns that contained lower loadings of Cu-BTC (0.10%) were necessary to elute explosives and related analytes; however, selectivity was not observed for aromatic compounds (including nitroaromatics) or nitroalkanes. Observed retention characteristics are discussed.« less
Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih
2015-01-01
The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions.
Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih
2015-01-01
The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions. PMID:26600156
Crow, Megan; Paul, Anirban; Ballouz, Sara; Huang, Z Josh; Gillis, Jesse
2018-02-28
Single-cell RNA-sequencing (scRNA-seq) technology provides a new avenue to discover and characterize cell types; however, the experiment-specific technical biases and analytic variability inherent to current pipelines may undermine its replicability. Meta-analysis is further hampered by the use of ad hoc naming conventions. Here we demonstrate our replication framework, MetaNeighbor, that quantifies the degree to which cell types replicate across datasets, and enables rapid identification of clusters with high similarity. We first measure the replicability of neuronal identity, comparing results across eight technically and biologically diverse datasets to define best practices for more complex assessments. We then apply this to novel interneuron subtypes, finding that 24/45 subtypes have evidence of replication, which enables the identification of robust candidate marker genes. Across tasks we find that large sets of variably expressed genes can identify replicable cell types with high accuracy, suggesting a general route forward for large-scale evaluation of scRNA-seq data.
Zelovich, Tamar; Hansen, Thorsten; Liu, Zhen-Fei; ...
2017-03-02
A parameter-free version of the recently developed driven Liouville-von Neumann equation [T. Zelovich et al., J. Chem. Theory Comput. 10(8), 2927-2941 (2014)] for electronic transport calculations in molecular junctions is presented. The single driving rate, appearing as a fitting parameter in the original methodology, is replaced by a set of state-dependent broadening factors applied to the different single-particle lead levels. These broadening factors are extracted explicitly from the self-energy of the corresponding electronic reservoir and are fully transferable to any junction incorporating the same lead model. Furthermore, the performance of the method is demonstrated via tight-binding and extended Hückel calculationsmore » of simple junction models. Our analytic considerations and numerical results indicate that the developed methodology constitutes a rigorous framework for the design of "black-box" algorithms to simulate electron dynamics in open quantum systems out of equilibrium.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zelovich, Tamar; Hansen, Thorsten; Liu, Zhen-Fei
A parameter-free version of the recently developed driven Liouville-von Neumann equation [T. Zelovich et al., J. Chem. Theory Comput. 10(8), 2927-2941 (2014)] for electronic transport calculations in molecular junctions is presented. The single driving rate, appearing as a fitting parameter in the original methodology, is replaced by a set of state-dependent broadening factors applied to the different single-particle lead levels. These broadening factors are extracted explicitly from the self-energy of the corresponding electronic reservoir and are fully transferable to any junction incorporating the same lead model. Furthermore, the performance of the method is demonstrated via tight-binding and extended Hückel calculationsmore » of simple junction models. Our analytic considerations and numerical results indicate that the developed methodology constitutes a rigorous framework for the design of "black-box" algorithms to simulate electron dynamics in open quantum systems out of equilibrium.« less
Analyzing Electronic Question/Answer Services: Framework and Evaluations of Selected Services.
ERIC Educational Resources Information Center
White, Marilyn Domas, Ed.
This report develops an analytical framework based on systems analysis for evaluating electronic question/answer or AskA services operated by a wide range of types of organizations, including libraries. Version 1.0 of this framework was applied in June 1999 to a selective sample of 11 electronic question/answer services, which cover a range of…
Rainbow: A Framework for Analysing Computer-Mediated Pedagogical Debates
ERIC Educational Resources Information Center
Baker, Michael; Andriessen, Jerry; Lund, Kristine; van Amelsvoort, Marie; Quignard, Matthieu
2007-01-01
In this paper we present a framework for analysing when and how students engage in a specific form of interactive knowledge elaboration in CSCL environments: broadening and deepening understanding of a space of debate. The framework is termed "Rainbow," as it comprises seven principal analytical categories, to each of which a colour is assigned,…
Analysis of Naval NETWAR FORCEnet Enterprise: Implications for Capabilities Based Budgeting
2006-12-01
of this background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed...background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed. The...Business Approach ......................................................26 Figure 8. Critical Assumption for Common Analytical Framework
ERIC Educational Resources Information Center
Bennison, Anne; Goos, Merrilyn
2013-01-01
This paper reviews recent literature on teacher identity in order to propose an operational framework that can be used to investigate the formation and development of numeracy teacher identities. The proposed framework is based on Van Zoest and Bohl's (2005) framework for mathematics teacher identity with a focus on those characteristics thought…
Green Framework and Its Role in Sustainable City Development (by Example of Yekaterinburg)
NASA Astrophysics Data System (ADS)
Maltseva, A.
2017-11-01
The article focuses on the destruction of the city green framework in Yekaterinburg. The strategy of its recovery by means of a bioactive core represented by a botanic garden has been proposed. The analytical framework for modification in the proportion of green territories and the total city area has been described.
DOT National Transportation Integrated Search
2012-05-01
This report provides an analytical framework for evaluating the two field deployments under the United States Department of Transportation (U.S. DOT) Integrated Corridor Management (ICM) Initiative Demonstration Phase. The San Diego Interstate 15 cor...
Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework.
Khazaei, Hamzeh; McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil
2015-11-18
Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids' NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution.
Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework
McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil
2015-01-01
Background Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. Objective To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. Methods We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). Results We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids’ NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Conclusions Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution. PMID:26582268
Distinctive aspects of the evolution of galactic magnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yar-Mukhamedov, D., E-mail: danial.su@gmail.com
2016-11-15
We perform an in-depth analysis of the evolution of galactic magnetic fields within a semi-analytic galaxy formation and evolution framework, determine various distinctive aspects of the evolution process, and obtain analytic solutions for a wide range of possible evolution scenarios.
Strategic, Analytic and Operational Domains of Information Management.
ERIC Educational Resources Information Center
Diener, Richard AV
1992-01-01
Discussion of information management focuses on three main areas of activities and their interrelationship: (1) strategic, including establishing frameworks and principles of operations; (2) analytic, or research elements, including user needs assessment, data gathering, and data analysis; and (3) operational activities, including reference…
Reed, M S; Podesta, G; Fazey, I; Geeson, N; Hessel, R; Hubacek, K; Letson, D; Nainggolan, D; Prell, C; Rickenbach, M G; Ritsema, C; Schwilch, G; Stringer, L C; Thomas, A D
2013-10-01
Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.
Reed, M.S.; Podesta, G.; Fazey, I.; Geeson, N.; Hessel, R.; Hubacek, K.; Letson, D.; Nainggolan, D.; Prell, C.; Rickenbach, M.G.; Ritsema, C.; Schwilch, G.; Stringer, L.C.; Thomas, A.D.
2013-01-01
Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change. PMID:25844020
A consistent conceptual framework for applying climate metrics in technology life cycle assessment
NASA Astrophysics Data System (ADS)
Mallapragada, Dharik; Mignone, Bryan K.
2017-07-01
Comparing the potential climate impacts of different technologies is challenging for several reasons, including the fact that any given technology may be associated with emissions of multiple greenhouse gases when evaluated on a life cycle basis. In general, analysts must decide how to aggregate the climatic effects of different technologies, taking into account differences in the properties of the gases (differences in atmospheric lifetimes and instantaneous radiative efficiencies) as well as different technology characteristics (differences in emission factors and technology lifetimes). Available metrics proposed in the literature have incorporated these features in different ways and have arrived at different conclusions. In this paper, we develop a general framework for classifying metrics based on whether they measure: (a) cumulative or end point impacts, (b) impacts over a fixed time horizon or up to a fixed end year, and (c) impacts from a single emissions pulse or from a stream of pulses over multiple years. We then use the comparison between compressed natural gas and gasoline-fueled vehicles to illustrate how the choice of metric can affect conclusions about technologies. Finally, we consider tradeoffs involved in selecting a metric, show how the choice of metric depends on the framework that is assumed for climate change mitigation, and suggest which subset of metrics are likely to be most analytically self-consistent.
Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)
NASA Astrophysics Data System (ADS)
Dubinskii, Yu A.; Osipenko, A. S.
2000-02-01
Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.
Quantifying drivers of wild pig movement across multiple spatial and temporal scales.
Kay, Shannon L; Fischer, Justin W; Monaghan, Andrew J; Beasley, James C; Boughton, Raoul; Campbell, Tyler A; Cooper, Susan M; Ditchkoff, Stephen S; Hartley, Steve B; Kilgo, John C; Wisely, Samantha M; Wyckoff, A Christy; VerCauteren, Kurt C; Pepin, Kim M
2017-01-01
The movement behavior of an animal is determined by extrinsic and intrinsic factors that operate at multiple spatio-temporal scales, yet much of our knowledge of animal movement comes from studies that examine only one or two scales concurrently. Understanding the drivers of animal movement across multiple scales is crucial for understanding the fundamentals of movement ecology, predicting changes in distribution, describing disease dynamics, and identifying efficient methods of wildlife conservation and management. We obtained over 400,000 GPS locations of wild pigs from 13 different studies spanning six states in southern U.S.A., and quantified movement rates and home range size within a single analytical framework. We used a generalized additive mixed model framework to quantify the effects of five broad predictor categories on movement: individual-level attributes, geographic factors, landscape attributes, meteorological conditions, and temporal variables. We examined effects of predictors across three temporal scales: daily, monthly, and using all data during the study period. We considered both local environmental factors such as daily weather data and distance to various resources on the landscape, as well as factors acting at a broader spatial scale such as ecoregion and season. We found meteorological variables (temperature and pressure), landscape features (distance to water sources), a broad-scale geographic factor (ecoregion), and individual-level characteristics (sex-age class), drove wild pig movement across all scales, but both the magnitude and shape of covariate relationships to movement differed across temporal scales. The analytical framework we present can be used to assess movement patterns arising from multiple data sources for a range of species while accounting for spatio-temporal correlations. Our analyses show the magnitude by which reaction norms can change based on the temporal scale of response data, illustrating the importance of appropriately defining temporal scales of both the movement response and covariates depending on the intended implications of research (e.g., predicting effects of movement due to climate change versus planning local-scale management). We argue that consideration of multiple spatial scales within the same framework (rather than comparing across separate studies post-hoc ) gives a more accurate quantification of cross-scale spatial effects by appropriately accounting for error correlation.
Pilot testing of SHRP 2 reliability data and analytical products: Florida. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
SHRP 2 initiated the L38 project to pilot test products from five of the programs completed projects. The products support reliability estimation and use based on data analyses, analytical techniques, and decision-making framework. The L38 project...
Joseph v. Brady: Synthesis Reunites What Analysis Has Divided
ERIC Educational Resources Information Center
Thompson, Travis
2012-01-01
Joseph V. Brady (1922-2011) created behavior-analytic neuroscience and the analytic framework for understanding how the external and internal neurobiological environments and mechanisms interact. Brady's approach offered synthesis as well as analysis. He embraced Findley's approach to constructing multioperant behavioral repertoires that found…
Comparing volume of fluid and level set methods for evaporating liquid-gas flows
NASA Astrophysics Data System (ADS)
Palmore, John; Desjardins, Olivier
2016-11-01
This presentation demonstrates three numerical strategies for simulating liquid-gas flows undergoing evaporation. The practical aim of this work is to choose a framework capable of simulating the combustion of liquid fuels in an internal combustion engine. Each framework is analyzed with respect to its accuracy and computational cost. All simulations are performed using a conservative, finite volume code for simulating reacting, multiphase flows under the low-Mach assumption. The strategies used in this study correspond to different methods for tracking the liquid-gas interface and handling the transport of the discontinuous momentum and vapor mass fractions fields. The first two strategies are based on conservative, geometric volume of fluid schemes using directionally split and un-split advection, respectively. The third strategy is the accurate conservative level set method. For all strategies, special attention is given to ensuring the consistency between the fluxes of mass, momentum, and vapor fractions. The study performs three-dimensional simulations of an isolated droplet of a single component fuel evaporating into air. Evaporation rates and vapor mass fractions are compared to analytical results.
Multi-Regge kinematics and the moduli space of Riemann spheres with marked points
Del Duca, Vittorio; Druc, Stefan; Drummond, James; ...
2016-08-25
We show that scattering amplitudes in planar N = 4 Super Yang-Mills in multi-Regge kinematics can naturally be expressed in terms of single-valued iterated integrals on the moduli space of Riemann spheres with marked points. As a consequence, scattering amplitudes in this limit can be expressed as convolutions that can easily be computed using Stokes’ theorem. We apply this framework to MHV amplitudes to leading-logarithmic accuracy (LLA), and we prove that at L loops all MHV amplitudes are determined by amplitudes with up to L + 4 external legs. We also investigate non-MHV amplitudes, and we show that they canmore » be obtained by convoluting the MHV results with a certain helicity flip kernel. We classify all leading singularities that appear at LLA in the Regge limit for arbitrary helicity configurations and any number of external legs. In conclusion, we use our new framework to obtain explicit analytic results at LLA for all MHV amplitudes up to five loops and all non-MHV amplitudes with up to eight external legs and four loops.« less
Banerjee, Debasis; Wang, Hao; Gong, Qihan; ...
2015-10-27
Here, the efficiency of physisorption-based separation of gas-mixtures depends on the selectivity of adsorbent which is directly linked to size, shape, polarizability and other physical properties of adsorbed molecules. Commensurate adsorption is an interesting and important adsorption phenomenon, where the adsorbed amount, location, and orientation of an adsorbate are commensurate with the crystal symmetry of the adsorbent. Understanding this phenomenon is important and beneficial as it can provide vital information about adsorbate–adsorbent interaction and adsorption–desorption mechanism. So far, only sporadic examples of commensurate adsorption have been reported in porous materials such as zeolites and metal organic frameworks (MOFs). In thismore » work we show for the first time direct structural evidence of commensurate-to-incommensurate transition of linear hydrocarbon molecules (C 2–C7) in a microporous MOF, by employing a number of analytical techniques including single crystal X-ray diffraction (SCXRD), in situ powder X-ray diffraction coupled with differential scanning calorimetry (PXRD-DSC), gas adsorption and molecular simulations.« less
A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.
Morag, Ido; Luria, Gil
2013-01-01
Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.
Pion single and double charge exchange in the resonance region: Dynamical corrections
NASA Astrophysics Data System (ADS)
Johnson, Mikkel B.; Siciliano, E. R.
1983-04-01
We consider pion-nucleus elastic scattering and single- and double-charge-exchange scattering to isobaric analog states near the (3,3) resonance within an isospin invariant framework. We extend previous theories by introducing terms into the optical potential U that are quadratic in density and consistent with isospin invariance of the strong interaction. We study the sensitivity of single and double charge exchange angular distributions to parameters of the second-order potential both numerically, by integrating the Klein-Gordon equation, and analytically, by using semiclassical approximations that explicate the dependence of the exact numerical results to the parameters of U. The magnitude and shape of double charge exchange angular distributions are more sensitive to the isotensor term in U than has been hitherto appreciated. An examination of recent experimental data shows that puzzles in the shape of the 18O(π+, π-)18Ne angular distribution at 164 MeV and in the A dependence of the forward double charge exchange scattering on 18O, 26Mg, 42Ca, and 48Ca at the same energy may be resolved by adding an isotensor term in U. NUCLEAR REACTIONS Scattering theory for elastic, single-, and double-charge-exchange scattering to IAS in the region of the P33 resonance. Second-order effects on charge-exchange calculations of σ(A, θ).
Clerehan, Rosemary; Hirsh, Di; Buchbinder, Rachelle
2009-01-01
While clinicians may routinely use patient information leaflets about drug therapy, a poorly conceived leaflet has the potential to do harm. We previously developed a novel approach to analysing leaflets about a rheumatoid arthritis drug, using an analytic approach based on systemic functional linguistics. The aim of the present study was to verify the validity of the linguistic framework by applying it to two further arthritis drug leaflets. The findings confirmed the applicability of the framework and were used to refine it. A new stage or 'move' in the genre was identified. While the function of many of the moves appeared to be 'to instruct' the patient, the instruction was often unclear. The role relationships expressed in the text were critical to the meaning. As with our previous study, judged on their lexical density, the leaflets resembled academic text. The framework can provide specific tools to assess and produce medication information leaflets to support readers in taking medication. Future work could utilize the framework to evaluate information on other treatments and procedures or on healthcare information more widely.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...
2016-01-28
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Big data analytics in healthcare: promise and potential.
Raghupathi, Wullianallur; Raghupathi, Viju
2014-01-01
To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.
The Ophidia framework: toward cloud-based data analytics for climate change
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni
2015-04-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.
NASA Technical Reports Server (NTRS)
Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James;
2016-01-01
Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.
Facilitating Multiple Intelligences through Multimodal Learning Analytics
ERIC Educational Resources Information Center
Perveen, Ayesha
2018-01-01
This paper develops a theoretical framework for employing learning analytics in online education to trace multiple learning variations of online students by considering their potential of being multiple intelligences based on Howard Gardner's 1983 theory of multiple intelligences. The study first emphasizes the need to facilitate students as…
Dhummakupt, Elizabeth S; Carmany, Daniel O; Mach, Phillip M; Tovar, Trenton M; Ploskonka, Ann M; Demond, Paul S; DeCoste, Jared B; Glaros, Trevor
2018-03-07
Paper spray mass spectrometry has been shown to successfully analyze chemical warfare agent (CWA) simulants. However, due to the volatility differences between the simulants and real G-series (i.e., sarin, soman) CWAs, analysis from an untreated paper substrate proved difficult. To extend the analytical lifetime of these G-agents, metal-organic frameworks (MOFs) were successfully integrated onto the paper spray substrates to increase adsorption and desorption. In this study, several MOFs and nanoparticles were tested to extend the analytical lifetimes of sarin, soman, and cyclosarin on paper spray substrates. It was found that the addition of either UiO-66 or HKUST-1 to the paper substrate increased the analytical lifetime of the G-agents from less than 5 min detectability to at least 50 min.
Analyte discrimination from chemiresistor response kinetics.
Read, Douglas H; Martin, James E
2010-08-15
Chemiresistors are polymer-based sensors that transduce the sorption of a volatile organic compound into a resistance change. Like other polymer-based gas sensors that function through sorption, chemiresistors can be selective for analytes on the basis of the affinity of the analyte for the polymer. However, a single sensor cannot, in and of itself, discriminate between analytes, since a small concentration of an analyte that has a high affinity for the polymer might give the same response as a high concentration of another analyte with a low affinity. In this paper we use a field-structured chemiresistor to demonstrate that its response kinetics can be used to discriminate between analytes, even between those that have identical chemical affinities for the polymer phase of the sensor. The response kinetics is shown to be independent of the analyte concentration, and thus the magnitude of the sensor response, but is found to vary inversely with the analyte's saturation vapor pressure. Saturation vapor pressures often vary greatly from analyte to analyte, so analysis of the response kinetics offers a powerful method for obtaining analyte discrimination from a single sensor.
BIRCH: a user-oriented, locally-customizable, bioinformatics system.
Fristensky, Brian
2007-02-09
Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.
BIRCH: A user-oriented, locally-customizable, bioinformatics system
Fristensky, Brian
2007-01-01
Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351
Polsky, Colin; Grove, J. Morgan; Knudson, Chris; Groffman, Peter M.; Bettez, Neil; Cavender-Bares, Jeannine; Hall, Sharon J.; Heffernan, James B.; Hobbie, Sarah E.; Larson, Kelli L.; Morse, Jennifer L.; Neill, Christopher; Nelson, Kristen C.; Ogden, Laura A.; O’Neil-Dunne, Jarlath; Pataki, Diane E.; Roy Chowdhury, Rinku; Steele, Meredith K.
2014-01-01
Changes in land use, land cover, and land management present some of the greatest potential global environmental challenges of the 21st century. Urbanization, one of the principal drivers of these transformations, is commonly thought to be generating land changes that are increasingly similar. An implication of this multiscale homogenization hypothesis is that the ecosystem structure and function and human behaviors associated with urbanization should be more similar in certain kinds of urbanized locations across biogeophysical gradients than across urbanization gradients in places with similar biogeophysical characteristics. This paper introduces an analytical framework for testing this hypothesis, and applies the framework to the case of residential lawn care. This set of land management behaviors are often assumed—not demonstrated—to exhibit homogeneity. Multivariate analyses are conducted on telephone survey responses from a geographically stratified random sample of homeowners (n = 9,480), equally distributed across six US metropolitan areas. Two behaviors are examined: lawn fertilizing and irrigating. Limited support for strong homogenization is found at two scales (i.e., multi- and single-city; 2 of 36 cases), but significant support is found for homogenization at only one scale (22 cases) or at neither scale (12 cases). These results suggest that US lawn care behaviors are more differentiated in practice than in theory. Thus, even if the biophysical outcomes of urbanization are homogenizing, managing the associated sustainability implications may require a multiscale, differentiated approach because the underlying social practices appear relatively varied. The analytical approach introduced here should also be productive for other facets of urban-ecological homogenization. PMID:24616515
Åsberg, Dennis; Leśko, Marek; Samuelsson, Jörgen; Kaczmarski, Krzysztof; Fornstedt, Torgny
2014-10-03
This is the first investigation in a series that aims to enhance the scientific knowledge needed for reliable analytical method transfer between HPLC and UHPLC using the quality by design (QbD) framework. Here, we investigated the differences and similarities from a thermodynamic point of view between RP-LC separations conducted with 3.5μm (HPLC) and 1.7μm (UHPLC) C18 particles. Three different model solutes and one pharmaceutical compound were used: the uncharged cycloheptanone, the cationic benzyltriethylammonium chloride, the anionic sodium 2-naphatlene sulfonate and the pharmaceutical compound omeprazole, which was anionic at the studied pH. Adsorption data were determined for the four solutes at varying fractions of organic modifier and in gradient elution in both the HPLC and UHPLC system, respectively. From the adsorption data, the adsorption energy distribution of each compound was calculated and the adsorption isotherm model was estimated. We found that the adsorption energy distribution was similar, with only minor differences in degree of homogeneity, for HPLC and UHPLC stationary phases. The adsorption isotherm model did not change between HPLC and UHPLC, but the parameter values changed considerably especially for the ionic compounds. The dependence of the organic modifier followed the same trend in HPLC as in UHPLC. These results indicates that the adsorption mechanism of a solute is the same on HPLC and UHPLC stationary phases which simplifies design of a single analytical method applicable to both HPLC and UHPLC conditions within the QbD framework. Copyright © 2014. Published by Elsevier B.V.
Parallel Aircraft Trajectory Optimization with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Falck, Robert D.; Gray, Justin S.; Naylor, Bret
2016-01-01
Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.
NASA Astrophysics Data System (ADS)
Donohue, Randall; Yang, Yuting; McVicar, Tim; Roderick, Michael
2016-04-01
A fundamental question in climate and ecosystem science is "how does climate regulate the land surface carbon budget?" To better answer that question, here we develop an analytical model for estimating mean annual terrestrial gross primary productivity (GPP), which is the largest carbon flux over land, based on a rate-limitation framework. Actual GPP (climatological mean from 1982 to 2010) is calculated as a function of the balance between two GPP potentials defined by the climate (i.e., precipitation and solar radiation) and a third parameter that encodes other environmental variables and modifies the GPP-climate relationship. The developed model was tested at three spatial scales using different GPP sources, i.e., (1) observed GPP from 94 flux-sites, (2) modelled GPP (using the model-tree-ensemble approach) at 48654 (0.5 degree) grid-cells and (3) at 32 large catchments across the globe. Results show that the proposed model could account for the spatial GPP patterns, with a root-mean-square error of 0.70, 0.65 and 0.3 g C m-2 d-1 and R2 of 0.79, 0.92 and 0.97 for the flux-site, grid-cell and catchment scales, respectively. This analytical GPP model shares a similar form with the Budyko hydroclimatological model, which opens the possibility of a general analytical framework to analyze the linked carbon-water-energy cycles.
Cultural Cleavage and Criminal Justice.
ERIC Educational Resources Information Center
Scheingold, Stuart A.
1978-01-01
Reviews major theories of criminal justice, proposes an alternative analytic framework which focuses on cultural factors, applies this framework to several cases, and discusses implications of a cultural perspective for rule of law values. Journal available from Office of Publication, Department of Political Science, University of Florida,…
Institutional Racist Melancholia: A Structural Understanding of Grief and Power in Schooling
ERIC Educational Resources Information Center
Vaught, Sabina E.
2012-01-01
In this article, Sabina Vaught undertakes the theoretical and analytical project of conceptually integrating "Whiteness as property", a key structural framework of Critical Race Theory (CRT), and "melancholia", a framework originally emerging from psychoanalysis. Specifically, Vaught engages "Whiteness as property" as…
Video-Based Analyses of Motivation and Interaction in Science Classrooms
NASA Astrophysics Data System (ADS)
Moeller Andersen, Hanne; Nielsen, Birgitte Lund
2013-04-01
An analytical framework for examining students' motivation was developed and used for analyses of video excerpts from science classrooms. The framework was developed in an iterative process involving theories on motivation and video excerpts from a 'motivational event' where students worked in groups. Subsequently, the framework was used for an analysis of students' motivation in the whole class situation. A cross-case analysis was carried out illustrating characteristics of students' motivation dependent on the context. This research showed that students' motivation to learn science is stimulated by a range of different factors, with autonomy, relatedness and belonging apparently being the main sources of motivation. The teacher's combined use of questions, uptake and high level evaluation was very important for students' learning processes and motivation, especially students' self-efficacy. By coding and analysing video excerpts from science classrooms, we were able to demonstrate that the analytical framework helped us gain new insights into the effect of teachers' communication and other elements on students' motivation.
ClimateSpark: An in-memory distributed computing framework for big climate data analytics
NASA Astrophysics Data System (ADS)
Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei
2018-06-01
The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.
Defense Resource Management Studies: Introduction to Capability and Acquisition Planning Processes
2010-08-01
interchangeable and useful in a common contextual framework . Currently, both simulations use a common scenario, the same fictitious country, and...culture, legal framework , and institutions. • Incorporate Principles of Good Governance and Respect for Human Rights: Stress accountability and...Preparing for the assessments requires defining the missions to be analyzed; subdividing the mission definitions to provide a framework for analytic work
Using Learning Analytics for Preserving Academic Integrity
ERIC Educational Resources Information Center
Amigud, Alexander; Arnedo-Moreno, Joan; Daradoumis, Thanasis; Guerrero-Roldan, Ana-Elena
2017-01-01
This paper presents the results of integrating learning analytics into the assessment process to enhance academic integrity in the e-learning environment. The goal of this research is to evaluate the computational-based approach to academic integrity. The machine-learning based framework learns students' patterns of language use from data,…
An Active Learning Exercise for Introducing Agent-Based Modeling
ERIC Educational Resources Information Center
Pinder, Jonathan P.
2013-01-01
Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…
Translating Learning into Numbers: A Generic Framework for Learning Analytics
ERIC Educational Resources Information Center
Greller, Wolfgang; Drachsler, Hendrik
2012-01-01
With the increase in available educational data, it is expected that Learning Analytics will become a powerful means to inform and support learners, teachers and their institutions in better understanding and predicting personal learning needs and performance. However, the processes and requirements behind the beneficial application of Learning…
ERIC Educational Resources Information Center
Dawson, Shane; Siemens, George
2014-01-01
The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional "literacy" skills towards an enhanced set of…
Robin J. Tausch
2015-01-01
A theoretically based analytic model of plant growth in single species conifer communities based on the species fully occupying a site and fully using the site resources is introduced. Model derivations result in a single equation simultaneously describes changes over both, different site conditions (or resources available), and over time for each variable for each...
Burns, K C; Zotz, G
2010-02-01
Epiphytes are an important component of many forested ecosystems, yet our understanding of epiphyte communities lags far behind that of terrestrial-based plant communities. This discrepancy is exacerbated by the lack of a theoretical context to assess patterns in epiphyte community structure. We attempt to fill this gap by developing an analytical framework to investigate epiphyte assemblages, which we then apply to a data set on epiphyte distributions in a Panamanian rain forest. On a coarse scale, interactions between epiphyte species and host tree species can be viewed as bipartite networks, similar to pollination and seed dispersal networks. On a finer scale, epiphyte communities on individual host trees can be viewed as meta-communities, or suites of local epiphyte communities connected by dispersal. Similar analytical tools are typically employed to investigate species interaction networks and meta-communities, thus providing a unified analytical framework to investigate coarse-scale (network) and fine-scale (meta-community) patterns in epiphyte distributions. Coarse-scale analysis of the Panamanian data set showed that most epiphyte species interacted with fewer host species than expected by chance. Fine-scale analyses showed that epiphyte species richness on individual trees was lower than null model expectations. Therefore, epiphyte distributions were clumped at both scales, perhaps as a result of dispersal limitations. Scale-dependent patterns in epiphyte species composition were observed. Epiphyte-host networks showed evidence of negative co-occurrence patterns, which could arise from adaptations among epiphyte species to avoid competition for host species, while most epiphyte meta-communities were distributed at random. Application of our "meta-network" analytical framework in other locales may help to identify general patterns in the structure of epiphyte assemblages and their variation in space and time.
Light scattering of a Bessel beam by a nucleated biological cell: An eccentric sphere model
NASA Astrophysics Data System (ADS)
Wang, Jia Jie; Han, Yi Ping; Chang, Jiao Yong; Chen, Zhu Yang
2018-02-01
Within the framework of generalized Lorenz-Mie theory (GLMT), an eccentrically stratified dielectric sphere model illuminated by an arbitrarily incident Bessel beam is applied to investigate the scattering characteristics of a single nucleated biological cell. The Bessel beam propagating in an arbitrary direction is expanded in terms of vector spherical wave functions (VSWFs), where the beam shape coefficients (BSCs) are calculated rigorously in a closed analytical form. The effects of the half-cone angle of Bessel beam, the location of the particle in the beam, the size ratio of nucleus to cell, and the location of the nucleus inside the cell on the scattering properties of a nucleated cell are analyzed. The results provide useful references for optical diagnostic and imaging of particle having nucleated structure.
Photocouplings at the pole from pion photoproduction
Ronchen, D.; Doring, M.; Huang, F.; ...
2014-06-24
The reactions γp → π 0p and γp → π +n are analyzed in a semi-phenomenological approach up to E ~ 2.3 GeV. Fits to differential cross section and single and double polarization observables are performed. A good overall reproduction of the available photoproduction data is achieved. The Julich2012 dynamical coupled-channel model -which describes elastic πN scattering and the world data base of the reactions πN → ηN, KΛ, and KΣ at the same time– is employed as the hadronic interaction in the final state. Furthermore, the framework guarantees analyticity and, thus, allows for a reliable extraction of resonance parametersmore » in terms of poles and residues. In particular, the photocouplings at the pole can be extracted and are presented.« less
Monte Carlo Solution to Find Input Parameters in Systems Design Problems
NASA Astrophysics Data System (ADS)
Arsham, Hossein
2013-06-01
Most engineering system designs, such as product, process, and service design, involve a framework for arriving at a target value for a set of experiments. This paper considers a stochastic approximation algorithm for estimating the controllable input parameter within a desired accuracy, given a target value for the performance function. Two different problems, what-if and goal-seeking problems, are explained and defined in an auxiliary simulation model, which represents a local response surface model in terms of a polynomial. A method of constructing this polynomial by a single run simulation is explained. An algorithm is given to select the design parameter for the local response surface model. Finally, the mean time to failure (MTTF) of a reliability subsystem is computed and compared with its known analytical MTTF value for validation purposes.
A Framework for Integrating Environmental Justice in Regulatory Analysis
Nweke, Onyemaechi C.
2011-01-01
With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235
Meyer, Adrian; Green, Laura; Faulk, Ciearro; Galla, Stephen; Meyer, Anne-Marie
2016-01-01
Introduction: Large amounts of health data generated by a wide range of health care applications across a variety of systems have the potential to offer valuable insight into populations and health care systems, but robust and secure computing and analytic systems are required to leverage this information. Framework: We discuss our experiences deploying a Secure Data Analysis Platform (SeDAP), and provide a framework to plan, build and deploy a virtual desktop infrastructure (VDI) to enable innovation, collaboration and operate within academic funding structures. It outlines 6 core components: Security, Ease of Access, Performance, Cost, Tools, and Training. Conclusion: A platform like SeDAP is not simply successful through technical excellence and performance. It’s adoption is dependent on a collaborative environment where researchers and users plan and evaluate the requirements of all aspects. PMID:27683665
Theory and applications of structured light single pixel imaging
NASA Astrophysics Data System (ADS)
Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.
2018-02-01
Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.
Deriving Appropriate Educational Program Costs in Illinois.
ERIC Educational Resources Information Center
Parrish, Thomas B.; Chambers, Jay G.
This document describes the comprehensive analytical framework for school finance used by the Illinois State Board of Education to assist policymakers in their decisions about equitable distribution of state aid and appropriate levels of resources to meet the varying educational requirements of differing student populations. This framework, the…
Analyzing Agricultural Technology Systems: A Research Report.
ERIC Educational Resources Information Center
Swanson, Burton E.
The International Program for Agricultural Knowledge Systems (INTERPAKS) research team is developing a descriptive and analytic framework to examine and assess agricultural technology systems. The first part of the framework is an inductive methodology that organizes data collection and orders data for comparison between countries. It requires and…
Large Ensemble Analytic Framework for Consequence-Driven Discovery of Climate Change Scenarios
NASA Astrophysics Data System (ADS)
Lamontagne, Jonathan R.; Reed, Patrick M.; Link, Robert; Calvin, Katherine V.; Clarke, Leon E.; Edmonds, James A.
2018-03-01
An analytic scenario generation framework is developed based on the idea that the same climate outcome can result from very different socioeconomic and policy drivers. The framework builds on the Scenario Matrix Framework's abstraction of "challenges to mitigation" and "challenges to adaptation" to facilitate the flexible discovery of diverse and consequential scenarios. We combine visual and statistical techniques for interrogating a large factorial data set of 33,750 scenarios generated using the Global Change Assessment Model. We demonstrate how the analytic framework can aid in identifying which scenario assumptions are most tied to user-specified measures for policy relevant outcomes of interest, specifically for our example high or low mitigation costs. We show that the current approach for selecting reference scenarios can miss policy relevant scenario narratives that often emerge as hybrids of optimistic and pessimistic scenario assumptions. We also show that the same scenario assumption can be associated with both high and low mitigation costs depending on the climate outcome of interest and the mitigation policy context. In the illustrative example, we show how agricultural productivity, population growth, and economic growth are most predictive of the level of mitigation costs. Formulating policy relevant scenarios of deeply and broadly uncertain futures benefits from large ensemble-based exploration of quantitative measures of consequences. To this end, we have contributed a large database of climate change futures that can support "bottom-up" scenario generation techniques that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.
2010-01-01
Background The prevention of overweight sometimes raises complex ethical questions. Ethical public health frameworks may be helpful in evaluating programs or policy for overweight prevention. We give an overview of the purpose, form and contents of such public health frameworks and investigate to which extent they are useful for evaluating programs to prevent overweight and/or obesity. Methods Our search for frameworks consisted of three steps. Firstly, we asked experts in the field of ethics and public health for the frameworks they were aware of. Secondly, we performed a search in Pubmed. Thirdly, we checked literature references in the articles on frameworks we found. In total, we thus found six ethical frameworks. We assessed the area on which the available ethical frameworks focus, the users they target at, the type of policy or intervention they propose to address, and their aim. Further, we looked at their structure and content, that is, tools for guiding the analytic process, the main ethical principles or values, possible criteria for dealing with ethical conflicts, and the concrete policy issues they are applied to. Results All frameworks aim to support public health professionals or policymakers. Most of them provide a set of values or principles that serve as a standard for evaluating policy. Most frameworks articulate both the positive ethical foundations for public health and ethical constraints or concerns. Some frameworks offer analytic tools for guiding the evaluative process. Procedural guidelines and concrete criteria for solving important ethical conflicts in the particular area of the prevention of overweight or obesity are mostly lacking. Conclusions Public health ethical frameworks may be supportive in the evaluation of overweight prevention programs or policy, but seem to lack practical guidance to address ethical conflicts in this particular area. PMID:20969761
VERTPAK1. Code Verification Analytic Solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Golis, M.J.
1983-04-01
VERTPAK1 is a package of analytical solutions used in verification of numerical codes that simulate fluid flow, rock deformation, and solute transport in fractured and unfractured porous media. VERTPAK1 contains the following: BAREN, an analytical solution developed by Barenblatt, Zhelton and Kochina (1960) for describing transient flow to a well penetrating a (double porosity) confined aquifer; GIBMAC, an analytical solution developed by McNamee and Gibson (1960) for describing consolidation of a semi-infinite soil medium subject to a strip (plane strain) or cylindrical (axisymmetric) loading; GRINRH, an analytical solution developed by Gringarten (1971) for describing transient flow to a partially penetratingmore » well in a confined aquifer containing a single horizontal fracture; GRINRV, an analytical solution developed by Gringarten, Ramey, and Raghavan (1974) for describing transient flow to a fully penetrating well in a confined aquifer containing a single vertical fracture; HART, an analytical solution given by Nowacki (1962) and implemented by HART (1981) for describing the elastic behavior of an infinite solid subject to a line heat source; LESTER, an analytical solution presented by Lester, Jansen, and Burkholder (1975) for describing one-dimensional transport of radionuclide chains through an adsorbing medium; STRELT, an analytical solution presented by Streltsova-Adams (1978) for describing transient flow to a fully penetrating well in a (double porosity) confined aquifer; and TANG, an analytical solution developed by Tang, Frind, and Sudicky (1981) for describing solute transport in a porous medium containing a single fracture.« less
Rapid ultrasensitive single particle surface-enhanced Raman spectroscopy using metallic nanopores.
Cecchini, Michael P; Wiener, Aeneas; Turek, Vladimir A; Chon, Hyangh; Lee, Sangyeop; Ivanov, Aleksandar P; McComb, David W; Choo, Jaebum; Albrecht, Tim; Maier, Stefan A; Edel, Joshua B
2013-10-09
Nanopore sensors embedded within thin dielectric membranes have been gaining significant interest due to their single molecule sensitivity and compatibility of detecting a large range of analytes, from DNA and proteins, to small molecules and particles. Building on this concept we utilize a metallic Au solid-state membrane to translocate and rapidly detect single Au nanoparticles (NPs) functionalized with 589 dye molecules using surface-enhanced resonance Raman spectroscopy (SERRS). We show that, due to the plasmonic coupling between the Au metallic nanopore surface and the NP, signal intensities are enhanced when probing analyte molecules bound to the NP surface. Although not single molecule, this nanopore sensing scheme benefits from the ability of SERRS to provide rich vibrational information on the analyte, improving on current nanopore-based electrical and optical detection techniques. We show that the full vibrational spectrum of the analyte can be detected with ultrahigh spectral sensitivity and a rapid temporal resolution of 880 μs.
A visual analytic framework for data fusion in investigative intelligence
NASA Astrophysics Data System (ADS)
Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David
2014-05-01
Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.
Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique
2010-06-30
The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France.
ERIC Educational Resources Information Center
Eick, Caroline Marie; Ryan, Patrick A.
2014-01-01
This article discusses the relevance of an analytic framework that integrates principles of Catholic social teaching, critical pedagogy, and the theory of intersectionality to explain attitudes toward marginalized youth held by Catholic students preparing to become teachers. The framework emerges from five years of action research data collected…
Competency Analytics Tool: Analyzing Curriculum Using Course Competencies
ERIC Educational Resources Information Center
Gottipati, Swapna; Shankararaman, Venky
2018-01-01
The applications of learning outcomes and competency frameworks have brought better clarity to engineering programs in many universities. Several frameworks have been proposed to integrate outcomes and competencies into course design, delivery and assessment. However, in many cases, competencies are course-specific and their overall impact on the…
ERIC Educational Resources Information Center
Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda
2016-01-01
Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…
ERIC Educational Resources Information Center
Kou, Xiaojing
2011-01-01
Various formats of online discussion have proven valuable for enhancing learning and collaboration in distance and blended learning contexts. However, despite their capacity to reveal essential processes in collaborative inquiry, current mainstream analytical frameworks, such as the cognitive presence framework (Garrison, Anderson, & Archer,…
A Cognitive Framework for the Analysis of Online Chemistry Courses
ERIC Educational Resources Information Center
Evans, Karen L.; Leinhardt, Gaea
2008-01-01
Many students now are receiving instruction in online environments created by universities, museums, corporations, and even students. What features of a given online course contribute to its effectiveness? This paper addresses that query by proposing and applying an analytic framework to five online introductory chemistry courses. Introductory…
1990-08-01
evidence for a surprising degree of long-term skill retention. We formulated a theoretical framework , focusing on the importance of procedural reinstatement...considerable forgetting over even relatively short retention intervals. We have been able to place these studies in the same general theoretical framework developed
Managing Offshore Branch Campuses: An Analytical Framework for Institutional Strategies
ERIC Educational Resources Information Center
Shams, Farshid; Huisman, Jeroen
2012-01-01
The aim of this article is to develop a framework that encapsulates the key managerial complexities of running offshore branch campuses. In the transnational higher education (TNHE) literature, several managerial ramifications and impediments have been addressed by scholars and practitioners. However, the strands of the literature are highly…
Analytical procedure validation and the quality by design paradigm.
Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno
2015-01-01
Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.
ERIC Educational Resources Information Center
Christie, Pam
2016-01-01
Reflecting on South African experience, this paper develops an analytical framework using the work of Henri Lefebvre and Nancy Fraser to understand why socially just arrangements may be so difficult to achieve in post-conflict reconstruction. The paper uses Lefebvre's analytic to trace three sets of entangled practices…
ERIC Educational Resources Information Center
Ranga, Marina; Etzkowitz, Henry
2013-01-01
This paper introduces the concept of Triple Helix systems as an analytical construct that synthesizes the key features of university--industry--government (Triple Helix) interactions into an "innovation system" format, defined according to systems theory as a set of components, relationships and functions. Among the components of Triple…
ERIC Educational Resources Information Center
Edwards, Jeffrey R.; Lambert, Lisa Schurer
2007-01-01
Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated…
Learning Analytics as a Counterpart to Surveys of Student Experience
ERIC Educational Resources Information Center
Borden, Victor M. H.; Coates, Hamish
2017-01-01
Analytics derived from the student learning environment provide new insights into the collegiate experience; they can be used as a supplement to or, to some extent, in place of traditional surveys. To serve this purpose, however, greater attention must be paid to conceptual frameworks and to advancing institutional systems, activating new…
Challenges of Using Learning Analytics Techniques to Support Mobile Learning
ERIC Educational Resources Information Center
Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide
2015-01-01
Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…
ERIC Educational Resources Information Center
Lintao, Rachelle B.; Erfe, Jonathan P.
2012-01-01
This study purports to foster the understanding of profession-based academic writing in two different cultural conventions by examining the rhetorical moves employed by American and Philippine thesis introductions in Architecture using Swales' 2004 Revised CARS move-analytic model as framework. Twenty (20) Master's thesis introductions in…
ERIC Educational Resources Information Center
Lu, Owen H. T.; Huang, Anna Y. Q.; Huang, Jeff C. H.; Lin, Albert J. Q.; Ogata, Hiroaki; Yang, Stephen J. H.
2018-01-01
Blended learning combines online digital resources with traditional classroom activities and enables students to attain higher learning performance through well-defined interactive strategies involving online and traditional learning activities. Learning analytics is a conceptual framework and is a part of our Precision education used to analyze…
A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories
ERIC Educational Resources Information Center
Duvvuri, Sri Devi; Gruca, Thomas S.
2010-01-01
Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…
ERIC Educational Resources Information Center
Tlili, Ahmed; Essalmi, Fathi; Jemni, Mohamed; Kinshuk; Chen, Nian-Shing
2018-01-01
With the rapid growth of online education in recent years, Learning Analytics (LA) has gained increasing attention from researchers and educational institutions as an area which can improve the overall effectiveness of learning experiences. However, the lack of guidelines on what should be taken into consideration during application of LA hinders…
Learning Analytics for Communities of Inquiry
ERIC Educational Resources Information Center
Kovanovic, Vitomir; Gaševic, Dragan; Hatala, Marek
2014-01-01
This paper describes doctoral research that focuses on the development of a learning analytics framework for inquiry-based digital learning. Building on the Community of Inquiry model (CoI)--a foundation commonly used in the research and practice of digital learning and teaching--this research builds on the existing body of knowledge in two…
ERIC Educational Resources Information Center
Weeks, Cristal E.; Kanter, Jonathan W.; Bonow, Jordan T.; Landes, Sara J.; Busch, Andrew M.
2012-01-01
Functional analytic psychotherapy (FAP) provides a behavioral analysis of the psychotherapy relationship that directly applies basic research findings to outpatient psychotherapy settings. Specifically, FAP suggests that a therapist's in vivo (i.e., in-session) contingent responding to targeted client behaviors, particularly positive reinforcement…
Infinite slope stability under steady unsaturated seepage conditions
Lu, Ning; Godt, Jonathan W.
2008-01-01
We present a generalized framework for the stability of infinite slopes under steady unsaturated seepage conditions. The analytical framework allows the water table to be located at any depth below the ground surface and variation of soil suction and moisture content above the water table under steady infiltration conditions. The framework also explicitly considers the effect of weathering and porosity increase near the ground surface on changes in the friction angle of the soil. The factor of safety is conceptualized as a function of the depth within the vadose zone and can be reduced to the classical analytical solution for subaerial infinite slopes in the saturated zone. Slope stability analyses with hypothetical sandy and silty soils are conducted to illustrate the effectiveness of the framework. These analyses indicate that for hillslopes of both sandy and silty soils, failure can occur above the water table under steady infiltration conditions, which is consistent with some field observations that cannot be predicted by the classical infinite slope theory. A case study of shallow slope failures of sandy colluvium on steep coastal hillslopes near Seattle, Washington, is presented to examine the predictive utility of the proposed framework.
8D likelihood effective Higgs couplings extraction framework in h → 4ℓ
Chen, Yi; Di Marco, Emanuele; Lykken, Joe; ...
2015-01-23
We present an overview of a comprehensive analysis framework aimed at performing direct extraction of all possible effective Higgs couplings to neutral electroweak gauge bosons in the decay to electrons and muons, the so called ‘golden channel’. Our framework is based primarily on a maximum likelihood method constructed from analytic expressions of the fully differential cross sections for h → 4l and for the dominant irreduciblemore » $$ q\\overline{q} $$ → 4l background, where 4l = 2e2μ, 4e, 4μ. Detector effects are included by an explicit convolution of these analytic expressions with the appropriate transfer function over all center of mass variables. Utilizing the full set of observables, we construct an unbinned detector-level likelihood which is continuous in the effective couplings. We consider possible ZZ, Zγ, and γγ couplings simultaneously, allowing for general CP odd/even admixtures. A broad overview is given of how the convolution is performed and we discuss the principles and theoretical basis of the framework. This framework can be used in a variety of ways to study Higgs couplings in the golden channel using data obtained at the LHC and other future colliders.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jung, Yoojin
In this study, we have developed an analytical solution for thermal single-well injection-withdrawal tests in horizontally fractured reservoirs where fluid flow through the fracture is radial. The dimensionless forms of the governing equations and the initial and boundary conditions in the radial flow system can be written in a form identical to those in the linear flow system developed by Jung and Pruess [Jung, Y., and K. Pruess (2012), A Closed-Form Analytical Solution for Thermal Single-Well Injection-Withdrawal Tests, Water Resour. Res., 48, W03504, doi:10.1029/2011WR010979], and therefore the analytical solutions developed in Jung and Pruess (2012) can be applied to computemore » the time dependence of temperature recovery at the injection/withdrawal well in a horizontally oriented fracture with radial flow.« less
Perturbatively deformed defects in Pöschl-Teller-driven scenarios for quantum mechanics
NASA Astrophysics Data System (ADS)
Bernardini, Alex E.; da Rocha, Roldão
2016-07-01
Pöschl-Teller-driven solutions for quantum mechanical fluctuations are triggered off by single scalar field theories obtained through a systematic perturbative procedure for generating deformed defects. The analytical properties concerning the quantum fluctuations in one-dimension, zero-mode states, first- and second-excited states, and energy density profiles are all obtained from deformed topological and non-topological structures supported by real scalar fields. Results are firstly derived from an integrated λϕ4 theory, with corresponding generalizations applied to starting λχ4 and sine-Gordon theories. By focusing our calculations on structures supported by the λϕ4 theory, the outcome of our study suggests an exact quantitative correspondence to Pöschl-Teller-driven systems. Embedded into the perturbative quantum mechanics framework, such a correspondence turns into a helpful tool for computing excited states and continuous mode solutions, as well as their associated energy spectrum, for quantum fluctuations of perturbatively deformed structures. Perturbative deformations create distinct physical scenarios in the context of exactly solvable quantum systems and may also work as an analytical support for describing novel braneworld universes embedded into a 5-dimensional gravity bulk.
On the distribution of local dissipation scales in turbulent flows
NASA Astrophysics Data System (ADS)
May, Ian; Morshed, Khandakar; Venayagamoorthy, Karan; Dasi, Lakshmi
2014-11-01
Universality of dissipation scales in turbulence relies on self-similar scaling and large scale independence. We show that the probability density function of dissipation scales, Q (η) , is analytically defined by the two-point correlation function, and the Reynolds number (Re). We also present a new analytical form for the two-point correlation function for the dissipation scales through a generalized definition of a directional Taylor microscale. Comparison of Q (η) predicted within this framework and published DNS data shows excellent agreement. It is shown that for finite Re no single similarity law exists even for the case of homogeneous isotropic turbulence. Instead a family of scaling is presented, defined by Re and a dimensionless local inhomogeneity parameter based on the spatial gradient of the rms velocity. For moderate Re inhomogeneous flows, we note a strong directional dependence of Q (η) dictated by the principal Reynolds stresses. It is shown that the mode of the distribution Q (η) significantly shifts to sub-Kolmogorov scales along the inhomogeneous directions, as in wall bounded turbulence. This work extends the classical Kolmogorov's theory to finite Re homogeneous isotropic turbulence as well as the case of inhomogeneous anisotropic turbulence.
Marvel, Skylar W; To, Kimberly; Grimm, Fabian A; Wright, Fred A; Rusyn, Ivan; Reif, David M
2018-03-05
Drawing integrated conclusions from diverse source data requires synthesis across multiple types of information. The ToxPi (Toxicological Prioritization Index) is an analytical framework that was developed to enable integration of multiple sources of evidence by transforming data into integrated, visual profiles. Methodological improvements have advanced ToxPi and expanded its applicability, necessitating a new, consolidated software platform to provide functionality, while preserving flexibility for future updates. We detail the implementation of a new graphical user interface for ToxPi (Toxicological Prioritization Index) that provides interactive visualization, analysis, reporting, and portability. The interface is deployed as a stand-alone, platform-independent Java application, with a modular design to accommodate inclusion of future analytics. The new ToxPi interface introduces several features, from flexible data import formats (including legacy formats that permit backward compatibility) to similarity-based clustering to options for high-resolution graphical output. We present the new ToxPi interface for dynamic exploration, visualization, and sharing of integrated data models. The ToxPi interface is freely-available as a single compressed download that includes the main Java executable, all libraries, example data files, and a complete user manual from http://toxpi.org .
Basin stability measure of different steady states in coupled oscillators
NASA Astrophysics Data System (ADS)
Rakshit, Sarbendu; Bera, Bidesh K.; Majhi, Soumen; Hens, Chittaranjan; Ghosh, Dibakar
2017-04-01
In this report, we investigate the stabilization of saddle fixed points in coupled oscillators where individual oscillators exhibit the saddle fixed points. The coupled oscillators may have two structurally different types of suppressed states, namely amplitude death and oscillation death. The stabilization of saddle equilibrium point refers to the amplitude death state where oscillations are ceased and all the oscillators converge to the single stable steady state via inverse pitchfork bifurcation. Due to multistability features of oscillation death states, linear stability theory fails to analyze the stability of such states analytically, so we quantify all the states by basin stability measurement which is an universal nonlocal nonlinear concept and it interplays with the volume of basins of attractions. We also observe multi-clustered oscillation death states in a random network and measure them using basin stability framework. To explore such phenomena we choose a network of coupled Duffing-Holmes and Lorenz oscillators which are interacting through mean-field coupling. We investigate how basin stability for different steady states depends on mean-field density and coupling strength. We also analytically derive stability conditions for different steady states and confirm by rigorous bifurcation analysis.
A Geographically Explicit Genetic Model of Worldwide Human-Settlement History
Liu, Hua; Prugnolle, Franck; Manica, Andrea; Balloux, François
2006-01-01
Currently available genetic and archaeological evidence is generally interpreted as supportive of a recent single origin of modern humans in East Africa. However, this is where the near consensus on human settlement history ends, and considerable uncertainty clouds any more detailed aspect of human colonization history. Here, we present a dynamic genetic model of human settlement history coupled with explicit geographical distances from East Africa, the likely origin of modern humans. We search for the best-supported parameter space by fitting our analytical prediction to genetic data that are based on 52 human populations analyzed at 783 autosomal microsatellite markers. This framework allows us to jointly estimate the key parameters of the expansion of modern humans. Our best estimates suggest an initial expansion of modern humans ∼56,000 years ago from a small founding population of ∼1,000 effective individuals. Our model further points to high growth rates in newly colonized habitats. The general fit of the model with the data is excellent. This suggests that coupling analytical genetic models with explicit demography and geography provides a powerful tool for making inferences on human-settlement history. PMID:16826514
NASA Astrophysics Data System (ADS)
Podgornova, O.; Leaney, S.; Liang, L.
2018-07-01
Extracting medium properties from seismic data faces some limitations due to the finite frequency content of the data and restricted spatial positions of the sources and receivers. Some distributions of the medium properties make low impact on the data (including none). If these properties are used as the inversion parameters, then the inverse problem becomes overparametrized, leading to ambiguous results. We present an analysis of multiparameter resolution for the linearized inverse problem in the framework of elastic full-waveform inversion. We show that the spatial and multiparameter sensitivities are intertwined and non-sensitive properties are spatial distributions of some non-trivial combinations of the conventional elastic parameters. The analysis accounts for the Hessian information and frequency content of the data; it is semi-analytical (in some scenarios analytical), easy to interpret and enhances results of the widely used radiation pattern analysis. Single-type scattering is shown to have limited sensitivity, even for full-aperture data. Finite-frequency data lose multiparameter sensitivity at smooth and fine spatial scales. Also, we establish ways to quantify a spatial-multiparameter coupling and demonstrate that the theoretical predictions agree well with the numerical results.
The breakup mechanism of biomolecular and colloidal aggregates in a shear flow
NASA Astrophysics Data System (ADS)
Ó Conchúir, Breanndán; Zaccone, Alessio
2014-03-01
The theory of self-assembly of colloidal particles in shear flow is incomplete. Previous analytical approaches have failed to capture the microscopic interplay between diffusion, shear and intermolecular interactions which controls the aggregates fate in shear. In this work we analytically solved the drift-diffusion equation for the breakup rate of a dimer in flow. Then applying rigidity percolation theory, we found that the lifetime of a generic cluster formed under shear is controlled by the typical lifetime of a single bond in its interior, which in turn depends on the efficiency of the stress transmitted from other bonds in the cluster. We showed that aggregate breakup is a thermally-activated process where the activation energy is controlled by the interplay between intermolecular forces and the shear drift, and where structural parameters determine whether cluster fragmentation or surface erosion prevails. In our latest work, we analyzed floppy modes and nonaffine deformations to derive a lower bound on the fractal dimension df below which aggregates are mechanically unstable, ie. for large aggregates df ~= 2.4. This theoretical framework is in quantitative agreement with experiments and can be used for population balance modeling of colloidal and protein aggregation.
Beta-Poisson model for single-cell RNA-seq data analyses.
Vu, Trung Nghia; Wills, Quin F; Kalari, Krishna R; Niu, Nifang; Wang, Liewei; Rantalainen, Mattias; Pawitan, Yudi
2016-07-15
Single-cell RNA-sequencing technology allows detection of gene expression at the single-cell level. One typical feature of the data is a bimodality in the cellular distribution even for highly expressed genes, primarily caused by a proportion of non-expressing cells. The standard and the over-dispersed gamma-Poisson models that are commonly used in bulk-cell RNA-sequencing are not able to capture this property. We introduce a beta-Poisson mixture model that can capture the bimodality of the single-cell gene expression distribution. We further integrate the model into the generalized linear model framework in order to perform differential expression analyses. The whole analytical procedure is called BPSC. The results from several real single-cell RNA-seq datasets indicate that ∼90% of the transcripts are well characterized by the beta-Poisson model; the model-fit from BPSC is better than the fit of the standard gamma-Poisson model in > 80% of the transcripts. Moreover, in differential expression analyses of simulated and real datasets, BPSC performs well against edgeR, a conventional method widely used in bulk-cell RNA-sequencing data, and against scde and MAST, two recent methods specifically designed for single-cell RNA-seq data. An R package BPSC for model fitting and differential expression analyses of single-cell RNA-seq data is available under GPL-3 license at https://github.com/nghiavtr/BPSC CONTACT: yudi.pawitan@ki.se or mattias.rantalainen@ki.se Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
A trajectory generation framework for modeling spacecraft entry in MDAO
NASA Astrophysics Data System (ADS)
D`Souza, Sarah N.; Sarigul-Klijn, Nesrin
2016-04-01
In this paper a novel trajectory generation framework was developed that optimizes trajectory event conditions for use in a Generalized Entry Guidance algorithm. The framework was developed to be adaptable via the use of high fidelity equations of motion and drag based analytical bank profiles. Within this framework, a novel technique was implemented that resolved the sensitivity of the bank profile to atmospheric non-linearities. The framework's adaptability was established by running two different entry bank conditions. Each case yielded a reference trajectory and set of transition event conditions that are flight feasible and implementable in a Generalized Entry Guidance algorithm.
2017-01-01
Förster resonance energy transfer (FRET) measurements from a donor, D, to an acceptor, A, fluorophore are frequently used in vitro and in live cells to reveal information on the structure and dynamics of DA labeled macromolecules. Accurate descriptions of FRET measurements by molecular models are complicated because the fluorophores are usually coupled to the macromolecule via flexible long linkers allowing for diffusional exchange between multiple states with different fluorescence properties caused by distinct environmental quenching, dye mobilities, and variable DA distances. It is often assumed for the analysis of fluorescence intensity decays that DA distances and D quenching are uncorrelated (homogeneous quenching by FRET) and that the exchange between distinct fluorophore states is slow (quasistatic). This allows us to introduce the FRET-induced donor decay, εD(t), a function solely depending on the species fraction distribution of the rate constants of energy transfer by FRET, for a convenient joint analysis of fluorescence decays of FRET and reference samples by integrated graphical and analytical procedures. Additionally, we developed a simulation toolkit to model dye diffusion, fluorescence quenching by the protein surface, and FRET. A benchmark study with simulated fluorescence decays of 500 protein structures demonstrates that the quasistatic homogeneous model works very well and recovers for single conformations the average DA distances with an accuracy of < 2%. For more complex cases, where proteins adopt multiple conformations with significantly different dye environments (heterogeneous case), we introduce a general analysis framework and evaluate its power in resolving heterogeneities in DA distances. The developed fast simulation methods, relying on Brownian dynamics of a coarse-grained dye in its sterically accessible volume, allow us to incorporate structural information in the decay analysis for heterogeneous cases by relating dye states with protein conformations to pave the way for fluorescence and FRET-based dynamic structural biology. Finally, we present theories and simulations to assess the accuracy and precision of steady-state and time-resolved FRET measurements in resolving DA distances on the single-molecule and ensemble level and provide a rigorous framework for estimating approximation, systematic, and statistical errors. PMID:28709377
The rise of environmental analytical chemistry as an interdisciplinary activity.
Brown, Richard
2009-07-01
Modern scientific endeavour is increasingly delivered within an interdisciplinary framework. Analytical environmental chemistry is a long-standing example of an interdisciplinary approach to scientific research where value is added by the close cooperation of different disciplines. This editorial piece discusses the rise of environmental analytical chemistry as an interdisciplinary activity and outlines the scope of the Analytical Chemistry and the Environmental Chemistry domains of TheScientificWorldJOURNAL (TSWJ), and the appropriateness of TSWJ's domain format in covering interdisciplinary research. All contributions of new data, methods, case studies, and instrumentation, or new interpretations and developments of existing data, case studies, methods, and instrumentation, relating to analytical and/or environmental chemistry, to the Analytical and Environmental Chemistry domains, are welcome and will be considered equally.
An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Zhou, Ning
With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less
Hartnell, Chad A; Ou, Amy Yi; Kinicki, Angelo
2011-07-01
We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial performance). The paper also tests theoretical suppositions undergirding the CVF by investigating the framework's nomological validity and proposed internal structure (i.e., interrelationships among culture types). Results based on data from 84 empirical studies with 94 independent samples indicate that clan, adhocracy, and market cultures are differentially and positively associated with the effectiveness criteria, though not always as hypothesized. The findings provide mixed support for the CVF's nomological validity and fail to support aspects of the CVF's proposed internal structure. We propose an alternative theoretical approach to the CVF and delineate directions for future research.
Conceptual framework for outcomes research studies of hepatitis C: an analytical review
Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M
2016-01-01
Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments. PMID:27313473
Earth Science Data Fusion with Event Building Approach
NASA Technical Reports Server (NTRS)
Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.
2015-01-01
Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Jian; Saraf, Laxmikant V.; Schwenzer, Birgit
2012-05-25
Flexible anionic metal-organic frameworks transform to neutral heterobimetallic systems via single-crystal-to-single-crystal processes invoked by cation insertion. These transformations are directed by cooperative bond breakage and formation, resulting in expansion or contraction of the 3D framework by up to 33% due to the flexible nature of the organic linker. These MOFs displays highly selective uptake of divalent transition metal cations (Co2+ and Ni2+ for example) over alkali metal cations (Li+ and Na+).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Golis, M.J.
1983-04-01
VERTPAK1 is a package of analytical solutions used in verification of numerical codes that simulate fluid flow, rock deformation, and solute transport in fractured and unfractured porous media. VERTPAK1 contains the following: BAREN, an analytical solution developed by Barenblatt, Zhelton and Kochina (1960) for describing transient flow to a well penetrating a (double porosity) confined aquifer; GIBMAC, an analytical solution developed by McNamee and Gibson (1960) for describing consolidation of a semi-infinite soil medium subject to a strip (plane strain) or cylindrical (axisymmetric) loading; GRINRH, an analytical solution developed by Gringarten (1971) for describing transient flow to a partially penetratingmore » well in a confined aquifer containing a single horizontal fracture; GRINRV, an analytical solution developed by Gringarten, Ramey, and Raghavan (1974) for describing transient flow to a fully penetrating well in a confined aquifer containing a single vertical fracture; HART, an analytical solution given by Nowacki (1962) and implemented by HART (1981) for describing the elastic behavior of an infinite solid subject to a line heat source; LESTER, an analytical solution presented by Lester, Jansen, and Burkholder (1975) for describing one-dimensional transport of radionuclide chains through an adsorbing medium; STRELT, an analytical solution presented by Streltsova-Adams (1978) for describing transient flow to a fully penetrating well in a (double porosity) confined aquifer; and TANG, an analytical solution developed by Tang, Frind, and Sudicky (1981) for describing solute transport in a porous medium containing a single fracture.« less
Lateral Stability Simulation of a Rail Truck on Roller Rig
NASA Astrophysics Data System (ADS)
Dukkipati, Rao V.
The development of experimental facilities for rail vehicle testing is being complemented by analytic studies. The purpose of this effort has been to gain insight into the dynamics of rail vehicles in order to guide development of the Roller Rigs and to establish an analytic framework for the design and interpretation of tests to be conducted on Roller Rigs. The work described here represents initial efforts towards meeting these objectives. Generic linear models were developed of a freight car (with a characteristic North American three-piece truck) on tangent track. The models were developed using the generalized multi body dynamics software MEDYNA. Predictions were made of the theoretical linear model hunting (lateral stability) characteristics of the freight car, i. e., the critical speeds and frequencies, for five different configurations: (a) freight car on track, (b) the freight car's front truck on the roller stand and its rear truck on track, (c) freight car on the roller rig, (d) a single truck on track, and (e) single truck on the roller stand. These were compared with the Association of American Railroads' field test data for an 80-ton hopper car equipped with A-3 ride control trucks. Agreement was reached among all the analytical models, with all models indicating a range of hunting speeds of 2% from the highest to lowest. The largest discrepancy, approximately 6%, was indicated between the models and the field test data. Parametric study results using linear model of freight truck on the roller rig show that (a) increasing roller radius increases critical speed (b) increasing the wheel initial cone angle will decrease the hunting speed (c) increasing the roller cant increases hunting speed (d) decrowning of the wheelset on the rollers will not effect the hunting speed but induces longitudinal destabilizing horizontal forces at the contact and (e) lozenging of wheelset on the rollers induces a yaw moment and the hunting speed decreases with increasing wheelset yaw angle.
NASA Technical Reports Server (NTRS)
Banks, H. T.; Ito, K.
1988-01-01
Numerical techniques for parameter identification in distributed-parameter systems are developed analytically. A general convergence and stability framework (for continuous dependence on observations) is derived for first-order systems on the basis of (1) a weak formulation in terms of sesquilinear forms and (2) the resolvent convergence form of the Trotter-Kato approximation. The extension of this framework to second-order systems is considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yi; Di Marco, Emanuele; Lykken, Joe
2014-10-17
In this technical note we present technical details on various aspects of the framework introduced in arXiv:1401.2077 aimed at extracting effective Higgs couplings in themore » $$h\\to 4\\ell$$ `golden channel'. Since it is the primary feature of the framework, we focus in particular on the convolution integral which takes us from `truth' level to `detector' level and the numerical and analytic techniques used to obtain it. We also briefly discuss other aspects of the framework.« less
Wang, Mingming; Sun, Yuanxiang; Sweetapple, Chris
2017-12-15
Storage is important for flood mitigation and non-point source pollution control. However, to seek a cost-effective design scheme for storage tanks is very complex. This paper presents a two-stage optimization framework to find an optimal scheme for storage tanks using storm water management model (SWMM). The objectives are to minimize flooding, total suspended solids (TSS) load and storage cost. The framework includes two modules: (i) the analytical module, which evaluates and ranks the flooding nodes with the analytic hierarchy process (AHP) using two indicators (flood depth and flood duration), and then obtains the preliminary scheme by calculating two efficiency indicators (flood reduction efficiency and TSS reduction efficiency); (ii) the iteration module, which obtains an optimal scheme using a generalized pattern search (GPS) method based on the preliminary scheme generated by the analytical module. The proposed approach was applied to a catchment in CZ city, China, to test its capability in choosing design alternatives. Different rainfall scenarios are considered to test its robustness. The results demonstrate that the optimal framework is feasible, and the optimization is fast based on the preliminary scheme. The optimized scheme is better than the preliminary scheme for reducing runoff and pollutant loads under a given storage cost. The multi-objective optimization framework presented in this paper may be useful in finding the best scheme of storage tanks or low impact development (LID) controls. Copyright © 2017 Elsevier Ltd. All rights reserved.
An Analytical Framework for the Cross-Country Comparison of Higher Education Governance
ERIC Educational Resources Information Center
Dobbins, Michael; Knill, Christoph; Vogtle, Eva Maria
2011-01-01
In this article we provide an integrated framework for the analysis of higher education governance which allows us to more systematically trace the changes that European higher education systems are currently undergoing. We argue that, despite highly insightful previous analyses, there is a need for more specific empirically observable indicators…
A Human Dimensions Framework: Guidelines for Conducting Social Assessments
Alan D. Bright; H. Ken Cordell; Anne P. Hoover; Michael A Tarrant
2003-01-01
This paper provides a framework and guidelines for identifying and organizing human dimension information for use in forest planning. It synthesizes concepts from a variety of social science disciplines and connects them with measurable indicators for use in analysis and reporting. Suggestions of analytical approaches and sources of data for employment of the...
A Data Analytical Framework for Improving Real-Time, Decision Support Systems in Healthcare
ERIC Educational Resources Information Center
Yahav, Inbal
2010-01-01
In this dissertation we develop a framework that combines data mining, statistics and operations research methods for improving real-time decision support systems in healthcare. Our approach consists of three main concepts: data gathering and preprocessing, modeling, and deployment. We introduce the notion of offline and semi-offline modeling to…
How Do Mathematicians Learn Math?: Resources and Acts for Constructing and Understanding Mathematics
ERIC Educational Resources Information Center
Wilkerson-Jerde, Michelle H.; Wilensky, Uri J.
2011-01-01
In this paper, we present an analytic framework for investigating expert mathematical learning as the process of building a "network of mathematical resources" by establishing relationships between different components and properties of mathematical ideas. We then use this framework to analyze the reasoning of ten mathematicians and mathematics…
Focus for Area Development Analysis: Urban Orientation of Counties.
ERIC Educational Resources Information Center
Bluestone, Herman
The orientation of counties to metropolitan systems and urban centers is identified by population density and percentage of urban population. This analytical framework differentiates 6 kinds of counties, ranging from most urban-oriented (group 1) to least urban-oriented (group 6). With this framework, it can be seen that the economic well-being of…
Analyzing Educators' Online Interactions: A Framework of Online Learning Support Roles
ERIC Educational Resources Information Center
Nacu, Denise C.; Martin, Caitlin K.; Pinkard, Nichole; Gray, Tené
2016-01-01
While the potential benefits of participating in online learning communities are documented, so too are inequities in terms of how different populations access and use them. We present the online learning support roles (OLSR) framework, an approach using both automated analytics and qualitative interpretation to identify and explore online…
Mind-Sets Matter: A Meta-Analytic Review of Implicit Theories and Self-Regulation
ERIC Educational Resources Information Center
Burnette, Jeni L.; O'Boyle, Ernest H.; VanEpps, Eric M.; Pollack, Jeffrey M.; Finkel, Eli J.
2013-01-01
This review builds on self-control theory (Carver & Scheier, 1998) to develop a theoretical framework for investigating associations of implicit theories with self-regulation. This framework conceptualizes self-regulation in terms of 3 crucial processes: goal setting, goal operating, and goal monitoring. In this meta-analysis, we included…
ERIC Educational Resources Information Center
Duhn, Iris; Fleer, Marilyn; Harrison, Linda
2016-01-01
This article focuses on the "Relational Agency Framework" (RAF), an analytical tool developed for an Australian review and evaluation study of an early years' policy initiative. We explore Anne Edward's concepts of "relational expertise", "building common knowledge" and "relational agency" to explore how…
An Analytic Framework to Support E.Learning Strategy Development
ERIC Educational Resources Information Center
Marshall, Stephen J.
2012-01-01
Purpose: The purpose of this paper is to discuss and demonstrate the relevance of a new conceptual framework for leading and managing the development of learning and teaching to e.learning strategy development. Design/methodology/approach: After reviewing and discussing the research literature on e.learning in higher education institutions from…
University Reform and Institutional Autonomy: A Framework for Analysing the Living Autonomy
ERIC Educational Resources Information Center
Maassen, Peter; Gornitzka, Åse; Fumasoli, Tatiana
2017-01-01
In this article we discuss recent university reforms aimed at enhancing university autonomy, highlighting various tensions in the underlying reform ideologies. We examine how the traditional interpretation of university autonomy has been expanded in the reform rationales. An analytical framework for studying how autonomy is interpreted and used…
ERIC Educational Resources Information Center
Wu, Ying-Tien; Tsai, Chin-Chung
2007-01-01
Recently, the significance of learners' informal reasoning on socio-scientific issues has received increasing attention among science educators. To gain deeper insights into this important issue, an integrated analytic framework was developed in this study. With this framework, 71 Grade 10 students' informal reasoning about nuclear energy usage…
ERIC Educational Resources Information Center
Adler, Jill; Ronda, Erlina
2015-01-01
We describe and use an analytical framework to document mathematics discourse in instruction (MDI), and interpret differences in mathematics teaching. MDI is characterised by four interacting components in the teaching of a mathematics lesson: exemplification (occurring through a sequence of examples and related tasks), explanatory talk (talk that…
Tracking the debate around marine protected areas: key issues and the BEG framework.
Thorpe, Andy; Bavinck, Maarten; Coulthard, Sarah
2011-04-01
Marine conservation is often criticized for a mono-disciplinary approach, which delivers fragmented solutions to complex problems with differing interpretations of success. As a means of reflecting on the breadth and range of scientific research on the management of the marine environment, this paper develops an analytical framework to gauge the foci of policy documents and published scientific work on Marine Protected Areas. We evaluate the extent to which MPA research articles delineate objectives around three domains: biological-ecological [B]; economic-social[E]; and governance-management [G]. This permits us to develop an analytic [BEG] framework which we then test on a sample of selected journal article cohorts. While the framework reveals the dominance of biologically focussed research [B], analysis also reveals a growing frequency of the use of governance/management terminology in the literature over the last 15 years, which may be indicative of a shift towards more integrated consideration of governance concerns. However, consideration of the economic/social domain appears to lag behind biological and governance concerns in both frequency and presence in MPA literature.
Linguistics and the Study of Literature. Linguistics in the Undergraduate Curriculum, Appendix 4-D.
ERIC Educational Resources Information Center
Steward, Ann Harleman
Linguistics gives the student of literature an analytical tool whose sole purpose is to describe faithfully the workings of language. It provides a theoretical framework, an analytical method, and a vocabulary for communicating its insights--all designed to serve concerns other than literary interpretation and evaluation, but all useful for…
ERIC Educational Resources Information Center
Cheung, Mike W. L.; Chan, Wai
2009-01-01
Structural equation modeling (SEM) is widely used as a statistical framework to test complex models in behavioral and social sciences. When the number of publications increases, there is a need to systematically synthesize them. Methodology of synthesizing findings in the context of SEM is known as meta-analytic SEM (MASEM). Although correlation…
ERIC Educational Resources Information Center
Schoendorff, Benjamin; Steinwachs, Joanne
2012-01-01
How can therapists be effectively trained in clinical functional contextualism? In this conceptual article we propose a new way of training therapists in Acceptance and Commitment Therapy skills using tools from Functional Analytic Psychotherapy in a training context functionally similar to the therapeutic relationship. FAP has been successfully…
Coates, James; Jeyaseelan, Asha K; Ybarra, Norma; David, Marc; Faria, Sergio; Souhami, Luis; Cury, Fabio; Duclos, Marie; El Naqa, Issam
2015-04-01
We explore analytical and data-driven approaches to investigate the integration of genetic variations (single nucleotide polymorphisms [SNPs] and copy number variations [CNVs]) with dosimetric and clinical variables in modeling radiation-induced rectal bleeding (RB) and erectile dysfunction (ED) in prostate cancer patients. Sixty-two patients who underwent curative hypofractionated radiotherapy (66 Gy in 22 fractions) between 2002 and 2010 were retrospectively genotyped for CNV and SNP rs5489 in the xrcc1 DNA repair gene. Fifty-four patients had full dosimetric profiles. Two parallel modeling approaches were compared to assess the risk of severe RB (Grade⩾3) and ED (Grade⩾1); Maximum likelihood estimated generalized Lyman-Kutcher-Burman (LKB) and logistic regression. Statistical resampling based on cross-validation was used to evaluate model predictive power and generalizability to unseen data. Integration of biological variables xrcc1 CNV and SNP improved the fit of the RB and ED analytical and data-driven models. Cross-validation of the generalized LKB models yielded increases in classification performance of 27.4% for RB and 14.6% for ED when xrcc1 CNV and SNP were included, respectively. Biological variables added to logistic regression modeling improved classification performance over standard dosimetric models by 33.5% for RB and 21.2% for ED models. As a proof-of-concept, we demonstrated that the combination of genetic and dosimetric variables can provide significant improvement in NTCP prediction using analytical and data-driven approaches. The improvement in prediction performance was more pronounced in the data driven approaches. Moreover, we have shown that CNVs, in addition to SNPs, may be useful structural genetic variants in predicting radiation toxicities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
SABRE hyperpolarization enables high-sensitivity 1H and 13C benchtop NMR spectroscopy.
Richardson, Peter M; Parrott, Andrew J; Semenova, Olga; Nordon, Alison; Duckett, Simon B; Halse, Meghan E
2018-06-19
Benchtop NMR spectrometers operating with low magnetic fields of 1-2 T at sub-ppm resolution show great promise as analytical platforms that can be used outside the traditional laboratory environment for industrial process monitoring. One current limitation that reduces the uptake of benchtop NMR is associated with the detection fields' reduced sensitivity. Here we demonstrate how para-hydrogen (p-H2) based signal amplification by reversible exchange (SABRE), a simple to achieve hyperpolarization technique, enhances agent detectability within the environment of a benchtop (1 T) NMR spectrometer so that informative 1H and 13C NMR spectra can be readily recorded for low-concentration analytes. SABRE-derived 1H NMR signal enhancements of up to 17 000-fold, corresponding to 1H polarization levels of P = 5.9%, were achieved for 26 mM pyridine in d4-methanol in a matter of seconds. Comparable enhancement levels can be achieved in both deuterated and protio solvents but now the SABRE-enhanced analyte signals dominate due to the comparatively weak thermally-polarized solvent response. The SABRE approach also enables the acquisition of 13C NMR spectra of analytes at natural isotopic abundance in a single scan as evidenced by hyperpolarized 13C NMR spectra of tens of millimolar concentrations of 4-methylpyridine. Now the associated signal enhancement factors are up to 45 500 fold (P = 4.0%) and achieved in just 15 s. Integration of an automated SABRE polarization system with the benchtop NMR spectrometer framework produces renewable and reproducible NMR signal enhancements that can be exploited for the collection of multi-dimensional NMR spectra, exemplified here by a SABRE-enhanced 2D COSY NMR spectrum.
Parvin, C A
1993-03-01
The error detection characteristics of quality-control (QC) rules that use control observations within a single analytical run are investigated. Unlike the evaluation of QC rules that span multiple analytical runs, most of the fundamental results regarding the performance of QC rules applied within a single analytical run can be obtained from statistical theory, without the need for simulation studies. The case of two control observations per run is investigated for ease of graphical display, but the conclusions can be extended to more than two control observations per run. Results are summarized in a graphical format that offers many interesting insights into the relations among the various QC rules. The graphs provide heuristic support to the theoretical conclusions that no QC rule is best under all error conditions, but the multirule that combines the mean rule and a within-run standard deviation rule offers an attractive compromise.
Power spectral density of a single Brownian trajectory: what one can and cannot learn from it
NASA Astrophysics Data System (ADS)
Krapf, Diego; Marinari, Enzo; Metzler, Ralf; Oshanin, Gleb; Xu, Xinran; Squarcini, Alessio
2018-02-01
The power spectral density (PSD) of any time-dependent stochastic process X t is a meaningful feature of its spectral content. In its text-book definition, the PSD is the Fourier transform of the covariance function of X t over an infinitely large observation time T, that is, it is defined as an ensemble-averaged property taken in the limit T\\to ∞ . A legitimate question is what information on the PSD can be reliably obtained from single-trajectory experiments, if one goes beyond the standard definition and analyzes the PSD of a single trajectory recorded for a finite observation time T. In quest for this answer, for a d-dimensional Brownian motion (BM) we calculate the probability density function of a single-trajectory PSD for arbitrary frequency f, finite observation time T and arbitrary number k of projections of the trajectory on different axes. We show analytically that the scaling exponent for the frequency-dependence of the PSD specific to an ensemble of BM trajectories can be already obtained from a single trajectory, while the numerical amplitude in the relation between the ensemble-averaged and single-trajectory PSDs is a fluctuating property which varies from realization to realization. The distribution of this amplitude is calculated exactly and is discussed in detail. Our results are confirmed by numerical simulations and single-particle tracking experiments, with remarkably good agreement. In addition we consider a truncated Wiener representation of BM, and the case of a discrete-time lattice random walk. We highlight some differences in the behavior of a single-trajectory PSD for BM and for the two latter situations. The framework developed herein will allow for meaningful physical analysis of experimental stochastic trajectories.
Bayesian variable selection for post-analytic interrogation of susceptibility loci.
Chen, Siying; Nunez, Sara; Reilly, Muredach P; Foulkes, Andrea S
2017-06-01
Understanding the complex interplay among protein coding genes and regulatory elements requires rigorous interrogation with analytic tools designed for discerning the relative contributions of overlapping genomic regions. To this aim, we offer a novel application of Bayesian variable selection (BVS) for classifying genomic class level associations using existing large meta-analysis summary level resources. This approach is applied using the expectation maximization variable selection (EMVS) algorithm to typed and imputed SNPs across 502 protein coding genes (PCGs) and 220 long intergenic non-coding RNAs (lncRNAs) that overlap 45 known loci for coronary artery disease (CAD) using publicly available Global Lipids Gentics Consortium (GLGC) (Teslovich et al., 2010; Willer et al., 2013) meta-analysis summary statistics for low-density lipoprotein cholesterol (LDL-C). The analysis reveals 33 PCGs and three lncRNAs across 11 loci with >50% posterior probabilities for inclusion in an additive model of association. The findings are consistent with previous reports, while providing some new insight into the architecture of LDL-cholesterol to be investigated further. As genomic taxonomies continue to evolve, additional classes such as enhancer elements and splicing regions, can easily be layered into the proposed analysis framework. Moreover, application of this approach to alternative publicly available meta-analysis resources, or more generally as a post-analytic strategy to further interrogate regions that are identified through single point analysis, is straightforward. All coding examples are implemented in R version 3.2.1 and provided as supplemental material. © 2016, The International Biometric Society.
DIVE: A Graph-based Visual Analytics Framework for Big Data
Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie
2014-01-01
The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197
Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert
2015-07-01
Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.
Metal–organic and covalent organic frameworks as single-site catalysts
Rogge, S. M. J.; Bavykina, A.; Hajek, J.; Garcia, H.; Olivos-Suarez, A. I.; Sepúlveda-Escribano, A.; Vimont, A.; Clet, G.; Bazin, P.; Kapteijn, F.
2017-01-01
Heterogeneous single-site catalysts consist of isolated, well-defined, active sites that are spatially separated in a given solid and, ideally, structurally identical. In this review, the potential of metal–organic frameworks (MOFs) and covalent organic frameworks (COFs) as platforms for the development of heterogeneous single-site catalysts is reviewed thoroughly. In the first part of this article, synthetic strategies and progress in the implementation of such sites in these two classes of materials are discussed. Because these solids are excellent playgrounds to allow a better understanding of catalytic functions, we highlight the most important recent advances in the modelling and spectroscopic characterization of single-site catalysts based on these materials. Finally, we discuss the potential of MOFs as materials in which several single-site catalytic functions can be combined within one framework along with their potential as powerful enzyme-mimicking materials. The review is wrapped up with our personal vision on future research directions. PMID:28338128
Celik, Metin
2009-03-01
The International Safety Management (ISM) Code defines a broad framework for the safe management and operation of merchant ships, maintaining high standards of safety and environmental protection. On the other hand, ISO 14001:2004 provides a generic, worldwide environmental management standard that has been utilized by several industries. Both the ISM Code and ISO 14001:2004 have the practical goal of establishing a sustainable Integrated Environmental Management System (IEMS) for shipping businesses. This paper presents a hybrid design methodology that shows how requirements from both standards can be combined into a single execution scheme. Specifically, the Analytic Hierarchy Process (AHP) and Fuzzy Axiomatic Design (FAD) are used to structure an IEMS for ship management companies. This research provides decision aid to maritime executives in order to enhance the environmental performance in the shipping industry.
Transient photocurrent in molecular junctions: singlet switching on and triplet blocking.
Petrov, E G; Leonov, V O; Snitsarev, V
2013-05-14
The kinetic approach adapted to describe charge transmission in molecular junctions, is used for the analysis of the photocurrent under conditions of moderate light intensity of the photochromic molecule. In the framework of the HOMO-LUMO model for the single electron molecular states, the analytic expressions describing the temporary behavior of the transient and steady state sequential (hopping) as well as direct (tunnel) current components have been derived. The conditions at which the current components achieve their maximal values are indicated. It is shown that if the rates of charge transmission in the unbiased molecular diode are much lower than the intramolecular singlet-singlet excitation/de-excitation rate, and the threefold degenerated triplet excited state of the molecule behaves like a trap blocking the charge transmission, a possibility of a large peak-like transient switch-on photocurrent arises.
iSeq: Web-Based RNA-seq Data Analysis and Visualization.
Zhang, Chao; Fan, Caoqi; Gan, Jingbo; Zhu, Ping; Kong, Lei; Li, Cheng
2018-01-01
Transcriptome sequencing (RNA-seq) is becoming a standard experimental methodology for genome-wide characterization and quantification of transcripts at single base-pair resolution. However, downstream analysis of massive amount of sequencing data can be prohibitively technical for wet-lab researchers. A functionally integrated and user-friendly platform is required to meet this demand. Here, we present iSeq, an R-based Web server, for RNA-seq data analysis and visualization. iSeq is a streamlined Web-based R application under the Shiny framework, featuring a simple user interface and multiple data analysis modules. Users without programming and statistical skills can analyze their RNA-seq data and construct publication-level graphs through a standardized yet customizable analytical pipeline. iSeq is accessible via Web browsers on any operating system at http://iseq.cbi.pku.edu.cn .
Addendum to "Colored-noise-induced discontinuous transitions in symbiotic ecosystems".
Sauga, Ako; Mankin, Romi
2005-06-01
A symbiotic ecosystem with Gompertz self-regulation and with adaptive competition between populations is studied by means of a N-species Lotka-Volterra stochastic model. The influence of fluctuating environment on the carrying capacity of a population is modeled as a dichotomous noise. The study is a follow up of previous investigations of symbiotic ecosystems subjected to the generalized Verhulst self-regulation [Phys. Rev. E 69, 061106 (2004); 65, 051108 (2002)]. In the framework of mean-field approximation the behavior of the solutions of the self-consistency equation for a stationary system is examined analytically in the full phase space of system parameters. Depending on the mutual interplay of symbiosis and competition of species, variation of noise parameters (amplitude, correlation time) can induce doubly unidirectional discontinuous transitions as well as single unidirectional discontinuous transitions of the mean population size.
Addendum to ``Colored-noise-induced discontinuous transitions in symbiotic ecosystems''
NASA Astrophysics Data System (ADS)
Sauga, Ako; Mankin, Romi
2005-06-01
A symbiotic ecosystem with Gompertz self-regulation and with adaptive competition between populations is studied by means of a N -species Lotka-Volterra stochastic model. The influence of fluctuating environment on the carrying capacity of a population is modeled as a dichotomous noise. The study is a follow up of previous investigations of symbiotic ecosystems subjected to the generalized Verhulst self-regulation [Phys. Rev. E 69, 061106 (2004); 65, 051108 (2002)]. In the framework of mean-field approximation the behavior of the solutions of the self-consistency equation for a stationary system is examined analytically in the full phase space of system parameters. Depending on the mutual interplay of symbiosis and competition of species, variation of noise parameters (amplitude, correlation time) can induce doubly unidirectional discontinuous transitions as well as single unidirectional discontinuous transitions of the mean population size.
Measuring public opinion on alcohol policy: a factor analytic study of a US probability sample.
Latimer, William W; Harwood, Eileen M; Newcomb, Michael D; Wagenaar, Alexander C
2003-03-01
Public opinion has been one factor affecting change in policies designed to reduce underage alcohol use. Extant research, however, has been criticized for using single survey items of unknown reliability to define adult attitudes on alcohol policy issues. The present investigation addresses a critical gap in the literature by deriving scales on public attitudes, knowledge, and concerns pertinent to alcohol policies designed to reduce underage drinking using a US probability sample survey of 7021 adults. Five attitudinal scales were derived from exploratory and confirmatory factor analyses addressing policies to: (1) regulate alcohol marketing, (2) regulate alcohol consumption in public places, (3) regulate alcohol distribution, (4) increase alcohol taxes, and (5) regulate youth access. The scales exhibited acceptable psychometric properties and were largely consistent with a rational framework which guided the survey construction.
Enhanced networked server management with random remote backups
NASA Astrophysics Data System (ADS)
Kim, Song-Kyoo
2003-08-01
In this paper, the model is focused on available server management in network environments. The (remote) backup servers are hooked up by VPN (Virtual Private Network) and replace broken main severs immediately. A virtual private network (VPN) is a way to use a public network infrastructure and hooks up long-distance servers within a single network infrastructure. The servers can be represent as "machines" and then the system deals with main unreliable and random auxiliary spare (remote backup) machines. When the system performs a mandatory routine maintenance, auxiliary machines are being used for backups during idle periods. Unlike other existing models, the availability of auxiliary machines is changed for each activation in this enhanced model. Analytically tractable results are obtained by using several mathematical techniques and the results are demonstrated in the framework of optimized networked server allocation problems.
Formation of Tidal Captures and Gravitational Wave Inspirals in Binary-single Interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samsing, Johan; MacLeod, Morgan; Ramirez-Ruiz, Enrico
We perform the first systematic study of how dynamical stellar tides and general relativistic (GR) effects affect the dynamics and outcomes of binary-single interactions. For this, we have constructed an N -body code that includes tides in the affine approximation, where stars are modeled as self-similar ellipsoidal polytropes, and GR corrections using the commonly used post-Newtonian formalism. Using this numerical formalism, we are able resolve the leading effect from tides and GR across several orders of magnitude in both stellar radius and initial target binary separation. We find that the main effect from tides is the formation of two-body tidalmore » captures that form during the chaotic and resonant evolution of the triple system. The two stars undergoing the capture spiral in and merge. The inclusion of tides can thus lead to an increase in the stellar coalescence rate. We also develop an analytical framework for calculating the cross section of tidal inspirals between any pair of objects with similar mass. From our analytical and numerical estimates, we find that the rate of tidal inspirals relative to collisions increases as the initial semimajor axis of the target binary increases and the radius of the interacting tidal objects decreases. The largest effect is therefore found for triple systems hosting white dwarfs and neutron stars (NSs). In this case, we find the rate of highly eccentric white dwarf—NS mergers to likely be dominated by tidal inspirals. While tidal inspirals occur rarely, we note that they can give rise to a plethora of thermonuclear transients, such as Ca-rich transients.« less
Single-Cell Detection of Secreted Aβ and sAPPα from Human IPSC-Derived Neurons and Astrocytes.
Liao, Mei-Chen; Muratore, Christina R; Gierahn, Todd M; Sullivan, Sarah E; Srikanth, Priya; De Jager, Philip L; Love, J Christopher; Young-Pearse, Tracy L
2016-02-03
Secreted factors play a central role in normal and pathological processes in every tissue in the body. The brain is composed of a highly complex milieu of different cell types and few methods exist that can identify which individual cells in a complex mixture are secreting specific analytes. By identifying which cells are responsible, we can better understand neural physiology and pathophysiology, more readily identify the underlying pathways responsible for analyte production, and ultimately use this information to guide the development of novel therapeutic strategies that target the cell types of relevance. We present here a method for detecting analytes secreted from single human induced pluripotent stem cell (iPSC)-derived neural cells and have applied the method to measure amyloid β (Aβ) and soluble amyloid precursor protein-alpha (sAPPα), analytes central to Alzheimer's disease pathogenesis. Through these studies, we have uncovered the dynamic range of secretion profiles of these analytes from single iPSC-derived neuronal and glial cells and have molecularly characterized subpopulations of these cells through immunostaining and gene expression analyses. In examining Aβ and sAPPα secretion from single cells, we were able to identify previously unappreciated complexities in the biology of APP cleavage that could not otherwise have been found by studying averaged responses over pools of cells. This technique can be readily adapted to the detection of other analytes secreted by neural cells, which would have the potential to open new perspectives into human CNS development and dysfunction. We have established a technology that, for the first time, detects secreted analytes from single human neurons and astrocytes. We examine secretion of the Alzheimer's disease-relevant factors amyloid β (Aβ) and soluble amyloid precursor protein-alpha (sAPPα) and present novel findings that could not have been observed without a single-cell analytical platform. First, we identify a previously unappreciated subpopulation that secretes high levels of Aβ in the absence of detectable sAPPα. Further, we show that multiple cell types secrete high levels of Aβ and sAPPα, but cells expressing GABAergic neuronal markers are overrepresented. Finally, we show that astrocytes are competent to secrete high levels of Aβ and therefore may be a significant contributor to Aβ accumulation in the brain. Copyright © 2016 the authors 0270-6474/16/361730-17$15.00/0.
Self-contained image mapping of placental vasculature in 3D ultrasound-guided fetoscopy.
Yang, Liangjing; Wang, Junchen; Ando, Takehiro; Kubota, Akihiro; Yamashita, Hiromasa; Sakuma, Ichiro; Chiba, Toshio; Kobayashi, Etsuko
2016-09-01
Surgical navigation technology directed at fetoscopic procedures is relatively underdeveloped compared with other forms of endoscopy. The narrow fetoscopic field of views and the vast vascular network on the placenta make examination and photocoagulation treatment of twin-to-twin transfusion syndrome challenging. Though ultrasonography is used for intraoperative guidance, its navigational ability is not fully exploited. This work aims to integrate 3D ultrasound imaging and endoscopic vision seamlessly for placental vasculature mapping through a self-contained framework without external navigational devices. This is achieved through development, integration, and experimentation of novel navigational modules. Firstly, a framework design that addresses the current limitations based on identified gaps is conceptualized. Secondly, integration of navigational modules including (1) ultrasound-based localization, (2) image alignment, and (3) vision-based tracking to update the scene texture map is implemented. This updated texture map is projected to an ultrasound-constructed 3D model for photorealistic texturing of the 3D scene creating a panoramic view of the moving fetoscope. In addition, a collaborative scheme for the integration of the modular workflow system is proposed to schedule updates in a systematic fashion. Finally, experiments are carried out to evaluate each modular variation and an integrated collaborative scheme of the framework. The modules and the collaborative scheme are evaluated through a series of phantom experiments with controlled trajectories for repeatability. The collaborative framework demonstrated the best accuracy (5.2 % RMS error) compared with all the three single-module variations during the experiment. Validation on an ex vivo monkey placenta shows visual continuity of the freehand fetoscopic panorama. The proposed developed collaborative framework and the evaluation study of the framework variations provide analytical insights for effective integration of ultrasonography and endoscopy. This contributes to the development of navigation techniques in fetoscopic procedures and can potentially be extended to other applications in intraoperative imaging.
An analytical framework to assist decision makers in the use of forest ecosystem model predictions
Larocque, Guy R.; Bhatti, Jagtar S.; Ascough, J.C.; Liu, J.; Luckai, N.; Mailly, D.; Archambault, L.; Gordon, Andrew M.
2011-01-01
The predictions from most forest ecosystem models originate from deterministic simulations. However, few evaluation exercises for model outputs are performed by either model developers or users. This issue has important consequences for decision makers using these models to develop natural resource management policies, as they cannot evaluate the extent to which predictions stemming from the simulation of alternative management scenarios may result in significant environmental or economic differences. Various numerical methods, such as sensitivity/uncertainty analyses, or bootstrap methods, may be used to evaluate models and the errors associated with their outputs. However, the application of each of these methods carries unique challenges which decision makers do not necessarily understand; guidance is required when interpreting the output generated from each model. This paper proposes a decision flow chart in the form of an analytical framework to help decision makers apply, in an orderly fashion, different steps involved in examining the model outputs. The analytical framework is discussed with regard to the definition of problems and objectives and includes the following topics: model selection, identification of alternatives, modelling tasks and selecting alternatives for developing policy or implementing management scenarios. Its application is illustrated using an on-going exercise in developing silvicultural guidelines for a forest management enterprise in Ontario, Canada.
Moat, K A; Abelson, J
2011-12-01
During the 2001 election campaign, President Yoweri Museveni announced he was abolishing user fees for health services in Uganda. No analysis has been carried out to explain how he was able to initiate such an important policy decision without encountering any immediate barriers. To explain this outcome through in-depth policy analysis driven by the application of key analytical frameworks. An explanatory case study informed by analytical frameworks from the institutionalism literature was undertaken. Multiple data sources were used including: academic literature, key government documents, grey literature, and a variety of print media. According to the analytical frameworks employed, several formal institutional constraints existed that would have reduced the prospects for the abolition of user fees. However, prevalent informal institutions such as "Big Man" presidentialism and clientelism that were both 'competing' and 'complementary' can be used to explain the policy outcome. The analysis suggests that these factors trumped the impact of more formal institutional structures in the Ugandan context. Consideration should be given to the interactions between formal and informal institutions in the analysis of health policy processes in Uganda, as they provide a more nuanced understanding of how each set of factors influence policy outcomes.
Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique
2010-01-01
Background The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. Purpose PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. Methods A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Results Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Conclusion Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France. PMID:20689740
High-Performance Data Analytics Beyond the Relational and Graph Data Models with GEMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castellana, Vito G.; Minutoli, Marco; Bhatt, Shreyansh
Graphs represent an increasingly popular data model for data-analytics, since they can naturally represent relationships and interactions between entities. Relational databases and their pure table-based data model are not well suitable to store and process sparse data. Consequently, graph databases have gained interest in the last few years and the Resource Description Framework (RDF) became the standard data model for graph data. Nevertheless, while RDF is well suited to analyze the relationships between the entities, it is not efficient in representing their attributes and properties. In this work we propose the adoption of a new hybrid data model, based onmore » attributed graphs, that aims at overcoming the limitations of the pure relational and graph data models. We present how we have re-designed the GEMS data-analytics framework to fully take advantage of the proposed hybrid data model. To improve analysts productivity, in addition to a C++ API for applications development, we adopt GraQL as input query language. We validate our approach implementing a set of queries on net-flow data and we compare our framework performance against Neo4j. Experimental results show significant performance improvement over Neo4j, up to several orders of magnitude when increasing the size of the input data.« less
Exploring the Moral Complexity of School Choice: Philosophical Frameworks and Contributions
ERIC Educational Resources Information Center
Wilson, Terri S.
2015-01-01
In this essay, I describe some of the methodological dimensions of my ongoing research into how parents choose schools. I particularly focus on how philosophical frameworks and analytical strategies have shaped the empirical portion of my research. My goal, in this essay, is to trace and explore the ways in which philosophy of education--as a…
ERIC Educational Resources Information Center
Edwards, D. Brent, Jr.
2013-01-01
This article uses multiple perspectives to frame international processes of education policy formation and then applies the framework to El Salvador's Plan 2021 between 2003 and 2005. These perspectives are policy attraction, policy negotiation, policy imposition, and policy hybridization. Research reveals that the formation of Plan 2021 was the…
Behavioral assessment of personality disorders.
Nelson-Gray, R O; Farmer, R F
1999-04-01
This article examines the definition of personality disorders (PDs) from a functional analytical framework and discusses the potential utility of such a framework to account for behavioral tendencies associated with PD pathology. Also reviewed are specific behavioral assessment methods that can be employed in the assessment of PDs, and how information derived from these assessments may be linked to specific intervention strategies.
An Empirical Investigation of Entrepreneurship Intensity in Iranian State Universities
ERIC Educational Resources Information Center
Mazdeh, Mohammad Mahdavi; Razavi, Seyed-Mostafa; Hesamamiri, Roozbeh; Zahedi, Mohammad-Reza; Elahi, Behin
2013-01-01
The purpose of this study is to propose a framework to evaluate the entrepreneurship intensity (EI) of Iranian state universities. In order to determine EI, a hybrid multi-method framework consisting of Delphi, Analytic Network Process (ANP), and VIKOR is proposed. The Delphi method is used to localize and reduce the number of criteria extracted…
ERIC Educational Resources Information Center
Clarke, Lane Whitney; Bartholomew, Audrey
2014-01-01
The purpose of this study was to investigate instructor participation in asynchronous discussions through an in-depth content analysis of instructors' postings and comments through the Community of Inquiry (COI) framework (Garrison et. al, 2001). We developed an analytical tool based on this framework in order to better understand what instructors…
ERIC Educational Resources Information Center
Hser, Yih-Ing; Longshore, Douglas; Anglin, M. Douglas
2007-01-01
This article discusses the life course perspective on drug use, including conceptual and analytic issues involved in developing the life course framework to explain how drug use trajectories develop during an individual's lifetime and how this knowledge can guide new research and approaches to management of drug dependence. Central concepts…
ERIC Educational Resources Information Center
McKinley, Jim
2015-01-01
This article makes the argument that we need to situate student's academic writing as socially constructed pieces of writing that embody a writer's cultural identity and critical argument. In support, I present and describe a comprehensive model of an original English as a Foreign Language (EFL) writing analytical framework. This article explains…
ERIC Educational Resources Information Center
Jaipal, Kamini
2010-01-01
The teaching of science is a complex process, involving the use of multiple modalities. This paper illustrates the potential of a multimodal semiotics discourse analysis framework to illuminate meaning-making possibilities during the teaching of a science concept. A multimodal semiotics analytical framework is developed and used to (1) analyze the…
Metzger, Marc J.; Bunce, Robert G.H.; Jongman, Rob H.G.; Sayre, Roger G.; Trabucco, Antonio; Zomer, Robert
2013-01-01
Main conclusions: The GEnS provides a robust spatial analytical framework for the aggregation of local observations, identification of gaps in current monitoring efforts and systematic design of complementary and new monitoring and research. The dataset is available for non-commercial use through the GEO portal (http://www.geoportal.org).
A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis
ERIC Educational Resources Information Center
Schiazza, Daniela Marie
2013-01-01
The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…
ERIC Educational Resources Information Center
Blackman, Stacey
2007-01-01
The cognitions of Caribbean students with dyslexia are explored as part of an embedded multiple case study approach to teaching and learning at two secondary schools on the island of Barbados. This exploration employed "low tech" approaches to analyse what pupils had said in interviews using a Miles and Huberman (1994) framework.…
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
Aminbakhsh, Saman; Gunduz, Murat; Sonmez, Rifat
2013-09-01
The inherent and unique risks on construction projects quite often present key challenges to contractors. Health and safety risks are among the most significant risks in construction projects since the construction industry is characterized by a relatively high injury and death rate compared to other industries. In construction project management, safety risk assessment is an important step toward identifying potential hazards and evaluating the risks associated with the hazards. Adequate prioritization of safety risks during risk assessment is crucial for planning, budgeting, and management of safety related risks. In this paper, a safety risk assessment framework is presented based on the theory of cost of safety (COS) model and the analytic hierarchy process (AHP). The main contribution of the proposed framework is that it presents a robust method for prioritization of safety risks in construction projects to create a rational budget and to set realistic goals without compromising safety. The framework provides a decision tool for the decision makers to determine the adequate accident/injury prevention investments while considering the funding limits. The proposed safety risk framework is illustrated using a real-life construction project and the advantages and limitations of the framework are discussed. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Chuan-Yao; Huang, Hai-Jun; Tang, Tie-Qiao
2017-03-01
This paper investigates the traffic flow dynamics under the social optimum (SO) principle in a single-entry traffic corridor with staggered shifts from the analytical and numerical perspectives. The LWR (Lighthill-Whitham and Richards) model and the Greenshield's velocity-density function are utilized to describe the dynamic properties of traffic flow. The closed-form SO solution is analytically derived and some numerical examples are used to further testify the analytical solution. The optimum proportion of the numbers of commuters with different desired arrival times is further discussed, where the analytical and numerical results both indicate that the cumulative outflow curve under the SO principle is piecewise smooth.
ERIC Educational Resources Information Center
Villavicencio, Adriana; Klevan, Sarah; Guidry, Brandon; Wulach, Suzanne
2014-01-01
This appendix describes the data collection and analytic processes used to develop the findings in the report "Promising Opportunities for Black and Latino Young Men." A central challenge was creating an analytic framework that could be uniformly applied to all schools, despite the individualized nature of their Expanded Success…
Sensemaking during the Use of Learning Analytics in the Context of a Large College System
ERIC Educational Resources Information Center
Morse, Robert Kenneth
2017-01-01
This research took place as a cognitive exploration of sensemaking of learning analytics at Ivy Tech Community College of Indiana. For the courses with the largest online enrollment, quality standards in the course design are maintained by creating sections from a course design framework. This means all sections have the same starting content and…
ERIC Educational Resources Information Center
Fisher, James E.; Sealey, Ronald W.
The study describes the analytical pragmatic structure of concepts and applies this structure to the legal concept of procedural due process. This structure consists of form, purpose, content, and function. The study conclusions indicate that the structure of the concept of procedural due process, or any legal concept, is not the same as the…
ERIC Educational Resources Information Center
Ingrey, Jennifer C.
2012-01-01
This paper derives from a larger study, looking at how students in one secondary school in Ontario problematised and understood gender expression. This study applies a Foucaultian analytic framework of disciplinary space to the problem of the bathroom in public schools. It focuses specifically on the surveillance and regulation of gendered bodies…
Learning Analytics in Small-Scale Teacher-Led Innovations: Ethical and Data Privacy Issues
ERIC Educational Resources Information Center
Rodríguez-Triana, María Jesús; Martínez-Monés, Alejandra; Villagrá-Sobrino, Sara
2016-01-01
As a further step towards maturity, the field of learning analytics (LA) is working on the definition of frameworks that structure the legal and ethical issues that scholars and practitioners must take into account when planning and applying LA solutions to their learning contexts. However, current efforts in this direction tend to be focused on…
Bajoub, Aadil; Bendini, Alessandra; Fernández-Gutiérrez, Alberto; Carrasco-Pancorbo, Alegría
2018-03-24
Over the last decades, olive oil quality and authenticity control has become an issue of great importance to consumers, suppliers, retailers, and regulators in both traditional and emerging olive oil producing countries, mainly due to the increasing worldwide popularity and the trade globalization of this product. Thus, in order to ensure olive oil authentication, various national and international laws and regulations have been adopted, although some of them are actually causing an enormous debate about the risk that they can represent for the harmonization of international olive oil trade standards. Within this context, this review was designed to provide a critical overview and comparative analysis of selected regulatory frameworks for olive oil authentication, with special emphasis on the quality and purity criteria considered by these regulation systems, their thresholds and the analytical methods employed for monitoring them. To complete the general overview, recent analytical advances to overcome drawbacks and limitations of the official methods to evaluate olive oil quality and to determine possible adulterations were reviewed. Furthermore, the latest trends on analytical approaches to assess the olive oil geographical and varietal origin traceability were also examined.
A novel analytical description of periodic volume coil geometries in MRI
NASA Astrophysics Data System (ADS)
Koh, D.; Felder, J.; Shah, N. J.
2018-03-01
MRI volume coils can be represented by equivalent lumped element circuits and for a variety of these circuit configurations analytical design equations have been presented. The unification of several volume coil topologies results in a two-dimensional gridded equivalent lumped element circuit which compromises the birdcage resonator, its multiple endring derivative but also novel structures like the capacitive coupled ring resonator. The theory section analyzes a general two-dimensional circuit by noting that its current distribution can be decomposed into a longitudinal and an azimuthal dependency. This can be exploited to compare the current distribution with a transfer function of filter circuits along one direction. The resonances of the transfer function coincide with the resonance of the volume resonator and the simple analytical solution can be used as a design equation. The proposed framework is verified experimentally against a novel capacitive coupled ring structure which was derived from the general circuit formulation and is proven to exhibit a dominant homogeneous mode. In conclusion, a unified analytical framework is presented that allows determining the resonance frequency of any volume resonator that can be represented by a two dimensional meshed equivalent circuit.
A genetic algorithm-based job scheduling model for big data analytics.
Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei
Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.
USDA-ARS?s Scientific Manuscript database
Enzyme-linked immunosorbent assays (ELISAs) usually focus on the detection of a single analyte or a single group of analytes, e.g., fluoroquinolones or sulfonamides. However, it is often necessary to simultaneously monitor the two classes of antimicrobial residues in different food matrices. In th...
Texas two-step: a framework for optimal multi-input single-output deconvolution.
Neelamani, Ramesh; Deffenbaugh, Max; Baraniuk, Richard G
2007-11-01
Multi-input single-output deconvolution (MISO-D) aims to extract a deblurred estimate of a target signal from several blurred and noisy observations. This paper develops a new two step framework--Texas Two-Step--to solve MISO-D problems with known blurs. Texas Two-Step first reduces the MISO-D problem to a related single-input single-output deconvolution (SISO-D) problem by invoking the concept of sufficient statistics (SSs) and then solves the simpler SISO-D problem using an appropriate technique. The two-step framework enables new MISO-D techniques (both optimal and suboptimal) based on the rich suite of existing SISO-D techniques. In fact, the properties of SSs imply that a MISO-D algorithm is mean-squared-error optimal if and only if it can be rearranged to conform to the Texas Two-Step framework. Using this insight, we construct new wavelet- and curvelet-based MISO-D algorithms with asymptotically optimal performance. Simulated and real data experiments verify that the framework is indeed effective.
Google Analytics: Single Page Traffic Reports
These are pages that live outside of Google Analytics (GA) but allow you to view GA data for any individual page on either the public EPA web or EPA intranet. You do need to log in to Google Analytics to view them.
Metadata Design in the New PDS4 Standards - Something for Everybody
NASA Astrophysics Data System (ADS)
Raugh, Anne C.; Hughes, John S.
2015-11-01
The Planetary Data System (PDS) archives, supports, and distributes data of diverse targets, from diverse sources, to diverse users. One of the core problems addressed by the PDS4 data standard redesign was that of metadata - how to accommodate the increasingly sophisticated demands of search interfaces, analytical software, and observational documentation into label standards without imposing limits and constraints that would impinge on the quality or quantity of metadata that any particular observer or team could supply. And yet, as an archive, PDS must have detailed documentation for the metadata in the labels it supports, or the institutional knowledge encoded into those attributes will be lost - putting the data at risk.The PDS4 metadata solution is based on a three-step approach. First, it is built on two key ISO standards: ISO 11179 "Information Technology - Metadata Registries", which provides a common framework and vocabulary for defining metadata attributes; and ISO 14721 "Space Data and Information Transfer Systems - Open Archival Information System (OAIS) Reference Model", which provides the framework for the information architecture that enforces the object-oriented paradigm for metadata modeling. Second, PDS has defined a hierarchical system that allows it to divide its metadata universe into namespaces ("data dictionaries", conceptually), and more importantly to delegate stewardship for a single namespace to a local authority. This means that a mission can develop its own data model with a high degree of autonomy and effectively extend the PDS model to accommodate its own metadata needs within the common ISO 11179 framework. Finally, within a single namespace - even the core PDS namespace - existing metadata structures can be extended and new structures added to the model as new needs are identifiedThis poster illustrates the PDS4 approach to metadata management and highlights the expected return on the development investment for PDS, users and data preparers.
Evaluation Framework for NASA's Educational Outreach Programs
NASA Technical Reports Server (NTRS)
Berg, Rick; Booker, Angela; Linde, Charlotte; Preston, Connie
1999-01-01
The objective of the proposed work is to develop an evaluation framework for NASA's educational outreach efforts. We focus on public (rather than technical or scientific) dissemination efforts, specifically on Internet-based outreach sites for children.The outcome of this work is to propose both methods and criteria for evaluation, which would enable NASA to do a more analytic evaluation of its outreach efforts. The proposed framework is based on IRL's ethnographic and video-based observational methods, which allow us to analyze how these sites are actually used.
Kim, Hyehyun; Oh, Minhak; Kim, Dongwook; Park, Jeongin; Seong, Junmo; Kwak, Sang Kyu; Lah, Myoung Soo
2015-02-28
Single crystalline hollow metal-organic frameworks (MOFs) with cavity dimensions on the order of several micrometers and hundreds of micrometers were prepared using a metal-organic polyhedron single crystal as a sacrificial hard template. The hollow nature of the MOF crystal was confirmed by scanning electron microscopy of the crystal sliced using a focused ion beam.
Adaptive scaling model of the main pycnocline and the associated overturning circulation
NASA Astrophysics Data System (ADS)
Fuckar, Neven-Stjepan
This thesis examines a number of crucial factors and processes that control the structure of the main pycnocline and the associated overturning circulation that maintains the ocean stratification. We construct an adaptive scaling model: a semi-empirical low-order theory based on the total transformation balance that linearly superimposes parameterized transformation rate terms of various mechanisms that participate in the water-mass conversion between the warm water sphere and the cold water sphere. The depth of the main pycnocline separates the light-water domain from the dense-water domain beneath the surface, hence we introduce a new definition in an integral form that is dynamically based on the large-scale potential vorticity (i.e., vertical density gradient is selected for the kernel function of the normalized vertical integral). We exclude the abyssal pycnocline from our consideration and limit our domain of interest to the top 2 km of water column. The goal is to understand the controlling mechanisms, and analytically predict and describe a wide spectrum of ocean steady states in terms of key large-scale indices relevant for understanding the ocean's role in climate. A devised polynomial equation uses the average depth of the main pycnocline as a single unknown (the key vertical scale of the upper ocean stratification) and gives us an estimate for the northern hemisphere deep water production and export across the equator from the parts of this equation. The adaptive scaling model aims to elucidate the roles of a limited number of dominant processes that determine some key upper ocean circulation and stratification properties. Additionally, we use a general circulation model in a series of simplified single-basin ocean configurations and surface forcing fields to confirm the usefulness of our analytical model and further clarify several aspects of the upper ocean structure. An idealized numerical setup, containing all the relevant physical and dynamical properties, is key to obtaining a clear understanding, uncomplicated by the effect of the real world geometry or intricacy of realistic surface radiative and turbulent fluxes. We show that wind-driven transformation processes can be decomposed into two terms separately driven by the mid-latitude westerlies and the low-latitude easterlies. Our analytical model smoothly connects all the classical limits describing different ocean regimes in a single-basin single-hemisphere geometry. The adjective "adaptive" refers to a simple and quantitatively successful adjustment to the description of a single-basin two-hemisphere ocean, with and without a circumpolar channel under the hemispherically symmetric surface buoyancy. For example, our water-mass conversion framework, unifying wind-driven and thermohaline processes, provides us with further insight into the "Drake Passage effect without Drake Passage". The modification of different transformation pathways in the Southern Hemisphere results in the equivalent net conversion changes. The introduction of hemispheric asymmetry in the surface density can lead to significant hemispheric differences in the main pycnocline structure. This demonstrates the limitations of our analytical model based on only one key vertical scale. Also, we show a strong influence of the northern hemisphere surface density change in high latitudes on the southern hemisphere stratification and circumpolar transport.
Ordered macro-microporous metal-organic framework single crystals
NASA Astrophysics Data System (ADS)
Shen, Kui; Zhang, Lei; Chen, Xiaodong; Liu, Lingmei; Zhang, Daliang; Han, Yu; Chen, Junying; Long, Jilan; Luque, Rafael; Li, Yingwei; Chen, Banglin
2018-01-01
We constructed highly oriented and ordered macropores within metal-organic framework (MOF) single crystals, opening up the area of three-dimensional–ordered macro-microporous materials (that is, materials containing both macro- and micropores) in single-crystalline form. Our methodology relies on the strong shaping effects of a polystyrene nanosphere monolith template and a double-solvent–induced heterogeneous nucleation approach. This process synergistically enabled the in situ growth of MOFs within ordered voids, rendering a single crystal with oriented and ordered macro-microporous structure. The improved mass diffusion properties of such hierarchical frameworks, together with their robust single-crystalline nature, endow them with superior catalytic activity and recyclability for bulky-molecule reactions, as compared with conventional, polycrystalline hollow, and disordered macroporous ZIF-8.
Total analysis systems with Thermochromic Etching Discs technology.
Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel
2014-12-16
A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.
A consortium-driven framework to guide the implementation of ICH M7 Option 4 control strategies.
Barber, Chris; Antonucci, Vincent; Baumann, Jens-Christoph; Brown, Roland; Covey-Crump, Elizabeth; Elder, David; Elliott, Eric; Fennell, Jared W; Gallou, Fabrice; Ide, Nathan D; Jordine, Guido; Kallemeyn, Jeffrey M; Lauwers, Dirk; Looker, Adam R; Lovelle, Lucie E; McLaughlin, Mark; Molzahn, Robert; Ott, Martin; Schils, Didier; Oestrich, Rolf Schulte; Stevenson, Neil; Talavera, Pere; Teasdale, Andrew; Urquhart, Michael W; Varie, David L; Welch, Dennie
2017-11-01
The ICH M7 Option 4 control of (potentially) mutagenic impurities is based on the use of scientific principles in lieu of routine analytical testing. This approach can reduce the burden of analytical testing without compromising patient safety, provided a scientifically rigorous approach is taken which is backed up by sufficient theoretical and/or analytical data. This paper introduces a consortium-led initiative and offers a proposal on the supporting evidence that could be presented in regulatory submissions. Copyright © 2017 Elsevier Inc. All rights reserved.
Glenn, Catherine R; Kleiman, Evan M; Cha, Christine B; Deming, Charlene A; Franklin, Joseph C; Nock, Matthew K
2018-01-01
The field is in need of novel and transdiagnostic risk factors for suicide. The National Institute of Mental Health's Research Domain Criteria (RDoC) provides a framework that may help advance research on suicidal behavior. We conducted a meta-analytic review of existing prospective risk and protective factors for suicidal thoughts and behaviors (ideation, attempts, and deaths) that fall within one of the five RDoC domains or relate to a prominent suicide theory. Predictors were selected from a database of 4,082 prospective risk and protective factors for suicide outcomes. A total of 460 predictors met inclusion criteria for this meta-analytic review and most examined risk (vs. protective) factors for suicidal thoughts and behaviors. The overall effect of risk factors was statistically significant, but relatively small, in predicting suicide ideation (weighted mean odds ratio: wOR = 1.72; 95% CI: 1.59-1.87), suicide attempt (wOR = 1.66 [1.57-1.76), and suicide death (wOR = 1.41 [1.24-1.60]). Across all suicide outcomes, most risk factors related to the Negative Valence Systems domain, although effect sizes were of similar magnitude across RDoC domains. This study demonstrated that the RDoC framework provides a novel and promising approach to suicide research; however, relatively few studies of suicidal behavior fit within this framework. Future studies must go beyond the "usual suspects" of suicide risk factors (e.g., mental disorders, sociodemographics) to understand the processes that combine to lead to this deadly outcome. © 2017 Wiley Periodicals, Inc.
Analytical Energy Gradients for Excited-State Coupled-Cluster Methods
NASA Astrophysics Data System (ADS)
Wladyslawski, Mark; Nooijen, Marcel
The equation-of-motion coupled-cluster (EOM-CC) and similarity transformed equation-of-motion coupled-cluster (STEOM-CC) methods have been firmly established as accurate and routinely applicable extensions of single-reference coupled-cluster theory to describe electronically excited states. An overview of these methods is provided, with emphasis on the many-body similarity transform concept that is the key to a rationalization of their accuracy. The main topic of the paper is the derivation of analytical energy gradients for such non-variational electronic structure approaches, with an ultimate focus on obtaining their detailed algebraic working equations. A general theoretical framework using Lagrange's method of undetermined multipliers is presented, and the method is applied to formulate the EOM-CC and STEOM-CC gradients in abstract operator terms, following the previous work in [P.G. Szalay, Int. J. Quantum Chem. 55 (1995) 151] and [S.R. Gwaltney, R.J. Bartlett, M. Nooijen, J. Chem. Phys. 111 (1999) 58]. Moreover, the systematics of the Lagrange multiplier approach is suitable for automation by computer, enabling the derivation of the detailed derivative equations through a standardized and direct procedure. To this end, we have developed the SMART (Symbolic Manipulation and Regrouping of Tensors) package of automated symbolic algebra routines, written in the Mathematica programming language. The SMART toolkit provides the means to expand, differentiate, and simplify equations by manipulation of the detailed algebraic tensor expressions directly. The Lagrangian multiplier formulation establishes a uniform strategy to perform the automated derivation in a standardized manner: A Lagrange multiplier functional is constructed from the explicit algebraic equations that define the energy in the electronic method; the energy functional is then made fully variational with respect to all of its parameters, and the symbolic differentiations directly yield the explicit equations for the wavefunction amplitudes, the Lagrange multipliers, and the analytical gradient via the perturbation-independent generalized Hellmann-Feynman effective density matrix. This systematic automated derivation procedure is applied to obtain the detailed gradient equations for the excitation energy (EE-), double ionization potential (DIP-), and double electron affinity (DEA-) similarity transformed equation-of-motion coupled-cluster singles-and-doubles (STEOM-CCSD) methods. In addition, the derivatives of the closed-shell-reference excitation energy (EE-), ionization potential (IP-), and electron affinity (EA-) equation-of-motion coupled-cluster singles-and-doubles (EOM-CCSD) methods are derived. Furthermore, the perturbative EOM-PT and STEOM-PT gradients are obtained. The algebraic derivative expressions for these dozen methods are all derived here uniformly through the automated Lagrange multiplier process and are expressed compactly in a chain-rule/intermediate-density formulation, which facilitates a unified modular implementation of analytic energy gradients for CCSD/PT-based electronic methods. The working equations for these analytical gradients are presented in full detail, and their factorization and implementation into an efficient computer code are discussed.
2017-01-01
In this study, we present a theoretical framework combining experimental characterizations and analytical calculus to capture the firing rate input-output properties of single neurons in the fluctuation-driven regime. Our framework consists of a two-step procedure to treat independently how the dendritic input translates into somatic fluctuation variables, and how the latter determine action potential firing. We use this framework to investigate the functional impact of the heterogeneity in firing responses found experimentally in young mice layer V pyramidal cells. We first design and calibrate in vitro a simplified morphological model of layer V pyramidal neurons with a dendritic tree following Rall's branching rule. Then, we propose an analytical derivation for the membrane potential fluctuations at the soma as a function of the properties of the synaptic input in dendrites. This mathematical description allows us to easily emulate various forms of synaptic input: either balanced, unbalanced, synchronized, purely proximal or purely distal synaptic activity. We find that those different forms of dendritic input activity lead to various impact on the somatic membrane potential fluctuations properties, thus raising the possibility that individual neurons will differentially couple to specific forms of activity as a result of their different firing response. We indeed found such a heterogeneous coupling between synaptic input and firing response for all types of presynaptic activity. This heterogeneity can be explained by different levels of cellular excitability in the case of the balanced, unbalanced, synchronized and purely distal activity. A notable exception appears for proximal dendritic inputs: increasing the input level can either promote firing response in some cells, or suppress it in some other cells whatever their individual excitability. This behavior can be explained by different sensitivities to the speed of the fluctuations, which was previously associated to different levels of sodium channel inactivation and density. Because local network connectivity rather targets proximal dendrites, our results suggest that this aspect of biophysical heterogeneity might be relevant to neocortical processing by controlling how individual neurons couple to local network activity. PMID:28410418
Human mobility and time spent at destination: impact on spatial epidemic spreading.
Poletto, Chiara; Tizzoni, Michele; Colizza, Vittoria
2013-12-07
Host mobility plays a fundamental role in the spatial spread of infectious diseases. Previous theoretical works based on the integration of network theory into the metapopulation framework have shown that the heterogeneities that characterize real mobility networks favor the propagation of epidemics. Nevertheless, the studies conducted so far assumed the mobility process to be either Markovian (in which the memory of the origin of each traveler is lost) or non-Markovian with a fixed traveling time scale (in which individuals travel to a destination and come back at a constant rate). Available statistics however show that the time spent by travelers at destination is characterized by wide fluctuations, ranging from a single day up to several months. Such varying length of stay crucially affects the chance and duration of mixing events among hosts and may therefore have a strong impact on the spread of an emerging disease. Here, we present an analytical and a computational study of epidemic processes on a complex subpopulation network where travelers have memory of their origin and spend a heterogeneously distributed time interval at their destination. Through analytical calculations and numerical simulations we show that the heterogeneity of the length of stay alters the expression of the threshold between local outbreak and global invasion, and, moreover, it changes the epidemic behavior of the system in case of a global outbreak. Additionally, our theoretical framework allows us to study the effect of changes in the traveling behavior in response to the infection, by considering a scenario in which sick individuals do not leave their home location. Finally, we compare the results of our non-Markovian framework with those obtained with a classic Markovian approach and find relevant differences between the two, in the estimate of the epidemic invasion potential, as well as of the timing and the pattern of its spatial spread. These results highlight the importance of properly accounting for host trip duration in epidemic models and open the path to the inclusion of such an additional layer of complexity to the existing modeling approaches. © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bhatnagar, Shashank; Alemu, Lmenew
2018-02-01
In this work we calculate the mass spectra of charmonium for 1 P ,…,4 P states of 0++ and 1++, for 1 S ,…,5 S states of 0-+, and for 1 S ,…,4 D states of 1- along with the two-photon decay widths of the ground and first excited states of 0++ quarkonia for the process O++→γ γ in the framework of a QCD-motivated Bethe-Salpeter equation (BSE). In this 4 ×4 BSE framework, the coupled Salpeter equations are first shown to decouple for the confining part of the interaction (under the heavy-quark approximation) and are analytically solved, and later the one-gluon-exchange interaction is perturbatively incorporated, leading to mass spectral equations for various quarkonia. The analytic forms of wave functions obtained are used for the calculation of the two-photon decay widths of χc 0. Our results are in reasonable agreement with data (where available) and other models.
Co-governing decentralised water systems: an analytical framework.
Yu, C; Brown, R; Morison, P
2012-01-01
Current discourses in urban water management emphasise a diversity of water sources and scales of infrastructure for resilience and adaptability. During the last 2 decades, in particular, various small-scale systems emerged and developed so that the debate has largely moved from centralised versus decentralised water systems toward governing integrated and networked systems of provision and consumption where small-scale technologies are embedded in large-scale centralised infrastructures. However, while centralised systems have established boundaries of ownership and management, decentralised water systems (such as stormwater harvesting technologies for the street, allotment/house scales) do not, therefore the viability for adoption and/or continued use of decentralised water systems is challenged. This paper brings together insights from the literature on public sector governance, co-production and social practices model to develop an analytical framework for co-governing such systems. The framework provides urban water practitioners with guidance when designing co-governance arrangements for decentralised water systems so that these systems continue to exist, and become widely adopted, within the established urban water regime.
Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.
El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher
2018-01-01
Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.
Panos, Joseph A.; Hoffman, Joshua T.; Wordeman, Samuel C.; Hewett, Timothy E.
2016-01-01
Background Correction of neuromuscular impairments after anterior cruciate ligament injury is vital to successful return to sport. Frontal plane knee control during landing is a common measure of lower-extremity neuromuscular control and asymmetries in neuromuscular control of the knee can predispose injured athletes to additional injury and associated morbidities. Therefore, this study investigated the effects of anterior cruciate ligament injury on knee biomechanics during landing. Methods Two-dimensional frontal plane video of single leg drop, cross over drop, and drop vertical jump dynamic movement trials was analyzed for twenty injured and reconstructed athletes. The position of the knee joint center was tracked in ImageJ software for 500 milliseconds after landing to calculate medio-lateral knee motion velocities and determine normal fluency, the number of times per second knee velocity changed direction. The inverse of this calculation, analytical fluency, was used to associate larger numerical values with fluent movement. Findings Analytical fluency was decreased in involved limbs for single leg drop trials (P=0.0018). Importantly, analytical fluency for single leg drop differed compared to cross over drop trials for involved (P<0.001), but not uninvolved limbs (P=0.5029). For involved limbs, analytical fluency values exhibited a stepwise trend in relative magnitudes. Interpretation Decreased analytical fluency in involved limbs is consistent with previous studies. Fluency asymmetries observed during single leg drop tasks may be indicative of abhorrent landing strategies in the involved limb. Analytical fluency differences in unilateral tasks for injured limbs may represent neuromuscular impairment as a result of injury. PMID:26895446
How Much Can We Learn from a Single Chromatographic Experiment? A Bayesian Perspective.
Wiczling, Paweł; Kaliszan, Roman
2016-01-05
In this work, we proposed and investigated a Bayesian inference procedure to find the desired chromatographic conditions based on known analyte properties (lipophilicity, pKa, and polar surface area) using one preliminary experiment. A previously developed nonlinear mixed effect model was used to specify the prior information about a new analyte with known physicochemical properties. Further, the prior (no preliminary data) and posterior predictive distribution (prior + one experiment) were determined sequentially to search towards the desired separation. The following isocratic high-performance reversed-phase liquid chromatographic conditions were sought: (1) retention time of a single analyte within the range of 4-6 min and (2) baseline separation of two analytes with retention times within the range of 4-10 min. The empirical posterior Bayesian distribution of parameters was estimated using the "slice sampling" Markov Chain Monte Carlo (MCMC) algorithm implemented in Matlab. The simulations with artificial analytes and experimental data of ketoprofen and papaverine were used to test the proposed methodology. The simulation experiment showed that for a single and two randomly selected analytes, there is 97% and 74% probability of obtaining a successful chromatogram using none or one preliminary experiment. The desired separation for ketoprofen and papaverine was established based on a single experiment. It was confirmed that the search for a desired separation rarely requires a large number of chromatographic analyses at least for a simple optimization problem. The proposed Bayesian-based optimization scheme is a powerful method of finding a desired chromatographic separation based on a small number of preliminary experiments.
Nine-analyte detection using an array-based biosensor
NASA Technical Reports Server (NTRS)
Taitt, Chris Rowe; Anderson, George P.; Lingerfelt, Brian M.; Feldstein, s. Mark. J.; Ligler, Frances S.
2002-01-01
A fluorescence-based multianalyte immunosensor has been developed for simultaneous analysis of multiple samples. While the standard 6 x 6 format of the array sensor has been used to analyze six samples for six different analytes, this same format has the potential to allow a single sample to be tested for 36 different agents. The method described herein demonstrates proof of principle that the number of analytes detectable using a single array can be increased simply by using complementary mixtures of capture and tracer antibodies. Mixtures were optimized to allow detection of closely related analytes without significant cross-reactivity. Following this facile modification of patterning and assay procedures, the following nine targets could be detected in a single 3 x 3 array: Staphylococcal enterotoxin B, ricin, cholera toxin, Bacillus anthracis Sterne, Bacillus globigii, Francisella tularensis LVS, Yersiniapestis F1 antigen, MS2 coliphage, and Salmonella typhimurium. This work maximizes the efficiency and utility of the described array technology, increasing only reagent usage and cost; production and fabrication costs are not affected.
Stakeholder perspectives on decision-analytic modeling frameworks to assess genetic services policy.
Guzauskas, Gregory F; Garrison, Louis P; Stock, Jacquie; Au, Sylvia; Doyle, Debra Lochner; Veenstra, David L
2013-01-01
Genetic services policymakers and insurers often make coverage decisions in the absence of complete evidence of clinical utility and under budget constraints. We evaluated genetic services stakeholder opinions on the potential usefulness of decision-analytic modeling to inform coverage decisions, and asked them to identify genetic tests for decision-analytic modeling studies. We presented an overview of decision-analytic modeling to members of the Western States Genetic Services Collaborative Reimbursement Work Group and state Medicaid representatives and conducted directed content analysis and an anonymous survey to gauge their attitudes toward decision-analytic modeling. Participants also identified and prioritized genetic services for prospective decision-analytic evaluation. Participants expressed dissatisfaction with current processes for evaluating insurance coverage of genetic services. Some participants expressed uncertainty about their comprehension of decision-analytic modeling techniques. All stakeholders reported openness to using decision-analytic modeling for genetic services assessments. Participants were most interested in application of decision-analytic concepts to multiple-disorder testing platforms, such as next-generation sequencing and chromosomal microarray. Decision-analytic modeling approaches may provide a useful decision tool to genetic services stakeholders and Medicaid decision-makers.
ERIC Educational Resources Information Center
Field, Christopher Ryan
2009-01-01
Developments in analytical chemistry were made using acoustically levitated small volumes of liquid to study enzyme reaction kinetics and by detecting volatile organic compounds in the gas phase using single-walled carbon nanotubes. Experience gained in engineering, electronics, automation, and software development from the design and…
ERIC Educational Resources Information Center
Waite, Sue; Bølling, Mads; Bentsen, Peter
2016-01-01
Using a conceptual model focused on purposes, aims, content, pedagogy, outcomes, and barriers, we review and interpret literature on two forms of outdoor learning: Forest Schools in England and "udeskole" in Denmark. We examine pedagogical principles within a comparative analytical framework and consider how adopted pedagogies reflect…
ERIC Educational Resources Information Center
Kim, Rae Young
2009-01-01
This study is an initial analytic attempt to iteratively develop a conceptual framework informed by both theoretical and practical perspectives that may be used to analyze non-textual elements in mathematics textbooks. Despite the importance of visual representations in teaching and learning, little effort has been made to specify in any…
ERIC Educational Resources Information Center
Markley, O. W.
The primary objective of this study is to develop a systems-oriented analytical framework with which to better understand how formal policies serve as regulatory influences on knowledge production and utilization (KPU) in education. When completed, the framework being developed should be able to organize information about the KPU system and its…
ERIC Educational Resources Information Center
Danielsson, Anna T.; Berge, Maria; Lidar, Malena
2018-01-01
The purpose of this paper is to develop and illustrate an analytical framework for exploring how relations between knowledge and power are constituted in science and technology classrooms. In addition, the empirical purpose of this paper is to explore how disciplinary knowledge and knowledge-making are constituted in teacher-student interactions.…
David C. Calkin; Mark A. Finney; Alan A. Ager; Matthew P. Thompson; Krista M. Gebert
2011-01-01
In this paper we review progress towards the implementation of a riskmanagement framework for US federal wildland fire policy and operations. We first describe new developments in wildfire simulation technology that catalyzed the development of risk-based decision support systems for strategic wildfire management. These systems include new analytical methods to measure...
What is Informal Learning and What are its Antecedents? An Integrative and Meta-Analytic Review
2014-07-01
formal training. Unfortunately, theory and research surrounding informal learning remains fragmented. Given that there has been little systematic...future-oriented. Applying this framework, the construct domain of informal learning in organizations is articulated. Second, an interactionist theory ...theoretical framework and outline an agenda for future theory development, research, and application of informal learning principles in organizations
ERIC Educational Resources Information Center
Holman, Gareth; Kohlenberg, Robert J.; Tsai, Mavis; Haworth, Kevin; Jacobson, Emily; Liu, Sarah
2012-01-01
Depression and cigarette smoking are recurrent, interacting problems that co-occur at high rates and--especially when depression is chronic--are difficult to treat and associated with costly health consequences. In this paper we present an integrative therapeutic framework for concurrent treatment of these problems based on evidence-based…
Network control principles predict neuron function in the Caenorhabditis elegans connectome
Chew, Yee Lian; Walker, Denise S.; Schafer, William R.; Barabási, Albert-László
2017-01-01
Recent studies on the controllability of complex systems offer a powerful mathematical framework to systematically explore the structure-function relationship in biological, social and technological networks1–3. Despite theoretical advances, we lack direct experimental proof of the validity of these widely used control principles. Here we fill this gap by applying a control framework to the connectome of the nematode C. elegans4–6, allowing us to predict the involvement of each C. elegans neuron in locomotor behaviours. We predict that control of the muscles or motor neurons requires twelve neuronal classes, which include neuronal groups previously implicated in locomotion by laser ablation7–13, as well as one previously uncharacterised neuron, PDB. We validate this prediction experimentally, finding that the ablation of PDB leads to a significant loss of dorsoventral polarity in large body bends. Importantly, control principles also allow us to investigate the involvement of individual neurons within each neuronal class. For example, we predict that, within the class of DD motor neurons, only three (DD04, DD05, or DD06) should affect locomotion when ablated individually. This prediction is also confirmed, with single-cell ablations of DD04 or DD05, but not DD02 or DD03, specifically affecting posterior body movements. Our predictions are robust to deletions of weak connections, missing connections, and rewired connections in the current connectome, indicating the potential applicability of this analytical framework to larger and less well-characterised connectomes. PMID:29045391
Langton, Julia M; Wong, Sabrina T; Johnston, Sharon; Abelson, Julia; Ammi, Mehdi; Burge, Fred; Campbell, John; Haggerty, Jeannie; Hogg, William; Wodchis, Walter P; McGrail, Kimberlyn
2016-11-01
Primary care services form the foundation of modern healthcare systems, yet the breadth and complexity of services and diversity of patient populations may present challenges for creating comprehensive primary care information systems. Our objective is to develop regional-level information on the performance of primary care in Canada. A scoping review was conducted to identify existing initiatives in primary care performance measurement and reporting across 11 countries. The results of this review were used by our international team of primary care researchers and clinicians to propose an approach for regional-level primary care reporting. We found a gap between conceptual primary care performance measurement frameworks in the peer-reviewed literature and real-world primary care performance measurement and reporting activities. We did not find a conceptual framework or analytic approach that could readily form the foundation of a regional-level primary care information system. Therefore, we propose an approach to reporting comprehensive and actionable performance information according to widely accepted core domains of primary care as well as different patient population groups. An approach that bridges the gap between conceptual frameworks and real-world performance measurement and reporting initiatives could address some of the potential pitfalls of existing ways of presenting performance information (i.e., by single diseases or by age). This approach could produce meaningful and actionable information on the quality of primary care services. Copyright © 2016 Longwoods Publishing.
NASA Astrophysics Data System (ADS)
Jeuland, Marc; Whittington, Dale
2014-03-01
This article presents a methodology for planning new water resources infrastructure investments and operating strategies in a world of climate change uncertainty. It combines a real options (e.g., options to defer, expand, contract, abandon, switch use, or otherwise alter a capital investment) approach with principles drawn from robust decision-making (RDM). RDM comprises a class of methods that are used to identify investment strategies that perform relatively well, compared to the alternatives, across a wide range of plausible future scenarios. Our proposed framework relies on a simulation model that includes linkages between climate change and system hydrology, combined with sensitivity analyses that explore how economic outcomes of investments in new dams vary with forecasts of changing runoff and other uncertainties. To demonstrate the framework, we consider the case of new multipurpose dams along the Blue Nile in Ethiopia. We model flexibility in design and operating decisions—the selection, sizing, and sequencing of new dams, and reservoir operating rules. Results show that there is no single investment plan that performs best across a range of plausible future runoff conditions. The decision-analytic framework is then used to identify dam configurations that are both robust to poor outcomes and sufficiently flexible to capture high upside benefits if favorable future climate and hydrological conditions should arise. The approach could be extended to explore design and operating features of development and adaptation projects other than dams.
A framework for analyzing contagion in assortative banking networks
Hurd, Thomas R.; Gleeson, James P.; Melnik, Sergey
2017-01-01
We introduce a probabilistic framework that represents stylized banking networks with the aim of predicting the size of contagion events. Most previous work on random financial networks assumes independent connections between banks, whereas our framework explicitly allows for (dis)assortative edge probabilities (i.e., a tendency for small banks to link to large banks). We analyze default cascades triggered by shocking the network and find that the cascade can be understood as an explicit iterated mapping on a set of edge probabilities that converges to a fixed point. We derive a cascade condition, analogous to the basic reproduction number R0 in epidemic modelling, that characterizes whether or not a single initially defaulted bank can trigger a cascade that extends to a finite fraction of the infinite network. This cascade condition is an easily computed measure of the systemic risk inherent in a given banking network topology. We use percolation theory for random networks to derive a formula for the frequency of global cascades. These analytical results are shown to provide limited quantitative agreement with Monte Carlo simulation studies of finite-sized networks. We show that edge-assortativity, the propensity of nodes to connect to similar nodes, can have a strong effect on the level of systemic risk as measured by the cascade condition. However, the effect of assortativity on systemic risk is subtle, and we propose a simple graph theoretic quantity, which we call the graph-assortativity coefficient, that can be used to assess systemic risk. PMID:28231324
Network control principles predict neuron function in the Caenorhabditis elegans connectome
NASA Astrophysics Data System (ADS)
Yan, Gang; Vértes, Petra E.; Towlson, Emma K.; Chew, Yee Lian; Walker, Denise S.; Schafer, William R.; Barabási, Albert-László
2017-10-01
Recent studies on the controllability of complex systems offer a powerful mathematical framework to systematically explore the structure-function relationship in biological, social, and technological networks. Despite theoretical advances, we lack direct experimental proof of the validity of these widely used control principles. Here we fill this gap by applying a control framework to the connectome of the nematode Caenorhabditis elegans, allowing us to predict the involvement of each C. elegans neuron in locomotor behaviours. We predict that control of the muscles or motor neurons requires 12 neuronal classes, which include neuronal groups previously implicated in locomotion by laser ablation, as well as one previously uncharacterized neuron, PDB. We validate this prediction experimentally, finding that the ablation of PDB leads to a significant loss of dorsoventral polarity in large body bends. Importantly, control principles also allow us to investigate the involvement of individual neurons within each neuronal class. For example, we predict that, within the class of DD motor neurons, only three (DD04, DD05, or DD06) should affect locomotion when ablated individually. This prediction is also confirmed; single cell ablations of DD04 or DD05 specifically affect posterior body movements, whereas ablations of DD02 or DD03 do not. Our predictions are robust to deletions of weak connections, missing connections, and rewired connections in the current connectome, indicating the potential applicability of this analytical framework to larger and less well-characterized connectomes.
Langton, Julia M.; Wong, Sabrina T.; Johnston, Sharon; Abelson, Julia; Ammi, Mehdi; Burge, Fred; Campbell, John; Haggerty, Jeannie; Hogg, William; Wodchis, Walter P.
2016-01-01
Objective: Primary care services form the foundation of modern healthcare systems, yet the breadth and complexity of services and diversity of patient populations may present challenges for creating comprehensive primary care information systems. Our objective is to develop regional-level information on the performance of primary care in Canada. Methods: A scoping review was conducted to identify existing initiatives in primary care performance measurement and reporting across 11 countries. The results of this review were used by our international team of primary care researchers and clinicians to propose an approach for regional-level primary care reporting. Results: We found a gap between conceptual primary care performance measurement frameworks in the peer-reviewed literature and real-world primary care performance measurement and reporting activities. We did not find a conceptual framework or analytic approach that could readily form the foundation of a regional-level primary care information system. Therefore, we propose an approach to reporting comprehensive and actionable performance information according to widely accepted core domains of primary care as well as different patient population groups. Conclusions: An approach that bridges the gap between conceptual frameworks and real-world performance measurement and reporting initiatives could address some of the potential pitfalls of existing ways of presenting performance information (i.e., by single diseases or by age). This approach could produce meaningful and actionable information on the quality of primary care services. PMID:28032823
A framework for analyzing contagion in assortative banking networks.
Hurd, Thomas R; Gleeson, James P; Melnik, Sergey
2017-01-01
We introduce a probabilistic framework that represents stylized banking networks with the aim of predicting the size of contagion events. Most previous work on random financial networks assumes independent connections between banks, whereas our framework explicitly allows for (dis)assortative edge probabilities (i.e., a tendency for small banks to link to large banks). We analyze default cascades triggered by shocking the network and find that the cascade can be understood as an explicit iterated mapping on a set of edge probabilities that converges to a fixed point. We derive a cascade condition, analogous to the basic reproduction number R0 in epidemic modelling, that characterizes whether or not a single initially defaulted bank can trigger a cascade that extends to a finite fraction of the infinite network. This cascade condition is an easily computed measure of the systemic risk inherent in a given banking network topology. We use percolation theory for random networks to derive a formula for the frequency of global cascades. These analytical results are shown to provide limited quantitative agreement with Monte Carlo simulation studies of finite-sized networks. We show that edge-assortativity, the propensity of nodes to connect to similar nodes, can have a strong effect on the level of systemic risk as measured by the cascade condition. However, the effect of assortativity on systemic risk is subtle, and we propose a simple graph theoretic quantity, which we call the graph-assortativity coefficient, that can be used to assess systemic risk.
Network control principles predict neuron function in the Caenorhabditis elegans connectome.
Yan, Gang; Vértes, Petra E; Towlson, Emma K; Chew, Yee Lian; Walker, Denise S; Schafer, William R; Barabási, Albert-László
2017-10-26
Recent studies on the controllability of complex systems offer a powerful mathematical framework to systematically explore the structure-function relationship in biological, social, and technological networks. Despite theoretical advances, we lack direct experimental proof of the validity of these widely used control principles. Here we fill this gap by applying a control framework to the connectome of the nematode Caenorhabditis elegans, allowing us to predict the involvement of each C. elegans neuron in locomotor behaviours. We predict that control of the muscles or motor neurons requires 12 neuronal classes, which include neuronal groups previously implicated in locomotion by laser ablation, as well as one previously uncharacterized neuron, PDB. We validate this prediction experimentally, finding that the ablation of PDB leads to a significant loss of dorsoventral polarity in large body bends. Importantly, control principles also allow us to investigate the involvement of individual neurons within each neuronal class. For example, we predict that, within the class of DD motor neurons, only three (DD04, DD05, or DD06) should affect locomotion when ablated individually. This prediction is also confirmed; single cell ablations of DD04 or DD05 specifically affect posterior body movements, whereas ablations of DD02 or DD03 do not. Our predictions are robust to deletions of weak connections, missing connections, and rewired connections in the current connectome, indicating the potential applicability of this analytical framework to larger and less well-characterized connectomes.
A general framework for parametric survival analysis.
Crowther, Michael J; Lambert, Paul C
2014-12-30
Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.
Argumentation in Science Education: A Model-based Framework
NASA Astrophysics Data System (ADS)
Böttcher, Florian; Meisert, Anke
2011-02-01
The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.
2010-10-22
4. TITLE AND SUBTITLE Enhanced Systemic Understanding of the Information Environment in Complex Crisis Management Analytical Concept, Version 1.0...Email: schmidtb@iabg.de UNCLASSIFIED FOR PUBLIC RELEASE – Enhanced Systemic Understanding of the Information Environment in Complex Crisis ...multinational crisis management and the security sector about the significance and characteristics of the information environment. The framework is
ERIC Educational Resources Information Center
Lamb, Theodore A.; Chin, Keric B. O.
This paper proposes a conceptual framework based on different levels of analysis using the metaphor of the layers of an onion to help organize and structure thinking on research issues concerning training. It discusses the core of the "analytic onion," the biological level, and seven levels of analysis that surround that core: the individual, the…
ERIC Educational Resources Information Center
Monroy, Carlos; Rangel, Virginia Snodgrass; Whitaker, Reid
2014-01-01
In this paper, we discuss a scalable approach for integrating learning analytics into an online K-12 science curriculum. A description of the curriculum and the underlying pedagogical framework is followed by a discussion of the challenges to be tackled as part of this integration. We include examples of data visualization based on teacher usage…
ERIC Educational Resources Information Center
Teplovs, Chris
2015-01-01
This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Min, Jonghwan; Pua, Rizza; Cho, Seungryong, E-mail: scho@kaist.ac.kr
Purpose: A beam-blocker composed of multiple strips is a useful gadget for scatter correction and/or for dose reduction in cone-beam CT (CBCT). However, the use of such a beam-blocker would yield cone-beam data that can be challenging for accurate image reconstruction from a single scan in the filtered-backprojection framework. The focus of the work was to develop an analytic image reconstruction method for CBCT that can be directly applied to partially blocked cone-beam data in conjunction with the scatter correction. Methods: The authors developed a rebinned backprojection-filteration (BPF) algorithm for reconstructing images from the partially blocked cone-beam data in amore » circular scan. The authors also proposed a beam-blocking geometry considering data redundancy such that an efficient scatter estimate can be acquired and sufficient data for BPF image reconstruction can be secured at the same time from a single scan without using any blocker motion. Additionally, scatter correction method and noise reduction scheme have been developed. The authors have performed both simulation and experimental studies to validate the rebinned BPF algorithm for image reconstruction from partially blocked cone-beam data. Quantitative evaluations of the reconstructed image quality were performed in the experimental studies. Results: The simulation study revealed that the developed reconstruction algorithm successfully reconstructs the images from the partial cone-beam data. In the experimental study, the proposed method effectively corrected for the scatter in each projection and reconstructed scatter-corrected images from a single scan. Reduction of cupping artifacts and an enhancement of the image contrast have been demonstrated. The image contrast has increased by a factor of about 2, and the image accuracy in terms of root-mean-square-error with respect to the fan-beam CT image has increased by more than 30%. Conclusions: The authors have successfully demonstrated that the proposed scanning method and image reconstruction algorithm can effectively estimate the scatter in cone-beam projections and produce tomographic images of nearly scatter-free quality. The authors believe that the proposed method would provide a fast and efficient CBCT scanning option to various applications particularly including head-and-neck scan.« less
Kawamoto, Kensaku; Martin, Cary J; Williams, Kip; Tu, Ming-Chieh; Park, Charlton G; Hunter, Cheri; Staes, Catherine J; Bray, Bruce E; Deshmukh, Vikrant G; Holbrook, Reid A; Morris, Scott J; Fedderson, Matthew B; Sletta, Amy; Turnbull, James; Mulvihill, Sean J; Crabtree, Gordon L; Entwistle, David E; McKenna, Quinn L; Strong, Michael B; Pendleton, Robert C; Lee, Vivian S
2015-01-01
Objective To develop expeditiously a pragmatic, modular, and extensible software framework for understanding and improving healthcare value (costs relative to outcomes). Materials and methods In 2012, a multidisciplinary team was assembled by the leadership of the University of Utah Health Sciences Center and charged with rapidly developing a pragmatic and actionable analytics framework for understanding and enhancing healthcare value. Based on an analysis of relevant prior work, a value analytics framework known as Value Driven Outcomes (VDO) was developed using an agile methodology. Evaluation consisted of measurement against project objectives, including implementation timeliness, system performance, completeness, accuracy, extensibility, adoption, satisfaction, and the ability to support value improvement. Results A modular, extensible framework was developed to allocate clinical care costs to individual patient encounters. For example, labor costs in a hospital unit are allocated to patients based on the hours they spent in the unit; actual medication acquisition costs are allocated to patients based on utilization; and radiology costs are allocated based on the minutes required for study performance. Relevant process and outcome measures are also available. A visualization layer facilitates the identification of value improvement opportunities, such as high-volume, high-cost case types with high variability in costs across providers. Initial implementation was completed within 6 months, and all project objectives were fulfilled. The framework has been improved iteratively and is now a foundational tool for delivering high-value care. Conclusions The framework described can be expeditiously implemented to provide a pragmatic, modular, and extensible approach to understanding and improving healthcare value. PMID:25324556
Surface Analysis of Nerve Agent Degradation Products by ...
Report This sampling and analytical procedure was developed and applied by a single laboratory to investigate nerve agent degradation products, which may persist at a contaminated site, via surface wiping followed by analytical characterization. The performance data presented demonstrate the fitness-for-purpose regarding surface analysis in that single laboratory. Surfaces (laminate, glass, galvanized steel, vinyl tile, painted drywall and treated wood) were wiped with cotton gauze wipes, sonicated, extracted with distilled water, and filtered. Samples were analyzed with direct injection electrospray ionization liquid chromatography tandem mass spectrometry (ESI-LC/MS/MS) without derivatization. Detection limit data were generated for all analytes of interest on a laminate surface. Accuracy and precision data were generated from each surface fortified with these analytes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Jesse D.M.
In the United States overall electrical generation capacity is expected to increase by 10-25 gigawatts (GW) per year to meet increases in demand. Wind energy is a key component of state and federal renewable energy standards, and central to the Department of Energy’s 20% by 2030 wind production goals. Increased wind energy development may present increased resource conflict with avian wildlife, and environmental permitting has been identified as a potential obstacle to expansion in the sector. ICF developed an analytical framework to help applicants and agencies examine potential impacts in support of facility siting and permitting. A key objective ofmore » our work was to develop a framework that is scalable from the local to the national level, and one that is generalizable across the different scales at which biological communities operate – from local influences to meta-populations. The intent was to allow natural resource managers to estimate the cumulative impacts of turbine strikes and habitat changes on long-term population performance in the context of a species demography, genetic potential, and life history. We developed three types of models based on our literature review and participation in the scientific review processes. First, the conceptual model was developed as a general description of the analytical framework. Second, we developed the analytical framework based on the relationships between concepts, and the functions presented in the scientific literature. Third, we constructed an application of the model by parameterizing the framework using data from and relevant to the Altamont Pass Wind Resource Area (APWRA), and an existing golden eagle population model. We developed managed source code, database create statements, and written documentation to allow for the reproduction of each phase of the analysis. ICF identified a potential template adaptive management system in the form of the US Fish & Wildlife Service (USFWS) Adaptive Harvest Management (AHM) program, and developed recommendations for the structure and function of a similar wind-facility related program. We provided a straw-man implementation of the analytical framework based on assumptions for APWRA-wide golden eagle fatalities, and presented a statistical examination of the model performance. APWRA-wide fatality rates appear substantial at all scales examined from the local APWRA population to the Bird Conservation Region. Documented fatality rates significantly influenced population performance in terms of non-territorial non-breeding birds. Breeder, Juvenile, Subadult, and Adult abundance were mostly unaffected by Baseline APWRA-wide fatality rates. However, increased variability in fatality rates would likely have impacts on long-term population performance, and would result in a substantially larger loss of resources. We developed four recommendations for future study. First, we recommend establishment of concept experts through the existing system of non-profits, regulatory agencies, academia, and industry in the wind energy sector. Second, we recommend the development of a central or distributed shared data repository, and establish guidelines for data sharing and transparency. Third, we recommend development a forum and process for model selection at the local and national level. Last, we recommend experimental implementation of the prescribed system at broader scales, and refinement the expectations for modeling and adaptive management.« less
A cyber-event correlation framework and metrics
NASA Astrophysics Data System (ADS)
Kang, Myong H.; Mayfield, Terry
2003-08-01
In this paper, we propose a cyber-event fusion, correlation, and situation assessment framework that, when instantiated, will allow cyber defenders to better understand the local, regional, and global cyber-situation. This framework, with associated metrics, can be used to guide assessment of our existing cyber-defense capabilities, and to help evaluate the state of cyber-event correlation research and where we must focus our future cyber-event correlation research. The framework, based on the cyber-event gathering activities and analysis functions, consists of five operational steps, each of which provides a richer set of contextual information to support greater situational understanding. The first three steps are categorically depicted as increasingly richer and broader-scoped contexts achieved through correlation activity, while in the final two steps, these richer contexts are achieved through analytical activities (situation assessment, and threat analysis & prediction). Category 1 Correlation focuses on the detection of suspicious activities and the correlation of events from a single cyber-event source. Category 2 Correlation clusters the same or similar events from multiple detectors that are located at close proximity and prioritizes them. Finally, the events from different time periods and event sources at different location/regions are correlated at Category 3 to recognize the relationship among different events. This is the category that focuses on the detection of large-scale and coordinated attacks. The situation assessment step (Category 4) focuses on the assessment of cyber asset damage and the analysis of the impact on missions. The threat analysis and prediction step (Category 5) analyzes attacks based on attack traces and predicts the next steps. Metrics that can distinguish correlation and cyber-situation assessment tools for each category are also proposed.
Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO
NASA Technical Reports Server (NTRS)
Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.
2016-01-01
A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.
Transcritical flow of a stratified fluid over topography: analysis of the forced Gardner equation
NASA Astrophysics Data System (ADS)
Kamchatnov, A. M.; Kuo, Y.-H.; Lin, T.-C.; Horng, T.-L.; Gou, S.-C.; Clift, R.; El, G. A.; Grimshaw, R. H. J.
2013-12-01
Transcritical flow of a stratified fluid past a broad localised topographic obstacle is studied analytically in the framework of the forced extended Korteweg--de Vries (eKdV), or Gardner, equation. We consider both possible signs for the cubic nonlinear term in the Gardner equation corresponding to different fluid density stratification profiles. We identify the range of the input parameters: the oncoming flow speed (the Froude number) and the topographic amplitude, for which the obstacle supports a stationary localised hydraulic transition from the subcritical flow upstream to the supercritical flow downstream. Such a localised transcritical flow is resolved back into the equilibrium flow state away from the obstacle with the aid of unsteady coherent nonlinear wave structures propagating upstream and downstream. Along with the regular, cnoidal undular bores occurring in the analogous problem for the single-layer flow modeled by the forced KdV equation, the transcritical internal wave flows support a diverse family of upstream and downstream wave structures, including solibores, rarefaction waves, reversed and trigonometric undular bores, which we describe using the recent development of the nonlinear modulation theory for the (unforced) Gardner equation. The predictions of the developed analytic construction are confirmed by direct numerical simulations of the forced Gardner equation for a broad range of input parameters.
High resolution x-ray CMT: Reconstruction methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, J.K.
This paper qualitatively discusses the primary characteristics of methods for reconstructing tomographic images from a set of projections. These reconstruction methods can be categorized as either {open_quotes}analytic{close_quotes} or {open_quotes}iterative{close_quotes} techniques. Analytic algorithms are derived from the formal inversion of equations describing the imaging process, while iterative algorithms incorporate a model of the imaging process and provide a mechanism to iteratively improve image estimates. Analytic reconstruction algorithms are typically computationally more efficient than iterative methods; however, analytic algorithms are available for a relatively limited set of imaging geometries and situations. Thus, the framework of iterative reconstruction methods is better suited formore » high accuracy, tomographic reconstruction codes.« less
Semi-analytic valuation of stock loans with finite maturity
NASA Astrophysics Data System (ADS)
Lu, Xiaoping; Putri, Endah R. M.
2015-10-01
In this paper we study stock loans of finite maturity with different dividend distributions semi-analytically using the analytical approximation method in Zhu (2006). Stock loan partial differential equations (PDEs) are established under Black-Scholes framework. Laplace transform method is used to solve the PDEs. Optimal exit price and stock loan value are obtained in Laplace space. Values in the original time space are recovered by numerical Laplace inversion. To demonstrate the efficiency and accuracy of our semi-analytic method several examples are presented, the results are compared with those calculated using existing methods. We also present a calculation of fair service fee charged by the lender for different loan parameters.
Fiber-reinforced materials: finite elements for the treatment of the inextensibility constraint
NASA Astrophysics Data System (ADS)
Auricchio, Ferdinando; Scalet, Giulia; Wriggers, Peter
2017-12-01
The present paper proposes a numerical framework for the analysis of problems involving fiber-reinforced anisotropic materials. Specifically, isotropic linear elastic solids, reinforced by a single family of inextensible fibers, are considered. The kinematic constraint equation of inextensibility in the fiber direction leads to the presence of an undetermined fiber stress in the constitutive equations. To avoid locking-phenomena in the numerical solution due to the presence of the constraint, mixed finite elements based on the Lagrange multiplier, perturbed Lagrangian, and penalty method are proposed. Several boundary-value problems under plane strain conditions are solved and numerical results are compared to analytical solutions, whenever the derivation is possible. The performed simulations allow to assess the performance of the proposed finite elements and to discuss several features of the developed formulations concerning the effective approximation for the displacement and fiber stress fields, mesh convergence, and sensitivity to penalty parameters.
Fretter, Christoph; Lesne, Annick; Hilgetag, Claus C.; Hütt, Marc-Thorsten
2017-01-01
Simple models of excitable dynamics on graphs are an efficient framework for studying the interplay between network topology and dynamics. This topic is of practical relevance to diverse fields, ranging from neuroscience to engineering. Here we analyze how a single excitation propagates through a random network as a function of the excitation threshold, that is, the relative amount of activity in the neighborhood required for the excitation of a node. We observe that two sharp transitions delineate a region of sustained activity. Using analytical considerations and numerical simulation, we show that these transitions originate from the presence of barriers to propagation and the excitation of topological cycles, respectively, and can be predicted from the network topology. Our findings are interpreted in the context of network reverberations and self-sustained activity in neural systems, which is a question of long-standing interest in computational neuroscience. PMID:28186182
Spatio-temporal propagation of cascading overload failures in spatially embedded networks
NASA Astrophysics Data System (ADS)
Zhao, Jichang; Li, Daqing; Sanhedrai, Hillel; Cohen, Reuven; Havlin, Shlomo
2016-01-01
Different from the direct contact in epidemics spread, overload failures propagate through hidden functional dependencies. Many studies focused on the critical conditions and catastrophic consequences of cascading failures. However, to understand the network vulnerability and mitigate the cascading overload failures, the knowledge of how the failures propagate in time and space is essential but still missing. Here we study the spatio-temporal propagation behaviour of cascading overload failures analytically and numerically on spatially embedded networks. The cascading overload failures are found to spread radially from the centre of the initial failure with an approximately constant velocity. The propagation velocity decreases with increasing tolerance, and can be well predicted by our theoretical framework with one single correction for all the tolerance values. This propagation velocity is found similar in various model networks and real network structures. Our findings may help to predict the dynamics of cascading overload failures in realistic systems.
On the Accuracy of Double Scattering Approximation for Atmospheric Polarization Computations
NASA Technical Reports Server (NTRS)
Korkin, Sergey V.; Lyapustin, Alexei I.; Marshak, Alexander L.
2011-01-01
Interpretation of multi-angle spectro-polarimetric data in remote sensing of atmospheric aerosols require fast and accurate methods of solving the vector radiative transfer equation (VRTE). The single and double scattering approximations could provide an analytical framework for the inversion algorithms and are relatively fast, however accuracy assessments of these approximations for the aerosol atmospheres in the atmospheric window channels have been missing. This paper provides such analysis for a vertically homogeneous aerosol atmosphere with weak and strong asymmetry of scattering. In both cases, the double scattering approximation gives a high accuracy result (relative error approximately 0.2%) only for the low optical path - 10(sup -2) As the error rapidly grows with optical thickness, a full VRTE solution is required for the practical remote sensing analysis. It is shown that the scattering anisotropy is not important at low optical thicknesses neither for reflected nor for transmitted polarization components of radiation.
Firing patterns in the adaptive exponential integrate-and-fire model.
Naud, Richard; Marcille, Nicolas; Clopath, Claudia; Gerstner, Wulfram
2008-11-01
For simulations of large spiking neuron networks, an accurate, simple and versatile single-neuron modeling framework is required. Here we explore the versatility of a simple two-equation model: the adaptive exponential integrate-and-fire neuron. We show that this model generates multiple firing patterns depending on the choice of parameter values, and present a phase diagram describing the transition from one firing type to another. We give an analytical criterion to distinguish between continuous adaption, initial bursting, regular bursting and two types of tonic spiking. Also, we report that the deterministic model is capable of producing irregular spiking when stimulated with constant current, indicating low-dimensional chaos. Lastly, the simple model is fitted to real experiments of cortical neurons under step current stimulation. The results provide support for the suitability of simple models such as the adaptive exponential integrate-and-fire neuron for large network simulations.
Fretter, Christoph; Lesne, Annick; Hilgetag, Claus C; Hütt, Marc-Thorsten
2017-02-10
Simple models of excitable dynamics on graphs are an efficient framework for studying the interplay between network topology and dynamics. This topic is of practical relevance to diverse fields, ranging from neuroscience to engineering. Here we analyze how a single excitation propagates through a random network as a function of the excitation threshold, that is, the relative amount of activity in the neighborhood required for the excitation of a node. We observe that two sharp transitions delineate a region of sustained activity. Using analytical considerations and numerical simulation, we show that these transitions originate from the presence of barriers to propagation and the excitation of topological cycles, respectively, and can be predicted from the network topology. Our findings are interpreted in the context of network reverberations and self-sustained activity in neural systems, which is a question of long-standing interest in computational neuroscience.
An Epidemiological Model for Examining Marijuana Use over the Life Course
Paddock, Susan M.; Kilmer, Beau; Caulkins, Jonathan P.; Booth, Marika J.; Pacula, Rosalie L.
2012-01-01
Trajectories of drug use are usually studied empirically by following over time persons sampled from either the general population (most often youth and young adults) or from heavy or problematic users (e.g., arrestees or those in treatment). The former, population-based samples, describe early career development, but miss the years of use that generate the greatest social costs. The latter, selected populations, help to summarize the most problematic use, but cannot easily explain how people become problem users nor are they representative of the population as a whole. This paper shows how microsimulation can synthesize both sorts of data within a single analytical framework, while retaining heterogeneous influences that can impact drug use decisions over the life course. The RAND Marijuana Microsimulation Model is constructed for marijuana use, validated, and then used to demonstrate how such models can be used to evaluate alternative policy options aimed at reducing use over the life course. PMID:23236590
Interaction of two walkers: wave-mediated energy and force.
Borghesi, Christian; Moukhtar, Julien; Labousse, Matthieu; Eddi, Antonin; Fort, Emmanuel; Couder, Yves
2014-12-01
A bouncing droplet, self-propelled by its interaction with the waves it generates, forms a classical wave-particle association called a "walker." Previous works have demonstrated that the dynamics of a single walker is driven by its global surface wave field that retains information on its past trajectory. Here we investigate the energy stored in this wave field for two coupled walkers and how it conveys an interaction between them. For this purpose, we characterize experimentally the "promenade modes" where two walkers are bound and propagate together. Their possible binding distances take discrete values, and the velocity of the pair depends on their mutual binding. The mean parallel motion can be either rectilinear or oscillating. The experimental results are recovered analytically with a simple theoretical framework. A relation between the kinetic energy of the droplets and the total energy of the standing waves is established.
Design and Analysis Tool for External-Compression Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.
2012-01-01
A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.
A theoretical basis for the analysis of multiversion software subject to coincident errors
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.; Lee, L. D.
1985-01-01
Fundamental to the development of redundant software techniques (known as fault-tolerant software) is an understanding of the impact of multiple joint occurrences of errors, referred to here as coincident errors. A theoretical basis for the study of redundant software is developed which: (1) provides a probabilistic framework for empirically evaluating the effectiveness of a general multiversion strategy when component versions are subject to coincident errors, and (2) permits an analytical study of the effects of these errors. An intensity function, called the intensity of coincident errors, has a central role in this analysis. This function describes the propensity of programmers to introduce design faults in such a way that software components fail together when executing in the application environment. A condition under which a multiversion system is a better strategy than relying on a single version is given.
Spatio-temporal propagation of cascading overload failures in spatially embedded networks
Zhao, Jichang; Li, Daqing; Sanhedrai, Hillel; Cohen, Reuven; Havlin, Shlomo
2016-01-01
Different from the direct contact in epidemics spread, overload failures propagate through hidden functional dependencies. Many studies focused on the critical conditions and catastrophic consequences of cascading failures. However, to understand the network vulnerability and mitigate the cascading overload failures, the knowledge of how the failures propagate in time and space is essential but still missing. Here we study the spatio-temporal propagation behaviour of cascading overload failures analytically and numerically on spatially embedded networks. The cascading overload failures are found to spread radially from the centre of the initial failure with an approximately constant velocity. The propagation velocity decreases with increasing tolerance, and can be well predicted by our theoretical framework with one single correction for all the tolerance values. This propagation velocity is found similar in various model networks and real network structures. Our findings may help to predict the dynamics of cascading overload failures in realistic systems. PMID:26754065
Crystal structure optimisation using an auxiliary equation of state
NASA Astrophysics Data System (ADS)
Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T.; Walsh, Aron
2015-11-01
Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy-volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other "beyond" density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu2ZnSnS4 and the magnetic metal-organic framework HKUST-1.
NASA Astrophysics Data System (ADS)
Fretter, Christoph; Lesne, Annick; Hilgetag, Claus C.; Hütt, Marc-Thorsten
2017-02-01
Simple models of excitable dynamics on graphs are an efficient framework for studying the interplay between network topology and dynamics. This topic is of practical relevance to diverse fields, ranging from neuroscience to engineering. Here we analyze how a single excitation propagates through a random network as a function of the excitation threshold, that is, the relative amount of activity in the neighborhood required for the excitation of a node. We observe that two sharp transitions delineate a region of sustained activity. Using analytical considerations and numerical simulation, we show that these transitions originate from the presence of barriers to propagation and the excitation of topological cycles, respectively, and can be predicted from the network topology. Our findings are interpreted in the context of network reverberations and self-sustained activity in neural systems, which is a question of long-standing interest in computational neuroscience.
De Neve, Jan-Walter; Boudreaux, Chantelle; Gill, Roopan; Geldsetzer, Pascal; Vaikath, Maria; Bärnighausen, Till; Bossert, Thomas J
2017-07-03
Many countries have created community-based health worker (CHW) programs for HIV. In most of these countries, several national and non-governmental initiatives have been implemented raising questions of how well these different approaches address the health problems and use health resources in a compatible way. While these questions have led to a general policy initiative to promote harmonization across programs, there is a need for countries to develop a more coherent and organized approach to CHW programs and to generate evidence about the most efficient and effective strategies to ensure their optimal, sustained performance. We conducted a narrative review of the existing published and gray literature on the harmonization of CHW programs. We searched for and noted evidence on definitions, models, and/or frameworks of harmonization; theoretical arguments or hypotheses about the effects of CHW program fragmentation; and empirical evidence. Based on this evidence, we defined harmonization, introduced three priority areas for harmonization, and identified a conceptual framework for analyzing harmonization of CHW programs that can be used to support their expanding role in HIV service delivery. We identified and described the major issues and relationships surrounding the harmonization of CHW programs, including key characteristics, facilitators, and barriers for each of the priority areas of harmonization, and used our analytic framework to map overarching findings. We apply this approach of CHW programs supporting HIV services across four countries in Southern Africa in a separate article. There is a large number and immense diversity of CHW programs for HIV. This includes integration of HIV components into countries' existing national programs along with the development of multiple, stand-alone CHW programs. We defined (i) coordination among stakeholders, (ii) integration into the broader health system, and (iii) assurance of a CHW program's sustainability to be priority areas of harmonization. While harmonization is likely a complex political process, with in many cases incremental steps toward improvement, a wide range of facilitators are available to decision-makers. These can be categorized using an analytic framework assessing the (i) health issue, (ii) intervention itself, (iii) stakeholders, (iv) health system, and (v) broad context. There is a need to address fragmentation of CHW programs to advance and sustain CHW roles and responsibilities for HIV. This study provides a narrative review and analytic framework to understand the process by which harmonization of CHW programs might be achieved and to test the assumption that harmonization is needed to improve CHW performance.
Marty, Michael T.; Kuhnline Sloan, Courtney D.; Bailey, Ryan C.; Sligar, Stephen G.
2012-01-01
Conventional methods to probe the binding kinetics of macromolecules at biosensor surfaces employ a stepwise titration of analyte concentrations and measure the association and dissociation to the immobilized ligand at each concentration level. It has previously been shown that kinetic rates can be measured in a single step by monitoring binding as the analyte concentration increases over time in a linear gradient. We report here the application of nonlinear analyte concentration gradients for determining kinetic rates and equilibrium binding affinities in a single experiment. A versatile nonlinear gradient maker is presented, which is easily applied to microfluidic systems. Simulations validate that accurate kinetic rates can be extracted for a wide range of association and dissociation rates, gradient slopes and curvatures, and with models for mass transport. The nonlinear analyte gradient method is demonstrated with a silicon photonic microring resonator platform to measure prostate specific antigen-antibody binding kinetics. PMID:22686186
Marty, Michael T; Sloan, Courtney D Kuhnline; Bailey, Ryan C; Sligar, Stephen G
2012-07-03
Conventional methods to probe the binding kinetics of macromolecules at biosensor surfaces employ a stepwise titration of analyte concentrations and measure the association and dissociation to the immobilized ligand at each concentration level. It has previously been shown that kinetic rates can be measured in a single step by monitoring binding as the analyte concentration increases over time in a linear gradient. We report here the application of nonlinear analyte concentration gradients for determining kinetic rates and equilibrium binding affinities in a single experiment. A versatile nonlinear gradient maker is presented, which is easily applied to microfluidic systems. Simulations validate that accurate kinetic rates can be extracted for a wide range of association and dissociation rates, gradient slopes, and curvatures, and with models for mass transport. The nonlinear analyte gradient method is demonstrated with a silicon photonic microring resonator platform to measure prostate specific antigen-antibody binding kinetics.
Uludağ, Yildiz; Piletsky, Sergey A; Turner, Anthony P F; Cooper, Matthew A
2007-11-01
Biomimetic recognition elements employed for the detection of analytes are commonly based on proteinaceous affibodies, immunoglobulins, single-chain and single-domain antibody fragments or aptamers. The alternative supra-molecular approach using a molecularly imprinted polymer now has proven utility in numerous applications ranging from liquid chromatography to bioassays. Despite inherent advantages compared with biochemical/biological recognition (which include robustness, storage endurance and lower costs) there are few contributions that describe quantitative analytical applications of molecularly imprinted polymers for relevant small molecular mass compounds in real-world samples. There is, however, significant literature describing the use of low-power, portable piezoelectric transducers to detect analytes in environmental monitoring and other application areas. Here we review the combination of molecularly imprinted polymers as recognition elements with piezoelectric biosensors for quantitative detection of small molecules. Analytes are classified by type and sample matrix presentation and various molecularly imprinted polymer synthetic fabrication strategies are also reviewed.
Analytic theory of alternate multilayer gratings operating in single-order regime.
Yang, Xiaowei; Kozhevnikov, Igor V; Huang, Qiushi; Wang, Hongchang; Hand, Matthew; Sawhney, Kawal; Wang, Zhanshan
2017-07-10
Using the coupled wave approach (CWA), we introduce the analytical theory for alternate multilayer grating (AMG) operating in the single-order regime, in which only one diffraction order is excited. Differing from previous study analogizing AMG to crystals, we conclude that symmetrical structure, or equal thickness of the two multilayer materials, is not the optimal design for AMG and may result in significant reduction in diffraction efficiency. The peculiarities of AMG compared with other multilayer gratings are analyzed. An influence of multilayer structure materials on diffraction efficiency is considered. The validity conditions of analytical theory are also discussed.
Li, Qike; Schissler, A Grant; Gardeux, Vincent; Achour, Ikbel; Kenost, Colleen; Berghout, Joanne; Li, Haiquan; Zhang, Hao Helen; Lussier, Yves A
2017-05-24
Transcriptome analytic tools are commonly used across patient cohorts to develop drugs and predict clinical outcomes. However, as precision medicine pursues more accurate and individualized treatment decisions, these methods are not designed to address single-patient transcriptome analyses. We previously developed and validated the N-of-1-pathways framework using two methods, Wilcoxon and Mahalanobis Distance (MD), for personal transcriptome analysis derived from a pair of samples of a single patient. Although, both methods uncover concordantly dysregulated pathways, they are not designed to detect dysregulated pathways with up- and down-regulated genes (bidirectional dysregulation) that are ubiquitous in biological systems. We developed N-of-1-pathways MixEnrich, a mixture model followed by a gene set enrichment test, to uncover bidirectional and concordantly dysregulated pathways one patient at a time. We assess its accuracy in a comprehensive simulation study and in a RNA-Seq data analysis of head and neck squamous cell carcinomas (HNSCCs). In presence of bidirectionally dysregulated genes in the pathway or in presence of high background noise, MixEnrich substantially outperforms previous single-subject transcriptome analysis methods, both in the simulation study and the HNSCCs data analysis (ROC Curves; higher true positive rates; lower false positive rates). Bidirectional and concordant dysregulated pathways uncovered by MixEnrich in each patient largely overlapped with the quasi-gold standard compared to other single-subject and cohort-based transcriptome analyses. The greater performance of MixEnrich presents an advantage over previous methods to meet the promise of providing accurate personal transcriptome analysis to support precision medicine at point of care.
Opportunities for bead-based multiplex assays in veterinary diagnostic laboratories
USDA-ARS?s Scientific Manuscript database
Bead based multiplex assays (BBMA) also referred to as Luminex, MultiAnalyte Profiling or cytometric bead array (CBA) assays, are applicable for high throughput, simultaneous detection of multiple analytes in solution (from several, up to 50-500 analytes within a single, small sample volume). Curren...
Amarasingham, Ruben; Audet, Anne-Marie J.; Bates, David W.; Glenn Cohen, I.; Entwistle, Martin; Escobar, G. J.; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M.; Shahi, Anand; Stewart, Walter F.; Steyerberg, Ewout W.; Xie, Bin
2016-01-01
Context: The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Objectives: Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. Methods: To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. Findings: The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and Ethics) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility.Ethics: Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models. PMID:27141516
Amarasingham, Ruben; Audet, Anne-Marie J; Bates, David W; Glenn Cohen, I; Entwistle, Martin; Escobar, G J; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M; Shahi, Anand; Stewart, Walter F; Steyerberg, Ewout W; Xie, Bin
2016-01-01
The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and ETHICS) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility. Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models.
Dubský, Pavel; Müllerová, Ludmila; Dvořák, Martin; Gaš, Bohuslav
2015-03-06
The model of electromigration of a multivalent weak acidic/basic/amphoteric analyte that undergoes complexation with a mixture of selectors is introduced. The model provides an extension of the series of models starting with the single-selector model without dissociation by Wren and Rowe in 1992, continuing with the monovalent weak analyte/single-selector model by Rawjee, Williams and Vigh in 1993 and that by Lelièvre in 1994, and ending with the multi-selector overall model without dissociation developed by our group in 2008. The new multivalent analyte multi-selector model shows that the effective mobility of the analyte obeys the original Wren and Row's formula. The overall complexation constant, mobility of the free analyte and mobility of complex can be measured and used in a standard way. The mathematical expressions for the overall parameters are provided. We further demonstrate mathematically that the pH dependent parameters for weak analytes can be simply used as an input into the multi-selector overall model and, in reverse, the multi-selector overall parameters can serve as an input into the pH-dependent models for the weak analytes. These findings can greatly simplify the rationale method development in analytical electrophoresis, specifically enantioseparations. Copyright © 2015 Elsevier B.V. All rights reserved.
Quo vadis, analytical chemistry?
Valcárcel, Miguel
2016-01-01
This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.
Initialization and Simulation of Three-Dimensional Aircraft Wake Vortices
NASA Technical Reports Server (NTRS)
Ash, Robert L.; Zheng, Z. C.
1997-01-01
This paper studies the effects of axial velocity profiles on vortex decay, in order to properly initialize and simulate three-dimensional wake vortex flow. Analytical relationships are obtained based on a single vortex model and computational simulations are performed for a rather practical vortex wake, which show that the single vortex analytical relations can still be applicable at certain streamwise sections of three-dimensional wake vortices.
Cuadros-Rodríguez, Luis; Ruiz-Samblás, Cristina; Valverde-Som, Lucia; Pérez-Castaño, Estefanía; González-Casado, Antonio
2016-02-25
Fingerprinting methods describe a variety of analytical methods that provide analytical signals related to the composition of foodstuffs in a non-selective way such as by collecting a spectrum or a chromatogram. Mathematical processing of the information in such fingerprints may allow the characterisation and/or authentication of foodstuffs. In this context, the particular meaning of 'fingerprinting', in conjunction with 'profiling', is different from the original meanings used in metabolomics. This fact has produced some confusion with the use of these terms in analytical papers. Researchers coming from the metabolomic field could use 'profiling' or 'fingerprinting' on a different way to researchers who are devoted to food science. The arrival of an eclectic discipline, named 'foodomics' has not been enough to allay this terminological problem, since the authors keep on using the terms with both meanings. Thus, a first goal of this tutorial is to clarify the difference between both terms. In addition, the chemical approaches for food authentication, i.e., chemical markers, component profiling and instrumental fingerprinting, have been described. A new term, designated as 'food identitation', has been introduced in order to complete the life cycle of the chemical-based food authentication process. Chromatographic fingerprinting has been explained in detail and some strategies which could be applied has been clarified and discussed. Particularly, the strategies for chromatographic signals acquisition and chromatographic data handling are unified in a single framework. Finally, an overview about the applications of chromatographic (GC and LC) fingerprints in food authentication using different chemometric techniques has been included. Copyright © 2016 Elsevier B.V. All rights reserved.
Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Dalton, Angela C.; Dale, Crystal
2014-06-01
Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less
High precision time calibration of the Permo-Triassic boundary mass extinction by U-Pb geochronology
NASA Astrophysics Data System (ADS)
Baresel, Björn; Bucher, Hugo; Brosse, Morgane; Schaltegger, Urs
2014-05-01
U-Pb dating using Chemical Abrasion, Isotope Dilution Thermal Ionization Mass Spectrometry (CA-ID-TIMS) is the analytical method of choice for geochronologists, who are seeking highest temporal resolution and a high degree of accuracy for single grains of zircon. The use of double-isotope tracer solutions, cross-calibrated and assessed in different EARTHTIME labs, coinciding with the reassessment of the uranium decay constants and further improvements in ion counting technology led to unprecedented precision better than 0.1% for single grain, and 0.05% for population ages, respectively. These analytical innovations now allow calibrating magmatic and biological timescales at resolution adequate for both groups of processes. To construct a revised and high resolution calibrated time scale for the Permian-Triassic boundary (PTB) we use (i) high-precision U-Pb zircon age determinations of a unique succession of volcanic ash beds interbedded with shallow to deep water fossiliferous sediments in the Nanpanjiang Basin (South China) combined with (ii) accurate quantitative biochronology based on ammonoids and conodonts and (iii) carbon isotope excursions across the PTB. Using these alignments allows (i) positioning the PTB in different depositional environments and (ii) solving age/stratigraphic contradictions generated by the index, water depth-controlled conodont Hindeodus parvus, whose diachronous first occurrences are arbitrarily used for placing the base of the Triassic. This new age framework provides the basis for a combined calibration of chemostratigraphic records with high-resolution biochronozones of the Late Permian and Early Triassic. Besides the general improvement of the radio-isotopic calibration of the PTB at the ±100 ka level, this will also lead to a better understanding of cause and effect relations involved in this mass extinction.
Meta-analytic framework for liquid association.
Wang, Lin; Liu, Silvia; Ding, Ying; Yuan, Shin-Sheng; Ho, Yen-Yi; Tseng, George C
2017-07-15
Although coexpression analysis via pair-wise expression correlation is popularly used to elucidate gene-gene interactions at the whole-genome scale, many complicated multi-gene regulations require more advanced detection methods. Liquid association (LA) is a powerful tool to detect the dynamic correlation of two gene variables depending on the expression level of a third variable (LA scouting gene). LA detection from single transcriptomic study, however, is often unstable and not generalizable due to cohort bias, biological variation and limited sample size. With the rapid development of microarray and NGS technology, LA analysis combining multiple gene expression studies can provide more accurate and stable results. In this article, we proposed two meta-analytic approaches for LA analysis (MetaLA and MetaMLA) to combine multiple transcriptomic studies. To compensate demanding computing, we also proposed a two-step fast screening algorithm for more efficient genome-wide screening: bootstrap filtering and sign filtering. We applied the methods to five Saccharomyces cerevisiae datasets related to environmental changes. The fast screening algorithm reduced 98% of running time. When compared with single study analysis, MetaLA and MetaMLA provided stronger detection signal and more consistent and stable results. The top triplets are highly enriched in fundamental biological processes related to environmental changes. Our method can help biologists understand underlying regulatory mechanisms under different environmental exposure or disease states. A MetaLA R package, data and code for this article are available at http://tsenglab.biostat.pitt.edu/software.htm. ctseng@pitt.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Analytical Considerations about the Cosmological Constant and Dark Energy
NASA Astrophysics Data System (ADS)
Abreu, Everton M. C.; de Assis, Leonardo P. G.; Dos Reis, Carlos M. L.
The accelerated expansion of the universe has now been confirmed by several independent observations including those of high redshift type Ia supernovae, and the cosmic microwave background combined with the large scale structure of the universe. Another way of presenting this kinematic property of the universe is to postulate the existence of a new and exotic entity, with negative pressure, the dark energy (DE). In spite of observationally well established, no single theoretical model provides an entirely compelling framework within which cosmic acceleration or DE can be understood. At present all existing observational data are in agreement with the simplest possibility that the cosmological constant be a candidate for DE. This case is internally self-consistent and noncontradictory. The extreme smallness of the cosmological constant expressed in either Planck, or even atomic units means only that its origin is not related to strong, electromagnetic, and weak interactions. Although in this case DE reduces to only a single fundamental constant we still have no derivation from any underlying quantum field theory for its small value. From the principles of quantum cosmologies, for example, it is possible to obtain the reason for an inverse-square law for the cosmological constant with no conflict with observations. Despite the fact that this general expression is well known, in this work we introduce families of analytical solutions for the scale factor different from the current literature. The knowledge of the scale factor behavior might shed some light on these questions mentioned above since the entire evolution of a homogeneous isotropic universe is contained in the scale factor. We use different parameters for these solutions and with these parameters we establish a connection with the equation of state for different DE scenarios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Renyu; Demory, Brice-Olivier; Seager, Sara
2015-03-20
Kepler has detected numerous exoplanet transits by measuring stellar light in a single visible-wavelength band. In addition to detection, the precise photometry provides phase curves of exoplanets, which can be used to study the dynamic processes on these planets. However, the interpretation of these observations can be complicated by the fact that visible-wavelength phase curves can represent both thermal emission and scattering from the planets. Here we present a semi-analytical model framework that can be applied to study Kepler and future visible-wavelength phase curve observations of exoplanets. The model efficiently computes reflection and thermal emission components for both rocky andmore » gaseous planets, considering both homogeneous and inhomogeneous surfaces or atmospheres. We analyze the phase curves of the gaseous planet Kepler- 7 b and the rocky planet Kepler- 10 b using the model. In general, we find that a hot exoplanet’s visible-wavelength phase curve having a significant phase offset can usually be explained by two classes of solutions: one class requires a thermal hot spot shifted to one side of the substellar point, and the other class requires reflective clouds concentrated on the same side of the substellar point. Particularly for Kepler- 7 b, reflective clouds located on the west side of the substellar point can best explain its phase curve. The reflectivity of the clear part of the atmosphere should be less than 7% and that of the cloudy part should be greater than 80%, and the cloud boundary should be located at 11° ± 3° to the west of the substellar point. We suggest single-band photometry surveys could yield valuable information on exoplanet atmospheres and surfaces.« less
Intellectual Biography in Higher Education: The Public Career of Earl J. McGrath as a Case Study.
ERIC Educational Resources Information Center
Reid, John Y.
The method of writing an intellectual biography and the public career of Earl J. McGrath in the post-World War I cultural milieu are analyzed. One analytical framework is adapted from cultural anthropology and is used to describe the relationship of educational systems to other social systems and to culture as a whole. The second analytic frame,…
2006-07-27
unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry
Yang, Cheng-Xiong; Liu, Chang; Cao, Yi-Meng; Yan, Xiu-Ping
2015-08-07
A simple and facile room-temperature solution-phase synthesis was developed to fabricate a spherical covalent organic framework with large surface area, good solvent stability and high thermostability for high-resolution chromatographic separation of diverse important industrial analytes including alkanes, cyclohexane and benzene, α-pinene and β-pinene, and alcohols with high column efficiency and good precision.
ERIC Educational Resources Information Center
Bull, Susan; Kay, Judy
2016-01-01
The SMILI? (Student Models that Invite the Learner In) Open Learner Model Framework was created to provide a coherent picture of the many and diverse forms of Open Learner Models (OLMs). The aim was for SMILI? to provide researchers with a systematic way to describe, compare and critique OLMs. We expected it to highlight those areas where there…
The "A" Factor: Coming to Terms with the Question of Legacy in South African Education
ERIC Educational Resources Information Center
Soudien, Crain
2007-01-01
This paper attempts to offer an alternative framework for assessing education delivery in South Africa. Its purpose is to develop an analytic approach for understanding education delivery in South Africa in the last 11 years and to use this framework to pose a set of strategic questions about how policy might be framed to deal with delivery. The…
Cahill, J. F.; Fei, H.; Cohen, S. M.; ...
2015-01-05
Materials with core-shell structures have distinct properties that lend themselves to a variety of potential applications. Characterization of small particle core-shell materials presents a unique analytical challenge. Herein, single particles of solid-state materials with core-shell structures were measured using on-line aerosol time-of-flight mass spectrometry (ATOFMS). Laser 'depth profiling' experiments verified the core-shell nature of two known core-shell particle configurations (< 2 mu m diameter) that possessed inverted, complimentary core-shell compositions (ZrO2@SiO2 versus SiO2@ZrO2). The average peak area ratios of Si and Zr ions were calculated to definitively show their core-shell composition. These ratio curves acted as a calibrant for anmore » uncharacterized sample - a metal-organic framework (MOF) material surround by silica (UiO-66(Zr)@SiO2; UiO = University of Oslo). ATOFMS depth profiling was used to show that these particles did indeed exhibit a core-shell architecture. The results presented here show that ATOFMS can provide unique insights into core-shell solid-state materials with particle diameters between 0.2-3 mu m.« less
Making sense of snapshot data: ergodic principle for clonal cell populations
2017-01-01
Population growth is often ignored when quantifying gene expression levels across clonal cell populations. We develop a framework for obtaining the molecule number distributions in an exponentially growing cell population taking into account its age structure. In the presence of generation time variability, the average acquired across a population snapshot does not obey the average of a dividing cell over time, apparently contradicting ergodicity between single cells and the population. Instead, we show that the variation observed across snapshots with known cell age is captured by cell histories, a single-cell measure obtained from tracking an arbitrary cell of the population back to the ancestor from which it originated. The correspondence between cells of known age in a population with their histories represents an ergodic principle that provides a new interpretation of population snapshot data. We illustrate the principle using analytical solutions of stochastic gene expression models in cell populations with arbitrary generation time distributions. We further elucidate that the principle breaks down for biochemical reactions that are under selection, such as the expression of genes conveying antibiotic resistance, which gives rise to an experimental criterion with which to probe selection on gene expression fluctuations. PMID:29187636
Making sense of snapshot data: ergodic principle for clonal cell populations.
Thomas, Philipp
2017-11-01
Population growth is often ignored when quantifying gene expression levels across clonal cell populations. We develop a framework for obtaining the molecule number distributions in an exponentially growing cell population taking into account its age structure. In the presence of generation time variability, the average acquired across a population snapshot does not obey the average of a dividing cell over time, apparently contradicting ergodicity between single cells and the population. Instead, we show that the variation observed across snapshots with known cell age is captured by cell histories, a single-cell measure obtained from tracking an arbitrary cell of the population back to the ancestor from which it originated. The correspondence between cells of known age in a population with their histories represents an ergodic principle that provides a new interpretation of population snapshot data. We illustrate the principle using analytical solutions of stochastic gene expression models in cell populations with arbitrary generation time distributions. We further elucidate that the principle breaks down for biochemical reactions that are under selection, such as the expression of genes conveying antibiotic resistance, which gives rise to an experimental criterion with which to probe selection on gene expression fluctuations. © 2017 The Author(s).
Boom Minimization Framework for Supersonic Aircraft Using CFD Analysis
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Rallabhandi, Sriram K.
2010-01-01
A new framework is presented for shape optimization using analytical shape functions and high-fidelity computational fluid dynamics (CFD) via Cart3D. The focus of the paper is the system-level integration of several key enabling analysis tools and automation methods to perform shape optimization and reduce sonic boom footprint. A boom mitigation case study subject to performance, stability and geometrical requirements is presented to demonstrate a subset of the capabilities of the framework. Lastly, a design space exploration is carried out to assess the key parameters and constraints driving the design.
Moral panic, moral regulation, and the civilizing process.
Hier, Sean
2016-09-01
This article compares two analytical frameworks ostensibly formulated to widen the focus of moral panic studies. The comparative analysis suggests that attempts to conceptualize moral panics in terms of decivilizing processes have neither substantively supplemented the explanatory gains made by conceptualizing moral panic as a form of moral regulation nor provided a viable alternative framework that better explains the dynamics of contemporary moral panics. The article concludes that Elias's meta-theory of the civilizing process potentially provides explanatory resources to investigate a possible historical-structural shift towards the so-called age of (a)moral panic; the analytical demands of such a project, however, require a sufficiently different line of inquiry than the one encouraged by both the regulatory and decivilizing perspectives on moral panic. © London School of Economics and Political Science 2016.
NASA Astrophysics Data System (ADS)
Margitus, Michael R.; Tagliaferri, William A., Jr.; Sudit, Moises; LaMonica, Peter M.
2012-06-01
Understanding the structure and dynamics of networks are of vital importance to winning the global war on terror. To fully comprehend the network environment, analysts must be able to investigate interconnected relationships of many diverse network types simultaneously as they evolve both spatially and temporally. To remove the burden from the analyst of making mental correlations of observations and conclusions from multiple domains, we introduce the Dynamic Graph Analytic Framework (DYGRAF). DYGRAF provides the infrastructure which facilitates a layered multi-modal network analysis (LMMNA) approach that enables analysts to assemble previously disconnected, yet related, networks in a common battle space picture. In doing so, DYGRAF provides the analyst with timely situation awareness, understanding and anticipation of threats, and support for effective decision-making in diverse environments.
Critical Medical Anthropology in Midwifery Research: A Framework for Ethnographic Analysis.
Newnham, Elizabeth C; Pincombe, Jan I; McKellar, Lois V
2016-01-01
In this article, we discuss the use of critical medical anthropology (CMA) as a theoretical framework for research in the maternity care setting. With reference to the doctoral research of the first author, we argue for the relevance of using CMA for research into the maternity care setting, particularly as it relates to midwifery. We then give an overview of an existing analytic model within CMA that we adapted for looking specifically at childbirth practices and which was then used in both analyzing the data and structuring the thesis. There is often no clear guide to the analysis or writing up of data in ethnographic research; we therefore offer this Critical analytic model of childbirth practices for other researchers conducting ethnographic research into childbirth or maternity care.
Towards an analytical framework for tailoring supercontinuum generation.
Castelló-Lurbe, David; Vermeulen, Nathalie; Silvestre, Enrique
2016-11-14
A fully analytical toolbox for supercontinuum generation relying on scenarios without pulse splitting is presented. Furthermore, starting from the new insights provided by this formalism about the physical nature of direct and cascaded dispersive wave emission, a unified description of this radiation in both normal and anomalous dispersion regimes is derived. Previously unidentified physics of broadband spectra reported in earlier works is successfully explained on this basis. Finally, a foundry-compatible few-millimeters-long silicon waveguide allowing octave-spanning supercontinuum generation pumped at telecom wavelengths in the normal dispersion regime is designed, hence showcasing the potential of this new analytical approach.
service line analytics in the new era.
Spence, Jay; Seargeant, Dan
2015-08-01
To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.
Scalable and Power Efficient Data Analytics for Hybrid Exascale Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhary, Alok; Samatova, Nagiza; Wu, Kesheng
This project developed a generic and optimized set of core data analytics functions. These functions organically consolidate a broad constellation of high performance analytical pipelines. As the architectures of emerging HPC systems become inherently heterogeneous, there is a need to design algorithms for data analysis kernels accelerated on hybrid multi-node, multi-core HPC architectures comprised of a mix of CPUs, GPUs, and SSDs. Furthermore, the power-aware trend drives the advances in our performance-energy tradeoff analysis framework which enables our data analysis kernels algorithms and software to be parameterized so that users can choose the right power-performance optimizations.
Energy thresholds of discrete breathers in thermal equilibrium and relaxation processes.
Ming, Yi; Ling, Dong-Bo; Li, Hui-Min; Ding, Ze-Jun
2017-06-01
So far, only the energy thresholds of single discrete breathers in nonlinear Hamiltonian systems have been analytically obtained. In this work, the energy thresholds of discrete breathers in thermal equilibrium and the energy thresholds of long-lived discrete breathers which can remain after a long time relaxation are analytically estimated for nonlinear chains. These energy thresholds are size dependent. The energy thresholds of discrete breathers in thermal equilibrium are the same as the previous analytical results for single discrete breathers. The energy thresholds of long-lived discrete breathers in relaxation processes are different from the previous results for single discrete breathers but agree well with the published numerical results known to us. Because real systems are either in thermal equilibrium or in relaxation processes, the obtained results could be important for experimental detection of discrete breathers.
Process-Hardened, Multi-Analyte Sensor for Characterizing Rocket Plume Constituents
NASA Technical Reports Server (NTRS)
Goswami, Kisholoy
2011-01-01
A multi-analyte sensor was developed that enables simultaneous detection of rocket engine combustion-product molecules in a launch-vehicle ground test stand. The sensor was developed using a pin-printing method by incorporating multiple sensor elements on a single chip. It demonstrated accurate and sensitive detection of analytes such as carbon dioxide, carbon monoxide, kerosene, isopropanol, and ethylene from a single measurement. The use of pin-printing technology enables high-volume fabrication of the sensor chip, which will ultimately eliminate the need for individual sensor calibration since many identical sensors are made in one batch. Tests were performed using a single-sensor chip attached to a fiber-optic bundle. The use of a fiber bundle allows placement of the opto-electronic readout device at a place remote from the test stand. The sensors are rugged for operation in harsh environments.
A Framework for Understanding Physics Students' Computational Modeling Practices
NASA Astrophysics Data System (ADS)
Lunk, Brandon Robert
With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by their existing physics content knowledge, particularly their knowledge of analytic procedures. While this existing knowledge was often applied in inappropriate circumstances, the students were still able to display a considerable amount of understanding of the physics content and of analytic solution procedures. These observations could not be adequately accommodated by the existing literature of programming comprehension. In extending the resource framework to the task of computational modeling, I model students' practices in terms of three important elements. First, a knowledge base includes re- sources for understanding physics, math, and programming structures. Second, a mechanism for monitoring and control describes students' expectations as being directed towards numerical, analytic, qualitative or rote solution approaches and which can be influenced by the problem representation. Third, a set of solution approaches---many of which were identified in this study---describe what aspects of the knowledge base students use and how they use that knowledge to enact their expectations. This framework allows us as researchers to track student discussions and pinpoint the source of difficulties. This work opens up many avenues of potential research. First, this framework gives researchers a vocabulary for extending Resource Theory to other domains of instruction, such as modeling how physics students use graphs. Second, this framework can be used as the basis for modeling expert physicists' programming practices. Important instructional implications also follow from this research. Namely, as we broaden the use of computational modeling in the physics classroom, our instructional practices should focus on helping students understand the step-by-step nature of programming in contrast to the already salient analytic procedures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeng, Qiao; Liang, WanZhen, E-mail: liangwz@xmu.edu.cn; Liu, Jie
2014-05-14
This work extends our previous works [J. Liu and W. Z. Liang, J. Chem. Phys. 135, 014113 (2011); J. Liu and W. Z. Liang, J. Chem. Phys. 135, 184111 (2011)] on analytical excited-state energy Hessian within the framework of time-dependent density functional theory (TDDFT) to couple with molecular mechanics (MM). The formalism, implementation, and applications of analytical first and second energy derivatives of TDDFT/MM excited state with respect to the nuclear and electric perturbations are presented. Their performances are demonstrated by the calculations of adiabatic excitation energies, and excited-state geometries, harmonic vibrational frequencies, and infrared intensities for a number ofmore » benchmark systems. The consistent results with the full quantum mechanical method and other hybrid theoretical methods indicate the reliability of the current numerical implementation of developed algorithms. The computational accuracy and efficiency of the current analytical approach are also checked and the computational efficient strategies are suggested to speed up the calculations of complex systems with many MM degrees of freedom. Finally, we apply the current analytical approach in TDDFT/MM to a realistic system, a red fluorescent protein chromophore together with part of its nearby protein matrix. The calculated results indicate that the rearrangement of the hydrogen bond interactions between the chromophore and the protein matrix is responsible for the large Stokes shift.« less
Keller, Brad M; Oustimov, Andrew; Wang, Yan; Chen, Jinbo; Acciavatti, Raymond J; Zheng, Yuanjie; Ray, Shonket; Gee, James C; Maidment, Andrew D A; Kontos, Despina
2015-04-01
An analytical framework is presented for evaluating the equivalence of parenchymal texture features across different full-field digital mammography (FFDM) systems using a physical breast phantom. Phantom images (FOR PROCESSING) are acquired from three FFDM systems using their automated exposure control setting. A panel of texture features, including gray-level histogram, co-occurrence, run length, and structural descriptors, are extracted. To identify features that are robust across imaging systems, a series of equivalence tests are performed on the feature distributions, in which the extent of their intersystem variation is compared to their intrasystem variation via the Hodges-Lehmann test statistic. Overall, histogram and structural features tend to be most robust across all systems, and certain features, such as edge enhancement, tend to be more robust to intergenerational differences between detectors of a single vendor than to intervendor differences. Texture features extracted from larger regions of interest (i.e., [Formula: see text]) and with a larger offset length (i.e., [Formula: see text]), when applicable, also appear to be more robust across imaging systems. This framework and observations from our experiments may benefit applications utilizing mammographic texture analysis on images acquired in multivendor settings, such as in multicenter studies of computer-aided detection and breast cancer risk assessment.
Evolution and dynamics of a matter creation model
NASA Astrophysics Data System (ADS)
Pan, S.; de Haro, J.; Paliathanasis, A.; Slagter, R. J.
2016-08-01
In a flat Friedmann-Lemaître-Robertson-Walker (FLRW) geometry, we consider the expansion of the universe powered by the gravitationally induced `adiabatic' matter creation. To demonstrate how matter creation works well with the expanding universe, we have considered a general creation rate and analysed this rate in the framework of dynamical analysis. The dynamical analysis hints the presence of a non-singular universe (without the big bang singularity) with two successive accelerated phases, one at the very early phase of the universe (I.e. inflation), and the other one describes the current accelerating universe, where this early, late accelerated phases are associated with an unstable fixed point (I.e. repeller) and a stable fixed point (attractor), respectively. We have described this phenomena by analytic solutions of the Hubble function and the scale factor of the FLRW universe. Using Jacobi last multiplier method, we have found a Lagrangian for this matter creation rate describing this scenario of the universe. To match with our early physics results, we introduce an equivalent dynamics driven by a single scalar field, discuss the associated observable parameters and compare them with the latest Planck data sets. Finally, introducing the teleparallel modified gravity, we have established an equivalent gravitational theory in the framework of matter creation.
Programming chemistry in DNA-addressable bioreactors
Fellermann, Harold; Cardelli, Luca
2014-01-01
We present a formal calculus, termed the chemtainer calculus, able to capture the complexity of compartmentalized reaction systems such as populations of possibly nested vesicular compartments. Compartments contain molecular cargo as well as surface markers in the form of DNA single strands. These markers serve as compartment addresses and allow for their targeted transport and fusion, thereby enabling reactions of previously separated chemicals. The overall system organization allows for the set-up of programmable chemistry in microfluidic or other automated environments. We introduce a simple sequential programming language whose instructions are motivated by state-of-the-art microfluidic technology. Our approach integrates electronic control, chemical computing and material production in a unified formal framework that is able to mimic the integrated computational and constructive capabilities of the subcellular matrix. We provide a non-deterministic semantics of our programming language that enables us to analytically derive the computational and constructive power of our machinery. This semantics is used to derive the sets of all constructable chemicals and supermolecular structures that emerge from different underlying instruction sets. Because our proofs are constructive, they can be used to automatically infer control programs for the construction of target structures from a limited set of resource molecules. Finally, we present an example of our framework from the area of oligosaccharide synthesis. PMID:25121647
Dual-wavelength pump-probe microscopy analysis of melanin composition
NASA Astrophysics Data System (ADS)
Thompson, Andrew; Robles, Francisco E.; Wilson, Jesse W.; Deb, Sanghamitra; Calderbank, Robert; Warren, Warren S.
2016-11-01
Pump-probe microscopy is an emerging technique that provides detailed chemical information of absorbers with sub-micrometer spatial resolution. Recent work has shown that the pump-probe signals from melanin in human skin cancers correlate well with clinical concern, but it has been difficult to infer the molecular origins of these differences. Here we develop a mathematical framework to describe the pump-probe dynamics of melanin in human pigmented tissue samples, which treats the ensemble of individual chromophores that make up melanin as Gaussian absorbers with bandwidth related via Frenkel excitons. Thus, observed signals result from an interplay between the spectral bandwidths of the individual underlying chromophores and spectral proximity of the pump and probe wavelengths. The model is tested using a dual-wavelength pump-probe approach and a novel signal processing method based on gnomonic projections. Results show signals can be described by a single linear transition path with different rates of progress for different individual pump-probe wavelength pairs. Moreover, the combined dual-wavelength data shows a nonlinear transition that supports our mathematical framework and the excitonic model to describe the optical properties of melanin. The novel gnomonic projection analysis can also be an attractive generic tool for analyzing mixing paths in biomolecular and analytical chemistry.
Dual-wavelength pump-probe microscopy analysis of melanin composition
Thompson, Andrew; Robles, Francisco E.; Wilson, Jesse W.; Deb, Sanghamitra; Calderbank, Robert; Warren, Warren S.
2016-01-01
Pump-probe microscopy is an emerging technique that provides detailed chemical information of absorbers with sub-micrometer spatial resolution. Recent work has shown that the pump-probe signals from melanin in human skin cancers correlate well with clinical concern, but it has been difficult to infer the molecular origins of these differences. Here we develop a mathematical framework to describe the pump-probe dynamics of melanin in human pigmented tissue samples, which treats the ensemble of individual chromophores that make up melanin as Gaussian absorbers with bandwidth related via Frenkel excitons. Thus, observed signals result from an interplay between the spectral bandwidths of the individual underlying chromophores and spectral proximity of the pump and probe wavelengths. The model is tested using a dual-wavelength pump-probe approach and a novel signal processing method based on gnomonic projections. Results show signals can be described by a single linear transition path with different rates of progress for different individual pump-probe wavelength pairs. Moreover, the combined dual-wavelength data shows a nonlinear transition that supports our mathematical framework and the excitonic model to describe the optical properties of melanin. The novel gnomonic projection analysis can also be an attractive generic tool for analyzing mixing paths in biomolecular and analytical chemistry. PMID:27833147
NASA Astrophysics Data System (ADS)
Saengow, C.; Giacomin, A. J.
2017-12-01
The Oldroyd 8-constant framework for continuum constitutive theory contains a rich diversity of popular special cases for polymeric liquids. In this paper, we use part of our exact solution for shear stress to arrive at unique exact analytical solutions for the normal stress difference responses to large-amplitude oscillatory shear (LAOS) flow. The nonlinearity of the polymeric liquids, triggered by LAOS, causes these responses at even multiples of the test frequency. We call responses at a frequency higher than twice the test frequency higher harmonics. We find the new exact analytical solutions to be compact and intrinsically beautiful. These solutions reduce to those of our previous work on the special case of the corotational Maxwell fluid. Our solutions also agree with our new truncated Goddard integral expansion for the special case of the corotational Jeffreys fluid. The limiting behaviors of these exact solutions also yield new explicit expressions. Finally, we use our exact solutions to see how η∞ affects the normal stress differences in LAOS.
Visualization techniques for computer network defense
NASA Astrophysics Data System (ADS)
Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew
2011-06-01
Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.
Temperature dependence of single-event burnout in n-channel power MOSFET's
NASA Astrophysics Data System (ADS)
Johnson, G. H.; Schrimpf, R. D.; Galloway, K. F.; Koga, R.
1994-03-01
The temperature dependence of single-event burnout (SEB) in n-channel power metal-oxide-semiconductor field effect transistors (MOSFET's) is investigated experimentally and analytically. Experimental data are presented which indicate that the SEB susceptibility of the power MOSFET decreases with increasing temperature. A previously reported analytical model that describes the SEB mechanism is updated to include temperature variations. This model is shown to agree with the experimental trends.
Kawamoto, Kensaku; Martin, Cary J; Williams, Kip; Tu, Ming-Chieh; Park, Charlton G; Hunter, Cheri; Staes, Catherine J; Bray, Bruce E; Deshmukh, Vikrant G; Holbrook, Reid A; Morris, Scott J; Fedderson, Matthew B; Sletta, Amy; Turnbull, James; Mulvihill, Sean J; Crabtree, Gordon L; Entwistle, David E; McKenna, Quinn L; Strong, Michael B; Pendleton, Robert C; Lee, Vivian S
2015-01-01
To develop expeditiously a pragmatic, modular, and extensible software framework for understanding and improving healthcare value (costs relative to outcomes). In 2012, a multidisciplinary team was assembled by the leadership of the University of Utah Health Sciences Center and charged with rapidly developing a pragmatic and actionable analytics framework for understanding and enhancing healthcare value. Based on an analysis of relevant prior work, a value analytics framework known as Value Driven Outcomes (VDO) was developed using an agile methodology. Evaluation consisted of measurement against project objectives, including implementation timeliness, system performance, completeness, accuracy, extensibility, adoption, satisfaction, and the ability to support value improvement. A modular, extensible framework was developed to allocate clinical care costs to individual patient encounters. For example, labor costs in a hospital unit are allocated to patients based on the hours they spent in the unit; actual medication acquisition costs are allocated to patients based on utilization; and radiology costs are allocated based on the minutes required for study performance. Relevant process and outcome measures are also available. A visualization layer facilitates the identification of value improvement opportunities, such as high-volume, high-cost case types with high variability in costs across providers. Initial implementation was completed within 6 months, and all project objectives were fulfilled. The framework has been improved iteratively and is now a foundational tool for delivering high-value care. The framework described can be expeditiously implemented to provide a pragmatic, modular, and extensible approach to understanding and improving healthcare value. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Yan, Yifei; Zhang, Lisong; Yan, Xiangzhen
2016-01-01
In this paper, a single-slope tunnel pipeline was analysed considering the effects of vertical earth pressure, horizontal soil pressure, inner pressure, thermal expansion force and pipeline—soil friction. The concept of stagnation point for the pipeline was proposed. Considering the deformation compatibility condition of the pipeline elbow, the push force of anchor blocks of a single-slope tunnel pipeline was derived based on an energy method. Then, the theoretical formula for this force is thus generated. Using the analytical equation, the push force of the anchor block of an X80 large-diameter pipeline from the West—East Gas Transmission Project was determined. Meanwhile, to verify the results of the analytical method, and the finite element method, four categories of finite element codes were introduced to calculate the push force, including CAESARII, ANSYS, AutoPIPE and ALGOR. The results show that the analytical results agree well with the numerical results, and the maximum relative error is only 4.1%. Therefore, the results obtained with the analytical method can satisfy engineering requirements. PMID:26963097
NASA Astrophysics Data System (ADS)
Wasser, L. A.; Gold, A. U.
2017-12-01
There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.
NASA Astrophysics Data System (ADS)
Tarasenko, Alexander
2018-01-01
Diffusion of particles adsorbed on a homogeneous one-dimensional lattice is investigated using a theoretical approach and MC simulations. The analytical dependencies calculated in the framework of approach are tested using the numerical data. The perfect coincidence of the data obtained by these different methods demonstrates that the correctness of the approach based on the theory of the non-equilibrium statistical operator.
ERIC Educational Resources Information Center
Wang, Li
2012-01-01
The aim of this paper is to build a capability-based framework, drawing upon the strengths of other approaches, which is applicable to the complexity of the urban-rural divide in education in China. It starts with a brief introduction to the capability approach. This is followed by a discussion of how the rights-based approach and resource-based…
Using connectivity to identify climatic drivers of local adaptation: a response to Macdonald et al.
Prunier, Jérôme G; Blanchet, Simon
2018-04-30
Macdonald et al. (Ecol. Lett., 21, 2018, 207-216) proposed an analytical framework for identifying evolutionary processes underlying trait-environment relationships observed in natural populations. Here, we propose an expanded and refined framework based on simulations and bootstrap-based approaches, and we elaborate on an important statistical caveat common to most datasets. © 2018 John Wiley & Sons Ltd/CNRS.
NASA Astrophysics Data System (ADS)
Ganapathy, Vinay; Ramachandran, Ramesh
2017-10-01
The response of a quadrupolar nucleus (nuclear spin with I > 1/2) to an oscillating radio-frequency pulse/field is delicately dependent on the ratio of the quadrupolar coupling constant to the amplitude of the pulse in addition to its duration and oscillating frequency. Consequently, analytic description of the excitation process in the density operator formalism has remained less transparent within existing theoretical frameworks. As an alternative, the utility of the "concept of effective Floquet Hamiltonians" is explored in the present study to explicate the nuances of the excitation process in multilevel systems. Employing spin I = 3/2 as a case study, a unified theoretical framework for describing the excitation of multiple-quantum transitions in static isotropic and anisotropic solids is proposed within the framework of perturbation theory. The challenges resulting from the anisotropic nature of the quadrupolar interactions are addressed within the effective Hamiltonian framework. The possible role of the various interaction frames on the convergence of the perturbation corrections is discussed along with a proposal for a "hybrid method" for describing the excitation process in anisotropic solids. Employing suitable model systems, the validity of the proposed hybrid method is substantiated through a rigorous comparison between simulations emerging from exact numerical and analytic methods.
Defining an additivity framework for mixture research in inducible whole-cell biosensors
NASA Astrophysics Data System (ADS)
Martin-Betancor, K.; Ritz, C.; Fernández-Piñas, F.; Leganés, F.; Rodea-Palomares, I.
2015-11-01
A novel additivity framework for mixture effect modelling in the context of whole cell inducible biosensors has been mathematically developed and implemented in R. The proposed method is a multivariate extension of the effective dose (EDp) concept. Specifically, the extension accounts for differential maximal effects among analytes and response inhibition beyond the maximum permissive concentrations. This allows a multivariate extension of Loewe additivity, enabling direct application in a biphasic dose-response framework. The proposed additivity definition was validated, and its applicability illustrated by studying the response of the cyanobacterial biosensor Synechococcus elongatus PCC 7942 pBG2120 to binary mixtures of Zn, Cu, Cd, Ag, Co and Hg. The novel method allowed by the first time to model complete dose-response profiles of an inducible whole cell biosensor to mixtures. In addition, the approach also allowed identification and quantification of departures from additivity (interactions) among analytes. The biosensor was found to respond in a near additive way to heavy metal mixtures except when Hg, Co and Ag were present, in which case strong interactions occurred. The method is a useful contribution for the whole cell biosensors discipline and related areas allowing to perform appropriate assessment of mixture effects in non-monotonic dose-response frameworks
de Barros, F P J; Fiori, A; Boso, F; Bellin, A
2015-01-01
Spatial heterogeneity of the hydraulic properties of geological porous formations leads to erratically shaped solute clouds, thus increasing the edge area of the solute body and augmenting the dilution rate. In this study, we provide a theoretical framework to quantify dilution of a non-reactive solute within a steady state flow as affected by the spatial variability of the hydraulic conductivity. Embracing the Lagrangian concentration framework, we obtain explicit semi-analytical expressions for the dilution index as a function of the structural parameters of the random hydraulic conductivity field, under the assumptions of uniform-in-the-average flow, small injection source and weak-to-mild heterogeneity. Results show how the dilution enhancement of the solute cloud is strongly dependent on both the statistical anisotropy ratio and the heterogeneity level of the porous medium. The explicit semi-analytical solution also captures the temporal evolution of the dilution rate; for the early- and late-time limits, the proposed solution recovers previous results from the literature, while at intermediate times it reflects the increasing interplay between large-scale advection and local-scale dispersion. The performance of the theoretical framework is verified with high resolution numerical results and successfully tested against the Cape Cod field data. Copyright © 2015 Elsevier B.V. All rights reserved.
A theoretical framework for analyzing the effect of external change on tidal dynamics in estuaries
NASA Astrophysics Data System (ADS)
CAI, H.; Savenije, H.; Toffolon, M.
2013-12-01
The most densely populated areas of the world are usually located in coastal areas near estuaries. As a result, estuaries are often subject to intense human interventions, such as dredging for navigation, dam construction and fresh water withdrawal etc., which in some areas has led to serious deterioration of invaluable ecosystems. Hence it is important to understand the influence of such interventions on tidal dynamics in these areas. In this study, we present one consistent theoretical framework for tidal hydrodynamics, which can be used as a rapid assessment technique that assist policy maker and managers to make considered decisions for the protection and management of estuarine environment when assessing the effect of human interventions in estuaries. Analytical solutions to the one-dimensional St. Venant equations for the tidal hydrodynamics in convergent unbounded estuaries with negligible river discharge can be cast in the form of a set of four implicit dimensionless equations for phase lag, velocity amplitude, damping, and wave celerity, as a function of two localized parameters describing friction and convergence. This method allows for the comparison of the different analytical approaches by rewriting the different solutions in the same format. In this study, classical and more recent formulations are compared, showing the differences and similarities associated to their specific simplifications. The envelope method, which is based on the consideration of the dynamics at high water and low water, can be used to derive damping equations that use different friction approximations. This results in as many analytical solutions, and thereby allows one to build a consistent theoretical framework. Analysis of the asymptotic behaviour of the equations shows that an equilibrium tidal amplitude exits reflecting the balance between friction and channel convergence. The framework is subsequently extended to take into account the effect of river discharge. Hence, the analytical solutions are applicable even in the upstream part of an estuary, where the influence of river discharge is remarkable. The proposed analytical solutions are transparent and practical, allowing a quantitative and qualitative assessment of human interventions (e.g., dredging, flow reduction) on tidal dynamics. Moreover, they are rapid assessment techniques that enable the users to set up a simple model and to understand the functioning of the system with a minimum of information required. The analytical model is illustrated in three large-scale estuaries with significant influence by human activities, i.e., the Scheldt estuary in the Netherlands, the Modaomen and the Yangtze estuaries in China. In these estuaries, the correspondence with observations is good, which suggests that the proposed model is a useful, yet realistic and reliable instrument for quick detection of the effect of human interventions on tidal dynamics and subsequent environmental issues, such as salt intrusion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cermelli, Paolo; Jabbour, Michel E.; Department of Mathematics, University of Kentucky, Lexington, Kentucky 40506-0027
A thermodynamically consistent continuum theory for single-species, step-flow epitaxy that extends the classical Burton-Cabrera-Frank (BCF) framework is derived from basic considerations. In particular, an expression for the step chemical potential is obtained that contains two energetic contributions--one from the adjacent terraces in the form of the jump in the adatom grand canonical potential and the other from the monolayer of crystallized adatoms that underlies the upper terrace in the form of the nominal bulk chemical potential--thus generalizing the classical Gibbs-Thomson relation to the dynamic, dissipative setting of step-flow growth. The linear stability analysis of the resulting quasistatic free-boundary problem formore » an infinite train of equidistant rectilinear steps yields explicit--i.e., analytical--criteria for the onset of step bunching in terms of the basic physical and geometric parameters of the theory. It is found that, in contrast with the predictions of the classical BCF model, both in the absence as well as in the presence of desorption, a growth regime exists for which step bunching occurs, except possibly in the dilute limit where the train is always stable to step bunching. In the present framework, the onset of one-dimensional instabilities is directly attributed to the energetic influence on the migrating steps of the adjacent terraces. Hence the theory provides a ''minimalist'' alternative to existing theories of step bunching and should be relevant to, e.g., molecular beam epitaxy of GaAs where the equilibrium adatom density is shown by Tersoff, Johnson, and Orr [Phys. Rev. B 78, 282 (1997)] to be extremely high.« less
Ragan, Eric D; Endert, Alex; Sanyal, Jibonananda; Chen, Jian
2016-01-01
While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance information and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric; Alex, Endert; Sanyal, Jibonananda
While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less
Optimizing cosmological surveys in a crowded market
NASA Astrophysics Data System (ADS)
Bassett, Bruce A.
2005-04-01
Optimizing the major next-generation cosmological surveys (such as SNAP, KAOS, etc.) is a key problem given our ignorance of the physics underlying cosmic acceleration and the plethora of surveys planned. We propose a Bayesian design framework which (1) maximizes the discrimination power of a survey without assuming any underlying dark-energy model, (2) finds the best niche survey geometry given current data and future competing experiments, (3) maximizes the cross section for serendipitous discoveries and (4) can be adapted to answer specific questions (such as “is dark energy dynamical?”). Integrated parameter-space optimization (IPSO) is a design framework that integrates projected parameter errors over an entire dark energy parameter space and then extremizes a figure of merit (such as Shannon entropy gain which we show is stable to off-diagonal covariance matrix perturbations) as a function of survey parameters using analytical, grid or MCMC techniques. We discuss examples where the optimization can be performed analytically. IPSO is thus a general, model-independent and scalable framework that allows us to appropriately use prior information to design the best possible surveys.
Ragan, Eric; Alex, Endert; Sanyal, Jibonananda; ...
2016-01-01
While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less
Zeitoun, Mark; Eid-Sabbagh, Karim; Loveless, Jeremy
2014-01-01
This paper develops an analytical framework to investigate the relationship between water and armed conflict, and applies it to the 'Summer War' of 2006 between Israel and Lebanon (Hezbollah). The framework broadens and deepens existing classifications by assessing the impact of acts of war as indiscriminate or targeted, and evaluating them in terms of international norms and law, in particular International Humanitarian Law (IHL). In the case at hand, the relationship is characterised by extensive damage in Lebanon to drinking water infrastructure and resources. This is seen as a clear violation of the letter and the spirit of IHL, while the partial destruction of more than 50 public water towers compromises water rights and national development goals. The absence of pre-war environmental baselines makes it difficult to gauge the impact on water resources, suggesting a role for those with first-hand knowledge of the hostilities to develop a more effective response before, during, and after armed conflict. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.
NASA Astrophysics Data System (ADS)
Viswanathan, Balakrishnan; Gea-Banacloche, Julio
2017-04-01
We analyze a recent scheme proposed by Xia et al. to induce a conditional phase shift between two single-photon pulses by having them propagate at different speeds through a nonlinear medium with a nonlocal response. We have obtained an analytical solution for the case they considered, which supports their claim that a π phase shift with unit fidelity is possible in principle. We discuss the conditions that have to be met and the challenges and opportunities that this might present to the realization of a single-photon conditional phase gate.
Controlling the angular radiation of single emitters using dielectric patch nanoantennas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Yuanqing; Li, Qiang; Qiu, Min, E-mail: minqiu@zju.edu.cn
2015-07-20
Dielectric nanoantennas have generated much interest in recent years owing to their low loss and optically induced electric and magnetic resonances. In this paper, we investigate the coupling between a single emitter and dielectric patch nanoantennas. For the coupled system involving non-spherical structures, analytical Mie theory is no longer applicable. A semi-analytical model is proposed instead to interpret the coupling mechanism and the radiation characteristics of the system. Based on the presented model, we demonstrate that the angular emission of the single emitter can be not only enhanced but also rotated using the dielectric patch nanoantennas.
Transportation systems health : concepts, applications & significance.
DOT National Transportation Integrated Search
2015-12-01
This report offers conceptual and analytical frameworks and application examples to address the question: how can broader statewide (or national) objectives be achieved while formally taking into consideration different regional priorities and constr...
A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework
NASA Astrophysics Data System (ADS)
Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo
An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.
Jin, Hui; Gui, Rijun; Yu, Jianbo; Lv, Wei; Wang, Zonghua
2017-05-15
Previously developed electrochemical biosensors with single-electric signal output are probably affected by intrinsic and extrinsic factors. In contrast, the ratiometric electrochemical biosensors (RECBSs) with dual-electric signal outputs have an intrinsic built-in correction to the effects from system or background electric signals, and therefore exhibit a significant potential to improve the accuracy and sensitivity in electrochemical sensing applications. In this review, we systematically summarize the fabrication strategies, sensing modes and analytical applications of RECBSs. First, the different fabrication strategies of RECBSs were introduced, referring to the analytes-induced single- and dual-dependent electrochemical signal strategies for RECBSs. Second, the different sensing modes of RECBSs were illustrated, such as differential pulse voltammetry, square wave voltammetry, cyclic voltammetry, alternating current voltammetry, electrochemiluminescence, and so forth. Third, the analytical applications of RECBSs were discussed based on the types of target analytes. Finally, the forthcoming development and future prospects in the research field of RECBSs were also highlighted. Copyright © 2017 Elsevier B.V. All rights reserved.
Flexible single-layer ionic organic-inorganic frameworks towards precise nano-size separation
NASA Astrophysics Data System (ADS)
Yue, Liang; Wang, Shan; Zhou, Ding; Zhang, Hao; Li, Bao; Wu, Lixin
2016-02-01
Consecutive two-dimensional frameworks comprised of molecular or cluster building blocks in large area represent ideal candidates for membranes sieving molecules and nano-objects, but challenges still remain in methodology and practical preparation. Here we exploit a new strategy to build soft single-layer ionic organic-inorganic frameworks via electrostatic interaction without preferential binding direction in water. Upon consideration of steric effect and additional interaction, polyanionic clusters as connection nodes and cationic pseudorotaxanes acting as bridging monomers connect with each other to form a single-layer ionic self-assembled framework with 1.4 nm layer thickness. Such soft supramolecular polymer frameworks possess uniform and adjustable ortho-tetragonal nanoporous structure in pore size of 3.4-4.1 nm and exhibit greatly convenient solution processability. The stable membranes maintaining uniform porous structure demonstrate precisely size-selective separation of semiconductor quantum dots within 0.1 nm of accuracy and may hold promise for practical applications in selective transport, molecular separation and dialysis systems.
Causal Analysis of Self-tracked Time Series Data Using a Counterfactual Framework for N-of-1 Trials.
Daza, Eric J
2018-02-01
Many of an individual's historically recorded personal measurements vary over time, thereby forming a time series (e.g., wearable-device data, self-tracked fitness or nutrition measurements, regularly monitored clinical events or chronic conditions). Statistical analyses of such n-of-1 (i.e., single-subject) observational studies (N1OSs) can be used to discover possible cause-effect relationships to then self-test in an n-of-1 randomized trial (N1RT). However, a principled way of determining how and when to interpret an N1OS association as a causal effect (e.g., as if randomization had occurred) is needed.Our goal in this paper is to help bridge the methodological gap between risk-factor discovery and N1RT testing by introducing a basic counterfactual framework for N1OS design and personalized causal analysis.We introduce and characterize what we call the average period treatment effect (APTE), i.e., the estimand of interest in an N1RT, and build an analytical framework around it that can accommodate autocorrelation and time trends in the outcome, effect carryover from previous treatment periods, and slow onset or decay of the effect. The APTE is loosely defined as a contrast (e.g., difference, ratio) of averages of potential outcomes the individual can theoretically experience under different treatment levels during a given treatment period. To illustrate the utility of our framework for APTE discovery and estimation, two common causal inference methods are specified within the N1OS context. We then apply the framework and methods to search for estimable and interpretable APTEs using six years of the author's self-tracked weight and exercise data, and report both the preliminary findings and the challenges we faced in conducting N1OS causal discovery.Causal analysis of an individual's time series data can be facilitated by an N1RT counterfactual framework. However, for inference to be valid, the veracity of certain key assumptions must be assessed critically, and the hypothesized causal models must be interpretable and meaningful. Schattauer GmbH.
NASA Astrophysics Data System (ADS)
Heudorfer, Benedikt; Haaf, Ezra; Barthel, Roland; Stahl, Kerstin
2017-04-01
A new framework for quantification of groundwater dynamics has been proposed in a companion study (Haaf et al., 2017). In this framework, a number of conceptual aspects of dynamics, such as seasonality, regularity, flashiness or inter-annual forcing, are described, which are then linked to quantitative metrics. Hereby, a large number of possible metrics are readily available from literature, such as Pardé Coefficients, Colwell's Predictability Indices or Base Flow Index. In the present work, we focus on finding multicollinearity and in consequence redundancy among the metrics representing different patterns of dynamics found in groundwater hydrographs. This is done also to verify the categories of dynamics aspects suggested by Haaf et al., 2017. To determine the optimal set of metrics we need to balance the desired minimum number of metrics and the desired maximum descriptive property of the metrics. To do this, a substantial number of candidate metrics are applied to a diverse set of groundwater hydrographs from France, Germany and Austria within the northern alpine and peri-alpine region. By applying Principle Component Analysis (PCA) to the correlation matrix of the metrics, we determine a limited number of relevant metrics that describe the majority of variation in the dataset. The resulting reduced set of metrics comprise an optimized set that can be used to describe the aspects of dynamics that were identified within the groundwater dynamics framework. For some aspects of dynamics a single significant metric could be attributed. Other aspects have a more fuzzy quality that can only be described by an ensemble of metrics and are re-evaluated. The PCA is furthermore applied to groups of groundwater hydrographs containing regimes of similar behaviour in order to explore transferability when applying the metric-based characterization framework to groups of hydrographs from diverse groundwater systems. In conclusion, we identify an optimal number of metrics, which are readily available for usage in studies on groundwater dynamics, intended to help overcome analytical limitations that exist due to the complexity of groundwater dynamics. Haaf, E., Heudorfer, B., Stahl, K., Barthel, R., 2017. A framework for quantification of groundwater dynamics - concepts and hydro(geo-)logical metrics. EGU General Assembly 2017, Vienna, Austria.
NASA Astrophysics Data System (ADS)
Jougnot, D.; Guarracino, L.
2016-12-01
The self-potential (SP) method is considered by most researchers the only geophysical method that is directly sensitive to groundwater flow. One source of SP signals, the so-called streaming potential, results from the presence of an electrical double layer at the mineral-pore water interface. When water flows through the pore space, it gives rise to a streaming current and a resulting measurable electrical voltage. Different approaches have been proposed to predict streaming potentials in porous media. One approach is based on the excess charge which is effectively dragged in the medium by the water flow. Following a recent theoretical framework, we developed a physically-based analytical model to predict the effective excess charge in saturated porous media. In this study, the porous media is described by a bundle of capillary tubes with a fractal pore-size distribution. First, an analytical relationship is derived to determine the effective excess charge for a single capillary tube as a function of the pore water salinity. Then, this relationship is used to obtain both exact and approximated expressions for the effective excess charge at the Representative Elementary Volume (REV) scale. The resulting analytical relationship allows the determination of the effective excess charge as a function of pore water salinity, fractal dimension and hydraulic parameters like porosity and permeability, which are also obtained at the REV scale. This new model has been successfully tested against data from the literature of different sources. One of the main finding of this study is that it provides a mechanistic explanation to the empirical dependence between the effective excess charge and the permeability that has been found by various researchers. The proposed petrophysical relationship also contributes to understand the role of porosity and water salinity on effective excess charge and will help to push further the use of streaming potential to monitor groundwater flow.
Effect of Additional Incentives for Aviation Biofuels: Results from the Biomass Scenario Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vimmerstedt, Laura J; Newes, Emily K
2017-12-05
The National Renewable Energy Laboratory supported the Department of Energy, Bioenergy Technologies Office, with analysis of alternative jet fuels in collaboration with the U.S. Department of Transportation, Federal Aviation Administration. Airlines for America requested additional exploratory scenarios within FAA analytic framework. Airlines for America requested additional analysis using the same analytic framework, the Biomass Scenario Model. The results were presented at a public working meeting of the California Air Resources Board on including alternative jet fuel in the Low Carbon Fuel Standard on March 17, 2017 (https://www.arb.ca.gov/fuels/lcfs/lcfs_meetings/lcfs_meetings.htm). This presentation clarifies and annotates the slides from the public working meeting, andmore » provides a link to the full data set. NREL does not advocate for or against the policies analyzed in this study.« less
Effect of Additional Incentives for Aviation Biofuels: Results from the Biomass Scenario Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vimmerstedt, Laura J; Newes, Emily K
The National Renewable Energy Laboratory supported the Department of Energy, Bioenergy Technologies Office, with analysis of alternative jet fuels in collaboration with the U.S. Department of Transportation, Federal Aviation Administration. Airlines for America requested additional exploratory scenarios within FAA analytic framework. Airlines for America requested additional analysis using the same analytic framework, the Biomass Scenario Model. The results were presented at a public working meeting of the California Air Resources Board on including alternative jet fuel in the Low Carbon Fuel Standard on March 17, 2017 (https://www.arb.ca.gov/fuels/lcfs/lcfs_meetings/lcfs_meetings.htm). This presentation clarifies and annotates the slides from the public working meeting, andmore » provides a link to the full data set. NREL does not advocate for or against the policies analyzed in this study.« less
A Crowdsensing Based Analytical Framework for Perceptional Degradation of OTT Web Browsing.
Li, Ke; Wang, Hai; Xu, Xiaolong; Du, Yu; Liu, Yuansheng; Ahmad, M Omair
2018-05-15
Service perception analysis is crucial for understanding both user experiences and network quality as well as for maintaining and optimizing of mobile networks. Given the rapid development of mobile Internet and over-the-top (OTT) services, the conventional network-centric mode of network operation and maintenance is no longer effective. Therefore, developing an approach to evaluate and optimizing users' service perceptions has become increasingly important. Meanwhile, the development of a new sensing paradigm, mobile crowdsensing (MCS), makes it possible to evaluate and analyze the user's OTT service perception from end-user's point of view other than from the network side. In this paper, the key factors that impact users' end-to-end OTT web browsing service perception are analyzed by monitoring crowdsourced user perceptions. The intrinsic relationships among the key factors and the interactions between key quality indicators (KQI) are evaluated from several perspectives. Moreover, an analytical framework of perceptional degradation and a detailed algorithm are proposed whose goal is to identify the major factors that impact the perceptional degradation of web browsing service as well as their significance of contribution. Finally, a case study is presented to show the effectiveness of the proposed method using a dataset crowdsensed from a large number of smartphone users in a real mobile network. The proposed analytical framework forms a valuable solution for mobile network maintenance and optimization and can help improve web browsing service perception and network quality.