Sample records for point process framework

  1. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  2. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining

    PubMed Central

    Truccolo, Wilson

    2017-01-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics (“order parameters”) inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. PMID:28336305

  3. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining.

    PubMed

    Truccolo, Wilson

    2016-11-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics ("order parameters") inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. Published by Elsevier Ltd.

  4. Hierarchical species distribution models

    USGS Publications Warehouse

    Hefley, Trevor J.; Hooten, Mevin B.

    2016-01-01

    Determining the distribution pattern of a species is important to increase scientific knowledge, inform management decisions, and conserve biodiversity. To infer spatial and temporal patterns, species distribution models have been developed for use with many sampling designs and types of data. Recently, it has been shown that count, presence-absence, and presence-only data can be conceptualized as arising from a point process distribution. Therefore, it is important to understand properties of the point process distribution. We examine how the hierarchical species distribution modeling framework has been used to incorporate a wide array of regression and theory-based components while accounting for the data collection process and making use of auxiliary information. The hierarchical modeling framework allows us to demonstrate how several commonly used species distribution models can be derived from the point process distribution, highlight areas of potential overlap between different models, and suggest areas where further research is needed.

  5. Quality Management Framework for Total Diet Study centres in Europe.

    PubMed

    Pité, Marina; Pinchen, Hannah; Castanheira, Isabel; Oliveira, Luisa; Roe, Mark; Ruprich, Jiri; Rehurkova, Irena; Sirot, Veronique; Papadopoulos, Alexandra; Gunnlaugsdóttir, Helga; Reykdal, Ólafur; Lindtner, Oliver; Ritvanen, Tiina; Finglas, Paul

    2018-02-01

    A Quality Management Framework to improve quality and harmonization of Total Diet Study practices in Europe was developed within the TDS-Exposure Project. Seventeen processes were identified and hazards, Critical Control Points and associated preventive and corrective measures described. The Total Diet Study process was summarized in a flowchart divided into planning and practical (sample collection, preparation and analysis; risk assessment analysis and publication) phases. Standard Operating Procedures were developed and implemented in pilot studies in five organizations. The flowchart was used to develop a quality framework for Total Diet Studies that could be included in formal quality management systems. Pilot studies operated by four project partners were visited by project assessors who reviewed implementation of the proposed framework and identified areas that could be improved. The quality framework developed can be the starting point for any Total Diet Study centre and can be used within existing formal quality management approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Unsupervised Detection of Planetary Craters by a Marked Point Process

    NASA Technical Reports Server (NTRS)

    Troglio, G.; Benediktsson, J. A.; Le Moigne, J.; Moser, G.; Serpico, S. B.

    2011-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images is being acquired. Preferably, automatic and robust processing techniques need to be used for data analysis because of the huge amount of the acquired data. Here, the aim is to achieve a robust and general methodology for crater detection. A novel technique based on a marked point process is proposed. First, the contours in the image are extracted. The object boundaries are modeled as a configuration of an unknown number of random ellipses, i.e., the contour image is considered as a realization of a marked point process. Then, an energy function is defined, containing both an a priori energy and a likelihood term. The global minimum of this function is estimated by using reversible jump Monte-Carlo Markov chain dynamics and a simulated annealing scheme. The main idea behind marked point processes is to model objects within a stochastic framework: Marked point processes represent a very promising current approach in the stochastic image modeling and provide a powerful and methodologically rigorous framework to efficiently map and detect objects and structures in an image with an excellent robustness to noise. The proposed method for crater detection has several feasible applications. One such application area is image registration by matching the extracted features.

  7. Conceptualising and managing trade-offs in sustainability assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au; School of Environmental Science, Murdoch University; Pope, Jenny

    One of the defining characteristics of sustainability assessment as a form of impact assessment is that it provides a forum for the explicit consideration of the trade-offs that are inherent in complex decision-making processes. Few sustainability assessments have achieved this goal though, and none has considered trade-offs in a holistic fashion throughout the process. Recent contributions such as the Gibson trade-off rules have significantly progressed thinking in this area by suggesting appropriate acceptability criteria for evaluating substantive trade-offs arising from proposed development, as well as process rules for how evaluations of acceptability should occur. However, there has been negligible uptakemore » of these rules in practice. Overall, we argue that there is inadequate consideration of trade-offs, both process and substantive, throughout the sustainability assessment process, and insufficient considerations of how process decisions and compromises influence substantive outcomes. This paper presents a framework for understanding and managing both process and substantive trade-offs within each step of a typical sustainability assessment process. The framework draws together previously published literature and offers case studies that illustrate aspects of the practical application of the framework. The framing and design of sustainability assessment are vitally important, as process compromises or trade-offs can have substantive consequences in terms of sustainability outcomes delivered, with the choice of alternatives considered being a particularly significant determinant of substantive outcomes. The demarcation of acceptable from unacceptable impacts is a key aspect of managing trade-offs. Offsets can be considered as a form of trade-off within a category of sustainability that are utilised to enhance preferred alternatives once conditions of impact acceptability have been met. In this way they may enable net gains to be delivered; another imperative for progress to sustainability. Understanding the nature and implications of trade-offs within sustainability assessment is essential to improving practice. - Highlights: Black-Right-Pointing-Pointer A framework for understanding trade-offs in sustainability assessment is presented. Black-Right-Pointing-Pointer Trade-offs should be considered as early as possible in any sustainability assessment process. Black-Right-Pointing-Pointer Demarcation of acceptable from unacceptable impacts is needed for effective trade-off management. Black-Right-Pointing-Pointer Offsets in place, time or kind can ensure and attain a net benefit outcome overall. Black-Right-Pointing-Pointer Gibson's trade-off rules provide useful acceptability criteria and process guidance.« less

  8. Public involvement in research: making sense of the diversity.

    PubMed

    Oliver, Sandy; Liabo, Kristin; Stewart, Ruth; Rees, Rebecca

    2015-01-01

    This paper presents a coherent framework for designing and evaluating public involvement in research by drawing on an extensive literature and the authors' experience. The framework consists of three key interrelated dimensions: the drivers for involvement; the processes for involvement and the impact of involvement. The pivotal point in this framework is the opportunity for researchers and others to exchange ideas. This opportunity results from the processes which bring them together and which support their debates and decisions. It is also the point at which research that is in the public interest is open to public influence and the point at which the interaction can also influence anyone directly involved. Judicious choice of methods for bringing people together, and supporting their debate and decisions, depends upon the drivers of those involved; these vary with their characteristics, particularly their degree of enthusiasm and experience, and their motivation. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  9. Processing approaches to cognition: the impetus from the levels-of-processing framework.

    PubMed

    Roediger, Henry L; Gallo, David A; Geraci, Lisa

    2002-01-01

    Processing approaches to cognition have a long history, from act psychology to the present, but perhaps their greatest boost was given by the success and dominance of the levels-of-processing framework. We review the history of processing approaches, and explore the influence of the levels-of-processing approach, the procedural approach advocated by Paul Kolers, and the transfer-appropriate processing framework. Processing approaches emphasise the procedures of mind and the idea that memory storage can be usefully conceptualised as residing in the same neural units that originally processed information at the time of encoding. Processing approaches emphasise the unity and interrelatedness of cognitive processes and maintain that they can be dissected into separate faculties only by neglecting the richness of mental life. We end by pointing to future directions for processing approaches.

  10. Recent updates in developing a statistical pseudo-dynamic source-modeling framework to capture the variability of earthquake rupture scenarios

    NASA Astrophysics Data System (ADS)

    Song, Seok Goo; Kwak, Sangmin; Lee, Kyungbook; Park, Donghee

    2017-04-01

    It is a critical element to predict the intensity and variability of strong ground motions in seismic hazard assessment. The characteristics and variability of earthquake rupture process may be a dominant factor in determining the intensity and variability of near-source strong ground motions. Song et al. (2014) demonstrated that the variability of earthquake rupture scenarios could be effectively quantified in the framework of 1-point and 2-point statistics of earthquake source parameters, constrained by rupture dynamics and past events. The developed pseudo-dynamic source modeling schemes were also validated against the recorded ground motion data of past events and empirical ground motion prediction equations (GMPEs) at the broadband platform (BBP) developed by the Southern California Earthquake Center (SCEC). Recently we improved the computational efficiency of the developed pseudo-dynamic source-modeling scheme by adopting the nonparametric co-regionalization algorithm, introduced and applied in geostatistics initially. We also investigated the effect of earthquake rupture process on near-source ground motion characteristics in the framework of 1-point and 2-point statistics, particularly focusing on the forward directivity region. Finally we will discuss whether the pseudo-dynamic source modeling can reproduce the variability (standard deviation) of empirical GMPEs and the efficiency of 1-point and 2-point statistics to address the variability of ground motions.

  11. Growth Points in Linking Representations of Function: A Research-Based Framework

    ERIC Educational Resources Information Center

    Ronda, Erlina

    2015-01-01

    This paper describes five growth points in linking representations of function developed from a study of secondary school learners. Framed within the cognitivist perspective and process-object conception of function, the growth points were identified and described based on linear and quadratic function tasks learners can do and their strategies…

  12. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    NASA Astrophysics Data System (ADS)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  13. a Conceptual Framework for Indoor Mapping by Using Grammars

    NASA Astrophysics Data System (ADS)

    Hu, X.; Fan, H.; Zipf, A.; Shang, J.; Gu, F.

    2017-09-01

    Maps are the foundation of indoor location-based services. Many automatic indoor mapping approaches have been proposed, but they rely highly on sensor data, such as point clouds and users' location traces. To address this issue, this paper presents a conceptual framework to represent the layout principle of research buildings by using grammars. This framework can benefit the indoor mapping process by improving the accuracy of generated maps and by dramatically reducing the volume of the sensor data required by traditional reconstruction approaches. In addition, we try to present more details of partial core modules of the framework. An example using the proposed framework is given to show the generation process of a semantic map. This framework is part of an ongoing research for the development of an approach for reconstructing semantic maps.

  14. Abstracted Workow Framework with a Structure from Motion Application

    NASA Astrophysics Data System (ADS)

    Rossi, Adam J.

    In scientific and engineering disciplines, from academia to industry, there is an increasing need for the development of custom software to perform experiments, construct systems, and develop products. The natural mindset initially is to shortcut and bypass all overhead and process rigor in order to obtain an immediate result for the problem at hand, with the misconception that the software will simply be thrown away at the end. In a majority of the cases, it turns out the software persists for many years, and likely ends up in production systems for which it was not initially intended. In the current study, a framework that can be used in both industry and academic applications mitigates underlying problems associated with developing scientific and engineering software. This results in software that is much more maintainable, documented, and usable by others, specifically allowing new users to extend capabilities of components already implemented in the framework. There is a multi-disciplinary need in the fields of imaging science, computer science, and software engineering for a unified implementation model, which motivates the development of an abstracted software framework. Structure from motion (SfM) has been identified as one use case where the abstracted workflow framework can improve research efficiencies and eliminate implementation redundancies in scientific fields. The SfM process begins by obtaining 2D images of a scene from different perspectives. Features from the images are extracted and correspondences are established. This provides a sufficient amount of information to initialize the problem for fully automated processing. Transformations are established between views, and 3D points are established via triangulation algorithms. The parameters for the camera models for all views / images are solved through bundle adjustment, establishing a highly consistent point cloud. The initial sparse point cloud and camera matrices are used to generate a dense point cloud through patch based techniques or densification algorithms such as Semi-Global Matching (SGM). The point cloud can be visualized or exploited by both humans and automated techniques. In some cases the point cloud is "draped" with original imagery in order to enhance the 3D model for a human viewer. The SfM workflow can be implemented in the abstracted framework, making it easily leverageable and extensible by multiple users. Like many processes in scientific and engineering domains, the workflow described for SfM is complex and requires many disparate components to form a functional system, often utilizing algorithms implemented by many users in different languages / environments and without knowledge of how the component fits into the larger system. In practice, this generally leads to issues interfacing the components, building the software for desired platforms, understanding its concept of operations, and how it can be manipulated in order to fit the desired function for a particular application. In addition, other scientists and engineers instinctively wish to analyze the performance of the system, establish new algorithms, optimize existing processes, and establish new functionality based on current research. This requires a framework whereby new components can be easily plugged in without affecting the current implemented functionality. The need for a universal programming environment establishes the motivation for the development of the abstracted workflow framework. This software implementation, named Catena, provides base classes from which new components must derive in order to operate within the framework. The derivation mandates requirements be satisfied in order to provide a complete implementation. Additionally, the developer must provide documentation of the component in terms of its overall function and inputs. The interface input and output values corresponding to the component must be defined in terms of their respective data types, and the implementation uses mechanisms within the framework to retrieve and send the values. This process requires the developer to componentize their algorithm rather than implement it monolithically. Although the requirements of the developer are slightly greater, the benefits realized from using Catena far outweigh the overhead, and results in extensible software. This thesis provides a basis for the abstracted workflow framework concept and the Catena software implementation. The benefits are also illustrated using a detailed examination of the SfM process as an example application.

  15. Strategy making and power in environmental assessments. Lessons from the establishment of an out-of-town shopping centre in Vaesteras, Sweden

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isaksson, Karolina, E-mail: karolina.isaksson@vti.se; Storbjoerk, Sofie, E-mail: sofie.storbjork@liu.se

    This paper seeks to provide deeper insights into how EA ineffectiveness is produced in land use planning practice. This is explored in a study of local development planning in the city of Vaesteras, Sweden. The case in question is the development of a large out-of-town shopping centre, propelled by the establishment of a new IKEA furniture store. The Healey (2007) framework of planning as strategy making is applied as an analytical framework, together with a focus on power-knowledge relations. In the analysis, we identify a range of mechanisms that produced ineffectiveness by limiting the role of environmental knowledge throughout themore » planning process. The specific mechanisms we identified were related to the overall consensus perspective in local development strategies and plans, a lack of concretisation and integration of various policies and strategies, a range of exclusion mechanisms and an overall focus on mitigation and benefits of the process in question. In practice, these mechanisms were closely intertwined. Our main conclusion is, consequently, that increased effectiveness of EA would require fundamental transformation of the norms, frameworks and routines that implicitly and explicitly guide land use planning in practice. - Highlights: Black-Right-Pointing-Pointer We analyse how EA-ineffectiveness is produced in land use planning practice. Black-Right-Pointing-Pointer Several mechanisms produce EA-ineffectiveness throughout the whole planning process. Black-Right-Pointing-Pointer These mechanisms are often closely intertwined and mutually reinforcing each other. Black-Right-Pointing-Pointer Enhancing EA-effectiveness requires a fundamental shift of the norms, frameworks and routines shaping planning practice.« less

  16. Framework for managing mycotoxin risks in the food industry.

    PubMed

    Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie

    2014-12-01

    We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.

  17. Inhomogeneous Point-Processes to Instantaneously Assess Affective Haptic Perception through Heartbeat Dynamics Information

    NASA Astrophysics Data System (ADS)

    Valenza, G.; Greco, A.; Citi, L.; Bianchi, M.; Barbieri, R.; Scilingo, E. P.

    2016-06-01

    This study proposes the application of a comprehensive signal processing framework, based on inhomogeneous point-process models of heartbeat dynamics, to instantaneously assess affective haptic perception using electrocardiogram-derived information exclusively. The framework relies on inverse-Gaussian point-processes with Laguerre expansion of the nonlinear Wiener-Volterra kernels, accounting for the long-term information given by the past heartbeat events. Up to cubic-order nonlinearities allow for an instantaneous estimation of the dynamic spectrum and bispectrum of the considered cardiovascular dynamics, as well as for instantaneous measures of complexity, through Lyapunov exponents and entropy. Short-term caress-like stimuli were administered for 4.3-25 seconds on the forearms of 32 healthy volunteers (16 females) through a wearable haptic device, by selectively superimposing two levels of force, 2 N and 6 N, and two levels of velocity, 9.4 mm/s and 65 mm/s. Results demonstrated that our instantaneous linear and nonlinear features were able to finely characterize the affective haptic perception, with a recognition accuracy of 69.79% along the force dimension, and 81.25% along the velocity dimension.

  18. On the stability and dynamics of stochastic spiking neuron models: Nonlinear Hawkes process and point process GLMs

    PubMed Central

    Truccolo, Wilson

    2017-01-01

    Point process generalized linear models (PP-GLMs) provide an important statistical framework for modeling spiking activity in single-neurons and neuronal networks. Stochastic stability is essential when sampling from these models, as done in computational neuroscience to analyze statistical properties of neuronal dynamics and in neuro-engineering to implement closed-loop applications. Here we show, however, that despite passing common goodness-of-fit tests, PP-GLMs estimated from data are often unstable, leading to divergent firing rates. The inclusion of absolute refractory periods is not a satisfactory solution since the activity then typically settles into unphysiological rates. To address these issues, we derive a framework for determining the existence and stability of fixed points of the expected conditional intensity function (CIF) for general PP-GLMs. Specifically, in nonlinear Hawkes PP-GLMs, the CIF is expressed as a function of the previous spike history and exogenous inputs. We use a mean-field quasi-renewal (QR) approximation that decomposes spike history effects into the contribution of the last spike and an average of the CIF over all spike histories prior to the last spike. Fixed points for stationary rates are derived as self-consistent solutions of integral equations. Bifurcation analysis and the number of fixed points predict that the original models can show stable, divergent, and metastable (fragile) dynamics. For fragile models, fluctuations of the single-neuron dynamics predict expected divergence times after which rates approach unphysiologically high values. This metric can be used to estimate the probability of rates to remain physiological for given time periods, e.g., for simulation purposes. We demonstrate the use of the stability framework using simulated single-neuron examples and neurophysiological recordings. Finally, we show how to adapt PP-GLM estimation procedures to guarantee model stability. Overall, our results provide a stability framework for data-driven PP-GLMs and shed new light on the stochastic dynamics of state-of-the-art statistical models of neuronal spiking activity. PMID:28234899

  19. On the stability and dynamics of stochastic spiking neuron models: Nonlinear Hawkes process and point process GLMs.

    PubMed

    Gerhard, Felipe; Deger, Moritz; Truccolo, Wilson

    2017-02-01

    Point process generalized linear models (PP-GLMs) provide an important statistical framework for modeling spiking activity in single-neurons and neuronal networks. Stochastic stability is essential when sampling from these models, as done in computational neuroscience to analyze statistical properties of neuronal dynamics and in neuro-engineering to implement closed-loop applications. Here we show, however, that despite passing common goodness-of-fit tests, PP-GLMs estimated from data are often unstable, leading to divergent firing rates. The inclusion of absolute refractory periods is not a satisfactory solution since the activity then typically settles into unphysiological rates. To address these issues, we derive a framework for determining the existence and stability of fixed points of the expected conditional intensity function (CIF) for general PP-GLMs. Specifically, in nonlinear Hawkes PP-GLMs, the CIF is expressed as a function of the previous spike history and exogenous inputs. We use a mean-field quasi-renewal (QR) approximation that decomposes spike history effects into the contribution of the last spike and an average of the CIF over all spike histories prior to the last spike. Fixed points for stationary rates are derived as self-consistent solutions of integral equations. Bifurcation analysis and the number of fixed points predict that the original models can show stable, divergent, and metastable (fragile) dynamics. For fragile models, fluctuations of the single-neuron dynamics predict expected divergence times after which rates approach unphysiologically high values. This metric can be used to estimate the probability of rates to remain physiological for given time periods, e.g., for simulation purposes. We demonstrate the use of the stability framework using simulated single-neuron examples and neurophysiological recordings. Finally, we show how to adapt PP-GLM estimation procedures to guarantee model stability. Overall, our results provide a stability framework for data-driven PP-GLMs and shed new light on the stochastic dynamics of state-of-the-art statistical models of neuronal spiking activity.

  20. [On-line processing mechanisms in text comprehension: a theoretical review on constructing situation models].

    PubMed

    Iseki, Ryuta

    2004-12-01

    This article reviewed research on construction of situation models during reading. To position variety of research in overall process appropriately, an unitary framework was devised in terms of three theories for on-line processing: resonance process, event-indexing model, and constructionist theory. Resonance process was treated as a basic activation mechanism in the framework. Event-indexing model was regarded as a screening system which selected and encoded activated information in situation models along with situational dimensions. Constructionist theory was considered to have a supervisory role based on coherence and explanation. From a view of the unitary framework, some problems concerning each theory were examined and possible interpretations were given. Finally, it was pointed out that there were little theoretical arguments on associative processing at global level and encoding text- and inference-information into long-term memory.

  1. Jupiter Europa Orbiter Architecture Definition Process

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Shishko, Robert

    2011-01-01

    The proposed Jupiter Europa Orbiter mission, planned for launch in 2020, is using a new architectural process and framework tool to drive its model-based systems engineering effort. The process focuses on getting the architecture right before writing requirements and developing a point design. A new architecture framework tool provides for the structured entry and retrieval of architecture artifacts based on an emerging architecture meta-model. This paper describes the relationships among these artifacts and how they are used in the systems engineering effort. Some early lessons learned are discussed.

  2. Dynamic optimization of chemical processes using ant colony framework.

    PubMed

    Rajesh, J; Gupta, K; Kusumakar, H S; Jayaraman, V K; Kulkarni, B D

    2001-11-01

    Ant colony framework is illustrated by considering dynamic optimization of six important bench marking examples. This new computational tool is simple to implement and can tackle problems with state as well as terminal constraints in a straightforward fashion. It requires fewer grid points to reach the global optimum at relatively very low computational effort. The examples with varying degree of complexities, analyzed here, illustrate its potential for solving a large class of process optimization problems in chemical engineering.

  3. A Unified Point Process Probabilistic Framework to Assess Heartbeat Dynamics and Autonomic Cardiovascular Control

    PubMed Central

    Chen, Zhe; Purdon, Patrick L.; Brown, Emery N.; Barbieri, Riccardo

    2012-01-01

    In recent years, time-varying inhomogeneous point process models have been introduced for assessment of instantaneous heartbeat dynamics as well as specific cardiovascular control mechanisms and hemodynamics. Assessment of the model’s statistics is established through the Wiener-Volterra theory and a multivariate autoregressive (AR) structure. A variety of instantaneous cardiovascular metrics, such as heart rate (HR), heart rate variability (HRV), respiratory sinus arrhythmia (RSA), and baroreceptor-cardiac reflex (baroreflex) sensitivity (BRS), are derived within a parametric framework and instantaneously updated with adaptive and local maximum likelihood estimation algorithms. Inclusion of second-order non-linearities, with subsequent bispectral quantification in the frequency domain, further allows for definition of instantaneous metrics of non-linearity. We here present a comprehensive review of the devised methods as applied to experimental recordings from healthy subjects during propofol anesthesia. Collective results reveal interesting dynamic trends across the different pharmacological interventions operated within each anesthesia session, confirming the ability of the algorithm to track important changes in cardiorespiratory elicited interactions, and pointing at our mathematical approach as a promising monitoring tool for an accurate, non-invasive assessment in clinical practice. We also discuss the limitations and other alternative modeling strategies of our point process approach. PMID:22375120

  4. Virtual reality for spherical images

    NASA Astrophysics Data System (ADS)

    Pilarczyk, Rafal; Skarbek, Władysław

    2017-08-01

    Paper presents virtual reality application framework and application concept for mobile devices. Framework uses Google Cardboard library for Android operating system. Framework allows to create virtual reality 360 video player using standard OpenGL ES rendering methods. Framework provides network methods in order to connect to web server as application resource provider. Resources are delivered using JSON response as result of HTTP requests. Web server also uses Socket.IO library for synchronous communication between application and server. Framework implements methods to create event driven process of rendering additional content based on video timestamp and virtual reality head point of view.

  5. Process-Product Research: A Cornerstone in Educational Effectiveness Research

    ERIC Educational Resources Information Center

    Creemers, Bert; Kyriakides, Leonidas

    2015-01-01

    This article links the contribution of process-product studies in developing the theoretical framework of educational effectiveness by pointing out the importance of teacher behavior in the classroom. The role that Jere Brophy played in this evolving research is described within the various phases of teacher effectiveness research. Process-product…

  6. Towards Culturally Relevant Classroom Science: A Theoretical Framework Focusing on Traditional Plant Healing

    ERIC Educational Resources Information Center

    Mpofu, Vongai; Otulaja, Femi S.; Mushayikwa, Emmanuel

    2014-01-01

    A theoretical framework is an important component of a research study. It grounds the study and guides the methodological design. It also forms a reference point for the interpretation of the research findings. This paper conceptually examines the process of constructing a multi-focal theoretical lens for guiding studies that aim to accommodate…

  7. Evaluating the substantive effectiveness of SEA: Towards a better understanding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doren, D. van; Driessen, P.P.J., E-mail: p.driessen@uu.nl; Schijf, B.

    Evaluating the substantive effectiveness of strategic environmental assessment (SEA) is vital in order to know to what extent the tool fulfills its purposes and produces expected results. However, the studies that have evaluated the substantive effectiveness of SEA produce varying outcomes as regards the tool's contribution to decision-making and have used a variety of approaches to appraise its effectiveness. The aim of this article is to discuss the theoretical concept of SEA substantive effectiveness and to present a new approach that can be applied for evaluation studies. The SEA effectiveness evaluation framework that will be presented is composed of conceptsmore » of, and approaches to, SEA effectiveness derived from SEA literature and planning theory. Lessons for evaluation can be learned from planning theory in particular, given its long history of analyzing and understanding how sources of information and decisions affect (subsequent) decision-making. Key concepts of this new approach are 'conformance' and 'performance'. In addition, this article presents a systematic overview of process and context factors that can explain SEA effectiveness, derived from SEA literature. To illustrate the practical value of our framework for the assessment and understanding of substantive effectiveness of SEA, three Dutch SEA case studies are examined. The case studies have confirmed the usefulness of the SEA effectiveness assessment framework. The framework proved helpful in order to describe the cumulative influence of the three SEAs on decision-making and the ultimate plan. - Highlights: Black-Right-Pointing-Pointer A new framework to evaluate the substantive effectiveness of SEA is presented. Black-Right-Pointing-Pointer The framework is based on two key concepts: 'conformance' and 'performance.' Black-Right-Pointing-Pointer The practical applicability of the framework is demonstrated by three Dutch cases. Black-Right-Pointing-Pointer The framework allows for a more systematic understanding of SEA effectiveness. Black-Right-Pointing-Pointer Finally, this paper presents explanations for SEA effectiveness.« less

  8. Modeling treatment of ischemic heart disease with partially observable Markov decision processes.

    PubMed

    Hauskrecht, M; Fraser, H

    1998-01-01

    Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead they are very often dependent and interleaved over time, mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of Partially observable Markov decision processes (POMDPs) developed and used in operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In the paper, we show how the POMDP framework could be used to model and solve the problem of the management of patients with ischemic heart disease, and point out modeling advantages of the framework over standard decision formalisms.

  9. Supercritical processing as a route to high internal surface areas and permanent microporosity in metal-organic framework materials.

    PubMed

    Nelson, Andrew P; Farha, Omar K; Mulfort, Karen L; Hupp, Joseph T

    2009-01-21

    Careful processing of four representative metal-organic framework (MOF) materials with liquid and supercritical carbon dioxide (ScD) leads to substantial, or in some cases spectacular (up to 1200%), increases in gas-accessible surface area. Maximization of surface area is key to the optimization of MOFs for many potential applications. Preliminary evidence points to inhibition of mesopore collapse, and therefore micropore accessibility, as the basis for the extraordinarily efficacious outcome of ScD-based activation.

  10. A framework for evaluating electronic health record vendor user-centered design and usability testing processes.

    PubMed

    Ratwani, Raj M; Zachary Hettinger, A; Kosydar, Allison; Fairbanks, Rollin J; Hodgkins, Michael L

    2017-04-01

    Currently, there are few resources for electronic health record (EHR) purchasers and end users to understand the usability processes employed by EHR vendors during product design and development. We developed a framework, based on human factors literature and industry standards, to systematically evaluate the user-centered design processes and usability testing methods used by EHR vendors. We reviewed current usability certification requirements and the human factors literature to develop a 15-point framework for evaluating EHR products. The framework is based on 3 dimensions: user-centered design process, summative testing methodology, and summative testing results. Two vendor usability reports were retrieved from the Office of the National Coordinator's Certified Health IT Product List and were evaluated using the framework. One vendor scored low on the framework (5 pts) while the other vendor scored high on the framework (15 pts). The 2 scored vendor reports demonstrate the framework's ability to discriminate between the variabilities in vendor processes and to determine which vendors are meeting best practices. The framework provides a method to more easily comprehend EHR vendors' usability processes and serves to highlight where EHR vendors may be falling short in terms of best practices. The framework provides a greater level of transparency for both purchasers and end users of EHRs. The framework highlights the need for clearer certification requirements and suggests that the authorized certification bodies that examine vendor usability reports may need to be provided with clearer guidance. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. Active point out-of-plane ultrasound calibration

    NASA Astrophysics Data System (ADS)

    Cheng, Alexis; Guo, Xiaoyu; Zhang, Haichong K.; Kang, Hyunjae; Etienne-Cummings, Ralph; Boctor, Emad M.

    2015-03-01

    Image-guided surgery systems are often used to provide surgeons with informational support. Due to several unique advantages such as ease of use, real-time image acquisition, and no ionizing radiation, ultrasound is a common intraoperative medical imaging modality used in image-guided surgery systems. To perform advanced forms of guidance with ultrasound, such as virtual image overlays or automated robotic actuation, an ultrasound calibration process must be performed. This process recovers the rigid body transformation between a tracked marker attached to the transducer and the ultrasound image. Point-based phantoms are considered to be accurate, but their calibration framework assumes that the point is in the image plane. In this work, we present the use of an active point phantom and a calibration framework that accounts for the elevational uncertainty of the point. Given the lateral and axial position of the point in the ultrasound image, we approximate a circle in the axial-elevational plane with a radius equal to the axial position. The standard approach transforms all of the imaged points to be a single physical point. In our approach, we minimize the distances between the circular subsets of each image, with them ideally intersecting at a single point. We simulated in noiseless and noisy cases, presenting results on out-of-plane estimation errors, calibration estimation errors, and point reconstruction precision. We also performed an experiment using a robot arm as the tracker, resulting in a point reconstruction precision of 0.64mm.

  12. Inhomogeneous Point-Processes to Instantaneously Assess Affective Haptic Perception through Heartbeat Dynamics Information

    PubMed Central

    Valenza, G.; Greco, A.; Citi, L.; Bianchi, M.; Barbieri, R.; Scilingo, E. P.

    2016-01-01

    This study proposes the application of a comprehensive signal processing framework, based on inhomogeneous point-process models of heartbeat dynamics, to instantaneously assess affective haptic perception using electrocardiogram-derived information exclusively. The framework relies on inverse-Gaussian point-processes with Laguerre expansion of the nonlinear Wiener-Volterra kernels, accounting for the long-term information given by the past heartbeat events. Up to cubic-order nonlinearities allow for an instantaneous estimation of the dynamic spectrum and bispectrum of the considered cardiovascular dynamics, as well as for instantaneous measures of complexity, through Lyapunov exponents and entropy. Short-term caress-like stimuli were administered for 4.3–25 seconds on the forearms of 32 healthy volunteers (16 females) through a wearable haptic device, by selectively superimposing two levels of force, 2 N and 6 N, and two levels of velocity, 9.4 mm/s and 65 mm/s. Results demonstrated that our instantaneous linear and nonlinear features were able to finely characterize the affective haptic perception, with a recognition accuracy of 69.79% along the force dimension, and 81.25% along the velocity dimension. PMID:27357966

  13. PHOTOCHEMICAL SIMULATIONS OF POINT SOURCE EMISSIONS WITH THE MODELS-3 CMAQ PLUME-IN-GRID APPROACH

    EPA Science Inventory

    A plume-in-grid (PinG) approach has been designed to provide a realistic treatment for the simulation the dynamic and chemical processes impacting pollutant species in major point source plumes during a subgrid scale phase within an Eulerian grid modeling framework. The PinG sci...

  14. Highway extraction from high resolution aerial photography using a geometric active contour model

    NASA Astrophysics Data System (ADS)

    Niu, Xutong

    Highway extraction and vehicle detection are two of the most important steps in traffic-flow analysis from multi-frame aerial photographs. The traditional method of deriving traffic flow trajectories relies on manual vehicle counting from a sequence of aerial photographs, which is tedious and time-consuming. This research presents a new framework for semi-automatic highway extraction. The basis of the new framework is an improved geometric active contour (GAC) model. This novel model seeks to minimize an objective function that transforms a problem of propagation of regular curves into an optimization problem. The implementation of curve propagation is based on level set theory. By using an implicit representation of a two-dimensional curve, a level set approach can be used to deal with topological changes naturally, and the output is unaffected by different initial positions of the curve. However, the original GAC model, on which the new model is based, only incorporates boundary information into the curve propagation process. An error-producing phenomenon called leakage is inevitable wherever there is an uncertain weak edge. In this research, region-based information is added as a constraint into the original GAC model, thereby, giving this proposed method the ability of integrating both boundary and region-based information during the curve propagation. Adding the region-based constraint eliminates the leakage problem. This dissertation applies the proposed augmented GAC model to the problem of highway extraction from high-resolution aerial photography. First, an optimized stopping criterion is designed and used in the implementation of the GAC model. It effectively saves processing time and computations. Second, a seed point propagation framework is designed and implemented. This framework incorporates highway extraction, tracking, and linking into one procedure. A seed point is usually placed at an end node of highway segments close to the boundary of the image or at a position where possible blocking may occur, such as at an overpass bridge or near vehicle crowds. These seed points can be automatically propagated throughout the entire highway network. During the process, road center points are also extracted, which introduces a search direction for solving possible blocking problems. This new framework has been successfully applied to highway network extraction from a large orthophoto mosaic. In the process, vehicles on the highway extracted from mosaic were detected with an 83% success rate.

  15. Modeling fixation locations using spatial point processes.

    PubMed

    Barthelmé, Simon; Trukenbrod, Hans; Engbert, Ralf; Wichmann, Felix

    2013-10-01

    Whenever eye movements are measured, a central part of the analysis has to do with where subjects fixate and why they fixated where they fixated. To a first approximation, a set of fixations can be viewed as a set of points in space; this implies that fixations are spatial data and that the analysis of fixation locations can be beneficially thought of as a spatial statistics problem. We argue that thinking of fixation locations as arising from point processes is a very fruitful framework for eye-movement data, helping turn qualitative questions into quantitative ones. We provide a tutorial introduction to some of the main ideas of the field of spatial statistics, focusing especially on spatial Poisson processes. We show how point processes help relate image properties to fixation locations. In particular we show how point processes naturally express the idea that image features' predictability for fixations may vary from one image to another. We review other methods of analysis used in the literature, show how they relate to point process theory, and argue that thinking in terms of point processes substantially extends the range of analyses that can be performed and clarify their interpretation.

  16. General methodology for nonlinear modeling of neural systems with Poisson point-process inputs.

    PubMed

    Marmarelis, V Z; Berger, T W

    2005-07-01

    This paper presents a general methodological framework for the practical modeling of neural systems with point-process inputs (sequences of action potentials or, more broadly, identical events) based on the Volterra and Wiener theories of functional expansions and system identification. The paper clarifies the distinctions between Volterra and Wiener kernels obtained from Poisson point-process inputs. It shows that only the Wiener kernels can be estimated via cross-correlation, but must be defined as zero along the diagonals. The Volterra kernels can be estimated far more accurately (and from shorter data-records) by use of the Laguerre expansion technique adapted to point-process inputs, and they are independent of the mean rate of stimulation (unlike their P-W counterparts that depend on it). The Volterra kernels can also be estimated for broadband point-process inputs that are not Poisson. Useful applications of this modeling approach include cases where we seek to determine (model) the transfer characteristics between one neuronal axon (a point-process 'input') and another axon (a point-process 'output') or some other measure of neuronal activity (a continuous 'output', such as population activity) with which a causal link exists.

  17. A Framework for Applying Point Clouds Grabbed by Multi-Beam LIDAR in Perceiving the Driving Environment

    PubMed Central

    Liu, Jian; Liang, Huawei; Wang, Zhiling; Chen, Xiangcheng

    2015-01-01

    The quick and accurate understanding of the ambient environment, which is composed of road curbs, vehicles, pedestrians, etc., is critical for developing intelligent vehicles. The road elements included in this work are road curbs and dynamic road obstacles that directly affect the drivable area. A framework for the online modeling of the driving environment using a multi-beam LIDAR, i.e., a Velodyne HDL-64E LIDAR, which describes the 3D environment in the form of a point cloud, is reported in this article. First, ground segmentation is performed via multi-feature extraction of the raw data grabbed by the Velodyne LIDAR to satisfy the requirement of online environment modeling. Curbs and dynamic road obstacles are detected and tracked in different manners. Curves are fitted for curb points, and points are clustered into bundles whose form and kinematics parameters are calculated. The Kalman filter is used to track dynamic obstacles, whereas the snake model is employed for curbs. Results indicate that the proposed framework is robust under various environments and satisfies the requirements for online processing. PMID:26404290

  18. 3D Reconstruction of Space Objects from Multi-Views by a Visible Sensor

    PubMed Central

    Zhang, Haopeng; Wei, Quanmao; Jiang, Zhiguo

    2017-01-01

    In this paper, a novel 3D reconstruction framework is proposed to recover the 3D structural model of a space object from its multi-view images captured by a visible sensor. Given an image sequence, this framework first estimates the relative camera poses and recovers the depths of the surface points by the structure from motion (SFM) method, then the patch-based multi-view stereo (PMVS) algorithm is utilized to generate a dense 3D point cloud. To resolve the wrong matches arising from the symmetric structure and repeated textures of space objects, a new strategy is introduced, in which images are added to SFM in imaging order. Meanwhile, a refining process exploiting the structural prior knowledge that most sub-components of artificial space objects are composed of basic geometric shapes is proposed and applied to the recovered point cloud. The proposed reconstruction framework is tested on both simulated image datasets and real image datasets. Experimental results illustrate that the recovered point cloud models of space objects are accurate and have a complete coverage of the surface. Moreover, outliers and points with severe noise are effectively filtered out by the refinement, resulting in an distinct improvement of the structure and visualization of the recovered points. PMID:28737675

  19. Development of Chemical Process Design and Control for ...

    EPA Pesticide Factsheets

    This contribution describes a novel process systems engineering framework that couples advanced control with sustainability evaluation and decision making for the optimization of process operations to minimize environmental impacts associated with products, materials, and energy. The implemented control strategy combines a biologically inspired method with optimal control concepts for finding more sustainable operating trajectories. The sustainability assessment of process operating points is carried out by using the U.S. E.P.A.’s Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Process Evaluator (GREENSCOPE) tool that provides scores for the selected indicators in the economic, material efficiency, environmental and energy areas. The indicator scores describe process performance on a sustainability measurement scale, effectively determining which operating point is more sustainable if there are more than several steady states for one specific product manufacturing. Through comparisons between a representative benchmark and the optimal steady-states obtained through implementation of the proposed controller, a systematic decision can be made in terms of whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous fermentation process for fuel production, whose materi

  20. The Consolidated Framework for Implementation Research (CFIR): a useful theoretical framework for guiding and evaluating a guideline implementation process in a hospital-based nursing practice.

    PubMed

    Breimaier, Helga E; Heckemann, Birgit; Halfens, Ruud J G; Lohrmann, Christa

    2015-01-01

    Implementing clinical practice guidelines (CPGs) in healthcare settings is a complex intervention involving both independent and interdependent components. Although the Consolidated Framework for Implementation Research (CFIR) has never been evaluated in a practical context, it appeared to be a suitable theoretical framework to guide an implementation process. The aim of this study was to evaluate the comprehensiveness, applicability and usefulness of the CFIR in the implementation of a fall-prevention CPG in nursing practice to improve patient care in an Austrian university teaching hospital setting. The evaluation of the CFIR was based on (1) team-meeting minutes, (2) the main investigator's research diary, containing a record of a before-and-after, mixed-methods study design embedded in a participatory action research (PAR) approach for guideline implementation, and (3) an analysis of qualitative and quantitative data collected from graduate and assistant nurses in two Austrian university teaching hospital departments. The CFIR was used to organise data per and across time point(s) and assess their influence on the implementation process, resulting in implementation and service outcomes. Overall, the CFIR could be demonstrated to be a comprehensive framework for the implementation of a guideline into a hospital-based nursing practice. However, the CFIR did not account for some crucial factors during the planning phase of an implementation process, such as consideration of stakeholder aims and wishes/needs when implementing an innovation, pre-established measures related to the intended innovation and pre-established strategies for implementing an innovation. For the CFIR constructs reflecting & evaluating and engaging, a more specific definition is recommended. The framework and its supplements could easily be used by researchers, and their scope was appropriate for the complexity of a prospective CPG-implementation project. The CFIR facilitated qualitative data analysis and provided a structure that allowed project results to be organised and viewed in a broader context to explain the main findings. The CFIR was a valuable and helpful framework for (1) the assessment of the baseline, process and final state of the implementation process and influential factors, (2) the content analysis of qualitative data collected throughout the implementation process, and (3) explaining the main findings.

  1. Differentiated Technical Assistance for Sustainable Transformation. Technical Assistance Brief #2

    ERIC Educational Resources Information Center

    McCart, Amy; McSheehan, Michael; Sailor, Wayne

    2015-01-01

    Schoolwide Integrated Framework for Transformation (SWIFT) Center's technical assistance process supports states, districts, and schools as they become excellent and equitable teaching and learning environments for "all" students. Each school with support from its district begins this process from its own starting point and travels its…

  2. In Search of a Unified Model of Language Contact

    ERIC Educational Resources Information Center

    Winford, Donald

    2013-01-01

    Much previous research has pointed to the need for a unified framework for language contact phenomena -- one that would include social factors and motivations, structural factors and linguistic constraints, and psycholinguistic factors involved in processes of language processing and production. While Contact Linguistics has devoted a great deal…

  3. Safe driving and executive functions in healthy middle-aged drivers.

    PubMed

    León-Domínguez, Umberto; Solís-Marcos, Ignacio; Barrio-Álvarez, Elena; Barroso Y Martín, Juan Manuel; León-Carrión, José

    2017-01-01

    The introduction of the point system driver's license in several European countries could offer a valid framework for evaluating driving skills. This is the first study to use this framework to assess the functional integrity of executive functions in middle-aged drivers with full points, partial points or no points on their driver's license (N = 270). The purpose of this study is to find differences in executive functions that could be determinants in safe driving. Cognitive tests were used to assess attention processes, processing speed, planning, cognitive flexibility, and inhibitory control. Analyses for covariance (ANCOVAS) were used for group comparisons while adjusting for education level. The Bonferroni method was used for correcting for multiple comparisons. Overall, drivers with the full points on their license showed better scores than the other two groups. In particular, significant differences were found in reaction times on Simple and Conditioned Attention tasks (both p-values < 0.001) and in number of type-III errors on the Tower of Hanoi task (p = 0.026). Differences in reaction time on attention tasks could serve as neuropsychological markers for safe driving. Further analysis should be conducted in order to determine the behavioral impact of impaired executive functioning on driving ability.

  4. Building a Semantic Framework for eScience

    NASA Astrophysics Data System (ADS)

    Movva, S.; Ramachandran, R.; Maskey, M.; Li, X.

    2009-12-01

    The e-Science vision focuses on the use of advanced computing technologies to support scientists. Recent research efforts in this area have focused primarily on “enabling” use of infrastructure resources for both data and computational access especially in Geosciences. One of the existing gaps in the existing e-Science efforts has been the failure to incorporate stable semantic technologies within the design process itself. In this presentation, we describe our effort in designing a framework for e-Science built using Service Oriented Architecture. Our framework provides users capabilities to create science workflows and mine distributed data. Our e-Science framework is being designed around a mass market tool to promote reusability across many projects. Semantics is an integral part of this framework and our design goal is to leverage the latest stable semantic technologies. The use of these stable semantic technologies will provide the users of our framework the useful features such as: allow search engines to find their content with RDFa tags; create RDF triple data store for their content; create RDF end points to share with others; and semantically mash their content with other online content available as RDF end point.

  5. Update on CERN Search based on SharePoint 2013

    NASA Astrophysics Data System (ADS)

    Alvarez, E.; Fernandez, S.; Lossent, A.; Posada, I.; Silva, B.; Wagner, A.

    2017-10-01

    CERN’s enterprise Search solution “CERN Search” provides a central search solution for users and CERN service providers. A total of about 20 million public and protected documents from a wide range of document collections is indexed, including Indico, TWiki, Drupal, SharePoint, JACOW, E-group archives, EDMS, and CERN Web pages. In spring 2015, CERN Search was migrated to a new infrastructure based on SharePoint 2013. In the context of this upgrade, the document pre-processing and indexing process was redesigned and generalised. The new data feeding framework allows to profit from new functionality and it facilitates the long term maintenance of the system.

  6. The role of advanced nursing in lung cancer: A framework based development.

    PubMed

    Serena, A; Castellani, P; Fucina, N; Griesser, A-C; Jeanmonod, J; Peters, S; Eicher, M

    2015-12-01

    Advanced Practice Lung Cancer Nurses (APLCN) are well-established in several countries but their role has yet to be established in Switzerland. Developing an innovative nursing role requires a structured approach to guide successful implementation and to meet the overarching goal of improved nursing sensitive patient outcomes. The "Participatory, Evidence-based, Patient-focused process, for guiding the development, implementation, and evaluation of advanced practice nursing" (PEPPA framework) is one approach that was developed in the context of the Canadian health system. The purpose of this article is to describe the development of an APLCN model at a Swiss Academic Medical Center as part of a specialized Thoracic Cancer Center and to evaluate the applicability of PEPPA framework in this process. In order to develop and implement the APLCN role, we applied the first seven phases of the PEPPA framework. This article spreads the applicability of the PEPPA framework for an APLCN development. This framework allowed us to i) identify key components of an APLCN model responsive to lung cancer patients' health needs, ii) identify role facilitators and barriers, iii) implement the APLCN role and iv) design a feasibility study of this new role. The PEPPA framework provides a structured process for implementing novel Advanced Practice Nursing roles in a local context, particularly where such roles are in their infancy. Two key points in the process include assessing patients' health needs and involving key stakeholders. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. An Interpretation of the "Science--A Process Approach" Objectives in Terms of Existing Psychological Theory and Experimentation.

    ERIC Educational Resources Information Center

    Cole, Henry P.

    This paper examines the sequence and hierarchy of objectives in the American Association for the Advancement of Science (AAAS) "Science--A Process Approach" curriculum. The work of Piaget, Bruner forms a framework from which the learning objectives and tasks in the AAAS science curriculum are examined. The points of correspondence…

  8. Quantifying structural uncertainty on fault networks using a marked point process within a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Aydin, Orhun; Caers, Jef Karel

    2017-08-01

    Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed methodology generates realistic fault network models conditioned to data and a conceptual model of the underlying tectonics.

  9. What is This Thing Called Sensemaking?: A Theoretical Framework for How Physics Students Resolve Inconsistencies in Understanding

    NASA Astrophysics Data System (ADS)

    Odden, Tor Ole B.

    Students often emerge from introductory physics courses with a feeling that the concepts they have learned do not make sense. In recent years, science education researchers have begun to attend to this type of problem by studying the ways in which students make sense of science concepts. However, although many researchers agree intuitively on what sensemaking looks like, the literature on sensemaking is both theoretically fragmented and provides few guidelines for how to encourage and support the process. In this dissertation, I address this challenge by proposing a theoretical framework to describe students' sensemaking processes. I base this framework both on the science education research literature on sensemaking and on a series of video-recorded cognitive, clinical interviews conducted with introductory physics students enrolled in a course on electricity and magnetism. Using the science education research literature on sensemaking as well as a cognitivist, dynamic network model of mind as a theoretical lens, I first propose a coherent definition of sensemaking. Then, using this definition I analyze the sensemaking processes of these introductory physics students during episodes when they work to articulate and resolve gaps or inconsistencies in their understanding. Based on the students' framing, gestures, and dialogue I argue that the process of sensemaking unfolds in a distinct way, which we can describe as an epistemic game in which students first build a framework of knowledge, then identify a gap or inconsistency in that framework, iteratively build an explanation to resolve the gap or inconsistency, and (sometimes) successfully resolve it. I further argue that their entry into the sensemaking frame is facilitated by a specific question, which is in turn motivated by a gap or inconsistency in knowledge that I call the vexation point. I also investigate the results of sensemaking, arguing that students may use the technique of conceptual blending to both "defragment" their knowledge and resolve their vexation points.

  10. Pole-Like Road Furniture Detection in Sparse and Unevenly Distributed Mobile Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Li, F.; Lehtomäki, M.; Oude Elberink, S.; Vosselman, G.; Puttonen, E.; Kukko, A.; Hyyppä, J.

    2018-05-01

    Pole-like road furniture detection received much attention due to its traffic functionality in recent years. In this paper, we develop a framework to detect pole-like road furniture from sparse mobile laser scanning data. The framework is carried out in four steps. The unorganised point cloud is first partitioned. Then above ground points are clustered and roughly classified after removing ground points. A slicing check in combination with cylinder masking is proposed to extract pole-like road furniture candidates. Pole-like road furniture are obtained after occlusion analysis in the last stage. The average completeness and correctness of pole-like road furniture in sparse and unevenly distributed mobile laser scanning data was above 0.83. It is comparable to the state of art in the field of pole-like road furniture detection in mobile laser scanning data of good quality and is potentially of practical use in the processing of point clouds collected by autonomous driving platforms.

  11. The Fundamentals of Care Framework as a Point-of-Care Nursing Theory.

    PubMed

    Kitson, Alison L

    Nursing theories have attempted to shape the everyday practice of clinical nurses and patient care. However, many theories-because of their level of abstraction and distance from everyday caring activity-have failed to help nurses undertake the routine practical aspects of nursing care in a theoretically informed way. The purpose of the paper is to present a point-of-care theoretical framework, called the fundamentals of care (FOC) framework, which explains, guides, and potentially predicts the quality of care nurses provide to patients, their carers, and family members. The theoretical framework is presented: person-centered fundamental care (PCFC)-the outcome for the patient and the nurse and the goal of the FOC framework are achieved through the active management of the practice process, which involves the nurse and the patient working together to integrate three core dimensions: establishing the nurse-patient relationship, integrating the FOC into the patient's care plan, and ensuring that the setting or context where care is transacted and coordinated is conducive to achieving PCFC outcomes. Each dimension has multiple elements and subelements, which require unique assessment for each nurse-patient encounter. The FOC framework is presented along with two scenarios to demonstrate its usefulness. The dimensions, elements, and subelements are described, and next steps in the development are articulated.

  12. A Proposed Theoretical Model Using the Work of Thomas Kuhn, David Ausubel, and Mauritz Johnson as a Basis for Curriculum and Instruction Decisions in Science Education.

    ERIC Educational Resources Information Center

    Bowen, Barbara Lynn

    This study presents a holistic framework which can be used as a basis for decision-making at various points in the curriculum-instruction development process as described by Johnson in a work published in 1967. The proposed framework has conceptual bases in the work of Thomas S. Kuhn and David P. Ausubel and utilizes the work of several perceptual…

  13. Point defect reduction in MOCVD (Al)GaN by chemical potential control and a comprehensive model of C incorporation in GaN

    NASA Astrophysics Data System (ADS)

    Reddy, Pramod; Washiyama, Shun; Kaess, Felix; Kirste, Ronny; Mita, Seiji; Collazo, Ramon; Sitar, Zlatko

    2017-12-01

    A theoretical framework that provides a quantitative relationship between point defect formation energies and growth process parameters is presented. It enables systematic point defect reduction by chemical potential control in metalorganic chemical vapor deposition (MOCVD) of III-nitrides. Experimental corroboration is provided by a case study of C incorporation in GaN. The theoretical model is shown to be successful in providing quantitative predictions of CN defect incorporation in GaN as a function of growth parameters and provides valuable insights into boundary phases and other impurity chemical reactions. The metal supersaturation is found to be the primary factor in determining the chemical potential of III/N and consequently incorporation or formation of point defects which involves exchange of III or N atoms with the reservoir. The framework is general and may be extended to other defect systems in (Al)GaN. The utility of equilibrium formalism typically employed in density functional theory in predicting defect incorporation in non-equilibrium and high temperature MOCVD growth is confirmed. Furthermore, the proposed theoretical framework may be used to determine optimal growth conditions to achieve minimum compensation within any given constraints such as growth rate, crystal quality, and other practical system limitations.

  14. Two frameworks for integrating knowledge in induction

    NASA Technical Reports Server (NTRS)

    Rosenbloom, Paul S.; Hirsh, Haym; Cohen, William W.; Smith, Benjamin D.

    1994-01-01

    The use of knowledge in inductive learning is critical for improving the quality of the concept definitions generated, reducing the number of examples required in order to learn effective concept definitions, and reducing the computation needed to find good concept definitions. Relevant knowledge may come in many forms (such as examples, descriptions, advice, and constraints) and from many sources (such as books, teachers, databases, and scientific instruments). How to extract the relevant knowledge from this plethora of possibilities, and then to integrate it together so as to appropriately affect the induction process is perhaps the key issue at this point in inductive learning. Here the focus is on the integration part of this problem; that is, how induction algorithms can, and do, utilize a range of extracted knowledge. Preliminary work on a transformational framework for defining knowledge-intensive inductive algorithms out of relatively knowledge-free algorithms is described, as is a more tentative problems-space framework that attempts to cover all induction algorithms within a single general approach. These frameworks help to organize what is known about current knowledge-intensive induction algorithms, and to point towards new algorithms.

  15. Adsorption in a Fixed-Bed Column and Stability of the Antibiotic Oxytetracycline Supported on Zn(II)-[2-Methylimidazolate] Frameworks in Aqueous Media

    PubMed Central

    Anceski Bataglion, Giovana; Nogueira Eberlin, Marcos; Machado Ronconi, Célia

    2015-01-01

    A metal-organic framework, Zn-[2-methylimidazolate] frameworks (ZIF-8), was used as adsorbent material to remove different concentrations of oxytetracycline (OTC) antibiotic in a fixed-bed column. The OTC was studied at concentrations of 10, 25 and 40 mg L-1. At 40 mg L-1, the breakthrough point was reached after approximately 10 minutes, while at 10 and 25 mg L-1 this point was reached in about 30 minutes. The highest removal rate of 60% for the 10 mg L-1 concentration was reached after 200 minutes. The highest adsorption capacity (28.3 mg g-1) was attained for 25 mg L-1 of OTC. After the adsorption process, a band shift was observed in the UV-Vis spectrum of the eluate. Additional studies were carried out to determine the cause of this band shift, involving a mass spectrometry (MS) analysis of the supernatant liquid during the process. This investigation revealed that the main route of adsorption consisted of the coordination of OTC with the metallic zinc centers of ZIF-8. The materials were characterized by thermal analysis (TA), scanning electron microscopy (SEM), powder X-ray diffraction (XRD), and infrared spectroscopy (IR) before and after adsorption, confirming the presence of OTC in the ZIF-8 and the latter’s structural stability after the adsorption process. PMID:26057121

  16. Real-time blind image deconvolution based on coordinated framework of FPGA and DSP

    NASA Astrophysics Data System (ADS)

    Wang, Ze; Li, Hang; Zhou, Hua; Liu, Hongjun

    2015-10-01

    Image restoration takes a crucial place in several important application domains. With the increasing of computation requirement as the algorithms become much more complexity, there has been a significant rise in the need for accelerating implementation. In this paper, we focus on an efficient real-time image processing system for blind iterative deconvolution method by means of the Richardson-Lucy (R-L) algorithm. We study the characteristics of algorithm, and an image restoration processing system based on the coordinated framework of FPGA and DSP (CoFD) is presented. Single precision floating-point processing units with small-scale cascade and special FFT/IFFT processing modules are adopted to guarantee the accuracy of the processing. Finally, Comparing experiments are done. The system could process a blurred image of 128×128 pixels within 32 milliseconds, and is up to three or four times faster than the traditional multi-DSPs systems.

  17. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    PubMed

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  18. An information extraction framework for cohort identification using electronic health records.

    PubMed

    Liu, Hongfang; Bielinski, Suzette J; Sohn, Sunghwan; Murphy, Sean; Wagholikar, Kavishwar B; Jonnalagadda, Siddhartha R; Ravikumar, K E; Wu, Stephen T; Kullo, Iftikhar J; Chute, Christopher G

    2013-01-01

    Information extraction (IE), a natural language processing (NLP) task that automatically extracts structured or semi-structured information from free text, has become popular in the clinical domain for supporting automated systems at point-of-care and enabling secondary use of electronic health records (EHRs) for clinical and translational research. However, a high performance IE system can be very challenging to construct due to the complexity and dynamic nature of human language. In this paper, we report an IE framework for cohort identification using EHRs that is a knowledge-driven framework developed under the Unstructured Information Management Architecture (UIMA). A system to extract specific information can be developed by subject matter experts through expert knowledge engineering of the externalized knowledge resources used in the framework.

  19. A cognitive information processing framework for distributed sensor networks

    NASA Astrophysics Data System (ADS)

    Wang, Feiyi; Qi, Hairong

    2004-09-01

    In this paper, we present a cognitive agent framework (CAF) based on swarm intelligence and self-organization principles, and demonstrate it through collaborative processing for target classification in sensor networks. The framework involves integrated designs to provide both cognitive behavior at the organization level to conquer complexity and reactive behavior at the individual agent level to retain simplicity. The design tackles various problems in the current information processing systems, including overly complex systems, maintenance difficulties, increasing vulnerability to attack, lack of capability to tolerate faults, and inability to identify and cope with low-frequency patterns. An important and distinguishing point of the presented work from classical AI research is that the acquired intelligence does not pertain to distinct individuals but to groups. It also deviates from multi-agent systems (MAS) due to sheer quantity of extremely simple agents we are able to accommodate, to the degree that some loss of coordination messages and behavior of faulty/compromised agents will not affect the collective decision made by the group.

  20. Interpersonal emotion regulation.

    PubMed

    Zaki, Jamil; Williams, W Craig

    2013-10-01

    Contemporary emotion regulation research emphasizes intrapersonal processes such as cognitive reappraisal and expressive suppression, but people experiencing affect commonly choose not to go it alone. Instead, individuals often turn to others for help in shaping their affective lives. How and under what circumstances does such interpersonal regulation modulate emotional experience? Although scientists have examined allied phenomena such as social sharing, empathy, social support, and prosocial behavior for decades, there have been surprisingly few attempts to integrate these data into a single conceptual framework of interpersonal regulation. Here we propose such a framework. We first map a "space" differentiating classes of interpersonal regulation according to whether an individual uses an interpersonal regulatory episode to alter their own or another person's emotion. We then identify 2 types of processes--response-dependent and response-independent--that could support interpersonal regulation. This framework classifies an array of processes through which interpersonal contact fulfills regulatory goals. More broadly, it organizes diffuse, heretofore independent data on "pieces" of interpersonal regulation, and identifies growth points for this young and exciting research domain.

  1. Climbing the ladder: capability maturity model integration level 3

    NASA Astrophysics Data System (ADS)

    Day, Bryce; Lutteroth, Christof

    2011-02-01

    This article details the attempt to form a complete workflow model for an information and communication technologies (ICT) company in order to achieve a capability maturity model integration (CMMI) maturity rating of 3. During this project, business processes across the company's core and auxiliary sectors were documented and extended using modern enterprise modelling tools and a The Open Group Architectural Framework (TOGAF) methodology. Different challenges were encountered with regard to process customisation and tool support for enterprise modelling. In particular, there were problems with the reuse of process models, the integration of different project management methodologies and the integration of the Rational Unified Process development process framework that had to be solved. We report on these challenges and the perceived effects of the project on the company. Finally, we point out research directions that could help to improve the situation in the future.

  2. Stressors and Turning Points in High School and Dropout: A Stress Process, Life Course Framework

    ERIC Educational Resources Information Center

    Dupéré, Véronique; Leventhal, Tama; Dion, Eric; Crosnoe, Robert; Archambault, Isabelle; Janosz, Michel

    2015-01-01

    High school dropout is commonly seen as the result of a long-term process of failure and disengagement. As useful as it is, this view has obscured the heterogeneity of pathways leading to dropout. Research suggests, for instance, that some students leave school not as a result of protracted difficulties but in response to situations that emerge…

  3. Parallel Processing of Big Point Clouds Using Z-Order Partitioning

    NASA Astrophysics Data System (ADS)

    Alis, C.; Boehm, J.; Liu, K.

    2016-06-01

    As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112) is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest neighbour algorithm for a hemispherical and a triangular wave point cloud.

  4. Heat-Passing Framework for Robust Interpretation of Data in Networks

    PubMed Central

    Fang, Yi; Sun, Mengtian; Ramani, Karthik

    2015-01-01

    Researchers are regularly interested in interpreting the multipartite structure of data entities according to their functional relationships. Data is often heterogeneous with intricately hidden inner structure. With limited prior knowledge, researchers are likely to confront the problem of transforming this data into knowledge. We develop a new framework, called heat-passing, which exploits intrinsic similarity relationships within noisy and incomplete raw data, and constructs a meaningful map of the data. The proposed framework is able to rank, cluster, and visualize the data all at once. The novelty of this framework is derived from an analogy between the process of data interpretation and that of heat transfer, in which all data points contribute simultaneously and globally to reveal intrinsic similarities between regions of data, meaningful coordinates for embedding the data, and exemplar data points that lie at optimal positions for heat transfer. We demonstrate the effectiveness of the heat-passing framework for robustly partitioning the complex networks, analyzing the globin family of proteins and determining conformational states of macromolecules in the presence of high levels of noise. The results indicate that the methodology is able to reveal functionally consistent relationships in a robust fashion with no reference to prior knowledge. The heat-passing framework is very general and has the potential for applications to a broad range of research fields, for example, biological networks, social networks and semantic analysis of documents. PMID:25668316

  5. Modeling and Advanced Control for Sustainable Process ...

    EPA Pesticide Factsheets

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-inspired, multi-agent-based method. The sustainability and performance assessment of process operating points is carried out using the U.S. E.P.A.’s GREENSCOPE assessment tool that provides scores for the selected economic, material management, environmental and energy indicators. The indicator results supply information on whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous bioethanol fermentation process whose dynamics are characterized by steady-state multiplicity and oscillatory behavior. This book chapter contribution demonstrates the application of novel process control strategies for sustainability by increasing material management, energy efficiency, and pollution prevention, as needed for SHC Sustainable Uses of Wastes and Materials Management.

  6. Equivalent formulations of “the equation of life”

    NASA Astrophysics Data System (ADS)

    Ao, Ping

    2014-07-01

    Motivated by progress in theoretical biology a recent proposal on a general and quantitative dynamical framework for nonequilibrium processes and dynamics of complex systems is briefly reviewed. It is nothing but the evolutionary process discovered by Charles Darwin and Alfred Wallace. Such general and structured dynamics may be tentatively named “the equation of life”. Three equivalent formulations are discussed, and it is also pointed out that such a quantitative dynamical framework leads naturally to the powerful Boltzmann-Gibbs distribution and the second law in physics. In this way, the equation of life provides a logically consistent foundation for thermodynamics. This view clarifies a particular outstanding problem and further suggests a unifying principle for physics and biology.

  7. The community ecology of pathogens: coinfection, coexistence and community composition.

    PubMed

    Seabloom, Eric W; Borer, Elizabeth T; Gross, Kevin; Kendig, Amy E; Lacroix, Christelle; Mitchell, Charles E; Mordecai, Erin A; Power, Alison G

    2015-04-01

    Disease and community ecology share conceptual and theoretical lineages, and there has been a resurgence of interest in strengthening links between these fields. Building on recent syntheses focused on the effects of host community composition on single pathogen systems, we examine pathogen (microparasite) communities using a stochastic metacommunity model as a starting point to bridge community and disease ecology perspectives. Such models incorporate the effects of core community processes, such as ecological drift, selection and dispersal, but have not been extended to incorporate host-pathogen interactions, such as immunosuppression or synergistic mortality, that are central to disease ecology. We use a two-pathogen susceptible-infected (SI) model to fill these gaps in the metacommunity approach; however, SI models can be intractable for examining species-diverse, spatially structured systems. By placing disease into a framework developed for community ecology, our synthesis highlights areas ripe for progress, including a theoretical framework that incorporates host dynamics, spatial structuring and evolutionary processes, as well as the data needed to test the predictions of such a model. Our synthesis points the way for this framework and demonstrates that a deeper understanding of pathogen community dynamics will emerge from approaches working at the interface of disease and community ecology. © 2015 John Wiley & Sons Ltd/CNRS.

  8. Creating an outcomes framework.

    PubMed

    Doerge, J B

    2000-01-01

    Four constructs used to build a framework for outcomes management for a large midwestern tertiary hospital are described in this article. A system framework outlining a model of clinical integration and population management based in Steven Shortell's work is discussed. This framework includes key definitions of high-risk patients, target groups, populations and community. Roles for each level of population management and how they were implemented in the health care system are described. A point of service framework centered on seven dimensions of care is the next construct applied on each nursing unit. The third construct outlines the framework for role development. Three roles for nursing were created to implement strategies for target groups that are strategic disease categories; two of those roles are described in depth. The philosophy of nursing practice is centered on caring and existential advocacy. The final construct is the modification of the Dartmouth model as a common framework for outcomes. System applications of the scorecard and lessons learned in the 2-year process of implementation are shared

  9. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine.

    PubMed

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-02-06

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  10. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    PubMed Central

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-01-01

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184

  11. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs

    PubMed Central

    Mosaliganti, Kishore R.; Gelas, Arnaud; Megason, Sean G.

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish embryo. PMID:24501592

  12. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs.

    PubMed

    Mosaliganti, Kishore R; Gelas, Arnaud; Megason, Sean G

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish embryo.

  13. Self-organizing change? On drivers, causes and global environmental change

    NASA Astrophysics Data System (ADS)

    von Elverfeldt, Kirsten; Embleton-Hamann, Christine; Slaymaker, Olav

    2016-01-01

    Within global environmental change research, certain external drivers generally are assumed to cause the environmental system to change. The most commonly considered drivers are relief, sea level, hydroclimate, and/or people. However, complexity theory and self-organizing systems provide a very different framework and means of explanation. Self-organization - understood as the aggregate processes internal to an environmental system that lead to a distinctive spatial, temporal, or other organization - reduces the possibility of implicating a specific process as being causal. The principle of equifinality, whereby two or more different drivers can generate the same form, has long been recognized within a process-response framework, as well as the concept of divergence, which states that similar causes or processes result in different effects. Both ideas differ from self-organization in that they (i) deal with drivers external to the system and (ii) imply concrete cause-and-effect relations that might be difficult to discern. The assumption is, however, that careful study will eventually lead to the true causes and processes. Studies of self-organization deal with the ways in which internal processes interact and may drive a system toward an instability threshold, the so-called bifurcation point. At this point, the system develops by chance and no single external or internal cause for the change can be defined. For research into environmental change this is a crucial theory for two reasons:

  14. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  15. Alternatives Assessment Frameworks: Research Needs for the Informed Substitution of Hazardous Chemicals

    PubMed Central

    Jacobs, Molly M.; Malloy, Timothy F.; Tickner, Joel A.; Edwards, Sally

    2015-01-01

    Background Given increasing pressures for hazardous chemical replacement, there is growing interest in alternatives assessment to avoid substituting a toxic chemical with another of equal or greater concern. Alternatives assessment is a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those used in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. Objectives The purposes of this substantive review of alternatives assessment frameworks are to identify consistencies and differences in methods and to outline needs for research and collaboration to advance science policy practice. Methods This review compares methods used in six core components of these frameworks: hazard assessment, exposure characterization, life-cycle impacts, technical feasibility evaluation, economic feasibility assessment, and decision making. Alternatives assessment frameworks published from 1990 to 2014 were included. Results Twenty frameworks were reviewed. The frameworks were consistent in terms of general process steps, but some differences were identified in the end points addressed. Methodological gaps were identified in the exposure characterization, life-cycle assessment, and decision–analysis components. Methods for addressing data gaps remain an issue. Discussion Greater consistency in methods and evaluation metrics is needed but with sufficient flexibility to allow the process to be adapted to different decision contexts. Conclusion Although alternatives assessment is becoming an important science policy field, there is a need for increased cross-disciplinary collaboration to refine methodologies in support of the informed substitution and design of safer chemicals, materials, and products. Case studies can provide concrete lessons to improve alternatives assessment. Citation Jacobs MM, Malloy TF, Tickner JA, Edwards S. 2016. Alternatives assessment frameworks: research needs for the informed substitution of hazardous chemicals. Environ Health Perspect 124:265–280; http://dx.doi.org/10.1289/ehp.1409581 PMID:26339778

  16. An Information Extraction Framework for Cohort Identification Using Electronic Health Records

    PubMed Central

    Liu, Hongfang; Bielinski, Suzette J.; Sohn, Sunghwan; Murphy, Sean; Wagholikar, Kavishwar B.; Jonnalagadda, Siddhartha R.; Ravikumar, K.E.; Wu, Stephen T.; Kullo, Iftikhar J.; Chute, Christopher G

    Information extraction (IE), a natural language processing (NLP) task that automatically extracts structured or semi-structured information from free text, has become popular in the clinical domain for supporting automated systems at point-of-care and enabling secondary use of electronic health records (EHRs) for clinical and translational research. However, a high performance IE system can be very challenging to construct due to the complexity and dynamic nature of human language. In this paper, we report an IE framework for cohort identification using EHRs that is a knowledge-driven framework developed under the Unstructured Information Management Architecture (UIMA). A system to extract specific information can be developed by subject matter experts through expert knowledge engineering of the externalized knowledge resources used in the framework. PMID:24303255

  17. Strengthening ecological mindfulness through hybrid learning in vital coalitions

    NASA Astrophysics Data System (ADS)

    Sol, Jifke; Wals, Arjen E. J.

    2015-03-01

    In this contribution a key policy `tool' used in the Dutch Environmental Education and Learning for Sustainability Policy framework is introduced as a means to develop a sense of place and associated ecological mindfulness. The key elements of this tool, called the vital coalition, are described while an example of its use in practice, is analysed using a form of reflexive monitoring and evaluation. The example focuses on a multi-stakeholder learning process around the transformation of a somewhat sterile pre-school playground into an intergenerational green place suitable for play, discovery and engagement. Our analysis of the policy-framework and the case leads us to pointing out the importance of critical interventions at so-called tipping points within the transformation process and a discussion of the potential of hybrid learning in vital coalitions in strengthening ecological mindfulness. This paper does not focus on establishing an evidence base for the causality between this type of learning and a change in behavior or mindfulness among participants as a result contributing to a vital coalition but rather focusses on the conditions, processes and interventions that allow for such learning to take place in the first place.

  18. Exploiting spatio-temporal characteristics of human vision for mobile video applications

    NASA Astrophysics Data System (ADS)

    Jillani, Rashad; Kalva, Hari

    2008-08-01

    Video applications on handheld devices such as smart phones pose a significant challenge to achieve high quality user experience. Recent advances in processor and wireless networking technology are producing a new class of multimedia applications (e.g. video streaming) for mobile handheld devices. These devices are light weight and have modest sizes, and therefore very limited resources - lower processing power, smaller display resolution, lesser memory, and limited battery life as compared to desktop and laptop systems. Multimedia applications on the other hand have extensive processing requirements which make the mobile devices extremely resource hungry. In addition, the device specific properties (e.g. display screen) significantly influence the human perception of multimedia quality. In this paper we propose a saliency based framework that exploits the structure in content creation as well as the human vision system to find the salient points in the incoming bitstream and adapt it according to the target device, thus improving the quality of new adapted area around salient points. Our experimental results indicate that the adaptation process that is cognizant of video content and user preferences can produce better perceptual quality video for mobile devices. Furthermore, we demonstrated how such a framework can affect user experience on a handheld device.

  19. [Formulation of technical specification for national survey of Chinese materia medica resources].

    PubMed

    Guo, Lan-Ping; Lu, Jian-Wei; Zhang, Xiao-Bo; Zhao, Run-Huai; Zhang, Ben-Gang; Sun, Li-Ying; Huang, Lu-Qi

    2013-04-01

    According to the process of the technical specification (TS) design for the fourth national survey of the Chinese materia medica resources (CMMR), we analyzed the assignment and objectives of the national survey and pointed out that the differences about CMMR management around China, the distribution of CMMR and their habitat, the economic and technological level, and even enthusiasm and initiative of the staff, etc. are the most difficult points for TS design. And we adopt the principle of combination of the mandatory and flexibility in TS design. We fixed the key points which would affect the quality of national survey first, then proposed the framework of TS which including 3 parts of organization and 11 parts of technique itself. The framework will serve and lead the TS preparation, which will not only provide an action standard to the national survey but will also have a profound influence to the popularization and application of the survey technology of CMMR. [Key words

  20. Knowledge-Based Object Detection in Laser Scanning Point Clouds

    NASA Astrophysics Data System (ADS)

    Boochs, F.; Karmacharya, A.; Marbs, A.

    2012-07-01

    Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.

  1. Midwife-physician collaboration: a conceptual framework for interprofessional collaborative practice.

    PubMed

    Smith, Denise Colter

    2015-01-01

    Since the passage of the Affordable Care Act, collaborative practice has been cited as one method of increasing access to care, decreasing costs, and improving efficiency. How and under what conditions might these goals be achieved? Midwives and physicians have built effective collaborative practice models over a period of 30 years. Empirical study of interprofessional collaboration between midwives and physicians could be useful in guiding professional education, regulation, and health policy in women's health and maternity care. Construction of a conceptual framework for interprofessional collaboration between midwives and physicians was guided by a review of the literature. A theory derivation strategy was used to define dimensions, concepts, and statements of the framework. Midwife-physician interprofessional collaboration can be defined by 4 dimensions (organizational, procedural, relational, and contextual) and 12 concepts (trust, shared power, synergy, commitment, and respect, among others). The constructed framework provides the foundation for further empirical study of the interprofessional collaborative process. The experiences of midwife-physician collaborations provide solid support for a conceptual framework of the collaborative process. A conceptual framework provides a point from which further research can increase knowledge and understanding about how successful outcomes are achieved in collaborative health care practices. Construction of a measurement scale and validation of the model are important next steps. © 2014 by the American College of Nurse-Midwives.

  2. A Framework to Guide the Assessment of Human-Machine Systems.

    PubMed

    Stowers, Kimberly; Oglesby, James; Sonesh, Shirley; Leyva, Kevin; Iwig, Chelsea; Salas, Eduardo

    2017-03-01

    We have developed a framework for guiding measurement in human-machine systems. The assessment of safety and performance in human-machine systems often relies on direct measurement, such as tracking reaction time and accidents. However, safety and performance emerge from the combination of several variables. The assessment of precursors to safety and performance are thus an important part of predicting and improving outcomes in human-machine systems. As part of an in-depth literature analysis involving peer-reviewed, empirical articles, we located and classified variables important to human-machine systems, giving a snapshot of the state of science on human-machine system safety and performance. Using this information, we created a framework of safety and performance in human-machine systems. This framework details several inputs and processes that collectively influence safety and performance. Inputs are divided according to human, machine, and environmental inputs. Processes are divided into attitudes, behaviors, and cognitive variables. Each class of inputs influences the processes and, subsequently, outcomes that emerge in human-machine systems. This framework offers a useful starting point for understanding the current state of the science and measuring many of the complex variables relating to safety and performance in human-machine systems. This framework can be applied to the design, development, and implementation of automated machines in spaceflight, military, and health care settings. We present a hypothetical example in our write-up of how it can be used to aid in project success.

  3. Large-scale urban point cloud labeling and reconstruction

    NASA Astrophysics Data System (ADS)

    Zhang, Liqiang; Li, Zhuqiang; Li, Anjian; Liu, Fangyu

    2018-04-01

    The large number of object categories and many overlapping or closely neighboring objects in large-scale urban scenes pose great challenges in point cloud classification. In this paper, a novel framework is proposed for classification and reconstruction of airborne laser scanning point cloud data. To label point clouds, we present a rectified linear units neural network named ReLu-NN where the rectified linear units (ReLu) instead of the traditional sigmoid are taken as the activation function in order to speed up the convergence. Since the features of the point cloud are sparse, we reduce the number of neurons by the dropout to avoid over-fitting of the training process. The set of feature descriptors for each 3D point is encoded through self-taught learning, and forms a discriminative feature representation which is taken as the input of the ReLu-NN. The segmented building points are consolidated through an edge-aware point set resampling algorithm, and then they are reconstructed into 3D lightweight models using the 2.5D contouring method (Zhou and Neumann, 2010). Compared with deep learning approaches, the ReLu-NN introduced can easily classify unorganized point clouds without rasterizing the data, and it does not need a large number of training samples. Most of the parameters in the network are learned, and thus the intensive parameter tuning cost is significantly reduced. Experimental results on various datasets demonstrate that the proposed framework achieves better performance than other related algorithms in terms of classification accuracy and reconstruction quality.

  4. Modular and Adaptive Control of Sound Processing

    NASA Astrophysics Data System (ADS)

    van Nort, Douglas

    This dissertation presents research into the creation of systems for the control of sound synthesis and processing. The focus differs from much of the work related to digital musical instrument design, which has rightly concentrated on the physicality of the instrument and interface: sensor design, choice of controller, feedback to performer and so on. Often times a particular choice of sound processing is made, and the resultant parameters from the physical interface are conditioned and mapped to the available sound parameters in an exploratory fashion. The main goal of the work presented here is to demonstrate the importance of the space that lies between physical interface design and the choice of sound manipulation algorithm, and to present a new framework for instrument design that strongly considers this essential part of the design process. In particular, this research takes the viewpoint that instrument designs should be considered in a musical control context, and that both control and sound dynamics must be considered in tandem. In order to achieve this holistic approach, the work presented in this dissertation assumes complementary points of view. Instrument design is first seen as a function of musical context, focusing on electroacoustic music and leading to a view on gesture that relates perceived musical intent to the dynamics of an instrumental system. The important design concept of mapping is then discussed from a theoretical and conceptual point of view, relating perceptual, systems and mathematically-oriented ways of examining the subject. This theoretical framework gives rise to a mapping design space, functional analysis of pertinent existing literature, implementations of mapping tools, instrumental control designs and several perceptual studies that explore the influence of mapping structure. Each of these reflect a high-level approach in which control structures are imposed on top of a high-dimensional space of control and sound synthesis parameters. In this view, desired gestural dynamics and sonic response are achieved through modular construction of mapping layers that are themselves subject to parametric control. Complementing this view of the design process, the work concludes with an approach in which the creation of gestural control/sound dynamics are considered in the low-level of the underlying sound model. The result is an adaptive system that is specialized to noise-based transformations that are particularly relevant in an electroacoustic music context. Taken together, these different approaches to design and evaluation result in a unified framework for creation of an instrumental system. The key point is that this framework addresses the influence that mapping structure and control dynamics have on the perceived feel of the instrument. Each of the results illustrate this using either top-down or bottom-up approaches that consider musical control context, thereby pointing to the greater potential for refined sonic articulation that can be had by combining them in the design process.

  5. Integrated catchment modelling within a strategic planning and decision making process: Werra case study

    NASA Astrophysics Data System (ADS)

    Dietrich, Jörg; Funke, Markus

    Integrated water resources management (IWRM) redefines conventional water management approaches through a closer cross-linkage between environment and society. The role of public participation and socio-economic considerations becomes more important within the planning and decision making process. In this paper we address aspects of the integration of catchment models into such a process taking the implementation of the European Water Framework Directive (WFD) as an example. Within a case study situated in the Werra river basin (Central Germany), a systems analytic decision process model was developed. This model uses the semantics of the Unified Modeling Language (UML) activity model. As an example application, the catchment model SWAT and the water quality model RWQM1 were applied to simulate the effect of phosphorus emissions from non-point and point sources on water quality. The decision process model was able to guide the participants of the case study through the interdisciplinary planning and negotiation of actions. Further improvements of the integration framework include tools for quantitative uncertainty analyses, which are crucial for real life application of models within an IWRM decision making toolbox. For the case study, the multi-criteria assessment of actions indicates that the polluter pays principle can be met at larger scales (sub-catchment or river basin) without significantly compromising cost efficiency for the local situation.

  6. Weighted regularized statistical shape space projection for breast 3D model reconstruction.

    PubMed

    Ruiz, Guillermo; Ramon, Eduard; García, Jaime; Sukno, Federico M; Ballester, Miguel A González

    2018-07-01

    The use of 3D imaging has increased as a practical and useful tool for plastic and aesthetic surgery planning. Specifically, the possibility of representing the patient breast anatomy in a 3D shape and simulate aesthetic or plastic procedures is a great tool for communication between surgeon and patient during surgery planning. For the purpose of obtaining the specific 3D model of the breast of a patient, model-based reconstruction methods can be used. In particular, 3D morphable models (3DMM) are a robust and widely used method to perform 3D reconstruction. However, if additional prior information (i.e., known landmarks) is combined with the 3DMM statistical model, shape constraints can be imposed to improve the 3DMM fitting accuracy. In this paper, we present a framework to fit a 3DMM of the breast to two possible inputs: 2D photos and 3D point clouds (scans). Our method consists in a Weighted Regularized (WR) projection into the shape space. The contribution of each point in the 3DMM shape is weighted allowing to assign more relevance to those points that we want to impose as constraints. Our method is applied at multiple stages of the 3D reconstruction process. Firstly, it can be used to obtain a 3DMM initialization from a sparse set of 3D points. Additionally, we embed our method in the 3DMM fitting process in which more reliable or already known 3D points or regions of points, can be weighted in order to preserve their shape information. The proposed method has been tested in two different input settings: scans and 2D pictures assessing both reconstruction frameworks with very positive results. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Development of a hospital-based patient-reported outcome framework for lung cancer patients: a study protocol.

    PubMed

    Moloczij, Natasha; Gough, Karla; Solomon, Benjamin; Ball, David; Mileshkin, Linda; Duffy, Mary; Krishnasamy, Mei

    2018-01-11

    Patient-reported outcome (PRO) data is central to the delivery of quality health care. Establishing sustainable, reliable and cost-efficient methods for routine collection and integration of PRO data into health information systems is challenging. This protocol paper describes the design and structure of a study to develop and pilot test a PRO framework to systematically and longitudinally collect PRO data from a cohort of lung cancer patients at a comprehensive cancer centre in Australia. Best-practice guidelines for developing registries aimed at collecting PROs informed the development of this PRO framework. Framework components included: achieving consensus on determining the purpose of the framework, the PRO measures to be included, the data collection time points and collection methods (electronic and paper), establishing processes to safeguard the quality of the data collected and to link the PRO framework to an existing hospital-based lung cancer clinical registry. Lung cancer patients will be invited to give feedback on the PRO measures (PROMs) chosen and the data collection time points and methods. Implementation of the framework will be piloted for 12 months. Then a mixed-methods approach used to explore patient and multidisciplinary perspectives on the feasibility of implementing the framework and linking it to the lung cancer clinical registry, its clinical utility, perceptions of data collection burden, and preliminary assessment of resource costs to integrate, implement and sustain the PRO framework. The PRO data set will include: a quality of life questionnaire (EORTC-QLQ-C30) and the EORTC lung cancer specific module (QLQC-LC-13). These will be collected pre-treatment (baseline), 2, 6 and 12 months post-baseline. Also, four social isolation questions (PROMIS) will be collected at baseline. Identifying and deciding on the overall purpose, clinical utility of data and which PROs to collect from patients requires careful consideration. Our study will explore how PRO data collection processes that link to a clinical data set can be developed and integrated; how PRO systems that are easy for patients to complete and professionals to use in practice can be achieved, and will provide indicative costs of developing and integrating a longitudinal PRO framework into routine hospital data collection systems. This study is not a clinical trial and is therefore not registered in any trial registry. However, it has received human research ethics approval (LNR/16/PMCC/45).

  8. a Web-Based Framework for Visualizing Industrial Spatiotemporal Distribution Using Standard Deviational Ellipse and Shifting Routes of Gravity Centers

    NASA Astrophysics Data System (ADS)

    Song, Y.; Gui, Z.; Wu, H.; Wei, Y.

    2017-09-01

    Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  9. Pilot study of a point-of-use decision support tool for cancer clinical trials eligibility.

    PubMed

    Breitfeld, P P; Weisburd, M; Overhage, J M; Sledge, G; Tierney, W M

    1999-01-01

    Many adults with cancer are not enrolled in clinical trials because caregivers do not have the time to match the patient's clinical findings with varying eligibility criteria associated with multiple trials for which the patient might be eligible. The authors developed a point-of-use portable decision support tool (DS-TRIEL) to automate this matching process. The support tool consists of a hand-held computer with a programmable relational database. A two-level hierarchic decision framework was used for the identification of eligible subjects for two open breast cancer clinical trials. The hand-held computer also provides protocol consent forms and schemas to further help the busy oncologist. This decision support tool and the decision framework on which it is based could be used for multiple trials and different cancer sites.

  10. Pilot Study of a Point-of-use Decision Support Tool for Cancer Clinical Trials Eligibility

    PubMed Central

    Breitfeld, Philip P.; Weisburd, Marina; Overhage, J. Marc; Sledge, George; Tierney, William M.

    1999-01-01

    Many adults with cancer are not enrolled in clinical trials because caregivers do not have the time to match the patient's clinical findings with varying eligibility criteria associated with multiple trials for which the patient might be eligible. The authors developed a point-of-use portable decision support tool (DS-TRIEL) to automate this matching process. The support tool consists of a hand-held computer with a programmable relational database. A two-level hierarchic decision framework was used for the identification of eligible subjects for two open breast cancer clinical trials. The hand-held computer also provides protocol consent forms and schemas to further help the busy oncologist. This decision support tool and the decision framework on which it is based could be used for multiple trials and different cancer sites. PMID:10579605

  11. [The development of European Union common research and development policy and programs with special regard to life sciences].

    PubMed

    Pörzse, Gábor

    2009-08-09

    Research and development (R&D) has been playing a leading role in the European Community's history since the very beginning of European integration. Its importance has grown in recent years, after the launch of the Lisbon strategy. Framework programs have always played a considerable part in community research. The aim of their introduction was to fine tune national R&D activities, and to successfully divide research tasks between the Community and the member states. The Community, from the very outset, has acknowledged the importance of life sciences. It is no coincidence that life sciences have become the second biggest priority in the last two framework programs. This study provides a historical, and at the same time analytical and evaluative review of community R&D policy and activity from the starting point of its development until the present day. It examines in detail how the changes in structure, conditional system, regulations and priorities of the framework programs have followed the formation of social and economic needs. The paper puts special emphasis on the analysis of the development of life science research, presenting how they have met the challenges of the age, and how they have been built into the framework programs. Another research area of the present study is to elaborate how successfully Hungarian researchers have been joining the community research, especially the framework programs in the field of life sciences. To answer these questions, it was essential to survey, process and analyze the data available in the national and European public and closed databases. Contrary to the previous documents, this analysis doesn't concentrate on the political and scientific background. It outlines which role community research has played in sustainable social and economic development and competitiveness, how it has supported common policies and how the processes of integration have been deepening. Besides, the present paper offers a complete review of the given field, from its foundation up until the present day, by elaborating the newest initiatives and ideas for the future. This work is also novel from the point of view of the given professional field, the life sciences in the framework programs, and processing and evaluating of data of Hungarian participation in the 5th and 6th framework programs in the field of life sciences.

  12. GIS-assisted spatial analysis for urban regulatory detailed planning: designer's dimension in the Chinese code system

    NASA Astrophysics Data System (ADS)

    Yu, Yang; Zeng, Zheng

    2009-10-01

    By discussing the causes behind the high amendments ratio in the implementation of urban regulatory detailed plans in China despite its law-ensured status, the study aims to reconcile conflict between the legal authority of regulatory detailed planning and the insufficient scientific support in its decision-making and compilation by introducing into the process spatial analysis based on GIS technology and 3D modeling thus present a more scientific and flexible approach to regulatory detailed planning in China. The study first points out that the current compilation process of urban regulatory detailed plan in China employs mainly an empirical approach which renders it constantly subjected to amendments; the study then discusses the need and current utilization of GIS in the Chinese system and proposes the framework of a GIS-assisted 3D spatial analysis process from the designer's perspective which can be regarded as an alternating processes between the descriptive codes and physical design in the compilation of regulatory detailed planning. With a case study of the processes and results from the application of the framework, the paper concludes that the proposed framework can be an effective instrument which provides more rationality, flexibility and thus more efficiency to the compilation and decision-making process of urban regulatory detailed plan in China.

  13. Precision and Accuracy of a Digital Impression Scanner in Full-Arch Implant Rehabilitation.

    PubMed

    Pesce, Paolo; Pera, Francesco; Setti, Paolo; Menini, Maria

    To evaluate the accuracy and precision of a digital scanner used to scan four implants positioned according to an immediate loading implant protocol and to assess the accuracy of an aluminum framework fabricated from a digital impression. Five master casts reproducing different edentulous maxillae with four tilted implants were used. Four scan bodies were screwed onto the low-profile abutments, and a digital intraoral scanner was used to perform five digital impressions of each master cast. To assess trueness, a metal framework of the best digital impression was produced with computer-aided design/computer-assisted manufacture (CAD/CAM) technology and passive fit was assessed with the Sheffield test. Gaps between the frameworks and the implant analogs were measured with a stereomicroscope. To assess precision, three-dimensional (3D) point cloud processing software was used to measure the deviations between the five digital impressions of each cast by producing a color map. The deviation values were grouped in three classes, and differences were assessed between class 2 (representing lower discrepancies) and the assembled classes 1 and 3 (representing the higher negative and positive discrepancies, respectively). The frameworks showed a mean gap of < 30 μm (range: 2 to 47 μm). A statistically significant difference was found between the two groups by the 3D point cloud software, with higher frequencies of points in class 2 than in grouped classes 1 and 3 (P < .001). Within the limits of this in vitro study, it appears that a digital impression may represent a reliable method for fabricating full-arch implant frameworks with good passive fit when tilted implants are present.

  14. Location of MTBE and toluene in the channel system of the zeolite mordenite: Adsorption and host-guest interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arletti, Rossella, E-mail: rossella.arletti@unito.it; Martucci, Annalisa; Alberti, Alberto

    This paper reports a study of the location of Methyl Tertiary Butyl Ether (MTBE) and toluene molecules adsorbed in the pores of the organophylic zeolite mordenite from an aqueous solution. The presence of these organic molecules in the zeolite channels was revealed by structure refinement performed by the Rietveld method. About 3 molecules of MTBE and 3.6 molecules of toluene per unit cell were incorporated into the cavities of mordenite, representing 75% and 80% of the total absorption capacity of this zeolite. In both cases a water molecule was localized inside the side pocket of mordenite. The saturation capacity determinedmore » by the adsorption isotherms, obtained by batch experiments, and the weight loss given by thermogravimetric (TG) analyses were in very good agreement with these values. The interatomic distances obtained after the structural refinements suggest MTBE could be connected to the framework through a water molecule, while toluene could be bonded to framework oxygen atoms. The rapid and high adsorption of these hydrocarbons into the organophylic mordenite zeolite makes this cheap and environmental friendly material a suitable candidate for the removal of these pollutants from water. - graphical abstract: Location of MTBE (a) and toluene (b) in mordenite channels (projection along the [001] direction). Highlights: Black-Right-Pointing-Pointer We investigated the MTBE and toluene adsorption process into an organophilic zeolite mordenite. Black-Right-Pointing-Pointer The presence of MTBE and toluene in mordenite was determined by X-ray diffraction studies. Black-Right-Pointing-Pointer About 3 molecules of MTBE and 3.6 molecules of toluene per unit cell were incorporated into the zeolite cavities. Black-Right-Pointing-Pointer MTBE is connected to the framework through a water molecule. Black-Right-Pointing-Pointer Toluene is directly bonded to framework oxygen atoms.« less

  15. Scientific Literacy and the South African School Curriculum

    ERIC Educational Resources Information Center

    Lelliott, Anthony

    2014-01-01

    The notion of scientific literacy is contested terrain, particularly when the term is used in school curricula. Using a scientific literacy framework of Vision I (covers science products and processes) and Vision II (based on science-related situations as a starting point for discussion), the article analyses the Natural Science (grades 7-9)…

  16. Developing a Framework of Facilitator Competencies: Lessons from the Field

    ERIC Educational Resources Information Center

    Kolb, Judith A.; Jin, Sungmi; Song, Ji Hoon

    2008-01-01

    People in organizations are increasingly called upon to serve as small group facilitators or to assist in this role. This article uses data collected from practicing facilitators at three points of time and a building block process of collection, analysis, further collection, and consolidation to develop and refine a list of competencies. A…

  17. A Framework for Mobile Apps in Colleges and Universities: Data Mining Perspective

    ERIC Educational Resources Information Center

    Singh, Archana; Ranjan, Jayanthi

    2016-01-01

    The Enterprise mobility communication technology provides easy and quick accessibility to data and information integrated into one single touch point device. This device incorporates or integrates all the processes into small applications or App and thus increases the workforce capability of knowledge workers. "App" which is a small set…

  18. Insights into mortality patterns and causes of death through a process point of view model

    PubMed Central

    Anderson, James J.; Li, Ting; Sharrow, David J.

    2016-01-01

    Process point of view models of mortality, such as the Strehler-Mildvan and stochastic vitality models, represent death in terms of the loss of survival capacity through challenges and dissipation. Drawing on hallmarks of aging, we link these concepts to candidate biological mechanisms through a framework that defines death as challenges to vitality where distal factors defined the age-evolution of vitality and proximal factors define the probability distribution of challenges. To illustrate the process point of view, we hypothesize that the immune system is a mortality nexus, characterized by two vitality streams: increasing vitality representing immune system development and immunosenescence representing vitality dissipation. Proximal challenges define three mortality partitions: juvenile and adult extrinsic mortalities and intrinsic adult mortality. Model parameters, generated from Swedish mortality data (1751-2010), exhibit biologically meaningful correspondences to economic, health and cause-of-death patterns. The model characterizes the 20th century epidemiological transition mainly as a reduction in extrinsic mortality resulting from a shift from high magnitude disease challenges on individuals at all vitality levels to low magnitude stress challenges on low vitality individuals. Of secondary importance, intrinsic mortality was described by a gradual reduction in the rate of loss of vitality presumably resulting from reduction in the rate of immunosenescence. Extensions and limitations of a distal/proximal framework for characterizing more explicit causes of death, e.g. the young adult mortality hump or cancer in old age are discussed. PMID:27885527

  19. Software Issues in High-Performance Computing and a Framework for the Development of HPC Applications

    DTIC Science & Technology

    1995-01-01

    possible to determine communication points. For this version, a C program spawning Posix threads and using semaphores to synchronize would have to...performance such as the time required for network communication and synchronization as well as issues of asynchrony and memory hierarchy. For example...enhances reusability. Process (or task) parallel computations can also be succinctly expressed with a small set of process creation and synchronization

  20. Biometric Border Security Evaluation Framework (Biometrique Cadre D’evaluation de la Securite des Frontieres)

    DTIC Science & Technology

    2011-10-01

    those least likely to change significantly over time: upper ridges of the eye sockets, areas around the cheekbones, sides of the mouth , nose shape, and...conduct self-process using their electronic passports. ACS processes roughly 22 million visitors annually in airport environments. The program was...the electronic passport. A successful match permits a traveler to clear through the customs control point, whereas an unsuccessful match alerts the

  1. A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks

    PubMed Central

    Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng

    2009-01-01

    Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885

  2. [The ethic dimension of daily tasks in the formation process of nurses].

    PubMed

    Fernandes, Josicélia Dumêt; Rosa, Darci de Oliveira Santa; Vieira, Therezinha Teixeira; Sadigursky, Dora

    2008-06-01

    This theoretical article had as its object of study the ethic dimension of the formation process of nurses taking in consideration the National Curricular Directives for Nursing Courses. It was based on the presuppositions of ethics and their relationship with the implementation of changes in the formation process of nurses, using as reference elements of ethical behavior in the formation and attempting to bring the reflection to current times and thus contribute to define a direction to Nursing education. It was concluded that the ethical dimension in the formation of nurses involves values that permeate the relations between the subjects of this process and nature itself. The study points out the need to transform the practices of students and teachers and change the current curriculum framework, highlighting elements that indicate that the concern with ethics when developing the curriculum framework is not limited to how a discipline is taught, but pass through as practices that take place in the education process.

  3. Developing evidence that is fit for purpose: a framework for payer and research dialogue.

    PubMed

    Sabharwal, Rajeev K; Graff, Jennifer S; Holve, Erin; Dubois, Robert W

    2015-09-01

    Matching the supply and demand of evidence requires an understanding of when more evidence is needed, as well as the type of evidence that will meet this need. This article describes efforts to develop and refine a decision-making framework that considers payers' perspectives on the utility of evidence generated by different types of research methods, including real-world evidence. Conceptual framework development with subsequent testing during a roundtable dialogue. The framework development process included a literature scan to identify existing frameworks and relevant articles on payer decision making. The framework was refined during a stand-alone roundtable in December 2013 hosted by the research team, which included representatives from public and private payers, pharmacy benefit management, the life sciences industry, and researchers. The roundtable discussion also included an application of the framework to 3 case studies. Application of the framework to the clinical scenarios and the resulting discussion provided key insights into when new evidence is needed to inform payer decision making and what questions should be addressed. Payers are not necessarily seeking more evidence about treatment efficacy; rather, they are seeking more evidence for relevant end points that illustrate the differences between treatment alternatives that can justify the resources required to change practice. In addition, payers are interested in obtaining new evidence that goes beyond efficacy, with an emphasis on effectiveness, longer-term safety, and delivery system impact. We believe that our decision-making framework is a useful tool to increase dialogue between evidence generators and payers, while also allowing for greater efficiency in the research process.

  4. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter initialization. Finally, the architecture extended control to tasks beyond those used for CLDA training. These results have significant implications towards the development of clinically-viable neuroprosthetics. PMID:27035820

  5. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  6. A cognitive perspective on health systems integration: results of a Canadian Delphi study.

    PubMed

    Evans, Jenna M; Baker, G Ross; Berta, Whitney; Barnsley, Jan

    2014-05-19

    Ongoing challenges to healthcare integration point toward the need to move beyond structural and process issues. While we know what needs to be done to achieve integrated care, there is little that informs us as to how. We need to understand how diverse organizations and professionals develop shared knowledge and beliefs - that is, we need to generate knowledge about normative integration. We present a cognitive perspective on integration, based on shared mental model theory, that may enhance our understanding and ability to measure and influence normative integration. The aim of this paper is to validate and improve the Mental Models of Integrated Care (MMIC) Framework, which outlines important knowledge and beliefs whose convergence or divergence across stakeholder groups may influence inter-professional and inter-organizational relations. We used a two-stage web-based modified Delphi process to test the MMIC Framework against expert opinion using a random sample of participants from Canada's National Symposium on Integrated Care. Respondents were asked to rate the framework's clarity, comprehensiveness, usefulness, and importance using seven-point ordinal scales. Spaces for open comments were provided. Descriptive statistics were used to describe the structured responses, while open comments were coded and categorized using thematic analysis. The Kruskall-Wallis test was used to examine cross-group agreement by level of integration experience, current workplace, and current role. In the first round, 90 individuals responded (52% response rate), representing a wide range of professional roles and organization types from across the continuum of care. In the second round, 68 individuals responded (75.6% response rate). The quantitative and qualitative feedback from experts was used to revise the framework. The re-named "Integration Mindsets Framework" consists of a Strategy Mental Model and a Relationships Mental Model, comprising a total of nineteen content areas. The Integration Mindsets Framework draws the attention of researchers and practitioners to how various stakeholders think about and conceptualize integration. A cognitive approach to understanding and measuring normative integration complements dominant cultural approaches and allows for more fine-grained analyses. The framework can be used by managers and leaders to facilitate the interpretation, planning, implementation, management and evaluation of integration initiatives.

  7. SWIFT Intensive Technical Assistance Process. Technical Assistance Brief #1

    ERIC Educational Resources Information Center

    Sailor, Wayne; McCart, Amy; McSheehan, Michael; Mitchiner, Melinda; Quirk, Carol

    2014-01-01

    The national center on Schoolwide Integrated Framework for Transformation (SWIFT Center) is now approaching the halfway point in its first full year of providing intensive technical assistance (TA) to 68 schools in 20 local educational agencies across five states. The purpose of this brief is to provide a thumbnail sketch of how this TA process…

  8. A Framework for Segmentation Using Physical Models of Image Formation

    DTIC Science & Technology

    1993-12-10

    light incoming to the point (vy,z) from direction (Ox, 0e) of wavelength x and Stokes parameter s at time t. This function is similar to the plenoptic ... Plenoptic Function and the Elements of Early Vision," in Computational Models of ivnal Processing, ed. M. S. Landy, and J. A. Movshon, Cambridge, MIT

  9. No Special K! A Signal Detection Framework for the Strategic Regulation of Memory Accuracy

    ERIC Educational Resources Information Center

    Higham, Philip A.

    2007-01-01

    Two experiments investigated criterion setting and metacognitive processes underlying the strategic regulation of accuracy on the Scholastic Aptitude Test (SAT) using Type-2 signal detection theory (SDT). In Experiment 1, report bias was manipulated by penalizing participants either 0.25 (low incentive) or 4 (high incentive) points for each error.…

  10. Transport phenomena in helical edge state interferometers: A Green's function approach

    NASA Astrophysics Data System (ADS)

    Rizzo, Bruno; Arrachea, Liliana; Moskalets, Michael

    2013-10-01

    We analyze the current and the shot noise of an electron interferometer made of the helical edge states of a two-dimensional topological insulator within the framework of nonequilibrium Green's functions formalism. We study, in detail, setups with a single and with two quantum point contacts inducing scattering between the different edge states. We consider processes preserving the spin as well as the effect of spin-flip scattering. In the case of a single quantum point contact, a simple test based on the shot-noise measurement is proposed to quantify the strength of the spin-flip scattering. In the case of two single point contacts with the additional ingredient of gate voltages applied within a finite-size region at the top and bottom edges of the sample, we identify two types of interference processes in the behavior of the currents and the noise. One such process is analogous to that taking place in a Fabry-Pérot interferometer, while the second one corresponds to a configuration similar to a Mach-Zehnder interferometer. In the helical interferometer, these two processes compete.

  11. MR connectomics: a conceptual framework for studying the developing brain

    PubMed Central

    Hagmann, Patric; Grant, Patricia E.; Fair, Damien A.

    2012-01-01

    The combination of advanced neuroimaging techniques and major developments in complex network science, have given birth to a new framework for studying the brain: “connectomics.” This framework provides the ability to describe and study the brain as a dynamic network and to explore how the coordination and integration of information processing may occur. In recent years this framework has been used to investigate the developing brain and has shed light on many dynamic changes occurring from infancy through adulthood. The aim of this article is to review this work and to discuss what we have learned from it. We will also use this body of work to highlight key technical aspects that are necessary in general for successful connectome analysis using today's advanced neuroimaging techniques. We look to identify current limitations of such approaches, what can be improved, and how these points generalize to other topics in connectome research. PMID:22707934

  12. NONFUEL MINERAL RESOURCES OF THE PACIFIC EXCLUSIVE ECONOMIC ZONE.

    USGS Publications Warehouse

    Clague, David; Bischoff, James; Howell, David

    1984-01-01

    The Pacific Exclusive Economic Zone contains a variety of hard mineral resources. Sand and gravel and their associated placer deposits of heavy minerals are the most likely to be developed in the near future, but offshore and deep water deposits of phosphorite, abyssal manganese nodules, ferromanganese crusts enriched in cobalt, and massive sulfide deposits all represent future resources. The distribution, extent, and formation of these deposits are poorly understood and will be clarified only with additional exploration, framework geologic mapping, and study of the processes by which these resources form. It is pointed out that the initial discovery of most hard-mineral resources in the EEZ was made during routine scientific marine-geologic surveys aimed at understanding the framework geology and geologic processes of an offshore region.

  13. A delta-rule model of numerical and non-numerical order processing.

    PubMed

    Verguts, Tom; Van Opstal, Filip

    2014-06-01

    Numerical and non-numerical order processing share empirical characteristics (distance effect and semantic congruity), but there are also important differences (in size effect and end effect). At the same time, models and theories of numerical and non-numerical order processing developed largely separately. Currently, we combine insights from 2 earlier models to integrate them in a common framework. We argue that the same learning principle underlies numerical and non-numerical orders, but that environmental features determine the empirical differences. Implications for current theories on order processing are pointed out. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  14. Statistical characterization of spatial patterns of rainfall cells in extratropical cyclones

    NASA Astrophysics Data System (ADS)

    Bacchi, Baldassare; Ranzi, Roberto; Borga, Marco

    1996-11-01

    The assumption of a particular type of distribution of rainfall cells in space is needed for the formulation of several space-time rainfall models. In this study, weather radar-derived rain rate maps are employed to evaluate different types of spatial organization of rainfall cells in storms through the use of distance functions and second-moment measures. In particular the spatial point patterns of the local maxima of rainfall intensity are compared to a completely spatially random (CSR) point process by applying an objective distance measure. For all the analyzed radar maps the CSR assumption is rejected, indicating that at the resolution of the observation considered, rainfall cells are clustered. Therefore a theoretical framework for evaluating and fitting alternative models to the CSR is needed. This paper shows how the "reduced second-moment measure" of the point pattern can be employed to estimate the parameters of a Neyman-Scott model and to evaluate the degree of adequacy to the experimental data. Some limitations of this theoretical framework, and also its effectiveness, in comparison to the use of scaling functions, are discussed.

  15. Framework GRASP: routine library for optimize processing of aerosol remote sensing observation

    NASA Astrophysics Data System (ADS)

    Fuertes, David; Torres, Benjamin; Dubovik, Oleg; Litvinov, Pavel; Lapyonok, Tatyana; Ducos, Fabrice; Aspetsberger, Michael; Federspiel, Christian

    The present the development of a Framework for the Generalized Retrieval of Aerosol and Surface Properties (GRASP) developed by Dubovik et al., (2011). The framework is a source code project that attempts to strengthen the value of the GRASP inversion algorithm by transforming it into a library that will be used later for a group of customized application modules. The functions of the independent modules include the managing of the configuration of the code execution, as well as preparation of the input and output. The framework provides a number of advantages in utilization of the code. First, it implements loading data to the core of the scientific code directly from memory without passing through intermediary files on disk. Second, the framework allows consecutive use of the inversion code without the re-initiation of the core routine when new input is received. These features are essential for optimizing performance of the data production in processing of large observation sets, such as satellite images by the GRASP. Furthermore, the framework is a very convenient tool for further development, because this open-source platform is easily extended for implementing new features. For example, it could accommodate loading of raw data directly onto the inversion code from a specific instrument not included in default settings of the software. Finally, it will be demonstrated that from the user point of view, the framework provides a flexible, powerful and informative configuration system.

  16. Beyond the SCS-CN method: A theoretical framework for spatially lumped rainfall-runoff response

    NASA Astrophysics Data System (ADS)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-06-01

    Since its introduction in 1954, the Soil Conservation Service curve number (SCS-CN) method has become the standard tool, in practice, for estimating an event-based rainfall-runoff response. However, because of its empirical origins, the SCS-CN method is restricted to certain geographic regions and land use types. Moreover, it does not describe the spatial variability of runoff. To move beyond these limitations, we present a new theoretical framework for spatially lumped, event-based rainfall-runoff modeling. In this framework, we describe the spatially lumped runoff model as a point description of runoff that is upscaled to a watershed area based on probability distributions that are representative of watershed heterogeneities. The framework accommodates different runoff concepts and distributions of heterogeneities, and in doing so, it provides an implicit spatial description of runoff variability. Heterogeneity in storage capacity and soil moisture are the basis for upscaling a point runoff response and linking ecohydrological processes to runoff modeling. For the framework, we consider two different runoff responses for fractions of the watershed area: "prethreshold" and "threshold-excess" runoff. These occur before and after infiltration exceeds a storage capacity threshold. Our application of the framework results in a new model (called SCS-CNx) that extends the SCS-CN method with the prethreshold and threshold-excess runoff mechanisms and an implicit spatial description of runoff. We show proof of concept in four forested watersheds and further that the resulting model may better represent geographic regions and site types that previously have been beyond the scope of the traditional SCS-CN method.

  17. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    NASA Technical Reports Server (NTRS)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  18. Examiner perceptions of a portfolio assessment process.

    PubMed

    Davis, Margery H; Ponnamperuma, Gominda G

    2010-01-01

    The portfolio assessment process is important for assessing learner achievement. To study examiner perceptions of Dundee Medical School's portfolio assessment process, in years 4 and 5 of the 5-year curriculum, in relation to: outcomes as a framework for the portfolio assessment process; portfolio content; portfolio assessment process; end points of the portfolio assessment process; appropriateness of the two part final exam format and examiner training. A questionnaire containing statements and open questions was used to obtain examiner feedback. Responses to each statement were compared over 3 years: 1999, 2000 and 2003. Response rates were 100%, 88% and 61% in 1999, 2002 and 2003, respectively. Examiners were positive about the ability of institutionally set learning outcomes (Dundee 12 exit learning outcomes) to provide a framework for the portfolio assessment process. They found difficulties, however, with the volume of portfolio content and the time allocated to assess it. Agreeing a grade for each learning outcome for the candidate with their co-examiner did not present difficulties. The comprehensive, holistic picture of the candidate provided by the portfolio assessment process was perceived to be one of its strengths. Examiners were supportive of the final examination format, and were satisfied with their briefing about the process. The 12 exit learning outcomes of Dundee curriculum provide an appropriate framework for the portfolio assessment process, but the content of the portfolio requires fine-tuning particularly with regard to quantity. Time allocated to examiners for the portfolio assessment process needs to be balanced against practicability. The holistic picture of the candidate provided by the process was one of its strengths.

  19. The MIXED framework: A novel approach to evaluating mixed-methods rigor.

    PubMed

    Eckhardt, Ann L; DeVon, Holli A

    2017-10-01

    Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.

  20. An expanded conceptual framework for solution-focused management of chemical pollution in European waters.

    PubMed

    Munthe, John; Brorström-Lundén, Eva; Rahmberg, Magnus; Posthuma, Leo; Altenburger, Rolf; Brack, Werner; Bunke, Dirk; Engelen, Guy; Gawlik, Bernd Manfred; van Gils, Jos; Herráez, David López; Rydberg, Tomas; Slobodnik, Jaroslav; van Wezel, Annemarie

    2017-01-01

    This paper describes a conceptual framework for solutions-focused management of chemical contaminants built on novel and systematic approaches for identifying, quantifying and reducing risks of these substances. The conceptual framework was developed in interaction with stakeholders representing relevant authorities and organisations responsible for managing environmental quality of water bodies. Stakeholder needs were compiled via a survey and dialogue. The content of the conceptual framework was thereafter developed with inputs from relevant scientific disciplines. The conceptual framework consists of four access points: Chemicals, Environment, Abatement and Society, representing different aspects and approaches to engaging in the issue of chemical contamination of surface waters. It widens the scope for assessment and management of chemicals in comparison to a traditional (mostly) perchemical risk assessment approaches by including abatement- and societal approaches as optional solutions. The solution-focused approach implies an identification of abatement- and policy options upfront in the risk assessment process. The conceptual framework was designed for use in current and future chemical pollution assessments for the aquatic environment, including the specific challenges encountered in prioritising individual chemicals and mixtures, and is applicable for the development of approaches for safe chemical management in a broader sense. The four access points of the conceptual framework are interlinked by four key topics representing the main scientific challenges that need to be addressed, i.e.: identifying and prioritising hazardous chemicals at different scales; selecting relevant and efficient abatement options; providing regulatory support for chemicals management; predicting and prioritising future chemical risks. The conceptual framework aligns current challenges in the safe production and use of chemicals. The current state of knowledge and implementation of these challenges is described. The use of the conceptual framework, and addressing the challenges, is intended to support: (1) forwarding sustainable use of chemicals, (2) identification of pollutants of priority concern for cost-effective management, (3) the selection of optimal abatement options and (4) the development and use of optimised legal and policy instruments.

  1. Facilitating admissions of diverse students: A six-point, evidence-informed framework for pipeline and program development.

    PubMed

    Young, Meredith E; Thomas, Aliki; Varpio, Lara; Razack, Saleem I; Hanson, Mark D; Slade, Steve; Dayem, Katharine L; McKnight, David J

    2017-04-01

    Several national level calls have encouraged reconsideration of diversity issues in medical education. Particular interest has been placed on admissions, as decisions made here shape the nature of the future physician workforce. Critical analysis of current practices paired with evidence-informed policies may counter some of the barriers impeding access for underrepresented groups. We present a framework for diversity-related program development and evaluation grounded within a knowledge translation framework, and supported by the initiation of longitudinal collection of diversity-related data. We provide an illustrative case study for each component of the framework. Descriptive analyses are presented of pre/post intervention diversity metrics if applicable and available. The framework's focal points are: 1) data-driven identification of underrepresented groups, 2) pipeline development and targeted recruitment, 3) ensuring an inclusive process, 4) ensuring inclusive assessment, 5) ensuring inclusive selection, and 6) iterative use of diversity-related data. Case studies ranged from wording changes on admissions websites to the establishment of educational and administrative offices addressing needs of underrepresented populations. We propose that diversity-related data must be collected on a variety of markers, developed in partnership with stakeholders who are most likely to facilitate implementation of best practices and new policies. These data can facilitate the design, implementation, and evaluation of evidence-informed diversity initiatives and provide a structure for continued investigation into 'interventions' supporting diversity-related initiatives.

  2. Theory of molecular rate processes in the presence of intense laser radiation

    NASA Technical Reports Server (NTRS)

    George, T. F.; Zimmerman, I. H.; Devries, P. L.; Yuan, J.-M.; Lam, K.-S.; Bellum, J. C.; Lee, H.-W.; Slutsky, M. S.; Lin, J.-T.

    1979-01-01

    The present paper deals with the influence of intense laser radiation on gas-phase molecular rate processes. Representations of the radiation field, the particle system, and the interaction involving these two entities are discussed from a general rather than abstract point of view. The theoretical methods applied are outlined, and the formalism employed is illustrated by application to a variety of specific processes. Quantum mechanical and semiclassical treatments of representative atom-atom and atom-diatom collision processes in the presence of a field are examined, and examples of bound-continuum processes and heterogeneous catalysis are discussed within the framework of both quantum-mechanical and semiclassical theories.

  3. Validating a new methodology for strain estimation from cardiac cine MRI

    NASA Astrophysics Data System (ADS)

    Elnakib, Ahmed; Beache, Garth M.; Gimel'farb, Georgy; Inanc, Tamer; El-Baz, Ayman

    2013-10-01

    This paper focuses on validating a novel framework for estimating the functional strain from cine cardiac magnetic resonance imaging (CMRI). The framework consists of three processing steps. First, the left ventricle (LV) wall borders are segmented using a level-set based deformable model. Second, the points on the wall borders are tracked during the cardiac cycle based on solving the Laplace equation between the LV edges. Finally, the circumferential and radial strains are estimated at the inner, mid-wall, and outer borders of the LV wall. The proposed framework is validated using synthetic phantoms of the material strains that account for the physiological features and the LV response during the cardiac cycle. Experimental results on simulated phantom images confirm the accuracy and robustness of our method.

  4. Classification framework for partially observed dynamical systems

    NASA Astrophysics Data System (ADS)

    Shen, Yuan; Tino, Peter; Tsaneva-Atanasova, Krasimira

    2017-04-01

    We present a general framework for classifying partially observed dynamical systems based on the idea of learning in the model space. In contrast to the existing approaches using point estimates of model parameters to represent individual data items, we employ posterior distributions over model parameters, thus taking into account in a principled manner the uncertainty due to both the generative (observational and/or dynamic noise) and observation (sampling in time) processes. We evaluate the framework on two test beds: a biological pathway model and a stochastic double-well system. Crucially, we show that the classification performance is not impaired when the model structure used for inferring posterior distributions is much more simple than the observation-generating model structure, provided the reduced-complexity inferential model structure captures the essential characteristics needed for the given classification task.

  5. A Framework for Re-thinking Learning in Science from Recent Cognitive Science Perspectives

    NASA Astrophysics Data System (ADS)

    Tytler, Russell; Prain, Vaughan

    2010-10-01

    Recent accounts by cognitive scientists of factors affecting cognition imply the need to reconsider current dominant conceptual theories about science learning. These new accounts emphasize the role of context, embodied practices, and narrative-based representation rather than learners' cognitive constructs. In this paper we analyse data from a longitudinal study of primary school children's learning to outline a framework based on these contemporary accounts and to delineate key points of difference from conceptual change perspectives. The findings suggest this framework provides strong theoretical and practical insights into how children learn and the key role of representational negotiation in this learning. We argue that the nature and process of conceptual change can be re-interpreted in terms of the development of students' representational resources.

  6. Exploring the complementarity of THz pulse imaging and DCE-MRIs: Toward a unified multi-channel classification and a deep learning framework.

    PubMed

    Yin, X-X; Zhang, Y; Cao, J; Wu, J-L; Hadjiloucas, S

    2016-12-01

    We provide a comprehensive account of recent advances in biomedical image analysis and classification from two complementary imaging modalities: terahertz (THz) pulse imaging and dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). The work aims to highlight underlining commonalities in both data structures so that a common multi-channel data fusion framework can be developed. Signal pre-processing in both datasets is discussed briefly taking into consideration advances in multi-resolution analysis and model based fractional order calculus system identification. Developments in statistical signal processing using principal component and independent component analysis are also considered. These algorithms have been developed independently by the THz-pulse imaging and DCE-MRI communities, and there is scope to place them in a common multi-channel framework to provide better software standardization at the pre-processing de-noising stage. A comprehensive discussion of feature selection strategies is also provided and the importance of preserving textural information is highlighted. Feature extraction and classification methods taking into consideration recent advances in support vector machine (SVM) and extreme learning machine (ELM) classifiers and their complex extensions are presented. An outlook on Clifford algebra classifiers and deep learning techniques suitable to both types of datasets is also provided. The work points toward the direction of developing a new unified multi-channel signal processing framework for biomedical image analysis that will explore synergies from both sensing modalities for inferring disease proliferation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Pervasive randomness in physics: an introduction to its modelling and spectral characterisation

    NASA Astrophysics Data System (ADS)

    Howard, Roy

    2017-10-01

    An introduction to the modelling and spectral characterisation of random phenomena is detailed at a level consistent with a first exposure to the subject at an undergraduate level. A signal framework for defining a random process is provided and this underpins an introduction to common random processes including the Poisson point process, the random walk, the random telegraph signal, shot noise, information signalling random processes, jittered pulse trains, birth-death random processes and Markov chains. An introduction to the spectral characterisation of signals and random processes, via either an energy spectral density or a power spectral density, is detailed. The important case of defining a white noise random process concludes the paper.

  8. Systems Reliability Framework for Surface Water Sustainability and Risk Management

    NASA Astrophysics Data System (ADS)

    Myers, J. R.; Yeghiazarian, L.

    2016-12-01

    With microbial contamination posing a serious threat to the availability of clean water across the world, it is necessary to develop a framework that evaluates the safety and sustainability of water systems in respect to non-point source fecal microbial contamination. The concept of water safety is closely related to the concept of failure in reliability theory. In water quality problems, the event of failure can be defined as the concentration of microbial contamination exceeding a certain standard for usability of water. It is pertinent in watershed management to know the likelihood of such an event of failure occurring at a particular point in space and time. Microbial fate and transport are driven by environmental processes taking place in complex, multi-component, interdependent environmental systems that are dynamic and spatially heterogeneous, which means these processes and therefore their influences upon microbial transport must be considered stochastic and variable through space and time. A physics-based stochastic model of microbial dynamics is presented that propagates uncertainty using a unique sampling method based on artificial neural networks to produce a correlation between watershed characteristics and spatial-temporal probabilistic patterns of microbial contamination. These results are used to address the question of water safety through several sustainability metrics: reliability, vulnerability, resilience and a composite sustainability index. System reliability is described uniquely though the temporal evolution of risk along watershed points or pathways. Probabilistic resilience describes how long the system is above a certain probability of failure, and the vulnerability metric describes how the temporal evolution of risk changes throughout a hierarchy of failure levels. Additionally our approach allows for the identification of contributions in microbial contamination and uncertainty from specific pathways and sources. We expect that this framework will significantly improve the efficiency and precision of sustainable watershed management strategies through providing a better understanding of how watershed characteristics and environmental parameters affect surface water quality and sustainability. With microbial contamination posing a serious threat to the availability of clean water across the world, it is necessary to develop a framework that evaluates the safety and sustainability of water systems in respect to non-point source fecal microbial contamination. The concept of water safety is closely related to the concept of failure in reliability theory. In water quality problems, the event of failure can be defined as the concentration of microbial contamination exceeding a certain standard for usability of water. It is pertinent in watershed management to know the likelihood of such an event of failure occurring at a particular point in space and time. Microbial fate and transport are driven by environmental processes taking place in complex, multi-component, interdependent environmental systems that are dynamic and spatially heterogeneous, which means these processes and therefore their influences upon microbial transport must be considered stochastic and variable through space and time. A physics-based stochastic model of microbial dynamics is presented that propagates uncertainty using a unique sampling method based on artificial neural networks to produce a correlation between watershed characteristics and spatial-temporal probabilistic patterns of microbial contamination. These results are used to address the question of water safety through several sustainability metrics: reliability, vulnerability, resilience and a composite sustainability index. System reliability is described uniquely though the temporal evolution of risk along watershed points or pathways. Probabilistic resilience describes how long the system is above a certain probability of failure, and the vulnerability metric describes how the temporal evolution of risk changes throughout a hierarchy of failure levels. Additionally our approach allows for the identification of contributions in microbial contamination and uncertainty from specific pathways and sources. We expect that this framework will significantly improve the efficiency and precision of sustainable watershed management strategies through providing a better understanding of how watershed characteristics and environmental parameters affect surface water quality and sustainability.

  9. Coming to See Objects of Knowledge: Guiding Student Conceptualization through Teacher Embodied Instruction in a Robotics Programming Class

    ERIC Educational Resources Information Center

    Kwah, Helen

    2013-01-01

    This thesis explores the questions of how a teacher guides students to see concepts, and the role of gesture and gesture viewpoints in mediating the process of guidance. To examine these questions, two sociocultural theoretical frameworks--Radford's cultural-semiotic theory of knowledge objectification (e.g., 2003), and Goldman's Points of Viewing…

  10. Characteristics in Restructuring High School Students' Frameworks of Gaseous Kinetics in Korea: A Psychological Point of View.

    ERIC Educational Resources Information Center

    Cho, In-Young; Park, Hyun-Ju; Choi, Byung-Soon

    This study was conducted to describe in detail Korean students' conceptual change learning processes in the study of kinetic theory of gases. The study was interpretive, using multiple data sources to achieve a triangulation of data. Three students from a public high school for boys served as representative cases. How epistemological aspect and…

  11. PCC Framework for Program-Generators

    NASA Technical Reports Server (NTRS)

    Kong, Soonho; Choi, Wontae; Yi, Kwangkeun

    2009-01-01

    In this paper, we propose a proof-carrying code framework for program-generators. The enabling technique is abstract parsing, a static string analysis technique, which is used as a component for generating and validating certificates. Our framework provides an efficient solution for certifying program-generators whose safety properties are expressed in terms of the grammar representing the generated program. The fixed-point solution of the analysis is generated and attached with the program-generator on the code producer side. The consumer receives the code with a fixed-point solution and validates that the received fixed point is indeed a fixed point of the received code. This validation can be done in a single pass.

  12. A model-based approach to wildland fire reconstruction using sediment charcoal records

    USGS Publications Warehouse

    Itter, Malcolm S.; Finley, Andrew O.; Hooten, Mevin B.; Higuera, Philip E.; Marlon, Jennifer R.; Kelly, Ryan; McLachlan, Jason S.

    2017-01-01

    Lake sediment charcoal records are used in paleoecological analyses to reconstruct fire history, including the identification of past wildland fires. One challenge of applying sediment charcoal records to infer fire history is the separation of charcoal associated with local fire occurrence and charcoal originating from regional fire activity. Despite a variety of methods to identify local fires from sediment charcoal records, an integrated statistical framework for fire reconstruction is lacking. We develop a Bayesian point process model to estimate the probability of fire associated with charcoal counts from individual-lake sediments and estimate mean fire return intervals. A multivariate extension of the model combines records from multiple lakes to reduce uncertainty in local fire identification and estimate a regional mean fire return interval. The univariate and multivariate models are applied to 13 lakes in the Yukon Flats region of Alaska. Both models resulted in similar mean fire return intervals (100–350 years) with reduced uncertainty under the multivariate model due to improved estimation of regional charcoal deposition. The point process model offers an integrated statistical framework for paleofire reconstruction and extends existing methods to infer regional fire history from multiple lake records with uncertainty following directly from posterior distributions.

  13. Point process modeling and estimation: Advances in the analysis of dynamic neural spiking data

    NASA Astrophysics Data System (ADS)

    Deng, Xinyi

    2016-08-01

    A common interest of scientists in many fields is to understand the relationship between the dynamics of a physical system and the occurrences of discrete events within such physical system. Seismologists study the connection between mechanical vibrations of the Earth and the occurrences of earthquakes so that future earthquakes can be better predicted. Astrophysicists study the association between the oscillating energy of celestial regions and the emission of photons to learn the Universe's various objects and their interactions. Neuroscientists study the link between behavior and the millisecond-timescale spike patterns of neurons to understand higher brain functions. Such relationships can often be formulated within the framework of state-space models with point process observations. The basic idea is that the dynamics of the physical systems are driven by the dynamics of some stochastic state variables and the discrete events we observe in an interval are noisy observations with distributions determined by the state variables. This thesis proposes several new methodological developments that advance the framework of state-space models with point process observations at the intersection of statistics and neuroscience. In particular, we develop new methods 1) to characterize the rhythmic spiking activity using history-dependent structure, 2) to model population spike activity using marked point process models, 3) to allow for real-time decision making, and 4) to take into account the need for dimensionality reduction for high-dimensional state and observation processes. We applied these methods to a novel problem of tracking rhythmic dynamics in the spiking of neurons in the subthalamic nucleus of Parkinson's patients with the goal of optimizing placement of deep brain stimulation electrodes. We developed a decoding algorithm that can make decision in real-time (for example, to stimulate the neurons or not) based on various sources of information present in population spiking data. Lastly, we proposed a general three-step paradigm that allows us to relate behavioral outcomes of various tasks to simultaneously recorded neural activity across multiple brain areas, which is a step towards closed-loop therapies for psychological diseases using real-time neural stimulation. These methods are suitable for real-time implementation for content-based feedback experiments.

  14. Creating a process for incorporating epidemiological modelling into outbreak management decisions.

    PubMed

    Akselrod, Hana; Mercon, Monica; Kirkeby Risoe, Petter; Schlegelmilch, Jeffrey; McGovern, Joanne; Bogucki, Sandy

    2012-01-01

    Modern computational models of infectious diseases greatly enhance our ability to understand new infectious threats and assess the effects of different interventions. The recently-released CDC Framework for Preventing Infectious Diseases calls for increased use of predictive modelling of epidemic emergence for public health preparedness. Currently, the utility of these technologies in preparedness and response to outbreaks is limited by gaps between modelling output and information requirements for incident management. The authors propose an operational structure that will facilitate integration of modelling capabilities into action planning for outbreak management, using the Incident Command System (ICS) and Synchronization Matrix framework. It is designed to be adaptable and scalable for use by state and local planners under the National Response Framework (NRF) and Emergency Support Function #8 (ESF-8). Specific epidemiological modelling requirements are described, and integrated with the core processes for public health emergency decision support. These methods can be used in checklist format to align prospective or real-time modelling output with anticipated decision points, and guide strategic situational assessments at the community level. It is anticipated that formalising these processes will facilitate translation of the CDC's policy guidance from theory to practice during public health emergencies involving infectious outbreaks.

  15. CALIBRATED ULTRA FAST IMAGE SIMULATIONS FOR THE DARK ENERGY SURVEY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruderer, Claudio; Chang, Chihway; Refregier, Alexandre

    2016-01-20

    Image simulations are becoming increasingly important in understanding the measurement process of the shapes of galaxies for weak lensing and the associated systematic effects. For this purpose we present the first implementation of the Monte Carlo Control Loops (MCCL), a coherent framework for studying systematic effects in weak lensing. It allows us to model and calibrate the shear measurement process using image simulations from the Ultra Fast Image Generator (UFig) and the image analysis software SExtractor. We apply this framework to a subset of the data taken during the Science Verification period (SV) of the Dark Energy Survey (DES). Wemore » calibrate the UFig simulations to be statistically consistent with one of the SV images, which covers ∼0.5 square degrees. We then perform tolerance analyses by perturbing six simulation parameters and study their impact on the shear measurement at the one-point level. This allows us to determine the relative importance of different parameters. For spatially constant systematic errors and point-spread function, the calibration of the simulation reaches the weak lensing precision needed for the DES SV survey area. Furthermore, we find a sensitivity of the shear measurement to the intrinsic ellipticity distribution, and an interplay between the magnitude-size and the pixel value diagnostics in constraining the noise model. This work is the first application of the MCCL framework to data and shows how it can be used to methodically study the impact of systematics on the cosmic shear measurement.« less

  16. Ethics and mediation.

    PubMed

    Patthoff, D E

    1993-12-01

    Ethics dialogue in this case is first used as a framework to initiate reflection on which forms of conflict resolution are appropriate in specific situations. This helps in planning and strategies, but does not guarantee what the outcome will actually be. Ethics dialogue, however, can also be used as a form of conflict resolution. For example, when the patient in the story wants to avoid revealing the names of her past dentists, an ethical framework could be presented that would respect her autonomy (an ethical term) and her right to privacy (a legal term), while still addressing your need to determine if the primary problem is of an ethical or dental nature, and if your role is to be that of a healing mediator or a healing dentist. This same form of conflict resolution could also be applied elsewhere in the story. For example, ethics dialogue would have been appropriate during the consultation between you and the endodontist, or between you and the patient, prior to the lawyer's formal request for the patient's records. It is difficult, however, for you to reduce conflict through an ethical dialogue once the lawyer requests information from you because, at that point, the adjudication process has already begun. The ethical reflection exercise will, however, help you negotiate through the adjudication process by providing a solid ethical reference point concerning conflict resolution. The February issue's ethics column will provide a framework for evaluating the forms of power available in conflict resolution in terms of justice.

  17. Integrating social networks and human social motives to achieve social influence at scale

    PubMed Central

    Contractor, Noshir S.; DeChurch, Leslie A.

    2014-01-01

    The innovations of science often point to ideas and behaviors that must spread and take root in communities to have impact. Ideas, practices, and behaviors need to go from accepted truths on the part of a few scientists to commonplace beliefs and norms in the minds of the many. Moving from scientific discoveries to public good requires social influence. We introduce a structured influence process (SIP) framework to explain how social networks (i.e., the structure of social influence) and human social motives (i.e., the process of social influence wherein one person’s attitudes and behaviors affect another’s) are used collectively to enact social influence within a community. The SIP framework advances the science of scientific communication by positing social influence events that consider both the “who” and the “how” of social influence. This framework synthesizes core ideas from two bodies of research on social influence. The first is network research on social influence structures, which identifies who are the opinion leaders and who among their network of peers shapes their attitudes and behaviors. The second is research on social influence processes in psychology, which explores how human social motives such as the need for accuracy or the need for affiliation stimulate behavior change. We illustrate the practical implications of the SIP framework by applying it to the case of reducing neonatal mortality in India. PMID:25225373

  18. Integrating social networks and human social motives to achieve social influence at scale.

    PubMed

    Contractor, Noshir S; DeChurch, Leslie A

    2014-09-16

    The innovations of science often point to ideas and behaviors that must spread and take root in communities to have impact. Ideas, practices, and behaviors need to go from accepted truths on the part of a few scientists to commonplace beliefs and norms in the minds of the many. Moving from scientific discoveries to public good requires social influence. We introduce a structured influence process (SIP) framework to explain how social networks (i.e., the structure of social influence) and human social motives (i.e., the process of social influence wherein one person's attitudes and behaviors affect another's) are used collectively to enact social influence within a community. The SIP framework advances the science of scientific communication by positing social influence events that consider both the "who" and the "how" of social influence. This framework synthesizes core ideas from two bodies of research on social influence. The first is network research on social influence structures, which identifies who are the opinion leaders and who among their network of peers shapes their attitudes and behaviors. The second is research on social influence processes in psychology, which explores how human social motives such as the need for accuracy or the need for affiliation stimulate behavior change. We illustrate the practical implications of the SIP framework by applying it to the case of reducing neonatal mortality in India.

  19. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  20. Critical Events in the Lives of Interns

    PubMed Central

    Graham, Mark; Schmidt, Hilary; Stern, David T.; Miller, Steven Z.

    2008-01-01

    BACKGROUND Early residency is a crucial time in the professional development of physicians. As interns assume primary care for their patients, they take on new responsibilities. The events they find memorable during this time could provide us with insight into their developing professional identities. OBJECTIVE To evaluate the most critical events in the lives of interns. PARTICIPANTS Forty-one internal medicine residents at one program participated in a two-day retreat in the fall of their first year. Each resident provided a written description of a recent high point, low point, and patient conflict. MEASUREMENTS We used a variant of grounded theory to analyze these critical incidents and determine the underlying themes of early internship. Independent inter-rater agreement of >90% was achieved for the coding of excerpts. MAIN RESULTS The 123 critical incidents were clustered into 23 categories. The categories were further organized into six themes: confidence, life balance, connections, emotional responses, managing expectations, and facilitating teamwork. High points were primarily in the themes of confidence and connections. Low points were dispersed more generally throughout the conceptual framework. Conflicts with patients were about negotiating the expectations inherent in the physician–patient relationship. CONCLUSION The high points, low points, and conflicts reported by early residents provide us with a glimpse into the lives of interns. The themes we have identified reflect critical challenges interns face the development of their professional identity. Program directors could use this process and conceptual framework to guide the development and promotion of residents’ emerging professional identities. PMID:18972091

  1. High-Dimensional Bayesian Geostatistics

    PubMed Central

    Banerjee, Sudipto

    2017-01-01

    With the growing capabilities of Geographic Information Systems (GIS) and user-friendly software, statisticians today routinely encounter geographically referenced data containing observations from a large number of spatial locations and time points. Over the last decade, hierarchical spatiotemporal process models have become widely deployed statistical tools for researchers to better understand the complex nature of spatial and temporal variability. However, fitting hierarchical spatiotemporal models often involves expensive matrix computations with complexity increasing in cubic order for the number of spatial locations and temporal points. This renders such models unfeasible for large data sets. This article offers a focused review of two methods for constructing well-defined highly scalable spatiotemporal stochastic processes. Both these processes can be used as “priors” for spatiotemporal random fields. The first approach constructs a low-rank process operating on a lower-dimensional subspace. The second approach constructs a Nearest-Neighbor Gaussian Process (NNGP) that ensures sparse precision matrices for its finite realizations. Both processes can be exploited as a scalable prior embedded within a rich hierarchical modeling framework to deliver full Bayesian inference. These approaches can be described as model-based solutions for big spatiotemporal datasets. The models ensure that the algorithmic complexity has ~ n floating point operations (flops), where n the number of spatial locations (per iteration). We compare these methods and provide some insight into their methodological underpinnings. PMID:29391920

  2. High-Dimensional Bayesian Geostatistics.

    PubMed

    Banerjee, Sudipto

    2017-06-01

    With the growing capabilities of Geographic Information Systems (GIS) and user-friendly software, statisticians today routinely encounter geographically referenced data containing observations from a large number of spatial locations and time points. Over the last decade, hierarchical spatiotemporal process models have become widely deployed statistical tools for researchers to better understand the complex nature of spatial and temporal variability. However, fitting hierarchical spatiotemporal models often involves expensive matrix computations with complexity increasing in cubic order for the number of spatial locations and temporal points. This renders such models unfeasible for large data sets. This article offers a focused review of two methods for constructing well-defined highly scalable spatiotemporal stochastic processes. Both these processes can be used as "priors" for spatiotemporal random fields. The first approach constructs a low-rank process operating on a lower-dimensional subspace. The second approach constructs a Nearest-Neighbor Gaussian Process (NNGP) that ensures sparse precision matrices for its finite realizations. Both processes can be exploited as a scalable prior embedded within a rich hierarchical modeling framework to deliver full Bayesian inference. These approaches can be described as model-based solutions for big spatiotemporal datasets. The models ensure that the algorithmic complexity has ~ n floating point operations (flops), where n the number of spatial locations (per iteration). We compare these methods and provide some insight into their methodological underpinnings.

  3. Process of e⁺e⁻ → ππX (3823) in the soft pion

    DOE PAGES

    Voloshin, M. B.

    2015-06-23

    The production of the resonance X(3823), identified as the charmonium ³D₂ state, in the process e⁺e⁻ → ππX(3823) has been recently reported by BESIII. Here it is pointed out that this process is fully described, up to one overall coupling constant, in the soft pion limit. An interpretation of the available and possible future data within the discussed theoretical framework may reveal new features of the charmoniumlike states. In particular, the observed relative yield for this process at different energies strongly suggests a very significant enhancement of the amplitude at the charmoniumlike peak near 4.36 GeV.

  4. The business process management software for successful quality management and organization: A case study from the University of Split School of Medicine.

    PubMed

    Sapunar, Damir; Grković, Ivica; Lukšić, Davor; Marušić, Matko

    2016-05-01

    Our aim was to describe a comprehensive model of internal quality management (QM) at a medical school founded on the business process analysis (BPA) software tool. BPA software tool was used as the core element for description of all working processes in our medical school, and subsequently the system served as the comprehensive model of internal QM. The quality management system at the University of Split School of Medicine included the documentation and analysis of all business processes within the School. The analysis revealed 80 weak points related to one or several business processes. A precise analysis of medical school business processes allows identification of unfinished, unclear and inadequate points in these processes, and subsequently the respective improvements and increase of the QM level and ultimately a rationalization of the institution's work. Our approach offers a potential reference model for development of common QM framework allowing a continuous quality control, i.e. the adjustments and adaptation to contemporary educational needs of medical students. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  5. Stability of Mixed-Strategy-Based Iterative Logit Quantal Response Dynamics in Game Theory

    PubMed Central

    Zhuang, Qian; Di, Zengru; Wu, Jinshan

    2014-01-01

    Using the Logit quantal response form as the response function in each step, the original definition of static quantal response equilibrium (QRE) is extended into an iterative evolution process. QREs remain as the fixed points of the dynamic process. However, depending on whether such fixed points are the long-term solutions of the dynamic process, they can be classified into stable (SQREs) and unstable (USQREs) equilibriums. This extension resembles the extension from static Nash equilibriums (NEs) to evolutionary stable solutions in the framework of evolutionary game theory. The relation between SQREs and other solution concepts of games, including NEs and QREs, is discussed. Using experimental data from other published papers, we perform a preliminary comparison between SQREs, NEs, QREs and the observed behavioral outcomes of those experiments. For certain games, we determine that SQREs have better predictive power than QREs and NEs. PMID:25157502

  6. A patient-specific segmentation framework for longitudinal MR images of traumatic brain injury

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Prastawa, Marcel; Irimia, Andrei; Chambers, Micah C.; Vespa, Paul M.; Van Horn, John D.; Gerig, Guido

    2012-02-01

    Traumatic brain injury (TBI) is a major cause of death and disability worldwide. Robust, reproducible segmentations of MR images with TBI are crucial for quantitative analysis of recovery and treatment efficacy. However, this is a significant challenge due to severe anatomy changes caused by edema (swelling), bleeding, tissue deformation, skull fracture, and other effects related to head injury. In this paper, we introduce a multi-modal image segmentation framework for longitudinal TBI images. The framework is initialized through manual input of primary lesion sites at each time point, which are then refined by a joint approach composed of Bayesian segmentation and construction of a personalized atlas. The personalized atlas construction estimates the average of the posteriors of the Bayesian segmentation at each time point and warps the average back to each time point to provide the updated priors for Bayesian segmentation. The difference between our approach and segmenting longitudinal images independently is that we use the information from all time points to improve the segmentations. Given a manual initialization, our framework automatically segments healthy structures (white matter, grey matter, cerebrospinal fluid) as well as different lesions such as hemorrhagic lesions and edema. Our framework can handle different sets of modalities at each time point, which provides flexibility in analyzing clinical scans. We show results on three subjects with acute baseline scans and chronic follow-up scans. The results demonstrate that joint analysis of all the points yields improved segmentation compared to independent analysis of the two time points.

  7. A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.

    PubMed

    Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan

    2017-12-01

    A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  8. Creation of a diagnostic wait times measurement framework based on evidence and consensus.

    PubMed

    Gilbert, Julie E; Dobrow, Mark J; Kaan, Melissa; Dobranowski, Julian; Srigley, John R; Jusko Friedman, Audrey; Irish, Jonathan C

    2014-09-01

    Public reporting of wait times worldwide has to date focused largely on treatment wait times and is limited in its ability to capture earlier parts of the patient journey. The interval between suspicion and diagnosis or ruling out of cancer is a complex phase of the cancer journey. Diagnostic delays and inefficient use of diagnostic imaging procedures can result in poor patient outcomes, both physical and psychosocial. This study was designed to develop a framework that could be adopted for multiple disease sites across different jurisdictions to enable the measurement of diagnostic wait times and diagnostic delay. Diagnostic benchmarks and targets in cancer systems were explored through a targeted literature review and jurisdictional scan. Cancer system leaders and clinicians were interviewed to validate the information found in the jurisdictional scan. An expert panel was assembled to review and, through a modified Delphi consensus process, provide feedback on a diagnostic wait times framework. The consensus process resulted in agreement on a measurement framework that identified suspicion, referral, diagnosis, and treatment as the main time points for measuring this critical phase of the patient journey. This work will help guide initiatives designed to improve patient access to health services by developing an evidence-based approach to standardization of the various waypoints during the diagnostic pathway. The diagnostic wait times measurement framework provides a yardstick to measure the performance of programs that are designed to manage and expedite care processes between referral and diagnosis or ruling out of cancer. Copyright © 2014 by American Society of Clinical Oncology.

  9. A framework for assessing Health Economic Evaluation (HEE) quality appraisal instruments.

    PubMed

    Langer, Astrid

    2012-08-16

    Health economic evaluations support the health care decision-making process by providing information on costs and consequences of health interventions. The quality of such studies is assessed by health economic evaluation (HEE) quality appraisal instruments. At present, there is no instrument for measuring and improving the quality of such HEE quality appraisal instruments. Therefore, the objectives of this study are to establish a framework for assessing the quality of HEE quality appraisal instruments to support and improve their quality, and to apply this framework to those HEE quality appraisal instruments which have been subject to more scrutiny than others, in order to test the framework and to demonstrate the shortcomings of existing HEE quality appraisal instruments. To develop the quality assessment framework for HEE quality appraisal instruments, the experiences of using appraisal tools for clinical guidelines are used. Based on a deductive iterative process, clinical guideline appraisal instruments identified through literature search are reviewed, consolidated, and adapted to produce the final quality assessment framework for HEE quality appraisal instruments. The final quality assessment framework for HEE quality appraisal instruments consists of 36 items organized within 7 dimensions, each of which captures a specific domain of quality. Applying the quality assessment framework to four existing HEE quality appraisal instruments, it is found that these four quality appraisal instruments are of variable quality. The framework described in this study should be regarded as a starting point for appraising the quality of HEE quality appraisal instruments. This framework can be used by HEE quality appraisal instrument producers to support and improve the quality and acceptance of existing and future HEE quality appraisal instruments. By applying this framework, users of HEE quality appraisal instruments can become aware of methodological deficiencies inherent in existing HEE quality appraisal instruments. These shortcomings of existing HEE quality appraisal instruments are illustrated by the pilot test.

  10. A framework for assessing Health Economic Evaluation (HEE) quality appraisal instruments

    PubMed Central

    2012-01-01

    Background Health economic evaluations support the health care decision-making process by providing information on costs and consequences of health interventions. The quality of such studies is assessed by health economic evaluation (HEE) quality appraisal instruments. At present, there is no instrument for measuring and improving the quality of such HEE quality appraisal instruments. Therefore, the objectives of this study are to establish a framework for assessing the quality of HEE quality appraisal instruments to support and improve their quality, and to apply this framework to those HEE quality appraisal instruments which have been subject to more scrutiny than others, in order to test the framework and to demonstrate the shortcomings of existing HEE quality appraisal instruments. Methods To develop the quality assessment framework for HEE quality appraisal instruments, the experiences of using appraisal tools for clinical guidelines are used. Based on a deductive iterative process, clinical guideline appraisal instruments identified through literature search are reviewed, consolidated, and adapted to produce the final quality assessment framework for HEE quality appraisal instruments. Results The final quality assessment framework for HEE quality appraisal instruments consists of 36 items organized within 7 dimensions, each of which captures a specific domain of quality. Applying the quality assessment framework to four existing HEE quality appraisal instruments, it is found that these four quality appraisal instruments are of variable quality. Conclusions The framework described in this study should be regarded as a starting point for appraising the quality of HEE quality appraisal instruments. This framework can be used by HEE quality appraisal instrument producers to support and improve the quality and acceptance of existing and future HEE quality appraisal instruments. By applying this framework, users of HEE quality appraisal instruments can become aware of methodological deficiencies inherent in existing HEE quality appraisal instruments. These shortcomings of existing HEE quality appraisal instruments are illustrated by the pilot test. PMID:22894708

  11. Sex, gender, and health biotechnology: points to consider

    PubMed Central

    2009-01-01

    Background Reproductive technologies have been extensively debated in the literature. As well, feminist economists, environmentalists, and agriculturalists have generated substantial debate and literature on gender. However, the implications for women of health biotechnologies have received relatively less attention. Surprisingly, while gender based frameworks have been proposed in the context of public health policy, practice, health research, and epidemiological research, we could identify no systematic framework for gender analysis of health biotechnology in the developing world. Discussion We propose sex and gender considerations at five critical stages of health biotechnology research and development: priority setting; technology design; clinical trials; commercialization, and health services delivery. Summary Applying a systematic sex and gender framework to five key process stages of health biotechnology research and development could be a first step towards unlocking the opportunities of this promising science for women in the developing world. PMID:19622163

  12. A physics-based crystallographic modeling framework for describing the thermal creep behavior of Fe-Cr alloys

    DOE PAGES

    Wen, Wei; Capolungo, Laurent; Patra, Anirban; ...

    2017-02-23

    In this work, a physics-based thermal creep model is developed based on the understanding of the microstructure in Fe-Cr alloys. This model is associated with a transition state theory based framework that considers the distribution of internal stresses at sub-material point level. The thermally activated dislocation glide and climb mechanisms are coupled in the obstacle-bypass processes for both dislocation and precipitate-type barriers. A kinetic law is proposed to track the dislocation densities evolution in the subgrain interior and in the cell wall. The predicted results show that this model, embedded in the visco-plastic self-consistent (VPSC) framework, captures well the creepmore » behaviors for primary and steady-state stages under various loading conditions. We also discuss the roles of the mechanisms involved.« less

  13. A View of the Tip of the Iceberg: Revisiting Conceptual Continuities and Their Implications for Science Learning

    ERIC Educational Resources Information Center

    Brown, Bryan A.; Kloser, Matt

    2009-01-01

    We respond to Hwang and Kim and Yeo's critiques of the conceptual continuity framework in science education. First, we address the criticism that their analysis fails to recognize the situated perspective of learning by denying the dichotomy of the formal and informal knowledge as a starting point in the learning process. Second, we address the…

  14. Modeling Menstrual Cycle Length and Variability at the Approach of Menopause Using Hierarchical Change Point Models

    PubMed Central

    Huang, Xiaobi; Elliott, Michael R.; Harlow, Siobán D.

    2013-01-01

    SUMMARY As women approach menopause, the patterns of their menstrual cycle lengths change. To study these changes, we need to jointly model both the mean and variability of cycle length. Our proposed model incorporates separate mean and variance change points for each woman and a hierarchical model to link them together, along with regression components to include predictors of menopausal onset such as age at menarche and parity. Additional complexity arises from the fact that the calendar data have substantial missingness due to hormone use, surgery, and failure to report. We integrate multiple imputation and time-to event modeling in a Bayesian estimation framework to deal with different forms of the missingness. Posterior predictive model checks are applied to evaluate the model fit. Our method successfully models patterns of women’s menstrual cycle trajectories throughout their late reproductive life and identifies change points for mean and variability of segment length, providing insight into the menopausal process. More generally, our model points the way toward increasing use of joint mean-variance models to predict health outcomes and better understand disease processes. PMID:24729638

  15. Toward a framework for levels of robot autonomy in human-robot interaction.

    PubMed

    Beer, Jenay M; Fisk, Arthur D; Rogers, Wendy A

    2014-07-01

    A critical construct related to human-robot interaction (HRI) is autonomy, which varies widely across robot platforms. Levels of robot autonomy (LORA), ranging from teleoperation to fully autonomous systems, influence the way in which humans and robots may interact with one another. Thus, there is a need to understand HRI by identifying variables that influence - and are influenced by - robot autonomy. Our overarching goal is to develop a framework for levels of robot autonomy in HRI. To reach this goal, the framework draws links between HRI and human-automation interaction, a field with a long history of studying and understanding human-related variables. The construct of autonomy is reviewed and redefined within the context of HRI. Additionally, the framework proposes a process for determining a robot's autonomy level, by categorizing autonomy along a 10-point taxonomy. The framework is intended to be treated as guidelines to determine autonomy, categorize the LORA along a qualitative taxonomy, and consider which HRI variables (e.g., acceptance, situation awareness, reliability) may be influenced by the LORA.

  16. Toward a framework for levels of robot autonomy in human-robot interaction

    PubMed Central

    Beer, Jenay M.; Fisk, Arthur D.; Rogers, Wendy A.

    2017-01-01

    A critical construct related to human-robot interaction (HRI) is autonomy, which varies widely across robot platforms. Levels of robot autonomy (LORA), ranging from teleoperation to fully autonomous systems, influence the way in which humans and robots may interact with one another. Thus, there is a need to understand HRI by identifying variables that influence – and are influenced by – robot autonomy. Our overarching goal is to develop a framework for levels of robot autonomy in HRI. To reach this goal, the framework draws links between HRI and human-automation interaction, a field with a long history of studying and understanding human-related variables. The construct of autonomy is reviewed and redefined within the context of HRI. Additionally, the framework proposes a process for determining a robot’s autonomy level, by categorizing autonomy along a 10-point taxonomy. The framework is intended to be treated as guidelines to determine autonomy, categorize the LORA along a qualitative taxonomy, and consider which HRI variables (e.g., acceptance, situation awareness, reliability) may be influenced by the LORA. PMID:29082107

  17. Concentration-driven models revisited: towards a unified framework to model settling tanks in water resource recovery facilities.

    PubMed

    Torfs, Elena; Martí, M Carmen; Locatelli, Florent; Balemans, Sophie; Bürger, Raimund; Diehl, Stefan; Laurent, Julien; Vanrolleghem, Peter A; François, Pierre; Nopens, Ingmar

    2017-02-01

    A new perspective on the modelling of settling behaviour in water resource recovery facilities is introduced. The ultimate goal is to describe in a unified way the processes taking place both in primary settling tanks (PSTs) and secondary settling tanks (SSTs) for a more detailed operation and control. First, experimental evidence is provided, pointing out distributed particle properties (such as size, shape, density, porosity, and flocculation state) as an important common source of distributed settling behaviour in different settling unit processes and throughout different settling regimes (discrete, hindered and compression settling). Subsequently, a unified model framework that considers several particle classes is proposed in order to describe distributions in settling behaviour as well as the effect of variations in particle properties on the settling process. The result is a set of partial differential equations (PDEs) that are valid from dilute concentrations, where they correspond to discrete settling, to concentrated suspensions, where they correspond to compression settling. Consequently, these PDEs model both PSTs and SSTs.

  18. SenSyF Experience on Integration of EO Services in a Generic, Cloud-Based EO Exploitation Platform

    NASA Astrophysics Data System (ADS)

    Almeida, Nuno; Catarino, Nuno; Gutierrez, Antonio; Grosso, Nuno; Andrade, Joao; Caumont, Herve; Goncalves, Pedro; Villa, Guillermo; Mangin, Antoine; Serra, Romain; Johnsen, Harald; Grydeland, Tom; Emsley, Stephen; Jauch, Eduardo; Moreno, Jose; Ruiz, Antonio

    2016-08-01

    SenSyF is a cloud-based data processing framework for EO- based services. It has been pioneer in addressing Big Data issues from the Earth Observation point of view, and is a precursor of several of the technologies and methodologies that will be deployed in ESA's Thematic Exploitation Platforms and other related systems.The SenSyF system focuses on developing fully automated data management, together with access to a processing and exploitation framework, including Earth Observation specific tools. SenSyF is both a development and validation platform for data intensive applications using Earth Observation data. With SenSyF, scientific, institutional or commercial institutions developing EO- based applications and services can take advantage of distributed computational and storage resources, tailored for applications dependent on big Earth Observation data, and without resorting to deep infrastructure and technological investments.This paper describes the integration process and the experience gathered from different EO Service providers during the project.

  19. Design and Implementation of an Architectural Framework for Web Portals in a Ubiquitous Pervasive Environment

    PubMed Central

    Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol

    2009-01-01

    Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal’s gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world’s largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework. PMID:22346693

  20. Design and implementation of an architectural framework for web portals in a ubiquitous pervasive environment.

    PubMed

    Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol

    2009-01-01

    Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal's gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world's largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework.

  1. Interactive Classification of Construction Materials: Feedback Driven Framework for Annotation and Analysis of 3d Point Clouds

    NASA Astrophysics Data System (ADS)

    Hess, M. R.; Petrovic, V.; Kuester, F.

    2017-08-01

    Digital documentation of cultural heritage structures is increasingly more common through the application of different imaging techniques. Many works have focused on the application of laser scanning and photogrammetry techniques for the acquisition of threedimensional (3D) geometry detailing cultural heritage sites and structures. With an abundance of these 3D data assets, there must be a digital environment where these data can be visualized and analyzed. Presented here is a feedback driven visualization framework that seamlessly enables interactive exploration and manipulation of massive point cloud data. The focus of this work is on the classification of different building materials with the goal of building more accurate as-built information models of historical structures. User defined functions have been tested within the interactive point cloud visualization framework to evaluate automated and semi-automated classification of 3D point data. These functions include decisions based on observed color, laser intensity, normal vector or local surface geometry. Multiple case studies are presented here to demonstrate the flexibility and utility of the presented point cloud visualization framework to achieve classification objectives.

  2. A framework of quality improvement interventions to implement evidence-based practices for pressure ulcer prevention.

    PubMed

    Padula, William V; Mishra, Manish K; Makic, Mary Beth F; Valuck, Robert J

    2014-06-01

    To enhance the learner's competence with knowledge about a framework of quality improvement (QI) interventions to implement evidence-based practices for pressure ulcer (PrU) prevention. This continuing education activity is intended for physicians and nurses with an interest in skin and wound care. After participating in this educational activity, the participant should be better able to:1. Summarize the process of creating and initiating the best-practice framework of QI for PrU prevention.2. Identify the domains and QI interventions for the best-practice framework of QI for PrU prevention. Pressure ulcer (PrU) prevention is a priority issue in US hospitals. The National Pressure Ulcer Advisory Panel endorses an evidence-based practice (EBP) protocol to help prevent PrUs. Effective implementation of EBPs requires systematic change of existing care units. Quality improvement interventions offer a mechanism of change to existing structures in order to effectively implement EBPs for PrU prevention. The best-practice framework developed by Nelson et al is a useful model of quality improvement interventions that targets process improvement in 4 domains: leadership, staff, information and information technology, and performance and improvement. At 2 academic medical centers, the best-practice framework was shown to physicians, nurses, and health services researchers. Their insight was used to modify the best-practice framework as a reference tool for quality improvement interventions in PrU prevention. The revised framework includes 25 elements across 4 domains. Many of these elements support EBPs for PrU prevention, such as updates in PrU staging and risk assessment. The best-practice framework offers a reference point to initiating a bundle of quality improvement interventions in support of EBPs. Hospitals and clinicians tasked with quality improvement efforts can use this framework to problem-solve PrU prevention and other critical issues.

  3. FPGA implemented testbed in 8-by-8 and 2-by-2 OFDM-MIMO channel estimation and design of baseband transceiver.

    PubMed

    Ramesh, S; Seshasayanan, R

    2016-01-01

    In this study, a baseband OFDM-MIMO framework with channel timing and estimation synchronization is composed and executed utilizing the FPGA innovation. The framework is prototyped in light of the IEEE 802.11a standard and the signals transmitted and received utilizing a data transmission of 20 MHz. With the assistance of the QPSK tweak, the framework can accomplish a throughput of 24 Mbps. Besides, the LS formula is executed and the estimation of a frequency-specific fading channel is illustrated. For the rough estimation of timing, MNC plan is examined and actualized. Above all else, the whole framework is demonstrated in MATLAB and a drifting point model is set up. At that point, the altered point model is made with the assistance of Simulink and Xilinx's System Generator for DSP. In this way, the framework is incorporated and actualized inside of Xilinx's ISE tools and focused to Xilinx Virtex 5 board. In addition, an equipment co-simulation is contrived to decrease the preparing time while figuring the BER of the fixed point model. The work concentrates on above all else venture for further examination of planning creative channel estimation strategies towards applications in the fourth era (4G) mobile correspondence frameworks.

  4. Message survival and decision dynamics in a class of reactive complex systems subject to external fields

    NASA Astrophysics Data System (ADS)

    Rodriguez Lucatero, C.; Schaum, A.; Alarcon Ramos, L.; Bernal-Jaquez, R.

    2014-07-01

    In this study, the dynamics of decisions in complex networks subject to external fields are studied within a Markov process framework using nonlinear dynamical systems theory. A mathematical discrete-time model is derived using a set of basic assumptions regarding the convincement mechanisms associated with two competing opinions. The model is analyzed with respect to the multiplicity of critical points and the stability of extinction states. Sufficient conditions for extinction are derived in terms of the convincement probabilities and the maximum eigenvalues of the associated connectivity matrices. The influences of exogenous (e.g., mass media-based) effects on decision behavior are analyzed qualitatively. The current analysis predicts: (i) the presence of fixed-point multiplicity (with a maximum number of four different fixed points), multi-stability, and sensitivity with respect to the process parameters; and (ii) the bounded but significant impact of exogenous perturbations on the decision behavior. These predictions were verified using a set of numerical simulations based on a scale-free network topology.

  5. Development of Quadratic Programming Algorithm Based on Interior Point Method with Estimation Mechanism of Active Constraints

    NASA Astrophysics Data System (ADS)

    Hashimoto, Hiroyuki; Takaguchi, Yusuke; Nakamura, Shizuka

    Instability of calculation process and increase of calculation time caused by increasing size of continuous optimization problem remain the major issues to be solved to apply the technique to practical industrial systems. This paper proposes an enhanced quadratic programming algorithm based on interior point method mainly for improvement of calculation stability. The proposed method has dynamic estimation mechanism of active constraints on variables, which fixes the variables getting closer to the upper/lower limit on them and afterwards releases the fixed ones as needed during the optimization process. It is considered as algorithm-level integration of the solution strategy of active-set method into the interior point method framework. We describe some numerical results on commonly-used bench-mark problems called “CUTEr” to show the effectiveness of the proposed method. Furthermore, the test results on large-sized ELD problem (Economic Load Dispatching problems in electric power supply scheduling) are also described as a practical industrial application.

  6. Voxel-Based Neighborhood for Spatial Shape Pattern Classification of Lidar Point Clouds with Supervised Learning.

    PubMed

    Plaza-Leiva, Victoria; Gomez-Ruiz, Jose Antonio; Mandow, Anthony; García-Cerezo, Alfonso

    2017-03-15

    Improving the effectiveness of spatial shape features classification from 3D lidar data is very relevant because it is largely used as a fundamental step towards higher level scene understanding challenges of autonomous vehicles and terrestrial robots. In this sense, computing neighborhood for points in dense scans becomes a costly process for both training and classification. This paper proposes a new general framework for implementing and comparing different supervised learning classifiers with a simple voxel-based neighborhood computation where points in each non-overlapping voxel in a regular grid are assigned to the same class by considering features within a support region defined by the voxel itself. The contribution provides offline training and online classification procedures as well as five alternative feature vector definitions based on principal component analysis for scatter, tubular and planar shapes. Moreover, the feasibility of this approach is evaluated by implementing a neural network (NN) method previously proposed by the authors as well as three other supervised learning classifiers found in scene processing methods: support vector machines (SVM), Gaussian processes (GP), and Gaussian mixture models (GMM). A comparative performance analysis is presented using real point clouds from both natural and urban environments and two different 3D rangefinders (a tilting Hokuyo UTM-30LX and a Riegl). Classification performance metrics and processing time measurements confirm the benefits of the NN classifier and the feasibility of voxel-based neighborhood.

  7. Exploring the implication of climate process uncertainties within the Earth System Framework

    NASA Astrophysics Data System (ADS)

    Booth, B.; Lambert, F. H.; McNeal, D.; Harris, G.; Sexton, D.; Boulton, C.; Murphy, J.

    2011-12-01

    Uncertainties in the magnitude of future climate change have been a focus of a great deal of research. Much of the work with General Circulation Models has focused on the atmospheric response to changes in atmospheric composition, while other processes remain outside these frameworks. Here we introduce an ensemble of new simulations, based on an Earth System configuration of HadCM3C, designed to explored uncertainties in both physical (atmospheric, oceanic and aerosol physics) and carbon cycle processes, using perturbed parameter approaches previously used to explore atmospheric uncertainty. Framed in the context of the climate response to future changes in emissions, the resultant future projections represent significantly broader uncertainty than existing concentration driven GCM assessments. The systematic nature of the ensemble design enables interactions between components to be explored. For example, we show how metrics of physical processes (such as climate sensitivity) are also influenced carbon cycle parameters. The suggestion from this work is that carbon cycle processes represent a comparable contribution to uncertainty in future climate projections as contributions from atmospheric feedbacks more conventionally explored. The broad range of climate responses explored within these ensembles, rather than representing a reason for inaction, provide information on lower likelihood but high impact changes. For example while the majority of these simulations suggest that future Amazon forest extent is resilient to the projected climate changes, a small number simulate dramatic forest dieback. This ensemble represents a framework to examine these risks, breaking them down into physical processes (such as ocean temperature drivers of rainfall change) and vegetation processes (where uncertainties point towards requirements for new observational constraints).

  8. Comprehensive process model of clinical information interaction in primary care: results of a "best-fit" framework synthesis.

    PubMed

    Veinot, Tiffany C; Senteio, Charles R; Hanauer, David; Lowery, Julie C

    2018-06-01

    To describe a new, comprehensive process model of clinical information interaction in primary care (Clinical Information Interaction Model, or CIIM) based on a systematic synthesis of published research. We used the "best fit" framework synthesis approach. Searches were performed in PubMed, Embase, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO, Library and Information Science Abstracts, Library, Information Science and Technology Abstracts, and Engineering Village. Two authors reviewed articles according to inclusion and exclusion criteria. Data abstraction and content analysis of 443 published papers were used to create a model in which every element was supported by empirical research. The CIIM documents how primary care clinicians interact with information as they make point-of-care clinical decisions. The model highlights 3 major process components: (1) context, (2) activity (usual and contingent), and (3) influence. Usual activities include information processing, source-user interaction, information evaluation, selection of information, information use, clinical reasoning, and clinical decisions. Clinician characteristics, patient behaviors, and other professionals influence the process. The CIIM depicts the complete process of information interaction, enabling a grasp of relationships previously difficult to discern. The CIIM suggests potentially helpful functionality for clinical decision support systems (CDSSs) to support primary care, including a greater focus on information processing and use. The CIIM also documents the role of influence in clinical information interaction; influencers may affect the success of CDSS implementations. The CIIM offers a new framework for achieving CDSS workflow integration and new directions for CDSS design that can support the work of diverse primary care clinicians.

  9. Applying Standard Independent Verification and Validation (IVV) Techniques Within an Agile Framework: Is There a Compatibility Issue?

    NASA Technical Reports Server (NTRS)

    Dabney, James B.; Arthur, James Douglas

    2017-01-01

    Agile methods have gained wide acceptance over the past several years, to the point that they are now a standard management and execution approach for small-scale software development projects. While conventional Agile methods are not generally applicable to large multi-year and mission-critical systems, Agile hybrids are now being developed (such as SAFe) to exploit the productivity improvements of Agile while retaining the necessary process rigor and coordination needs of these projects. From the perspective of Independent Verification and Validation (IVV), however, the adoption of these hybrid Agile frameworks is becoming somewhat problematic. Hence, we find it prudent to question the compatibility of conventional IVV techniques with (hybrid) Agile practices.This paper documents our investigation of (a) relevant literature, (b) the modification and adoption of Agile frameworks to accommodate the development of large scale, mission critical systems, and (c) the compatibility of standard IVV techniques within hybrid Agile development frameworks. Specific to the latter, we found that the IVV methods employed within a hybrid Agile process can be divided into three groups: (1) early lifecycle IVV techniques that are fully compatible with the hybrid lifecycles, (2) IVV techniques that focus on tracing requirements, test objectives, etc. are somewhat incompatible, but can be tailored with a modest effort, and (3) IVV techniques involving an assessment requiring artifact completeness that are simply not compatible with hybrid Agile processes, e.g., those that assume complete requirement specification early in the development lifecycle.

  10. SU-D-BRA-04: Computerized Framework for Marker-Less Localization of Anatomical Feature Points in Range Images Based On Differential Geometry Features for Image-Guided Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soufi, M; Arimura, H; Toyofuku, F

    Purpose: To propose a computerized framework for localization of anatomical feature points on the patient surface in infrared-ray based range images by using differential geometry (curvature) features. Methods: The general concept was to reconstruct the patient surface by using a mathematical modeling technique for the computation of differential geometry features that characterize the local shapes of the patient surfaces. A region of interest (ROI) was firstly extracted based on a template matching technique applied on amplitude (grayscale) images. The extracted ROI was preprocessed for reducing temporal and spatial noises by using Kalman and bilateral filters, respectively. Next, a smooth patientmore » surface was reconstructed by using a non-uniform rational basis spline (NURBS) model. Finally, differential geometry features, i.e. the shape index and curvedness features were computed for localizing the anatomical feature points. The proposed framework was trained for optimizing shape index and curvedness thresholds and tested on range images of an anthropomorphic head phantom. The range images were acquired by an infrared ray-based time-of-flight (TOF) camera. The localization accuracy was evaluated by measuring the mean of minimum Euclidean distances (MMED) between reference (ground truth) points and the feature points localized by the proposed framework. The evaluation was performed for points localized on convex regions (e.g. apex of nose) and concave regions (e.g. nasofacial sulcus). Results: The proposed framework has localized anatomical feature points on convex and concave anatomical landmarks with MMEDs of 1.91±0.50 mm and 3.70±0.92 mm, respectively. A statistically significant difference was obtained between the feature points on the convex and concave regions (P<0.001). Conclusion: Our study has shown the feasibility of differential geometry features for localization of anatomical feature points on the patient surface in range images. The proposed framework might be useful for tasks involving feature-based image registration in range-image guided radiation therapy.« less

  11. Theory of Turing Patterns on Time Varying Networks.

    PubMed

    Petit, Julien; Lauwens, Ben; Fanelli, Duccio; Carletti, Timoteo

    2017-10-06

    The process of pattern formation for a multispecies model anchored on a time varying network is studied. A nonhomogeneous perturbation superposed to an homogeneous stable fixed point can be amplified following the Turing mechanism of instability, solely instigated by the network dynamics. By properly tuning the frequency of the imposed network evolution, one can make the examined system behave as its averaged counterpart, over a finite time window. This is the key observation to derive a closed analytical prediction for the onset of the instability in the time dependent framework. Continuously and piecewise constant periodic time varying networks are analyzed, setting the framework for the proposed approach. The extension to nonperiodic settings is also discussed.

  12. [Construction of educational software about personality disorders].

    PubMed

    Botti, Nadja Cristiane Lappann; Carneiro, Ana Luíza Marques; Almeida, Camila Souza; Pereira, Cíntia Braga Silva

    2011-01-01

    The study describes the experience of building educational software in the area of mental health. The software was developed to enable the nursing student identify personality disorders. In this process, we applied the pedagogical framework of Vygotsky and the theoretical framework of the diagnostic criteria defined by DSM-IV. From these references were identified personality disorders characters in stories and / or children's movies. The software development bank was built with multimedia graphics data, sound and explanatory. The software developed like educational game like questions with increasing levels of difficulty. The software was developed with Microsoft Office PowerPoint 2007. It is believed in the validity of this strategy for teaching-learning to the area of mental health nursing.

  13. Usability Guidelines for Product Recommenders Based on Example Critiquing Research

    NASA Astrophysics Data System (ADS)

    Pu, Pearl; Faltings, Boi; Chen, Li; Zhang, Jiyong; Viappiani, Paolo

    Over the past decade, our group has developed a suite of decision tools based on example critiquing to help users find their preferred products in e-commerce environments. In this chapter, we survey important usability research work relative to example critiquing and summarize the major results by deriving a set of usability guidelines. Our survey is focused on three key interaction activities between the user and the system: the initial preference elicitation process, the preference revision process, and the presentation of the systems recommendation results. To provide a basis for the derivation of the guidelines, we developed a multi-objective framework of three interacting criteria: accuracy, confidence, and effort (ACE). We use this framework to analyze our past work and provide a specific context for each guideline: when the system should maximize its ability to increase users' decision accuracy, when to increase user confidence, and when to minimize the interaction effort for the users. Due to the general nature of this multi-criteria model, the set of guidelines that we propose can be used to ease the usability engineering process of other recommender systems, especially those used in e-commerce environments. The ACE framework presented here is also the first in the field to evaluate the performance of preference-based recommenders from a user-centric point of view.

  14. Layout compliance for triple patterning lithography: an iterative approach

    NASA Astrophysics Data System (ADS)

    Yu, Bei; Garreton, Gilda; Pan, David Z.

    2014-10-01

    As the semiconductor process further scales down, the industry encounters many lithography-related issues. In the 14nm logic node and beyond, triple patterning lithography (TPL) is one of the most promising techniques for Metal1 layer and possibly Via0 layer. As one of the most challenging problems in TPL, recently layout decomposition efforts have received more attention from both industry and academia. Ideally the decomposer should point out locations in the layout that are not triple patterning decomposable and therefore manual intervention by designers is required. A traditional decomposition flow would be an iterative process, where each iteration consists of an automatic layout decomposition step and manual layout modification task. However, due to the NP-hardness of triple patterning layout decomposition, automatic full chip level layout decomposition requires long computational time and therefore design closure issues continue to linger around in the traditional flow. Challenged by this issue, we present a novel incremental layout decomposition framework to facilitate accelerated iterative decomposition. In the first iteration, our decomposer not only points out all conflicts, but also provides the suggestions to fix them. After the layout modification, instead of solving the full chip problem from scratch, our decomposer can provide a quick solution for a selected portion of layout. We believe this framework is efficient, in terms of performance and designer friendly.

  15. IT-adoption and the interaction of task, technology and individuals: a fit framework and a case study

    PubMed Central

    Ammenwerth, Elske; Iller, Carola; Mahler, Cornelia

    2006-01-01

    Background Factors of IT adoption have largely been discussed in the literature. However, existing frameworks (such as TAM or TTF) are failing to include one important aspect, the interaction between user and task. Method Based on a literature study and a case study, we developed the FITT framework to help analyse the socio-organisational-technical factors that influence IT adoption in a health care setting. Results Our FITT framework ("Fit between Individuals, Task and Technology") is based on the idea that IT adoption in a clinical environment depends on the fit between the attributes of the individual users (e.g. computer anxiety, motivation), attributes of the technology (e.g. usability, functionality, performance), and attributes of the clinical tasks and processes (e.g. organisation, task complexity). We used this framework in the retrospective analysis of a three-year case study, describing the adoption of a nursing documentation system in various departments in a German University Hospital. We will show how the FITT framework helped analyzing the process of IT adoption during an IT implementation: we were able to describe every found IT adoption problem with regard to the three fit dimensions, and any intervention on the fit can be described with regard to the three objects of the FITT framework (individual, task, technology). We also derive facilitators and barriers to IT adoption of clinical information systems. Conclusion This work should support a better understanding of the reasons for IT adoption failures and therefore enable better prepared and more successful IT introduction projects. We will discuss, however, that from a more epistemological point of view, it may be difficult or even impossible to analyse the complex and interacting factors that predict success or failure of IT projects in a socio-technical environment. PMID:16401336

  16. How patients with gout become engaged in disease management: a constructivist grounded theory study.

    PubMed

    Howren, Alyssa; Cox, Susan M; Shojania, Kam; Rai, Sharan K; Choi, Hyon K; De Vera, Mary A

    2018-06-01

    Prior qualitative research on gout has focused primarily on barriers to disease management. Our objective was to use patients' perspectives to construct an explanatory framework to understand how patients become engaged in the management of their gout. We recruited a sample of individuals with gout who were participating in a proof-of-concept study of an eHealth-supported collaborative care model for gout involving rheumatology, pharmacy, and dietetics. Semistructured interviews were used. We analyzed transcripts using principles of constructivist grounded theory involving initial coding, focused coding and categorizing, and theoretical coding. Twelve participants with gout (ten males, two females; mean age, 66.5 ± 13.3 years) were interviewed. The analysis resulted in the construction of three themes as well as a framework describing the dynamically linked themes on (1) processing the diagnosis and management of gout, (2) supporting management of gout, and (3) interfering with management of gout. In this framework, patients with gout transition between each theme in the process of becoming engaged in the management of their gout and may represent potential opportunities for healthcare intervention. Findings derived from this study show that becoming engaged in gout management is a dynamic process whereby patients with gout experience factors that interfere with gout management, process their disease and its management, and develop the practical and perceptual skills necessary to manage their gout. By understanding this process, healthcare providers can identify points to adapt care delivery and thereby improve health outcomes.

  17. Adoption of high technology medical imaging and hospital quality and efficiency: Towards a conceptual framework.

    PubMed

    Sandoval, Guillermo A; Brown, Adalsteinn D; Wodchis, Walter P; Anderson, Geoffrey M

    2018-05-17

    Measuring the value of medical imaging is challenging, in part, due to the lack of conceptual frameworks underlying potential mechanisms where value may be assessed. To address this gap, this article proposes a framework that builds on the large body of literature on quality of hospital care and the classic structure-process-outcome paradigm. The framework was also informed by the literature on adoption of technological innovations and introduces 2 distinct though related aspects of imaging technology not previously addressed specifically in the literature on quality of hospital care: adoption (a structural hospital characteristic) and use (an attribute of the process of care). The framework hypothesizes a 2-part causality where adoption is proposed to be a central, linking factor between hospital structural characteristics, market factors, and hospital outcomes (ie, quality and efficiency). The first part indicates that hospital structural characteristics and market factors influence or facilitate the adoption of high technology medical imaging within an institution. The presence of this technology, in turn, is hypothesized to improve the ability of the hospital to deliver high quality and efficient care. The second part describes this ability throughout 3 main mechanisms pointing to the importance of imaging use on patients, to the presence of staff and qualified care providers, and to some elements of organizational capacity capturing an enhanced clinical environment. The framework has the potential to assist empirical investigations of the value of adoption and use of medical imaging, and to advance understanding of the mechanisms that produce quality and efficiency in hospitals. Copyright © 2018 John Wiley & Sons, Ltd.

  18. A framework for correcting brain retraction based on an eXtended Finite Element Method using a laser range scanner.

    PubMed

    Li, Ping; Wang, Weiwei; Song, Zhijian; An, Yong; Zhang, Chenxi

    2014-07-01

    Brain retraction causes great distortion that limits the accuracy of an image-guided neurosurgery system that uses preoperative images. Therefore, brain retraction correction is an important intraoperative clinical application. We used a linear elastic biomechanical model, which deforms based on the eXtended Finite Element Method (XFEM) within a framework for brain retraction correction. In particular, a laser range scanner was introduced to obtain a surface point cloud of the exposed surgical field including retractors inserted into the brain. A brain retraction surface tracking algorithm converted these point clouds into boundary conditions applied to XFEM modeling that drive brain deformation. To test the framework, we performed a brain phantom experiment involving the retraction of tissue. Pairs of the modified Hausdorff distance between Canny edges extracted from model-updated images, pre-retraction, and post-retraction CT images were compared to evaluate the morphological alignment of our framework. Furthermore, the measured displacements of beads embedded in the brain phantom and the predicted ones were compared to evaluate numerical performance. The modified Hausdorff distance of 19 pairs of images decreased from 1.10 to 0.76 mm. The forecast error of 23 stainless steel beads in the phantom was between 0 and 1.73 mm (mean 1.19 mm). The correction accuracy varied between 52.8 and 100 % (mean 81.4 %). The results demonstrate that the brain retraction compensation can be incorporated intraoperatively into the model-updating process in image-guided neurosurgery systems.

  19. Conceptual framework for outcomes research studies of hepatitis C: an analytical review

    PubMed Central

    Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M

    2016-01-01

    Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments. PMID:27313473

  20. Surface growth kinematics via local curve evolution.

    PubMed

    Moulton, Derek E; Goriely, Alain

    2014-01-01

    A mathematical framework is developed to model the kinematics of surface growth for objects that can be generated by evolving a curve in space, such as seashells and horns. Growth is dictated by a growth velocity vector field defined at every point on a generating curve. A local orthonormal basis is attached to each point of the generating curve and the velocity field is given in terms of the local coordinate directions, leading to a fully local and elegant mathematical structure. Several examples of increasing complexity are provided, and we demonstrate how biologically relevant structures such as logarithmic shells and horns emerge as analytical solutions of the kinematics equations with a small number of parameters that can be linked to the underlying growth process. Direct access to cell tracks and local orientation enables for connections to be made to the underlying growth process.

  1. Final Report Collaborative Project. Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less

  2. First-principles investigation of point defect and atomic diffusion in Al2Ca

    NASA Astrophysics Data System (ADS)

    Tian, Xiao; Wang, Jia-Ning; Wang, Ya-Ping; Shi, Xue-Feng; Tang, Bi-Yu

    2017-04-01

    Point defects and atomic diffusion in Al2Ca have been studied from first-principles calculations within density functional framework. After formation energy and relative stability of point defects are investigated, several predominant diffusion processes in Al2Ca are studied, including sublattice one-step mechanism, 3-jump vacancy cycles and antistructure sublattice mechanism. The associated energy profiles are calculated with climbing image nudged elastic band (CI-NEB) method, then the saddle points and activation barriers during atomic diffusion are further determined. The resulted activation barriers show that both Al and Ca can diffuse mainly mediated by neighbor vacancy on their own sublattice. 3-jump cycle mechanism mediated by VCa may make some contribution to the overall Al diffusion. And antistructure (AS) sublattice mechanism can also play an important role in Ca atomic diffusion owing to the moderate activation barrier.

  3. SPY: A new scission point model based on microscopic ingredients to predict fission fragments properties

    NASA Astrophysics Data System (ADS)

    Lemaître, J.-F.; Dubray, N.; Hilaire, S.; Panebianco, S.; Sida, J.-L.

    2013-12-01

    Our purpose is to determine fission fragments characteristics in a framework of a scission point model named SPY for Scission Point Yields. This approach can be considered as a theoretical laboratory to study fission mechanism since it gives access to the correlation between the fragments properties and their nuclear structure, such as shell correction, pairing, collective degrees of freedom, odd-even effects. Which ones are dominant in final state? What is the impact of compound nucleus structure? The SPY model consists in a statistical description of the fission process at the scission point where fragments are completely formed and well separated with fixed properties. The most important property of the model relies on the nuclear structure of the fragments which is derived from full quantum microscopic calculations. This approach allows computing the fission final state of extremely exotic nuclei which are inaccessible by most of the fission model available on the market.

  4. Spatially based management of agricultural phosphorus pollution from diffuse sources: the SCIMAP risk based approach

    NASA Astrophysics Data System (ADS)

    Reaney, S. M.; Heathwaite, L.; Lane, S. N.; Buckley, C.

    2007-12-01

    Pollution of rivers from agricultural phosphorus is recognised as a significant global problem and is a major management challenge as it involves processes that are small in magnitude, distributed over large areas, operating at fine spatial scales and associated with certain land use types when they are well connected to the receiving waters. Whilst some of these processes have been addressed in terms of water quality forecasting models and field measurements, we lack effective tools to prioritise where action should be taken to remediate the diffuse pollution problem. From a management perspective, the required information is on 'what to do where' rather than absolute values. This change in focus opens up the problem to be considered in a probabilistic / relative framework rather than concentrating on absolute values. The SCIMAP risk management framework is based on the critical source area concept whereby a risk and a connection are required to generate a problem. Treatments of both surface and subsurface hydrological connectivity have been developed. The approach is based on the philosophy that for a point to be considered connected there needs to be a continuous flow path to the receiving water. This information is calculated by simulating the possible flow paths from the source cell to the receiving water and recording the required catchment wetness to allow flow along that route. This algorithm gives information on the ease at which each point in the landscape can export risk along surface and subsurface pathways to the receiving waters. To understand the annual dynamics of the locational diffuse P risk, a temporal risk framework has been developed. This risk framework accounts for land management activies within the agricultural calendar. These events include the application of fertiliser, the P additions from livestock and the offtake of P in crops. Changes to these risks can be made to investigate management options. The SCIMAP risk mapping framework has been applied to 12 catchments in England as part of the DEFRA / Environment Agency's Catchment Sensitive Farming programme. Result from these catchments will be presented.

  5. Theoretical and Empirical Comparison of Big Data Image Processing with Apache Hadoop and Sun Grid Engine.

    PubMed

    Bao, Shunxing; Weitendorf, Frederick D; Plassard, Andrew J; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A

    2017-02-11

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., "short" processing times and/or "large" datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply "large scale" processing transitions into "big data" and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and non-relevant for medical imaging.

  6. Theoretical and empirical comparison of big data image processing with Apache Hadoop and Sun Grid Engine

    NASA Astrophysics Data System (ADS)

    Bao, Shunxing; Weitendorf, Frederick D.; Plassard, Andrew J.; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A.

    2017-03-01

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., "short" processing times and/or "large" datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply "large scale" processing transitions into "big data" and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and nonrelevant for medical imaging.

  7. Adaptation, expertise, and giftedness: towards an understanding of cortical, subcortical, and cerebellar network contributions.

    PubMed

    Koziol, Leonard F; Budding, Deborah Ely; Chidekel, Dana

    2010-12-01

    Current cortico-centric models of cognition lack a cohesive neuroanatomic framework that sufficiently considers overlapping levels of function, from "pathological" through "normal" to "gifted" or exceptional ability. While most cognitive theories presume an evolutionary context, few actively consider the process of adaptation, including concepts of neurodevelopment. Further, the frequent co-occurrence of "gifted" and "pathological" function is difficult to explain from a cortico-centric point of view. This comprehensive review paper proposes a framework that includes the brain's vertical organization and considers "giftedness" from an evolutionary and neurodevelopmental vantage point. We begin by discussing the current cortico-centric model of cognition and its relationship to intelligence. We then review an integrated, dual-tiered model of cognition that better explains the process of adaptation by simultaneously allowing for both stimulus-based processing and higher-order cognitive control. We consider the role of the basal ganglia within this model, particularly in relation to reward circuitry and instrumental learning. We review the important role of white matter tracts in relation to speed of adaptation and development of behavioral mastery. We examine the cerebellum's critical role in behavioral refinement and in cognitive and behavioral automation, particularly in relation to expertise and giftedness. We conclude this integrated model of brain function by considering the savant syndrome, which we believe is best understood within the context of a dual-tiered model of cognition that allows for automaticity in adaptation as well as higher-order executive control.

  8. Insights into mortality patterns and causes of death through a process point of view model.

    PubMed

    Anderson, James J; Li, Ting; Sharrow, David J

    2017-02-01

    Process point of view (POV) models of mortality, such as the Strehler-Mildvan and stochastic vitality models, represent death in terms of the loss of survival capacity through challenges and dissipation. Drawing on hallmarks of aging, we link these concepts to candidate biological mechanisms through a framework that defines death as challenges to vitality where distal factors defined the age-evolution of vitality and proximal factors define the probability distribution of challenges. To illustrate the process POV, we hypothesize that the immune system is a mortality nexus, characterized by two vitality streams: increasing vitality representing immune system development and immunosenescence representing vitality dissipation. Proximal challenges define three mortality partitions: juvenile and adult extrinsic mortalities and intrinsic adult mortality. Model parameters, generated from Swedish mortality data (1751-2010), exhibit biologically meaningful correspondences to economic, health and cause-of-death patterns. The model characterizes the twentieth century epidemiological transition mainly as a reduction in extrinsic mortality resulting from a shift from high magnitude disease challenges on individuals at all vitality levels to low magnitude stress challenges on low vitality individuals. Of secondary importance, intrinsic mortality was described by a gradual reduction in the rate of loss of vitality presumably resulting from reduction in the rate of immunosenescence. Extensions and limitations of a distal/proximal framework for characterizing more explicit causes of death, e.g. the young adult mortality hump or cancer in old age are discussed.

  9. Idaho National Laboratory Test Area North: Application of Endpoints to Guide Adaptive Remediation at a Complex Site: INL Test Area North: Application of Endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, M. Hope; Truex, Mike; Freshley, Mark

    Complex sites are defined as those with difficult subsurface access, deep and/or thick zones of contamination, large areal extent, subsurface heterogeneities that limit the effectiveness of remediation, or where long-term remedies are needed to address contamination (e.g., because of long-term sources or large extent). The Test Area North at the Idaho National Laboratory, developed for nuclear fuel operations and heavy metal manufacturing, is used as a case study. Liquid wastes and sludge from experimental facilities were disposed in an injection well, which contaminated the subsurface aquifer located deep within fractured basalt. The wastes included organic, inorganic, and low-level radioactive constituents,more » with the focus of this case study on trichloroethylene. The site is used as an example of a systems-based framework that provides a structured approach to regulatory processes established for remediation under existing regulations. The framework is intended to facilitate remedy decisions and implementation at complex sites where restoration may be uncertain, require long timeframes, or involve use of adaptive management approaches. The framework facilitates site, regulator, and stakeholder interactions during the remedial planning and implementation process by using a conceptual model description as a technical foundation for decisions, identifying endpoints, which are interim remediation targets or intermediate decision points on the path to an ultimate end, and maintaining protectiveness during the remediation process. At the Test Area North, using a structured approach to implementing concepts in the endpoint framework, a three-component remedy is largely functioning as intended and is projected to meet remedial action objectives by 2095 as required. The remedy approach is being adjusted as new data become available. The framework provides a structured process for evaluating and adjusting the remediation approach, allowing site owners, regulators, and stakeholders to manage contamination at complex sites where adaptive remedies are needed.« less

  10. Measuring temperature and field profiles in heat assisted magnetic recording

    NASA Astrophysics Data System (ADS)

    Hohlfeld, J.; Zheng, X.; Benakli, M.

    2015-08-01

    We introduce a theoretical and experimental framework that enables quantitative measurements of the temperature and magnetic field profiles governing the thermo-magnetic write process in heat assisted magnetic recording. Since our approach allows the identification of the correct temperature dependence of the magneto-crystalline anisotropy field in the vicinity of the Curie point as well, it provides an unprecedented experimental foundation to assess our understanding of heat assisted magnetic recording.

  11. Registration of 4D time-series of cardiac images with multichannel Diffeomorphic Demons.

    PubMed

    Peyrat, Jean-Marc; Delingette, Hervé; Sermesant, Maxime; Pennec, Xavier; Xu, Chenyang; Ayache, Nicholas

    2008-01-01

    In this paper, we propose a generic framework for intersubject non-linear registration of 4D time-series images. In this framework, spatio-temporal registration is defined by mapping trajectories of physical points as opposed to spatial registration that solely aims at mapping homologous points. First, we determine the trajectories we want to register in each sequence using a motion tracking algorithm based on the Diffeomorphic Demons algorithm. Then, we perform simultaneously pairwise registrations of corresponding time-points with the constraint to map the same physical points over time. We show this trajectory registration can be formulated as a multichannel registration of 3D images. We solve it using the Diffeomorphic Demons algorithm extended to vector-valued 3D images. This framework is applied to the inter-subject non-linear registration of 4D cardiac CT sequences.

  12. Interpersonal Emotion Regulation Model of Mood and Anxiety Disorders.

    PubMed

    Hofmann, Stefan G

    2014-10-01

    Although social factors are of critical importance in the development and maintenance of emotional disorders, the contemporary view of emotion regulation has been primarily limited to intrapersonal processes. Based on diverse perspectives pointing to the communicative function of emotions, the social processes in self-regulation, and the role of social support, this article presents an interpersonal model of emotion regulation of mood and anxiety disorders. This model provides a theoretical framework to understand and explain how mood and anxiety disorders are regulated and maintained through others. The literature, which provides support for the model, is reviewed and the clinical implications are discussed.

  13. Exploring Evolving Media Discourse Through Event Cueing.

    PubMed

    Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross

    2016-01-01

    Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa.

  14. Modeling of prepregs during automated draping sequences

    NASA Astrophysics Data System (ADS)

    Krogh, Christian; Glud, Jens A.; Jakobsen, Johnny

    2017-10-01

    The behavior of wowen prepreg fabric during automated draping sequences is investigated. A drape tool under development with an arrangement of grippers facilitates the placement of a woven prepreg fabric in a mold. It is essential that the draped configuration is free from wrinkles and other defects. The present study aims at setting up a virtual draping framework capable of modeling the draping process from the initial flat fabric to the final double curved shape and aims at assisting the development of an automated drape tool. The virtual draping framework consists of a kinematic mapping algorithm used to generate target points on the mold which are used as input to a draping sequence planner. The draping sequence planner prescribes the displacement history for each gripper in the drape tool and these displacements are then applied to each gripper in a transient model of the draping sequence. The model is based on a transient finite element analysis with the material's constitutive behavior currently being approximated as linear elastic orthotropic. In-plane tensile and bias-extension tests as well as bending tests are conducted and used as input for the model. The virtual draping framework shows a good potential for obtaining a better understanding of the drape process and guide the development of the drape tool. However, results obtained from using the framework on a simple test case indicate that the generation of draping sequences is non-trivial.

  15. Growth Points in Students' Developing Understanding of Function in Equation Form

    ERIC Educational Resources Information Center

    Ronda, Erlina R.

    2009-01-01

    This paper presents a research-based framework for analyzing and monitoring students' understanding of functions in equation form. The framework consists of "growth points" which describe "big ideas" of students' understanding of the concept. The data were collected from Grades 8, 9, and 10 students using a set of tasks…

  16. A point process approach to identifying and tracking transitions in neural spiking dynamics in the subthalamic nucleus of Parkinson's patients

    NASA Astrophysics Data System (ADS)

    Deng, Xinyi; Eskandar, Emad N.; Eden, Uri T.

    2013-12-01

    Understanding the role of rhythmic dynamics in normal and diseased brain function is an important area of research in neural electrophysiology. Identifying and tracking changes in rhythms associated with spike trains present an additional challenge, because standard approaches for continuous-valued neural recordings—such as local field potential, magnetoencephalography, and electroencephalography data—require assumptions that do not typically hold for point process data. Additionally, subtle changes in the history dependent structure of a spike train have been shown to lead to robust changes in rhythmic firing patterns. Here, we propose a point process modeling framework to characterize the rhythmic spiking dynamics in spike trains, test for statistically significant changes to those dynamics, and track the temporal evolution of such changes. We first construct a two-state point process model incorporating spiking history and develop a likelihood ratio test to detect changes in the firing structure. We then apply adaptive state-space filters and smoothers to track these changes through time. We illustrate our approach with a simulation study as well as with experimental data recorded in the subthalamic nucleus of Parkinson's patients performing an arm movement task. Our analyses show that during the arm movement task, neurons underwent a complex pattern of modulation of spiking intensity characterized initially by a release of inhibitory control at 20-40 ms after a spike, followed by a decrease in excitatory influence at 40-60 ms after a spike.

  17. Voxel-Based Neighborhood for Spatial Shape Pattern Classification of Lidar Point Clouds with Supervised Learning

    PubMed Central

    Plaza-Leiva, Victoria; Gomez-Ruiz, Jose Antonio; Mandow, Anthony; García-Cerezo, Alfonso

    2017-01-01

    Improving the effectiveness of spatial shape features classification from 3D lidar data is very relevant because it is largely used as a fundamental step towards higher level scene understanding challenges of autonomous vehicles and terrestrial robots. In this sense, computing neighborhood for points in dense scans becomes a costly process for both training and classification. This paper proposes a new general framework for implementing and comparing different supervised learning classifiers with a simple voxel-based neighborhood computation where points in each non-overlapping voxel in a regular grid are assigned to the same class by considering features within a support region defined by the voxel itself. The contribution provides offline training and online classification procedures as well as five alternative feature vector definitions based on principal component analysis for scatter, tubular and planar shapes. Moreover, the feasibility of this approach is evaluated by implementing a neural network (NN) method previously proposed by the authors as well as three other supervised learning classifiers found in scene processing methods: support vector machines (SVM), Gaussian processes (GP), and Gaussian mixture models (GMM). A comparative performance analysis is presented using real point clouds from both natural and urban environments and two different 3D rangefinders (a tilting Hokuyo UTM-30LX and a Riegl). Classification performance metrics and processing time measurements confirm the benefits of the NN classifier and the feasibility of voxel-based neighborhood. PMID:28294963

  18. Biologically Inspired Model for Visual Cognition Achieving Unsupervised Episodic and Semantic Feature Learning.

    PubMed

    Qiao, Hong; Li, Yinlin; Li, Fengfu; Xi, Xuanyang; Wu, Wei

    2016-10-01

    Recently, many biologically inspired visual computational models have been proposed. The design of these models follows the related biological mechanisms and structures, and these models provide new solutions for visual recognition tasks. In this paper, based on the recent biological evidence, we propose a framework to mimic the active and dynamic learning and recognition process of the primate visual cortex. From principle point of view, the main contributions are that the framework can achieve unsupervised learning of episodic features (including key components and their spatial relations) and semantic features (semantic descriptions of the key components), which support higher level cognition of an object. From performance point of view, the advantages of the framework are as follows: 1) learning episodic features without supervision-for a class of objects without a prior knowledge, the key components, their spatial relations and cover regions can be learned automatically through a deep neural network (DNN); 2) learning semantic features based on episodic features-within the cover regions of the key components, the semantic geometrical values of these components can be computed based on contour detection; 3) forming the general knowledge of a class of objects-the general knowledge of a class of objects can be formed, mainly including the key components, their spatial relations and average semantic values, which is a concise description of the class; and 4) achieving higher level cognition and dynamic updating-for a test image, the model can achieve classification and subclass semantic descriptions. And the test samples with high confidence are selected to dynamically update the whole model. Experiments are conducted on face images, and a good performance is achieved in each layer of the DNN and the semantic description learning process. Furthermore, the model can be generalized to recognition tasks of other objects with learning ability.

  19. A novel numerical framework for self-similarity in plasticity: Wedge indentation in single crystals

    NASA Astrophysics Data System (ADS)

    Juul, K. J.; Niordson, C. F.; Nielsen, K. L.; Kysar, J. W.

    2018-03-01

    A novel numerical framework for analyzing self-similar problems in plasticity is developed and demonstrated. Self-similar problems of this kind include processes such as stationary cracks, void growth, indentation etc. The proposed technique offers a simple and efficient method for handling this class of complex problems by avoiding issues related to traditional Lagrangian procedures. Moreover, the proposed technique allows for focusing the mesh in the region of interest. In the present paper, the technique is exploited to analyze the well-known wedge indentation problem of an elastic-viscoplastic single crystal. However, the framework may be readily adapted to any constitutive law of interest. The main focus herein is the development of the self-similar framework, while the indentation study serves primarily as verification of the technique by comparing to existing numerical and analytical studies. In this study, the three most common metal crystal structures will be investigated, namely the face-centered cubic (FCC), body-centered cubic (BCC), and hexagonal close packed (HCP) crystal structures, where the stress and slip rate fields around the moving contact point singularity are presented.

  20. Geometry Of Discrete Sets With Applications To Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Sinha, Divyendu

    1990-03-01

    In this paper we present a new framework for discrete black and white images that employs only integer arithmetic. This framework is shown to retain the essential characteristics of the framework for Euclidean images. We propose two norms and based on them, the permissible geometric operations on images are defined. The basic invariants of our geometry are line images, structure of image and the corresponding local property of strong attachment of pixels. The permissible operations also preserve the 3x3 neighborhoods, area, and perpendicularity. The structure, patterns, and the inter-pattern gaps in a discrete image are shown to be conserved by the magnification and contraction process. Our notions of approximate congruence, similarity and symmetry are similar, in character, to the corresponding notions, for Euclidean images [1]. We mention two discrete pattern recognition algorithms that work purely with integers, and which fit into our framework. Their performance has been shown to be at par with the performance of traditional geometric schemes. Also, all the undesired effects of finite length registers in fixed point arithmetic that plague traditional algorithms, are non-existent in this family of algorithms.

  1. Development of a competency framework for optometrists with a specialist interest in glaucoma.

    PubMed

    Myint, J; Edgar, D F; Kotecha, A; Crabb, D P; Lawrenson, J G

    2010-09-01

    To develop a competency framework, using a modified Delphi methodology, for optometrists with a specialist interest in glaucoma, which would provide a basis for training and accreditation. A modified iterative Delphi technique was used using a 16-member panel consisting almost exclusively of sub-specialist optometrists and ophthalmologists. The first round involved scoring the relevance of a draft series of competencies using a 9-point Likert scale with a free-text option to modify any competency or suggest additional competencies. The revised framework was subjected to a second round of scoring and free-text comment. The Delphi process was followed by a face-to-face structured workshop to debate and agree the final framework. The version of the framework agreed at the workshop was sent out for a 4-month period of external stakeholder validation. There was a 100% response to round 1 and an 94% response to round 2. All panel members attended the workshop. The final version of the competency framework was validated by a subsequent stakeholder consultation and contained 19 competencies for the diagnosis of glaucoma and 7 further competencies for monitoring and treatment. Application of a consensus methodology consisting of a modified Delphi technique allowed the development of a competency framework for glaucoma specialisation by optometrists. This will help to shape the development of a speciality curriculum and potentially could be adapted for other healthcare professionals.

  2. Technology Infusion Challenges from a Decision Support Perspective

    NASA Technical Reports Server (NTRS)

    Adumitroaie, V.; Weisbin, C. R.

    2009-01-01

    In a restricted science budget environment and increasingly numerous required technology developments, the technology investment decisions within NASA are objectively more and more difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Under these conditions it is rationally desirable to build an investment portfolio, which has the highest possible technology infusion rate. Arguably the path to infusion is subject to many influencing factors, but here only the challenges associated with the very initial stages are addressed: defining the needs and the subsequent investment decision-support process. It is conceivable that decision consistency and possibly its quality suffer when the decision-making process has limited or no traceability. This paper presents a structured decision-support framework aiming to provide traceable, auditable, infusion- driven recommendations towards a selection process in which these recommendations are used as reference points in further discussions among stakeholders. In this framework addressing well-defined requirements, different measures of success can be defined based on traceability to specific selection criteria. As a direct result, even by using simplified decision models the likelihood of infusion can be probed and consequently improved.

  3. Scalets, wavelets and (complex) turning point quantization

    NASA Astrophysics Data System (ADS)

    Handy, C. R.; Brooks, H. A.

    2001-05-01

    Despite the many successes of wavelet analysis in image and signal processing, the incorporation of continuous wavelet transform theory within quantum mechanics has lacked a compelling, first principles, motivating analytical framework, until now. For arbitrary one-dimensional rational fraction Hamiltonians, we develop a simple, unified formalism, which clearly underscores the complementary, and mutually interdependent, role played by moment quantization theory (i.e. via scalets, as defined herein) and wavelets. This analysis involves no approximation of the Hamiltonian within the (equivalent) wavelet space, and emphasizes the importance of (complex) multiple turning point contributions in the quantization process. We apply the method to three illustrative examples. These include the (double-well) quartic anharmonic oscillator potential problem, V(x) = Z2x2 + gx4, the quartic potential, V(x) = x4, and the very interesting and significant non-Hermitian potential V(x) = -(ix)3, recently studied by Bender and Boettcher.

  4. Use of strategic environmental assessment in the site selection process for a radioactive waste disposal facility in Slovenia.

    PubMed

    Dermol, Urška; Kontić, Branko

    2011-01-01

    The benefits of strategic environmental considerations in the process of siting a repository for low- and intermediate-level radioactive waste (LILW) are presented. The benefits have been explored by analyzing differences between the two site selection processes. One is a so-called official site selection process, which is implemented by the Agency for radwaste management (ARAO); the other is an optimization process suggested by experts working in the area of environmental impact assessment (EIA) and land-use (spatial) planning. The criteria on which the comparison of the results of the two site selection processes has been based are spatial organization, environmental impact, safety in terms of potential exposure of the population to radioactivity released from the repository, and feasibility of the repository from the technical, financial/economic and social point of view (the latter relates to consent by the local community for siting the repository). The site selection processes have been compared with the support of the decision expert system named DEX. The results of the comparison indicate that the sites selected by ARAO meet fewer suitability criteria than those identified by applying strategic environmental considerations in the framework of the optimization process. This result stands when taking into account spatial, environmental, safety and technical feasibility points of view. Acceptability of a site by a local community could not have been tested, since the formal site selection process has not yet been concluded; this remains as an uncertain and open point of the comparison. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  6. X-ray Characterization and Defect Control of III-Nitrides

    NASA Astrophysics Data System (ADS)

    Tweedie, James

    A process for controlling point defects in a semiconductor using excess charge carriers was developed in theory and practice. A theoretical framework based on first principles was developed to model the effect of excess charge carriers on the formation energy and concentration of charged point defects in a semiconductor. The framework was validated for the completely general case of a generic carrier source and a generic point defect in a generic semiconductor, and then refined for the more specific case of a generic carrier source applied during the growth of a doped semiconductor crystal. It was theoretically demonstrated that the process as defined will always reduce the degree of compensation in the semiconductor. The established theoretical framework was applied to the case of above-bandgap illumination on both the MOCVD growth and the post-growth annealing of Mg-doped GaN thin films. It was theoretically demonstrated that UV light will lower the concentration of compensating defects during growth and will facilitate complete activation of the Mg acceptor at lower annealing temperatures. Annealing experiments demonstrated that UV illumination of GaN:Mg thin films during annealing lowers the resistivity of the film at any given temperature below the 650 °C threshold at which complete activation is achieved without illumination. Broad spectrum analysis of the photoluminescence (PL) spectra together with a correlation between the acceptor-bound exciton transition and room temperature resistivity demonstrated that UV light only acts to enhance the activation Mg. Surface chemistry and interface chemistry of AlN and high Al mole fraction AlGaN films were studied using x-ray photoelectron spectroscopy (XPS). It was seen that surfaces readily form stable surface oxides. The Schottky barrier height (SBH) of various metals contacted to these surfaces was using XPS. Finally, an x-ray diffraction method (XRD) was developed to quantify strain and composition of alloy films in the context of a processing environment. Reciprocal space mapping revealed intensity limitations on the accuracy of the method. The method was used to demonstrate a bimodal strain distribution across the composition spectrum for 200 nm AlGaN thin films grown on GaN. A weak, linear strain dependence on composition was observed for Al mole fractions below 30%. Above this threshold the films were observed to be completely relaxed by cracking.

  7. PyCDT: A Python toolkit for modeling point defects in semiconductors and insulators

    DOE PAGES

    Broberg, Danny; Medasani, Bharat; Zimmermann, Nils E. R.; ...

    2018-02-13

    Point defects have a strong impact on the performance of semiconductor and insulator materials used in technological applications, spanning microelectronics to energy conversion and storage. The nature of the dominant defect types, how they vary with processing conditions, and their impact on materials properties are central aspects that determine the performance of a material in a certain application. This information is, however, difficult to access directly from experimental measurements. Consequently, computational methods, based on electronic density functional theory (DFT), have found widespread use in the calculation of point-defect properties. Here we have developed the Python Charged Defect Toolkit (PyCDT) tomore » expedite the setup and post-processing of defect calculations with widely used DFT software. PyCDT has a user-friendly command-line interface and provides a direct interface with the Materials Project database. This allows for setting up many charged defect calculations for any material of interest, as well as post-processing and applying state-of-the-art electrostatic correction terms. Our paper serves as a documentation for PyCDT, and demonstrates its use in an application to the well-studied GaAs compound semiconductor. As a result, we anticipate that the PyCDT code will be useful as a framework for undertaking readily reproducible calculations of charged point-defect properties, and that it will provide a foundation for automated, high-throughput calculations.« less

  8. A graph signal filtering-based approach for detection of different edge types on airborne lidar data

    NASA Astrophysics Data System (ADS)

    Bayram, Eda; Vural, Elif; Alatan, Aydin

    2017-10-01

    Airborne Laser Scanning is a well-known remote sensing technology, which provides a dense and highly accurate, yet unorganized point cloud of earth surface. During the last decade, extracting information from the data generated by airborne LiDAR systems has been addressed by many studies in geo-spatial analysis and urban monitoring applications. However, the processing of LiDAR point clouds is challenging due to their irregular structure and 3D geometry. In this study, we propose a novel framework for the detection of the boundaries of an object or scene captured by LiDAR. Our approach is motivated by edge detection techniques in vision research and it is established on graph signal filtering which is an exciting and promising field of signal processing for irregular data types. Due to the convenient applicability of graph signal processing tools on unstructured point clouds, we achieve the detection of the edge points directly on 3D data by using a graph representation that is constructed exclusively to answer the requirements of the application. Moreover, considering the elevation data as the (graph) signal, we leverage aerial characteristic of the airborne LiDAR data. The proposed method can be employed both for discovering the jump edges on a segmentation problem and for exploring the crease edges on a LiDAR object on a reconstruction/modeling problem, by only adjusting the filter characteristics.

  9. PyCDT: A Python toolkit for modeling point defects in semiconductors and insulators

    NASA Astrophysics Data System (ADS)

    Broberg, Danny; Medasani, Bharat; Zimmermann, Nils E. R.; Yu, Guodong; Canning, Andrew; Haranczyk, Maciej; Asta, Mark; Hautier, Geoffroy

    2018-05-01

    Point defects have a strong impact on the performance of semiconductor and insulator materials used in technological applications, spanning microelectronics to energy conversion and storage. The nature of the dominant defect types, how they vary with processing conditions, and their impact on materials properties are central aspects that determine the performance of a material in a certain application. This information is, however, difficult to access directly from experimental measurements. Consequently, computational methods, based on electronic density functional theory (DFT), have found widespread use in the calculation of point-defect properties. Here we have developed the Python Charged Defect Toolkit (PyCDT) to expedite the setup and post-processing of defect calculations with widely used DFT software. PyCDT has a user-friendly command-line interface and provides a direct interface with the Materials Project database. This allows for setting up many charged defect calculations for any material of interest, as well as post-processing and applying state-of-the-art electrostatic correction terms. Our paper serves as a documentation for PyCDT, and demonstrates its use in an application to the well-studied GaAs compound semiconductor. We anticipate that the PyCDT code will be useful as a framework for undertaking readily reproducible calculations of charged point-defect properties, and that it will provide a foundation for automated, high-throughput calculations.

  10. PyCDT: A Python toolkit for modeling point defects in semiconductors and insulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broberg, Danny; Medasani, Bharat; Zimmermann, Nils E. R.

    Point defects have a strong impact on the performance of semiconductor and insulator materials used in technological applications, spanning microelectronics to energy conversion and storage. The nature of the dominant defect types, how they vary with processing conditions, and their impact on materials properties are central aspects that determine the performance of a material in a certain application. This information is, however, difficult to access directly from experimental measurements. Consequently, computational methods, based on electronic density functional theory DFT), have found widespread use in the calculation of point defect properties. Here we have developed the Python Charged Defect Toolkit (PyCDT)more » to expedite the setup and post-processing of defect calculations with widely used DFT software. PyCDT has a user-friendly command-line interface and provides a direct interface with the Materials Project database. This allows for setting up many charged defect calculations for any material of interest, as well as post-processing and applying state-of-the-art electrostatic correction terms. Our paper serves as a documentation for PyCDT, and demonstrates its use in an application to the well-studied GaAs compound semiconductor. We anticipate that the PyCDT code will be useful as a framework for undertaking readily reproducible calculations of charged point-defect properties, and that it will provide a foundation for automated, high-throughput calculations.« less

  11. PyCDT: A Python toolkit for modeling point defects in semiconductors and insulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broberg, Danny; Medasani, Bharat; Zimmermann, Nils E. R.

    Point defects have a strong impact on the performance of semiconductor and insulator materials used in technological applications, spanning microelectronics to energy conversion and storage. The nature of the dominant defect types, how they vary with processing conditions, and their impact on materials properties are central aspects that determine the performance of a material in a certain application. This information is, however, difficult to access directly from experimental measurements. Consequently, computational methods, based on electronic density functional theory (DFT), have found widespread use in the calculation of point-defect properties. Here we have developed the Python Charged Defect Toolkit (PyCDT) tomore » expedite the setup and post-processing of defect calculations with widely used DFT software. PyCDT has a user-friendly command-line interface and provides a direct interface with the Materials Project database. This allows for setting up many charged defect calculations for any material of interest, as well as post-processing and applying state-of-the-art electrostatic correction terms. Our paper serves as a documentation for PyCDT, and demonstrates its use in an application to the well-studied GaAs compound semiconductor. As a result, we anticipate that the PyCDT code will be useful as a framework for undertaking readily reproducible calculations of charged point-defect properties, and that it will provide a foundation for automated, high-throughput calculations.« less

  12. Interactive Reference Point Procedure Based on the Conic Scalarizing Function

    PubMed Central

    2014-01-01

    In multiobjective optimization methods, multiple conflicting objectives are typically converted into a single objective optimization problem with the help of scalarizing functions. The conic scalarizing function is a general characterization of Benson proper efficient solutions of non-convex multiobjective problems in terms of saddle points of scalar Lagrangian functions. This approach preserves convexity. The conic scalarizing function, as a part of a posteriori or a priori methods, has successfully been applied to several real-life problems. In this paper, we propose a conic scalarizing function based interactive reference point procedure where the decision maker actively takes part in the solution process and directs the search according to her or his preferences. An algorithmic framework for the interactive solution of multiple objective optimization problems is presented and is utilized for solving some illustrative examples. PMID:24723795

  13. The extended evolutionary synthesis: its structure, assumptions and predictions

    PubMed Central

    Laland, Kevin N.; Uller, Tobias; Feldman, Marcus W.; Sterelny, Kim; Müller, Gerd B.; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-01-01

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the ‘extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism–environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. PMID:26246559

  14. A Framework for Modeling Competitive and Cooperative Computation in Retinal Processing

    NASA Astrophysics Data System (ADS)

    Moreno-Díaz, Roberto; de Blasio, Gabriel; Moreno-Díaz, Arminda

    2008-07-01

    The structure of the retina suggests that it should be treated (at least from the computational point of view), as a layered computer. Different retinal cells contribute to the coding of the signals down to ganglion cells. Also, because of the nature of the specialization of some ganglion cells, the structure suggests that all these specialization processes should take place at the inner plexiform layer and they should be of a local character, prior to a global integration and frequency-spike coding by the ganglion cells. The framework we propose consists of a layered computational structure, where outer layers provide essentially with band-pass space-time filtered signals which are progressively delayed, at least for their formal treatment. Specialization is supposed to take place at the inner plexiform layer by the action of spatio-temporal microkernels (acting very locally), and having a centerperiphery space-time structure. The resulting signals are then integrated by the ganglion cells through macrokernels structures. Practically all types of specialization found in different vertebrate retinas, as well as the quasilinear behavior in some higher vertebrates, can be modeled and simulated within this framework. Finally, possible feedback from central structures is considered. Though their relevance to retinal processing is not definitive, it is included here for the sake of completeness, since it is a formal requisite for recursiveness.

  15. Design optimization for active twist rotor blades

    NASA Astrophysics Data System (ADS)

    Mok, Ji Won

    This dissertation introduces the process of optimizing active twist rotor blades in the presence of embedded anisotropic piezo-composite actuators. Optimum design of active twist blades is a complex task, since it involves a rich design space with tightly coupled design variables. The study presents the development of an optimization framework for active helicopter rotor blade cross-sectional design. This optimization framework allows for exploring a rich and highly nonlinear design space in order to optimize the active twist rotor blades. Different analytical components are combined in the framework: cross-sectional analysis (UM/VABS), an automated mesh generator, a beam solver (DYMORE), a three-dimensional local strain recovery module, and a gradient based optimizer within MATLAB. Through the mathematical optimization problem, the static twist actuation performance of a blade is maximized while satisfying a series of blade constraints. These constraints are associated with locations of the center of gravity and elastic axis, blade mass per unit span, fundamental rotating blade frequencies, and the blade strength based on local three-dimensional strain fields under worst loading conditions. Through pre-processing, limitations of the proposed process have been studied. When limitations were detected, resolution strategies were proposed. These include mesh overlapping, element distortion, trailing edge tab modeling, electrode modeling and foam implementation of the mesh generator, and the initial point sensibility of the current optimization scheme. Examples demonstrate the effectiveness of this process. Optimization studies were performed on the NASA/Army/MIT ATR blade case. Even though that design was built and shown significant impact in vibration reduction, the proposed optimization process showed that the design could be improved significantly. The second example, based on a model scale of the AH-64D Apache blade, emphasized the capability of this framework to explore the nonlinear design space of complex planform. Especially for this case, detailed design is carried out to make the actual blade manufacturable. The proposed optimization framework is shown to be an effective tool to design high authority active twist blades to reduce vibration in future helicopter rotor blades.

  16. Segmentation of radiographic images under topological constraints: application to the femur.

    PubMed

    Gamage, Pavan; Xie, Sheng Quan; Delmas, Patrice; Xu, Wei Liang

    2010-09-01

    A framework for radiographic image segmentation under topological control based on two-dimensional (2D) image analysis was developed. The system is intended for use in common radiological tasks including fracture treatment analysis, osteoarthritis diagnostics and osteotomy management planning. The segmentation framework utilizes a generic three-dimensional (3D) model of the bone of interest to define the anatomical topology. Non-rigid registration is performed between the projected contours of the generic 3D model and extracted edges of the X-ray image to achieve the segmentation. For fractured bones, the segmentation requires an additional step where a region-based active contours curve evolution is performed with a level set Mumford-Shah method to obtain the fracture surface edge. The application of the segmentation framework to analysis of human femur radiographs was evaluated. The proposed system has two major innovations. First, definition of the topological constraints does not require a statistical learning process, so the method is generally applicable to a variety of bony anatomy segmentation problems. Second, the methodology is able to handle both intact and fractured bone segmentation. Testing on clinical X-ray images yielded an average root mean squared distance (between the automatically segmented femur contour and the manual segmented ground truth) of 1.10 mm with a standard deviation of 0.13 mm. The proposed point correspondence estimation algorithm was benchmarked against three state-of-the-art point matching algorithms, demonstrating successful non-rigid registration for the cases of interest. A topologically constrained automatic bone contour segmentation framework was developed and tested, providing robustness to noise, outliers, deformations and occlusions.

  17. Framework and components for effective discharge planning system: a delphi methodology

    PubMed Central

    2012-01-01

    Background To reduce avoidable hospital readmissions, effective discharge planning and appropriate post discharge support care are key requirements. This study is a 3-staged process to develop, pretest and pilot a framework for an effective discharge planning system in Hong Kong. This paper reports on the methodology of Delphi approach and findings of the second stage on pre-testing the framework developed so as to validate and attest to its applicability and practicability in which consensus was sought on the key components of discharge planning. Methods Delphi methodology was adopted to engage a group of experienced healthcare professionals to rate and discuss the framework and components of an effective discharge planning. The framework was consisted 36 statements under 5 major themes: initial screening, discharge planning process, coordination of discharge, implementation of discharge, and post discharge follow-up. Each statement was rated independently based on 3 aspects including clarity, validity and applicability on a 5-point Likert-scale. Statement with 75% or above of participants scoring 4–5 on all 3 aspects would be included in the discharge planning framework. For those statements not reaching 75% of consensus in any one of the aspect, it would be revised or discarded following the group discussion, and be re-rated in another round. Results A total of 24 participants participated in the consensus-building process. In round one rating, consensus was achieved in 25 out of 36 statements. Among those 11 statements not reaching consensus, the major concern was related to the “applicability” of the statements. The participants expressed a lack of manpower, skills and time in particular during weekends and long holidays in carrying out assessment and care plans within 24 h after admission. There were also timeliness and availability issue in providing transportation and necessary equipment to the patients. To make the statements more applicable, the wordings of some of the statements were revised to provide greater flexibility. Due to the lack of a statement in clarifying the role of the members of the healthcare professional team, one additional statement on the role and responsibility of the multidisciplinary team members was added. The first theme on “initial screening” was further revised to “initial screening and assessment” to better reflect the first stage of discharge planning process. After two rounds of rating process, all the 36 statements and the newly added statement reached consensus Conclusions A structured, systematic and coordinated system of hospital discharge system is required to facilitate the discharge process to ensure a smooth patient transition from the hospital to the community and improve patient health outcome in both clinical and social aspect. The findings of this paper provide a reference framework helping policymakers and hospital managers to facilitate the development of a coherent and systematized discharge planning process. Adopting a Delphi approach also demonstrates the values of the method as a pre-test (before the clinical run) of the components and requirements of a discharge planning system taking into account of the local context and system constraints, which would lead to improvements to its applicability and practicability. To confirm the applicability and practicability of this consensus framework for discharge planning system, the third stage of process of development of the discharge planning framework is to apply and pilot the framework in a hospital setting to evaluate its feasibility, applicability and impact in hospital including satisfaction from both the perspectives of staff and patients. PMID:23151173

  18. The estimation of branching curves in the presence of subject-specific random effects.

    PubMed

    Elmi, Angelo; Ratcliffe, Sarah J; Guo, Wensheng

    2014-12-20

    Branching curves are a technique for modeling curves that change trajectory at a change (branching) point. Currently, the estimation framework is limited to independent data, and smoothing splines are used for estimation. This article aims to extend the branching curve framework to the longitudinal data setting where the branching point varies by subject. If the branching point is modeled as a random effect, then the longitudinal branching curve framework is a semiparametric nonlinear mixed effects model. Given existing issues with using random effects within a smoothing spline, we express the model as a B-spline based semiparametric nonlinear mixed effects model. Simple, clever smoothness constraints are enforced on the B-splines at the change point. The method is applied to Women's Health data where we model the shape of the labor curve (cervical dilation measured longitudinally) before and after treatment with oxytocin (a labor stimulant). Copyright © 2014 John Wiley & Sons, Ltd.

  19. The European Qualifications Framework: A Technical Critique

    ERIC Educational Resources Information Center

    Lester, Stan

    2015-01-01

    The European Qualifications Framework (EQF) was introduced in 2008 as a "meta-framework" or common reference point for national qualifications frameworks in Europe, a function for which, with some caveats, it has been pragmatically successful. It has also been used with variable success to support the development or referencing of…

  20. DNA as information: at the crossroads between biology, mathematics, physics and chemistry

    PubMed Central

    2016-01-01

    On the one hand, biology, chemistry and also physics tell us how the process of translating the genetic information into life could possibly work, but we are still very far from a complete understanding of this process. On the other hand, mathematics and statistics give us methods to describe such natural systems—or parts of them—within a theoretical framework. Also, they provide us with hints and predictions that can be tested at the experimental level. Furthermore, there are peculiar aspects of the management of genetic information that are intimately related to information theory and communication theory. This theme issue is aimed at fostering the discussion on the problem of genetic coding and information through the presentation of different innovative points of view. The aim of the editors is to stimulate discussions and scientific exchange that will lead to new research on why and how life can exist from the point of view of the coding and decoding of genetic information. The present introduction represents the point of view of the editors on the main aspects that could be the subject of future scientific debate. PMID:26857674

  1. Strong field QED in lepton colliders and electron/laser interactions

    NASA Astrophysics Data System (ADS)

    Hartin, Anthony

    2018-05-01

    The studies of strong field particle physics processes in electron/laser interactions and lepton collider interaction points (IPs) are reviewed. These processes are defined by the high intensity of the electromagnetic fields involved and the need to take them into account as fully as possible. Thus, the main theoretical framework considered is the Furry interaction picture within intense field quantum field theory. In this framework, the influence of a background electromagnetic field in the Lagrangian is calculated nonperturbatively, involving exact solutions for quantized charged particles in the background field. These “dressed” particles go on to interact perturbatively with other particles, enabling the background field to play both macroscopic and microscopic roles. Macroscopically, the background field starts to polarize the vacuum, in effect rendering it a dispersive medium. Particles encountering this dispersive vacuum obtain a lifetime, either radiating or decaying into pair particles at a rate dependent on the intensity of the background field. In fact, the intensity of the background field enters into the coupling constant of the strong field quantum electrodynamic Lagrangian, influencing all particle processes. A number of new phenomena occur. Particles gain an intensity-dependent rest mass shift that accounts for their presence in the dispersive vacuum. Multi-photon events involving more than one external field photon occur at each vertex. Higher order processes which exchange a virtual strong field particle resonate via the lifetimes of the unstable strong field states. Two main arenas of strong field physics are reviewed; those occurring in relativistic electron interactions with intense laser beams, and those occurring in the beam-beam physics at the interaction point of colliders. This review outlines the theory, describes its significant novel phenomenology and details the experimental schema required to detect strong field effects and the simulation programs required to model them.

  2. Vehicle security encryption based on unlicensed encryption

    NASA Astrophysics Data System (ADS)

    Huang, Haomin; Song, Jing; Xu, Zhijia; Ding, Xiaoke; Deng, Wei

    2018-03-01

    The current vehicle key is easy to be destroyed and damage, proposing the use of elliptical encryption algorithm is improving the reliability of vehicle security system. Based on the encryption rules of elliptic curve, the chip's framework and hardware structure are designed, then the chip calculation process simulation has been analyzed by software. The simulation has been achieved the expected target. Finally, some issues pointed out in the data calculation about the chip's storage control and other modules.

  3. Consideration of reference points for the management of renewable resources under an adaptive management paradigm

    USGS Publications Warehouse

    Irwin, Brian J.; Conroy, Michael J.

    2013-01-01

    The success of natural resource management depends on monitoring, assessment and enforcement. In support of these efforts, reference points (RPs) are often viewed as critical values of management-relevant indicators. This paper considers RPs from the standpoint of objective-driven decision making in dynamic resource systems, guided by principles of structured decision making (SDM) and adaptive resource management (AM). During the development of natural resource policy, RPs have been variously treated as either ‘targets’ or ‘triggers’. Under a SDM/AM paradigm, target RPs correspond approximately to value-based objectives, which may in turn be either of fundamental interest to stakeholders or intermediaries to other central objectives. By contrast, trigger RPs correspond to decision rules that are presumed to lead to desirable outcomes (such as the programme targets). Casting RPs as triggers or targets within a SDM framework is helpful towards clarifying why (or whether) a particular metric is appropriate. Further, the benefits of a SDM/AM process include elucidation of underlying untested assumptions that may reveal alternative metrics for use as RPs. Likewise, a structured decision-analytic framework may also reveal that failure to achieve management goals is not because the metrics are wrong, but because the decision-making process in which they are embedded is insufficiently robust to uncertainty, is not efficiently directed at producing a resource objective, or is incapable of adaptation to new knowledge.

  4. The AskIT Service Desk: A Model for Improving Productivity and Reducing Costs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashcraft, Phillip Lynn; Fogle, Blythe G.; Cummings, Susan M.

    This was prepared for the business process improvement presentation to the Department of Energy. Los Alamos National Laboratory provides a single point of contact, the AskIT Service Desk, to address issues that impact customer productivity. At the most basic level, what customers want is for their calls to be received, to get a response from a knowledgeable analyst, and to have their issues resolved and their requests fulfilled. Providing a centralized, single point of contact service desk makes initiating technical or business support simple for the customer and improves the odds of immediately resolving the issue or correctly escalating themore » request to the next support level when necessary. Fulfilling customer requests through automated workflow also improves customer productivity and reduces costs. Finally, customers should be provided the option to solve their own problems through easy access to self-help resources such as frequently asked questions (FAQs) and how-to guides. To accomplish this, everyone who provides and supports services must understand how these processes and functions work together. Service providers and those who support services must “speak the same language” and share common objectives. The Associate Directorate for Business Innovation (ADBI) began the journey to improve services by selecting a known service delivery framework (Information Technology Infrastructure Library, or ITIL). From this framework, components that contribute significant business value were selected.« less

  5. REDD+ and climate smart agriculture in landscapes: A case study in Vietnam using companion modelling.

    PubMed

    Salvini, G; Ligtenberg, A; van Paassen, A; Bregt, A K; Avitabile, V; Herold, M

    2016-05-01

    Finding land use strategies that merge land-based climate change mitigation measures and adaptation strategies is still an open issue in climate discourse. This article explores synergies and trade-offs between REDD+, a scheme that focuses mainly on mitigation through forest conservation, with "Climate Smart Agriculture", an approach that emphasizes adaptive agriculture. We introduce a framework for ex-ante assessment of the impact of land management policies and interventions and for quantifying their impacts on land-based mitigation and adaptation goals. The framework includes a companion modelling (ComMod) process informed by interviews with policymakers, local experts and local farmers. The ComMod process consists of a Role-Playing Game with local farmers and an Agent Based Model. The game provided a participatory means to develop policy and climate change scenarios. These scenarios were then used as inputs to the Agent Based Model, a spatially explicit model to simulate landscape dynamics and the associated carbon emissions over decades. We applied the framework using as case study a community in central Vietnam, characterized by deforestation for subsistence agriculture and cultivation of acacias as a cash crop. The main findings show that the framework is useful in guiding consideration of local stakeholders' goals, needs and constraints. Additionally the framework provided beneficial information to policymakers, pointing to ways that policies might be re-designed to make them better tailored to local circumstances and therefore more effective in addressing synergistically climate change mitigation and adaptation objectives. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. A screening-level modeling approach to estimate nitrogen ...

    EPA Pesticide Factsheets

    This paper presents a screening-level modeling approach that can be used to rapidly estimate nutrient loading and assess numerical nutrient standard exceedance risk of surface waters leading to potential classification as impaired for designated use. It can also be used to explore best management practice (BMP) implementation to reduce loading. The modeling framework uses a hybrid statistical and process based approach to estimate source of pollutants, their transport and decay in the terrestrial and aquatic parts of watersheds. The framework is developed in the ArcGIS environment and is based on the total maximum daily load (TMDL) balance model. Nitrogen (N) is currently addressed in the framework, referred to as WQM-TMDL-N. Loading for each catchment includes non-point sources (NPS) and point sources (PS). NPS loading is estimated using export coefficient or event mean concentration methods depending on the temporal scales, i.e., annual or daily. Loading from atmospheric deposition is also included. The probability of a nutrient load to exceed a target load is evaluated using probabilistic risk assessment, by including the uncertainty associated with export coefficients of various land uses. The computed risk data can be visualized as spatial maps which show the load exceedance probability for all stream segments. In an application of this modeling approach to the Tippecanoe River watershed in Indiana, USA, total nitrogen (TN) loading and risk of standard exce

  7. Theoretical and Empirical Comparison of Big Data Image Processing with Apache Hadoop and Sun Grid Engine

    PubMed Central

    Bao, Shunxing; Weitendorf, Frederick D.; Plassard, Andrew J.; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A.

    2016-01-01

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., “short” processing times and/or “large” datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply “large scale” processing transitions into “big data” and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and non-relevant for medical imaging. PMID:28736473

  8. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  9. An operational information systems architecture for assessing sustainable transportation planning: principles and design.

    PubMed

    Borzacchiello, Maria Teresa; Torrieri, Vincenzo; Nijkamp, Peter

    2009-11-01

    This paper offers the description of an integrated information system framework for the assessment of transportation planning and management. After an introductory exposition, in the first part of the paper, a broad overview of international experiences regarding information systems on transportation is given, focusing in particular on the relationship between transportation system's performance monitoring and the decision-making process, and on the importance of this connection in the evaluation and planning process, in Italian and European cases. Next, the methodological design of an information system to support efficient and sustainable transportation planning and management aiming to integrate inputs from several different data sources is presented. The resulting framework deploys modular and integrated databases which include data stemming from different national or regional data banks and which integrate information belonging to different transportation fields. For this reason, it allows public administrations to account for many strategic elements that influence their decisions regarding transportation, both from a systemic and infrastructural point of view.

  10. Time-domain damping models in structural acoustics using digital filtering

    NASA Astrophysics Data System (ADS)

    Parret-Fréaud, Augustin; Cotté, Benjamin; Chaigne, Antoine

    2016-02-01

    This paper describes a new approach in order to formulate well-posed time-domain damping models able to represent various frequency domain profiles of damping properties. The novelty of this approach is to represent the behavior law of a given material directly in a discrete-time framework as a digital filter, which is synthesized for each material from a discrete set of frequency-domain data such as complex modulus through an optimization process. A key point is the addition of specific constraints to this process in order to guarantee stability, causality and verification of thermodynamics second law when transposing the resulting discrete-time behavior law into the time domain. Thus, this method offers a framework which is particularly suitable for time-domain simulations in structural dynamics and acoustics for a wide range of materials (polymers, wood, foam, etc.), allowing to control and even reduce the distortion effects induced by time-discretization schemes on the frequency response of continuous-time behavior laws.

  11. Gaussian processes with optimal kernel construction for neuro-degenerative clinical onset prediction

    NASA Astrophysics Data System (ADS)

    Canas, Liane S.; Yvernault, Benjamin; Cash, David M.; Molteni, Erika; Veale, Tom; Benzinger, Tammie; Ourselin, Sébastien; Mead, Simon; Modat, Marc

    2018-02-01

    Gaussian Processes (GP) are a powerful tool to capture the complex time-variations of a dataset. In the context of medical imaging analysis, they allow a robust modelling even in case of highly uncertain or incomplete datasets. Predictions from GP are dependent of the covariance kernel function selected to explain the data variance. To overcome this limitation, we propose a framework to identify the optimal covariance kernel function to model the data.The optimal kernel is defined as a composition of base kernel functions used to identify correlation patterns between data points. Our approach includes a modified version of the Compositional Kernel Learning (CKL) algorithm, in which we score the kernel families using a new energy function that depends both the Bayesian Information Criterion (BIC) and the explained variance score. We applied the proposed framework to model the progression of neurodegenerative diseases over time, in particular the progression of autosomal dominantly-inherited Alzheimer's disease, and use it to predict the time to clinical onset of subjects carrying genetic mutation.

  12. Fast algorithm for probabilistic bone edge detection (FAPBED)

    NASA Astrophysics Data System (ADS)

    Scepanovic, Danilo; Kirshtein, Joshua; Jain, Ameet K.; Taylor, Russell H.

    2005-04-01

    The registration of preoperative CT to intra-operative reality systems is a crucial step in Computer Assisted Orthopedic Surgery (CAOS). The intra-operative sensors include 3D digitizers, fiducials, X-rays and Ultrasound (US). FAPBED is designed to process CT volumes for registration to tracked US data. Tracked US is advantageous because it is real time, noninvasive, and non-ionizing, but it is also known to have inherent inaccuracies which create the need to develop a framework that is robust to various uncertainties, and can be useful in US-CT registration. Furthermore, conventional registration methods depend on accurate and absolute segmentation. Our proposed probabilistic framework addresses the segmentation-registration duality, wherein exact segmentation is not a prerequisite to achieve accurate registration. In this paper, we develop a method for fast and automatic probabilistic bone surface (edge) detection in CT images. Various features that influence the likelihood of the surface at each spatial coordinate are combined using a simple probabilistic framework, which strikes a fair balance between a high-level understanding of features in an image and the low-level number crunching of standard image processing techniques. The algorithm evaluates different features for detecting the probability of a bone surface at each voxel, and compounds the results of these methods to yield a final, low-noise, probability map of bone surfaces in the volume. Such a probability map can then be used in conjunction with a similar map from tracked intra-operative US to achieve accurate registration. Eight sample pelvic CT scans were used to extract feature parameters and validate the final probability maps. An un-optimized fully automatic Matlab code runs in five minutes per CT volume on average, and was validated by comparison against hand-segmented gold standards. The mean probability assigned to nonzero surface points was 0.8, while nonzero non-surface points had a mean value of 0.38 indicating clear identification of surface points on average. The segmentation was also sufficiently crisp, with a full width at half maximum (FWHM) value of 1.51 voxels.

  13. Revisiting the diffusion approximation to estimate evolutionary rates of gene family diversification.

    PubMed

    Gjini, Erida; Haydon, Daniel T; David Barry, J; Cobbold, Christina A

    2014-01-21

    Genetic diversity in multigene families is shaped by multiple processes, including gene conversion and point mutation. Because multi-gene families are involved in crucial traits of organisms, quantifying the rates of their genetic diversification is important. With increasing availability of genomic data, there is a growing need for quantitative approaches that integrate the molecular evolution of gene families with their higher-scale function. In this study, we integrate a stochastic simulation framework with population genetics theory, namely the diffusion approximation, to investigate the dynamics of genetic diversification in a gene family. Duplicated genes can diverge and encode new functions as a result of point mutation, and become more similar through gene conversion. To model the evolution of pairwise identity in a multigene family, we first consider all conversion and mutation events in a discrete manner, keeping track of their details and times of occurrence; second we consider only the infinitesimal effect of these processes on pairwise identity accounting for random sampling of genes and positions. The purely stochastic approach is closer to biological reality and is based on many explicit parameters, such as conversion tract length and family size, but is more challenging analytically. The population genetics approach is an approximation accounting implicitly for point mutation and gene conversion, only in terms of per-site average probabilities. Comparison of these two approaches across a range of parameter combinations reveals that they are not entirely equivalent, but that for certain relevant regimes they do match. As an application of this modelling framework, we consider the distribution of nucleotide identity among VSG genes of African trypanosomes, representing the most prominent example of a multi-gene family mediating parasite antigenic variation and within-host immune evasion. © 2013 Published by Elsevier Ltd. All rights reserved.

  14. Pursuing the method of multiple working hypotheses to understand differences in process-based snow models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Essery, Richard

    2017-04-01

    When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.

  15. Multibeam 3D Underwater SLAM with Probabilistic Registration.

    PubMed

    Palomer, Albert; Ridao, Pere; Ribas, David

    2016-04-20

    This paper describes a pose-based underwater 3D Simultaneous Localization and Mapping (SLAM) using a multibeam echosounder to produce high consistency underwater maps. The proposed algorithm compounds swath profiles of the seafloor with dead reckoning localization to build surface patches (i.e., point clouds). An Iterative Closest Point (ICP) with a probabilistic implementation is then used to register the point clouds, taking into account their uncertainties. The registration process is divided in two steps: (1) point-to-point association for coarse registration and (2) point-to-plane association for fine registration. The point clouds of the surfaces to be registered are sub-sampled in order to decrease both the computation time and also the potential of falling into local minima during the registration. In addition, a heuristic is used to decrease the complexity of the association step of the ICP from O(n2) to O(n) . The performance of the SLAM framework is tested using two real world datasets: First, a 2.5D bathymetric dataset obtained with the usual down-looking multibeam sonar configuration, and second, a full 3D underwater dataset acquired with a multibeam sonar mounted on a pan and tilt unit.

  16. Distribution majorization of corner points by reinforcement learning for moving object detection

    NASA Astrophysics Data System (ADS)

    Wu, Hao; Yu, Hao; Zhou, Dongxiang; Cheng, Yongqiang

    2018-04-01

    Corner points play an important role in moving object detection, especially in the case of free-moving camera. Corner points provide more accurate information than other pixels and reduce the computation which is unnecessary. Previous works only use intensity information to locate the corner points, however, the information that former and the last frames provided also can be used. We utilize the information to focus on more valuable area and ignore the invaluable area. The proposed algorithm is based on reinforcement learning, which regards the detection of corner points as a Markov process. In the Markov model, the video to be detected is regarded as environment, the selections of blocks for one corner point are regarded as actions and the performance of detection is regarded as state. Corner points are assigned to be the blocks which are seperated from original whole image. Experimentally, we select a conventional method which uses marching and Random Sample Consensus algorithm to obtain objects as the main framework and utilize our algorithm to improve the result. The comparison between the conventional method and the same one with our algorithm show that our algorithm reduce 70% of the false detection.

  17. Modeling Sea-Level Change using Errors-in-Variables Integrated Gaussian Processes

    NASA Astrophysics Data System (ADS)

    Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin

    2014-05-01

    We perform Bayesian inference on historical and late Holocene (last 2000 years) rates of sea-level change. The data that form the input to our model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. To accurately estimate rates of sea-level change and reliably compare tide-gauge compilations with proxy reconstructions it is necessary to account for the uncertainties that characterize each dataset. Many previous studies used simple linear regression models (most commonly polynomial regression) resulting in overly precise rate estimates. The model we propose uses an integrated Gaussian process approach, where a Gaussian process prior is placed on the rate of sea-level change and the data itself is modeled as the integral of this rate process. The non-parametric Gaussian process model is known to be well suited to modeling time series data. The advantage of using an integrated Gaussian process is that it allows for the direct estimation of the derivative of a one dimensional curve. The derivative at a particular time point will be representative of the rate of sea level change at that time point. The tide gauge and proxy data are complicated by multiple sources of uncertainty, some of which arise as part of the data collection exercise. Most notably, the proxy reconstructions include temporal uncertainty from dating of the sediment core using techniques such as radiocarbon. As a result of this, the integrated Gaussian process model is set in an errors-in-variables (EIV) framework so as to take account of this temporal uncertainty. The data must be corrected for land-level change known as glacio-isostatic adjustment (GIA) as it is important to isolate the climate-related sea-level signal. The correction for GIA introduces covariance between individual age and sea level observations into the model. The proposed integrated Gaussian process model allows for the estimation of instantaneous rates of sea-level change and accounts for all available sources of uncertainty in tide-gauge and proxy-reconstruction data. Our response variable is sea level after correction for GIA. By embedding the integrated process in an errors-in-variables (EIV) framework, and removing the estimate of GIA, we can quantify rates with better estimates of uncertainty than previously possible. The model provides a flexible fit and enables us to estimate rates of change at any given time point, thus observing how rates have been evolving from the past to present day.

  18. Whole systems shared governance: a model for the integrated health system.

    PubMed

    Evan, K; Aubry, K; Hawkins, M; Curley, T A; Porter-O'Grady, T

    1995-05-01

    The healthcare system is under renovation and renewal. In the process, roles and structures are shifting to support a subscriber-based continuum of care. Alliances and partnerships are emerging as the models of integration for the future. But how do we structure to support these emerging integrated partnerships? As the nurse executive expands the role and assumes increasing responsibility for creating new frameworks for care, a structure that sustains the point-of-care innovations and interdisciplinary relationships must be built. Whole systems models of organization, such as shared governance, are expanding as demand grows for a sustainable structure for horizontal and partnered systems of healthcare delivery. The executive will have to apply these newer frameworks to the delivery of care to provide adequate support for the clinically integrated environment.

  19. Study on Full Supply Chain Quality and Safetytraceability Systems For Cereal And Oilproducts

    NASA Astrophysics Data System (ADS)

    Liu, Shihong; Zheng, Huoguo; Meng, Hong; Hu, Haiyan; Wu, Jiangshou; Li, Chunhua

    Global food industry and Governments in many countries are putting increasing emphasis on establishment of food traceability systems. Food traceability has become an effective way in food safety management. Aimed at the major quality problems of cereal and oil products existing in the production, processing, warehousing, distribution and other links in the supply chain, this paper firstly proposes a new traceability framework combines the information flow with critical control points and quality indicators. Then it introduces traceability database design and data access mode to realize the framework. In practice, Code design for tracing goods is a challenge thing, so this paper put forward a code system based on UCC/EAN-128 standard.Middleware and Electronic terminal design are also briefly introduced to accomplish traceability system for cereal and oil products.

  20. Cardea: Dynamic Access Control in Distributed Systems

    NASA Technical Reports Server (NTRS)

    Lepro, Rebekah

    2004-01-01

    Modern authorization systems span domains of administration, rely on many different authentication sources, and manage complex attributes as part of the authorization process. This . paper presents Cardea, a distributed system that facilitates dynamic access control, as a valuable piece of an inter-operable authorization framework. First, the authorization model employed in Cardea and its functionality goals are examined. Next, critical features of the system architecture and its handling of the authorization process are then examined. Then the S A M L and XACML standards, as incorporated into the system, are analyzed. Finally, the future directions of this project are outlined and connection points with general components of an authorization system are highlighted.

  1. Interpersonal Emotion Regulation Model of Mood and Anxiety Disorders

    PubMed Central

    Hofmann, Stefan G.

    2014-01-01

    Although social factors are of critical importance in the development and maintenance of emotional disorders, the contemporary view of emotion regulation has been primarily limited to intrapersonal processes. Based on diverse perspectives pointing to the communicative function of emotions, the social processes in self-regulation, and the role of social support, this article presents an interpersonal model of emotion regulation of mood and anxiety disorders. This model provides a theoretical framework to understand and explain how mood and anxiety disorders are regulated and maintained through others. The literature, which provides support for the model, is reviewed and the clinical implications are discussed. PMID:25267867

  2. A Computational Framework for Automation of Point Defect Calculations

    NASA Astrophysics Data System (ADS)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration

    A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.

  3. A hierarchical model combining distance sampling and time removal to estimate detection probability during avian point counts

    USGS Publications Warehouse

    Amundson, Courtney L.; Royle, J. Andrew; Handel, Colleen M.

    2014-01-01

    Imperfect detection during animal surveys biases estimates of abundance and can lead to improper conclusions regarding distribution and population trends. Farnsworth et al. (2005) developed a combined distance-sampling and time-removal model for point-transect surveys that addresses both availability (the probability that an animal is available for detection; e.g., that a bird sings) and perceptibility (the probability that an observer detects an animal, given that it is available for detection). We developed a hierarchical extension of the combined model that provides an integrated analysis framework for a collection of survey points at which both distance from the observer and time of initial detection are recorded. Implemented in a Bayesian framework, this extension facilitates evaluating covariates on abundance and detection probability, incorporating excess zero counts (i.e. zero-inflation), accounting for spatial autocorrelation, and estimating population density. Species-specific characteristics, such as behavioral displays and territorial dispersion, may lead to different patterns of availability and perceptibility, which may, in turn, influence the performance of such hierarchical models. Therefore, we first test our proposed model using simulated data under different scenarios of availability and perceptibility. We then illustrate its performance with empirical point-transect data for a songbird that consistently produces loud, frequent, primarily auditory signals, the Golden-crowned Sparrow (Zonotrichia atricapilla); and for 2 ptarmigan species (Lagopus spp.) that produce more intermittent, subtle, and primarily visual cues. Data were collected by multiple observers along point transects across a broad landscape in southwest Alaska, so we evaluated point-level covariates on perceptibility (observer and habitat), availability (date within season and time of day), and abundance (habitat, elevation, and slope), and included a nested point-within-transect and park-level effect. Our results suggest that this model can provide insight into the detection process during avian surveys and reduce bias in estimates of relative abundance but is best applied to surveys of species with greater availability (e.g., breeding songbirds).

  4. EMG prediction from Motor Cortical Recordings via a Non-Negative Point Process Filter

    PubMed Central

    Nazarpour, Kianoush; Ethier, Christian; Paninski, Liam; Rebesco, James M.; Miall, R. Chris; Miller, Lee E.

    2012-01-01

    A constrained point process filtering mechanism for prediction of electromyogram (EMG) signals from multi-channel neural spike recordings is proposed here. Filters from the Kalman family are inherently sub-optimal in dealing with non-Gaussian observations, or a state evolution that deviates from the Gaussianity assumption. To address these limitations, we modeled the non-Gaussian neural spike train observations by using a generalized linear model (GLM) that encapsulates covariates of neural activity, including the neurons’ own spiking history, concurrent ensemble activity, and extrinsic covariates (EMG signals). In order to predict the envelopes of EMGs, we reformulated the Kalman filter (KF) in an optimization framework and utilized a non-negativity constraint. This structure characterizes the non-linear correspondence between neural activity and EMG signals reasonably. The EMGs were recorded from twelve forearm and hand muscles of a behaving monkey during a grip-force task. For the case of limited training data, the constrained point process filter improved the prediction accuracy when compared to a conventional Wiener cascade filter (a linear causal filter followed by a static non-linearity) for different bin sizes and delays between input spikes and EMG output. For longer training data sets, results of the proposed filter and that of the Wiener cascade filter were comparable. PMID:21659018

  5. Stochastic investigation of temperature process for climatic variability identification

    NASA Astrophysics Data System (ADS)

    Lerias, Eleutherios; Kalamioti, Anna; Dimitriadis, Panayiotis; Markonis, Yannis; Iliopoulou, Theano; Koutsoyiannis, Demetris

    2016-04-01

    The temperature process is considered as the most characteristic hydrometeorological process and has been thoroughly examined in the climate-change framework. We use a dataset comprising hourly temperature and dew point records to identify statistical variability with emphasis on the last period. Specifically, we investigate the occurrence of mean, maximum and minimum values and we estimate statistical properties such as marginal probability distribution function and the type of decay of the climacogram (i.e., mean process variance vs. scale) for various time periods. Acknowledgement: This research is conducted within the frame of the undergraduate course "Stochastic Methods in Water Resources" of the National Technical University of Athens (NTUA). The School of Civil Engineering of NTUA provided moral support for the participation of the students in the Assembly.

  6. FRAMEWORK FOR ASSESSING RISKS OF ...

    EPA Pesticide Factsheets

    The Framework for Children's Health Risk Assessment report can serve as a resource on children's health risk assessment and it addresses the need to provide a comprehensive and consistent framework for considering children in risk assessments at EPA. This framework lays out the process, points to existing published sources for more detailed information on life stage-specific considerations, and includes web links to specific online publications and relevant Agency science policy papers, guidelines and guidance. The document emphasizes the need to take into account the potential exposures to environmental agents during preconception and all stages of development and focuses on the relevant adverse health outcomes that may occur as a result of such exposures. This framework is not an Agency guideline, but rather describes the overall structure and the components considered important for children's health risk assessment. The document describes an approach that includes problem formulation, analysis, and risk characterization, and also builds on Agency experience assessing risk to susceptible populations. The problem formulation step focuses on the life stage-specific nature of the analysis to include scoping and screening level questions for hazard characterization, dose response and exposure assessment. The risk characterization step recognizes the need to consider life stage-specific risks and explicitly describes the uncertainties and variability in the d

  7. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  8. Design and Implementation of a Basic Cross-Compiler and Virtual Memory Management System for the TI-59 Programmable Calculator.

    DTIC Science & Technology

    1983-06-01

    previously stated requirements to construct the framework for a software soluticn. It is during this phase of design that lany cf the most critical...the linker would have to be deferred until the compiler was formalized and ir the implementation phase of design. The second problem involved...memory liait was encountered. At this point a segmentation occurred. The memory limits were reset and the combining process continued until another

  9. Local dynamics in decision making: The evolution of preference within and across decisions

    NASA Astrophysics Data System (ADS)

    O'Hora, Denis; Dale, Rick; Piiroinen, Petri T.; Connolly, Fionnuala

    2013-07-01

    Within decisions, perceived alternatives compete until one is preferred. Across decisions, the playing field on which these alternatives compete evolves to favor certain alternatives. Mouse cursor trajectories provide rich continuous information related to such cognitive processes during decision making. In three experiments, participants learned to choose symbols to earn points in a discrimination learning paradigm and the cursor trajectories of their responses were recorded. Decisions between two choices that earned equally high-point rewards exhibited far less competition than decisions between choices that earned equally low-point rewards. Using positional coordinates in the trajectories, it was possible to infer a potential field in which the choice locations occupied areas of minimal potential. These decision spaces evolved through the experiments, as participants learned which options to choose. This visualisation approach provides a potential framework for the analysis of local dynamics in decision-making that could help mitigate both theoretical disputes and disparate empirical results.

  10. Function representation with circle inversion map systems

    NASA Astrophysics Data System (ADS)

    Boreland, Bryson; Kunze, Herb

    2017-01-01

    The fractals literature develops the now well-known concept of local iterated function systems (using affine maps) with grey-level maps (LIFSM) as an approach to function representation in terms of the associated fixed point of the so-called fractal transform. While originally explored as a method to achieve signal (and 2-D image) compression, more recent work has explored various aspects of signal and image processing using this machinery. In this paper, we develop a similar framework for function representation using circle inversion map systems. Given a circle C with centre õ and radius r, inversion with respect to C transforms the point p˜ to the point p˜', such that p˜ and p˜' lie on the same radial half-line from õ and d(õ, p˜)d(õ, p˜') = r2, where d is Euclidean distance. We demonstrate the results with an example.

  11. FPGA-Based High-Performance Embedded Systems for Adaptive Edge Computing in Cyber-Physical Systems: The ARTICo³ Framework.

    PubMed

    Rodríguez, Alfonso; Valverde, Juan; Portilla, Jorge; Otero, Andrés; Riesgo, Teresa; de la Torre, Eduardo

    2018-06-08

    Cyber-Physical Systems are experiencing a paradigm shift in which processing has been relocated to the distributed sensing layer and is no longer performed in a centralized manner. This approach, usually referred to as Edge Computing, demands the use of hardware platforms that are able to manage the steadily increasing requirements in computing performance, while keeping energy efficiency and the adaptability imposed by the interaction with the physical world. In this context, SRAM-based FPGAs and their inherent run-time reconfigurability, when coupled with smart power management strategies, are a suitable solution. However, they usually fail in user accessibility and ease of development. In this paper, an integrated framework to develop FPGA-based high-performance embedded systems for Edge Computing in Cyber-Physical Systems is presented. This framework provides a hardware-based processing architecture, an automated toolchain, and a runtime to transparently generate and manage reconfigurable systems from high-level system descriptions without additional user intervention. Moreover, it provides users with support for dynamically adapting the available computing resources to switch the working point of the architecture in a solution space defined by computing performance, energy consumption and fault tolerance. Results show that it is indeed possible to explore this solution space at run time and prove that the proposed framework is a competitive alternative to software-based edge computing platforms, being able to provide not only faster solutions, but also higher energy efficiency for computing-intensive algorithms with significant levels of data-level parallelism.

  12. Vulnerability Assessments and Resilience Planning at Federal Facilities. Preliminary Synthesis of Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moss, R. H.; Blohm, A. J.; Delgado, A.

    2015-08-15

    U.S. government agencies are now directed to assess the vulnerability of their operations and facilities to climate change and to develop adaptation plans to increase their resilience. Specific guidance on methods is still evolving based on the many different available frameworks. Agencies have been experimenting with these frameworks and approaches. This technical paper synthesizes lessons and insights from a series of research case studies conducted by the investigators at facilities of the U.S. Department of Energy and the Department of Defense. The purpose of the paper is to solicit comments and feedback from interested program managers and analysts before finalmore » conclusions are published. The paper describes the characteristics of a systematic process for prioritizing needs for adaptation planning at individual facilities and examines requirements and methods needed. It then suggests a framework of steps for vulnerability assessments at Federal facilities and elaborates on three sets of methods required for assessments, regardless of the detailed framework used. In a concluding section, the paper suggests a roadmap to further develop methods to support agencies in preparing for climate change. The case studies point to several preliminary conclusions; (1) Vulnerability assessments are needed to translate potential changes in climate exposure to estimates of impacts and evaluation of their significance for operations and mission attainment, in other words into information that is related to and useful in ongoing planning, management, and decision-making processes; (2) To increase the relevance and utility of vulnerability assessments to site personnel, the assessment process needs to emphasize the characteristics of the site infrastructure, not just climate change; (3) A multi-tiered framework that includes screening, vulnerability assessments at the most vulnerable installations, and adaptation design will efficiently target high-risk sites and infrastructure; (4) Vulnerability assessments can be connected to efforts to improve facility resilience to motivate participation; and (5) Efficient, scalable methods for vulnerability assessment can be developed, but additional case studies and evaluation are required.« less

  13. A cognitive perspective on health systems integration: results of a Canadian Delphi study

    PubMed Central

    2014-01-01

    Background Ongoing challenges to healthcare integration point toward the need to move beyond structural and process issues. While we know what needs to be done to achieve integrated care, there is little that informs us as to how. We need to understand how diverse organizations and professionals develop shared knowledge and beliefs – that is, we need to generate knowledge about normative integration. We present a cognitive perspective on integration, based on shared mental model theory, that may enhance our understanding and ability to measure and influence normative integration. The aim of this paper is to validate and improve the Mental Models of Integrated Care (MMIC) Framework, which outlines important knowledge and beliefs whose convergence or divergence across stakeholder groups may influence inter-professional and inter-organizational relations. Methods We used a two-stage web-based modified Delphi process to test the MMIC Framework against expert opinion using a random sample of participants from Canada’s National Symposium on Integrated Care. Respondents were asked to rate the framework’s clarity, comprehensiveness, usefulness, and importance using seven-point ordinal scales. Spaces for open comments were provided. Descriptive statistics were used to describe the structured responses, while open comments were coded and categorized using thematic analysis. The Kruskall-Wallis test was used to examine cross-group agreement by level of integration experience, current workplace, and current role. Results In the first round, 90 individuals responded (52% response rate), representing a wide range of professional roles and organization types from across the continuum of care. In the second round, 68 individuals responded (75.6% response rate). The quantitative and qualitative feedback from experts was used to revise the framework. The re-named “Integration Mindsets Framework” consists of a Strategy Mental Model and a Relationships Mental Model, comprising a total of nineteen content areas. Conclusions The Integration Mindsets Framework draws the attention of researchers and practitioners to how various stakeholders think about and conceptualize integration. A cognitive approach to understanding and measuring normative integration complements dominant cultural approaches and allows for more fine-grained analyses. The framework can be used by managers and leaders to facilitate the interpretation, planning, implementation, management and evaluation of integration initiatives. PMID:24885659

  14. Framework to trade optimality for local processing in large-scale wavefront reconstruction problems.

    PubMed

    Haber, Aleksandar; Verhaegen, Michel

    2016-11-15

    We show that the minimum variance wavefront estimation problems permit localized approximate solutions, in the sense that the wavefront value at a point (excluding unobservable modes, such as the piston mode) can be approximated by a linear combination of the wavefront slope measurements in the point's neighborhood. This enables us to efficiently compute a wavefront estimate by performing a single sparse matrix-vector multiplication. Moreover, our results open the possibility for the development of wavefront estimators that can be easily implemented in a decentralized/distributed manner, and in which the estimate optimality can be easily traded for computational efficiency. We numerically validate our approach on Hudgin wavefront sensor geometries, and the results can be easily generalized to Fried geometries.

  15. Constraint-based Data Mining

    NASA Astrophysics Data System (ADS)

    Boulicaut, Jean-Francois; Jeudy, Baptiste

    Knowledge Discovery in Databases (KDD) is a complex interactive process. The promising theoretical framework of inductive databases considers this is essentially a querying process. It is enabled by a query language which can deal either with raw data or patterns which hold in the data. Mining patterns turns to be the so-called inductive query evaluation process for which constraint-based Data Mining techniques have to be designed. An inductive query specifies declaratively the desired constraints and algorithms are used to compute the patterns satisfying the constraints in the data. We survey important results of this active research domain. This chapter emphasizes a real breakthrough for hard problems concerning local pattern mining under various constraints and it points out the current directions of research as well.

  16. Characterization of intermittency in renewal processes: Application to earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akimoto, Takuma; Hasumi, Tomohiro; Aizawa, Yoji

    2010-03-15

    We construct a one-dimensional piecewise linear intermittent map from the interevent time distribution for a given renewal process. Then, we characterize intermittency by the asymptotic behavior near the indifferent fixed point in the piecewise linear intermittent map. Thus, we provide a framework to understand a unified characterization of intermittency and also present the Lyapunov exponent for renewal processes. This method is applied to the occurrence of earthquakes using the Japan Meteorological Agency and the National Earthquake Information Center catalog. By analyzing the return map of interevent times, we find that interevent times are not independent and identically distributed random variablesmore » but that the conditional probability distribution functions in the tail obey the Weibull distribution.« less

  17. A computational framework for automation of point defect calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  18. A computational framework for automation of point defect calculations

    DOE PAGES

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei; ...

    2017-01-13

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  19. Monitoring of services with non-relational databases and map-reduce framework

    NASA Astrophysics Data System (ADS)

    Babik, M.; Souto, F.

    2012-12-01

    Service Availability Monitoring (SAM) is a well-established monitoring framework that performs regular measurements of the core site services and reports the corresponding availability and reliability of the Worldwide LHC Computing Grid (WLCG) infrastructure. One of the existing extensions of SAM is Site Wide Area Testing (SWAT), which gathers monitoring information from the worker nodes via instrumented jobs. This generates quite a lot of monitoring data to process, as there are several data points for every job and several million jobs are executed every day. The recent uptake of non-relational databases opens a new paradigm in the large-scale storage and distributed processing of systems with heavy read-write workloads. For SAM this brings new possibilities to improve its model, from performing aggregation of measurements to storing raw data and subsequent re-processing. Both SAM and SWAT are currently tuned to run at top performance, reaching some of the limits in storage and processing power of their existing Oracle relational database. We investigated the usability and performance of non-relational storage together with its distributed data processing capabilities. For this, several popular systems have been compared. In this contribution we describe our investigation of the existing non-relational databases suited for monitoring systems covering Cassandra, HBase and MongoDB. Further, we present our experiences in data modeling and prototyping map-reduce algorithms focusing on the extension of the already existing availability and reliability computations. Finally, possible future directions in this area are discussed, analyzing the current deficiencies of the existing Grid monitoring systems and proposing solutions to leverage the benefits of the non-relational databases to get more scalable and flexible frameworks.

  20. Spatially distributed modelling of pesticide leaching at European scale with the PyCatch modelling framework

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; van der Perk, Marcel; Karssenberg, Derek; Häring, Tim; Jene, Bernhard

    2017-04-01

    The modelling of pesticide transport through the soil and estimating its leaching to groundwater is essential for an appropriate environmental risk assessment. Pesticide leaching models commonly used in regulatory processes often lack the capability of providing a comprehensive spatial view, as they are implemented as non-spatial point models or only use a few combinations of representative soils to simulate specific plots. Furthermore, their handling of spatial input and output data and interaction with available Geographical Information Systems tools is limited. Therefore, executing several scenarios simulating and assessing the potential leaching on national or continental scale at high resolution is rather inefficient and prohibits the straightforward identification of areas prone to leaching. We present a new pesticide leaching model component of the PyCatch framework developed in PCRaster Python, an environmental modelling framework tailored to the development of spatio-temporal models (http://www.pcraster.eu). To ensure a feasible computational runtime of large scale models, we implemented an elementary field capacity approach to model soil water. Currently implemented processes are evapotranspiration, advection, dispersion, sorption, degradation and metabolite transformation. Not yet implemented relevant additional processes such as surface runoff, snowmelt, erosion or other lateral flows can be integrated with components already implemented in PyCatch. A preliminary version of the model executes a 20-year simulation of soil water processes for Germany (20 soil layers, 1 km2 spatial resolution, and daily timestep) within half a day using a single CPU. A comparison of the soil moisture and outflow obtained from the PCRaster implementation and PELMO, a commonly used pesticide leaching model, resulted in an R2 of 0.98 for the FOCUS Hamburg scenario. We will further discuss the validation of the pesticide transport processes and show case studies applied to European countries.

  1. Within-subject template estimation for unbiased longitudinal image analysis.

    PubMed

    Reuter, Martin; Schmansky, Nicholas J; Rosas, H Diana; Fischl, Bruce

    2012-07-16

    Longitudinal image analysis has become increasingly important in clinical studies of normal aging and neurodegenerative disorders. Furthermore, there is a growing appreciation of the potential utility of longitudinally acquired structural images and reliable image processing to evaluate disease modifying therapies. Challenges have been related to the variability that is inherent in the available cross-sectional processing tools, to the introduction of bias in longitudinal processing and to potential over-regularization. In this paper we introduce a novel longitudinal image processing framework, based on unbiased, robust, within-subject template creation, for automatic surface reconstruction and segmentation of brain MRI of arbitrarily many time points. We demonstrate that it is essential to treat all input images exactly the same as removing only interpolation asymmetries is not sufficient to remove processing bias. We successfully reduce variability and avoid over-regularization by initializing the processing in each time point with common information from the subject template. The presented results show a significant increase in precision and discrimination power while preserving the ability to detect large anatomical deviations; as such they hold great potential in clinical applications, e.g. allowing for smaller sample sizes or shorter trials to establish disease specific biomarkers or to quantify drug effects. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Integrated framework for developing search and discrimination metrics

    NASA Astrophysics Data System (ADS)

    Copeland, Anthony C.; Trivedi, Mohan M.

    1997-06-01

    This paper presents an experimental framework for evaluating target signature metrics as models of human visual search and discrimination. This framework is based on a prototype eye tracking testbed, the Integrated Testbed for Eye Movement Studies (ITEMS). ITEMS determines an observer's visual fixation point while he studies a displayed image scene, by processing video of the observer's eye. The utility of this framework is illustrated with an experiment using gray-scale images of outdoor scenes that contain randomly placed targets. Each target is a square region of a specific size containing pixel values from another image of an outdoor scene. The real-world analogy of this experiment is that of a military observer looking upon the sensed image of a static scene to find camouflaged enemy targets that are reported to be in the area. ITEMS provides the data necessary to compute various statistics for each target to describe how easily the observers located it, including the likelihood the target was fixated or identified and the time required to do so. The computed values of several target signature metrics are compared to these statistics, and a second-order metric based on a model of image texture was found to be the most highly correlated.

  3. Machine Learning-based discovery of closures for reduced models of dynamical systems

    NASA Astrophysics Data System (ADS)

    Pan, Shaowu; Duraisamy, Karthik

    2017-11-01

    Despite the successful application of machine learning (ML) in fields such as image processing and speech recognition, only a few attempts has been made toward employing ML to represent the dynamics of complex physical systems. Previous attempts mostly focus on parameter calibration or data-driven augmentation of existing models. In this work we present a ML framework to discover closure terms in reduced models of dynamical systems and provide insights into potential problems associated with data-driven modeling. Based on exact closure models for linear system, we propose a general linear closure framework from viewpoint of optimization. The framework is based on trapezoidal approximation of convolution term. Hyperparameters that need to be determined include temporal length of memory effect, number of sampling points, and dimensions of hidden states. To circumvent the explicit specification of memory effect, a general framework inspired from neural networks is also proposed. We conduct both a priori and posteriori evaluations of the resulting model on a number of non-linear dynamical systems. This work was supported in part by AFOSR under the project ``LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.

  4. Translational research in addiction: toward a framework for the development of novel therapeutics.

    PubMed

    Paterson, Neil E

    2011-06-15

    The development of novel substance use disorder (SUD) therapeutics is insufficient to meet the medical needs of a growing SUD patient population. The identification of translatable SUD models and tests is a crucial step in establishing a framework for SUD therapeutic development programs. The present review begins by identifying the clinical features of SUDs and highlights the narrow regulatory end-point required for approval of a novel SUD therapeutic. A conceptual overview of dependence is provided, followed by identification of potential intervention targets in the addiction cycle. The main components of the addiction cycle provide the framework for a discussion of preclinical models and their clinical analogs, all of which are focused on isolated behavioral end-points thought to be relevant to the persistence of compulsive drug use. Thus, the greatest obstacle to successful development is the gap between the multiplicity of preclinical and early clinical end-points and the regulatory end-point of sustained abstinence. This review proposes two pathways to bridging this gap: further development and validation of the preclinical extended access self-administration model; inclusion of secondary end-points comprising all of the measures highlighted in the present discussion in Phase 3 trials. Further, completion of the postdictive validation of analogous preclinical and clinical assays is of high priority. Ultimately, demonstration of the relevance and validity of a variety of end-points to the ultimate goal of abstinence will allow researchers to identify truly relevant therapeutic mechanisms and intervention targets, and establish a framework for SUD therapeutic development that allows optimal decision-making and resource allocation. 2011 Elsevier Inc. All rights reserved.

  5. Architectural approaches for HL7-based health information systems implementation.

    PubMed

    López, D M; Blobel, B

    2010-01-01

    Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.

  6. [Relational Frame Theory--A Theoretical Framework for Contextual Behavioral Science].

    PubMed

    Kensche, M; Schweiger, U

    2015-07-01

    Therapists have to deal with verbal systems and often work with verbal exchange. Therefore, a psychological theory is required, which teaches the therapist how to accomplish this task. The BRT is a theory of human language and cognition that explains how people use their verbal behavior as stimuli in their interrelations and how they act and react, based on the resulting relationships. This behavior is learned very early in the course of language acquisition and functions as a generalized operant. A prerequisite for this is the ability of people to undergo mental simulation. This enables them to construct diverse relational frameworks between individual stimuli. Without relational frameworks, people cannot function. The ability to establish a relational framework is a prerequisite for the formation of rule-governed behavior. Rule-governed behavior economizes complex decision processes, creates interpersonal security and enables dealing with events before they take place. On the other hand, the same properties that enable people to solve problems effectively can also contribute to rigid adherence to rules and experience avoidance. Relational frameworks, once established, outweigh other sources of behavioral regulation. Thus, it can become the basis of psychopathology. Poor contextual control makes it difficult for people to devote flexible, focused and voluntary attention to the present and align their actions with the immediate present. Contextual psychotherapy methods that are based on the BRT start precisely at this point: Targeted establishment of new contingencies in the therapeutic interaction through systematic strengthening of metacognitive mode and through the establishment of new rules that make possible a change in the rule-governed behavior enable undermining of dysfunctional rule-governed behavior and build up desirable behavior. This allows any therapeutic process to be more effective--regardless of the patient's expressed symptoms. © Georg Thieme Verlag KG Stuttgart · New York.

  7. [Relational frame theory - a theoretical framework for contextual behavioral science].

    PubMed

    Kensche, M; Schweiger, U

    2015-05-01

    Therapists have to deal with verbal systems and often work with verbal exchange. Therefore, a psychological theory is required, which teaches the therapist how to accomplish this task. The BRT is a theory of human language and cognition that explains how people use their verbal behavior as stimuli in their interrelations and how they act and react, based on the resulting relationships. This behavior is learned very early in the course of language acquisition and functions as a generalized operant. A prerequisite for this is the ability of people to undergo mental simulation. This enables them to construct diverse relational frameworks between individual stimuli. Without relational frameworks, people cannot function. The ability to establish a relational framework is a prerequisite for the formation of rule-governed behavior. Rule-governed behavior economizes complex decision processes, creates interpersonal security and enables dealing with events before they take place. On the other hand, the same properties that enable people to solve problems effectively can also contribute to rigid adherence to rules and experience avoidance. Relational frameworks, once established, outweigh other sources of behavioral regulation. Thus, it can become the basis of psychopathology. Poor contextual control makes it difficult for people to devote flexible, focused and voluntary attention to the present and align their actions with the immediate present. Contextual psychotherapy methods that are based on the BRT start precisely at this point: Targeted establishment of new contingencies in the therapeutic interaction through systematic strengthening of metacognitive mode and through the establishment of new rules that make possible a change in the rule-governed behavior enable undermining of dysfunctional rule-governed behavior and build up desirable behavior. This allows any therapeutic process to be more effective - regardless of the patient's expressed symptoms. © Georg Thieme Verlag KG Stuttgart · New York.

  8. Temporal cognition: Connecting subjective time to perception, attention, and memory.

    PubMed

    Matthews, William J; Meck, Warren H

    2016-08-01

    Time is a universal psychological dimension, but time perception has often been studied and discussed in relative isolation. Increasingly, researchers are searching for unifying principles and integrated models that link time perception to other domains. In this review, we survey the links between temporal cognition and other psychological processes. Specifically, we describe how subjective duration is affected by nontemporal stimulus properties (perception), the allocation of processing resources (attention), and past experience with the stimulus (memory). We show that many of these connections instantiate a "processing principle," according to which perceived time is positively related to perceptual vividity and the ease of extracting information from the stimulus. This empirical generalization generates testable predictions and provides a starting-point for integrated theoretical frameworks. By outlining some of the links between temporal cognition and other domains, and by providing a unifying principle for understanding these effects, we hope to encourage time-perception researchers to situate their work within broader theoretical frameworks, and that researchers from other fields will be inspired to apply their insights, techniques, and theorizing to improve our understanding of the representation and judgment of time. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elia, Valerio; Gnoni, Maria Grazia, E-mail: mariagrazia.gnoni@unisalento.it; Tornese, Fabiana

    Highlights: • Pay-As-You-Throw (PAYT) schemes are becoming widespread in several countries. • Economic, organizational and technological issues have to be integrated in an efficient PAYT model design. • Efficiency refers to a PAYT system which support high citizen participation rates as well as economic sustainability. • Different steps and constraints have to be evaluated from collection services to type technologies. • An holistic approach is discussed to support PAYT systems diffusion. - Abstract: Pay-As-You-Throw (PAYT) strategies are becoming widely applied in solid waste management systems; the main purpose is to support a more sustainable – from economic, environmental and socialmore » points of view – management of waste flows. Adopting PAYT charging models increases the complexity level of the waste management service as new organizational issues have to be evaluated compared to flat charging models. In addition, innovative technological solutions could also be adopted to increase the overall efficiency of the service. Unit pricing, user identification and waste measurement represent the three most important processes to be defined in a PAYT system. The paper proposes a holistic framework to support an effective design and management process. The framework defines most critical processes and effective organizational and technological solutions for supporting waste managers as well as researchers.« less

  10. Understanding Science: Frameworks for using stories to facilitate systems thinking

    NASA Astrophysics Data System (ADS)

    ElShafie, S. J.; Bean, J. R.

    2017-12-01

    Studies indicate that using a narrative structure for teaching and learning helps audiences to process and recall new information. Stories also help audiences retain specific information, such as character names or plot points, in the context of a broader narrative. Stories can therefore facilitate high-context systems learning in addition to low-context declarative learning. Here we incorporate a framework for science storytelling, which we use in communication workshops, with the Understanding Science framework developed by the UC Museum of Paleontology (UCMP) to explore the application of storytelling to systems thinking. We translate portions of the Understanding Science flowchart into narrative terms. Placed side by side, the two charts illustrate the parallels between the scientific process and the story development process. They offer a roadmap for developing stories about scientific studies and concepts. We also created a series of worksheets for use with the flowcharts. These new tools can generate stories from any perspective, including a scientist conducting a study; a character that plays a role in a larger system (e.g., foraminifera or a carbon atom); an entire system that interacts with other systems (e.g., the carbon cycle). We will discuss exemplar stories about climate change from each of these perspectives, which we are developing for workshops using content and storyboard models from the new UCMP website Understanding Global Change. This conceptual framework and toolkit will help instructors to develop stories about scientific concepts for use in a classroom setting. It will also help students to analyze stories presented in class, and to create their own stories about new concepts. This approach facilitates student metacognition of the learning process, and can also be used as a form of evaluation. We are testing this flowchart and its use in systems teaching with focus groups, in preparation for use in teacher professional development workshops.

  11. The development of a conceptually based nursing curriculum: an international experiment.

    PubMed

    Meleis, A I

    1979-11-01

    Nursing programmes in the United States of America are based on a conceptual framework. Not only do faculty and students ascribe to the necessity of such programmes but the national accreditation agency also provides its accreditation approval for the institution only after all criteria are met, including the requirement of a well-defined, operationalized and implemented framework. Can a nursing programme be developed in other nations utilizing the esoteric, American-based idea of the necessity for a conceptually based curriculum? The author answers this question. The manuscript presents both the process utilized in selecting a conceptual framwork for a new junior college programme in Kuwait and discusses the selected framework. The idea of a conceptual framework to guide the curriculum was as foreign in Kuwait as it was to nursing curricula in the United States 15 years ago. Though initially rejected by the faculty in Kuwait, the idea of a conceptual framework was reintroduced after much faculty discussion and questions related to nursing knowledge vis-a-vis medical knowledge, and what should be included in and excluded from the programme. By the end of the second year, a definite framework had been operationalized into courses and content. The selection of the framework evolved from faculty participation in the operationalization of the framework. This point is quite significant particularly in an international assignment, as it is the faculty who are left with the monumental task of supporting and continuing the work which has been done. Strategies used to develop and implement a conceptual framework included confrontation of faculty of the existing situation, lectures, seminars, workshops, and the identification of a critical review board.

  12. Validation of a clinical leadership qualities framework for managers in aged care: a Delphi study.

    PubMed

    Jeon, Yun-Hee; Conway, Jane; Chenoweth, Lynn; Weise, Janelle; Thomas, Tamsin Ht; Williams, Anna

    2015-04-01

    To establish validity of a clinical leadership framework for aged care middle managers (The Aged care Clinical Leadership Qualities Framework). Middle managers in aged care have responsibility not only for organisational governance also and operational management but also quality service delivery. There is a need to better define clinical leadership abilities in aged care middle managers, in order to optimise their positional authority to lead others to achieve quality outcomes. A Delphi method. Sixty-nine experts in aged care were recruited, representing rural, remote and metropolitan community and residential aged care settings. Panellists were asked to rate the proposed framework in terms of the relevance and importance of each leadership quality using four-point Likert scales, and to provide comments. Three rounds of consultation were conducted. The number and corresponding percentage of the relevance and importance rating for each quality was calculated for each consultation round, as well as mean scores. Consensus was determined to be reached when a percentage score reached 70% or greater. Twenty-three panellists completed all three rounds of consultation. Following the three rounds of consultation, the acceptability and face validity of the framework was confirmed. The study confirmed the framework as useful in identifying leadership requirements for middle managers in Australian aged care settings. The framework is the first validated framework of clinical leadership attributes for middle managers in aged care and offers an initial step forward in clarifying the aged care middle manager role. The framework provides clarity in the breadth of role expectations for the middle managers and can be used to inform an aged care specific leadership program development, individuals' and organisations' performance and development processes; and policy and guidelines about the types of activities required of middle managers in aged care. © 2014 John Wiley & Sons Ltd.

  13. Role of Sink Density in Nonequilibrium Chemical Redistribution in Alloys

    DOE PAGES

    Martinez, Enrique Saez; Senninger, Oriane; Caro, Alfredo; ...

    2018-03-08

    Nonequilibrium chemical redistribution in open systems submitted to external forces, such as particle irradiation, leads to changes in the structural properties of the material, potentially driving the system to failure. Such redistribution is controlled by the complex interplay between the production of point defects, atomic transport rates, and the sink character of the microstructure. In this work, we analyze this interplay by means of a kinetic Monte Carlo (KMC) framework with an underlying atomistic model for the Fe-Cr model alloy to study the effect of ideal defect sinks on Cr concentration profiles, with a particular focus on the role ofmore » interface density. We observe that the amount of segregation decreases linearly with decreasing interface spacing. Within the framework of the thermodynamics of irreversible processes, a general analytical model is derived and assessed against the KMC simulations to elucidate the structure-property relationship of this system. Interestingly, in the kinetic regime where elimination of point defects at sinks is dominant over bulk recombination, the solute segregation does not directly depend on the dose rate but only on the density of sinks. Furthermore, this model provides new insight into the design of microstructures that mitigate chemical redistribution and improve radiation tolerance.« less

  14. Role of Sink Density in Nonequilibrium Chemical Redistribution in Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinez, Enrique Saez; Senninger, Oriane; Caro, Alfredo

    Nonequilibrium chemical redistribution in open systems submitted to external forces, such as particle irradiation, leads to changes in the structural properties of the material, potentially driving the system to failure. Such redistribution is controlled by the complex interplay between the production of point defects, atomic transport rates, and the sink character of the microstructure. In this work, we analyze this interplay by means of a kinetic Monte Carlo (KMC) framework with an underlying atomistic model for the Fe-Cr model alloy to study the effect of ideal defect sinks on Cr concentration profiles, with a particular focus on the role ofmore » interface density. We observe that the amount of segregation decreases linearly with decreasing interface spacing. Within the framework of the thermodynamics of irreversible processes, a general analytical model is derived and assessed against the KMC simulations to elucidate the structure-property relationship of this system. Interestingly, in the kinetic regime where elimination of point defects at sinks is dominant over bulk recombination, the solute segregation does not directly depend on the dose rate but only on the density of sinks. Furthermore, this model provides new insight into the design of microstructures that mitigate chemical redistribution and improve radiation tolerance.« less

  15. Role of Sink Density in Nonequilibrium Chemical Redistribution in Alloys

    NASA Astrophysics Data System (ADS)

    Martínez, Enrique; Senninger, Oriane; Caro, Alfredo; Soisson, Frédéric; Nastar, Maylise; Uberuaga, Blas P.

    2018-03-01

    Nonequilibrium chemical redistribution in open systems submitted to external forces, such as particle irradiation, leads to changes in the structural properties of the material, potentially driving the system to failure. Such redistribution is controlled by the complex interplay between the production of point defects, atomic transport rates, and the sink character of the microstructure. In this work, we analyze this interplay by means of a kinetic Monte Carlo (KMC) framework with an underlying atomistic model for the Fe-Cr model alloy to study the effect of ideal defect sinks on Cr concentration profiles, with a particular focus on the role of interface density. We observe that the amount of segregation decreases linearly with decreasing interface spacing. Within the framework of the thermodynamics of irreversible processes, a general analytical model is derived and assessed against the KMC simulations to elucidate the structure-property relationship of this system. Interestingly, in the kinetic regime where elimination of point defects at sinks is dominant over bulk recombination, the solute segregation does not directly depend on the dose rate but only on the density of sinks. This model provides new insight into the design of microstructures that mitigate chemical redistribution and improve radiation tolerance.

  16. A cooperative game-theoretic framework for negotiating marine spatial allocation agreements among heterogeneous players.

    PubMed

    Kyriazi, Zacharoula; Lejano, Raul; Maes, Frank; Degraer, Steven

    2017-02-01

    Marine spatial allocation has become, in recent decades, a political flashpoint, fuelled by political power struggles, as well as the continuously increasing demand for marine space by both traditional and emerging marine uses. To effectively address this issue, we develop a decision-making procedure, that facilitates the distribution of disputed areas of specific size among heterogeneous players in a transparent and ethical way, while considering coalitional formations through coexistence. To do this, we model players' alternative strategies and payoffs within a cooperative game-theoretic framework. Depending on whether transferable utility (TU) or non-transferable utility (NTU) is the more appropriate assumption, we illustrate the use of the TU Shapley value and the Lejano's fixed point NTU Shapley value to solve for the ideal allocations. The applicability and effectiveness of the process has been tested in a case study area, the Dogger Bank Special Area of Conservation in the North Sea, which involves three totally or partially conflicting activities, i.e. fishing, nature conservation and wind farm development. The findings demonstrate that the process is capable of providing a unique, fair and equitable division of space Finally, among the two solution concepts proposed the fixed point NTU Shapley value manages to better address the heterogeneity of the players and thus to provide a more socially acceptable allocation that favours the weaker player, while demonstrating the importance of the monetary valuation attributed by each use to the area. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. DNA as information: at the crossroads between biology, mathematics, physics and chemistry.

    PubMed

    Cartwright, Julyan H E; Giannerini, Simone; González, Diego L

    2016-03-13

    On the one hand, biology, chemistry and also physics tell us how the process of translating the genetic information into life could possibly work, but we are still very far from a complete understanding of this process. On the other hand, mathematics and statistics give us methods to describe such natural systems-or parts of them-within a theoretical framework. Also, they provide us with hints and predictions that can be tested at the experimental level. Furthermore, there are peculiar aspects of the management of genetic information that are intimately related to information theory and communication theory. This theme issue is aimed at fostering the discussion on the problem of genetic coding and information through the presentation of different innovative points of view. The aim of the editors is to stimulate discussions and scientific exchange that will lead to new research on why and how life can exist from the point of view of the coding and decoding of genetic information. The present introduction represents the point of view of the editors on the main aspects that could be the subject of future scientific debate. © 2016 The Author(s).

  18. An Experience-Based Learning Framework: Activities for the Initial Development of Sustainability Competencies

    ERIC Educational Resources Information Center

    Caniglia, Guido; John, Beatrice; Kohler, Martin; Bellina, Leonie; Wiek, Arnim; Rojas, Christopher; Laubichler, Manfred D.; Lang, Daniel

    2016-01-01

    Purpose: This paper aims to present an experience-based learning framework that provides a bottom-up, student-centered entrance point for the development of systems thinking, normative and collaborative competencies in sustainability. Design/methodology/approach: The framework combines mental mapping with exploratory walking. It interweaves…

  19. IWRM: What should we teach? A report on curriculum development at the University of the Western Cape, South Africa

    NASA Astrophysics Data System (ADS)

    Jonker, Lewis

    In South Africa, the national government has taken deliberate steps to ensure that tertiary education programmes help to meet societal and economic needs. This article reports on the process of developing a programme in Integrated Water Resources Management at the University of the Western Cape (UWC) that speaks directly to current government policy. It describes two different approaches to curriculum development an eclectic approach that takes as its starting point courses already on offer, and a framework development approach that takes as its starting point identification of particular needs and deliberately builds a curriculum around these needs. The article illustrates how seemingly unrelated policy processes in education and water management could impact on curriculum development. The article suggests that, while curriculum development is a first key step, challenges remain in fine-tuning the IWRM M.Sc. programme so that graduates are equipped with skills that may contribute to equitable and sustainable development in the evolving context of 21st century South Africa.

  20. A marked point process approach for identifying neural correlates of tics in Tourette Syndrome.

    PubMed

    Loza, Carlos A; Shute, Jonathan B; Principe, Jose C; Okun, Michael S; Gunduz, Aysegul

    2017-07-01

    We propose a novel interpretation of local field potentials (LFP) based on a marked point process (MPP) framework that models relevant neuromodulations as shifted weighted versions of prototypical temporal patterns. Particularly, the MPP samples are categorized according to the well known oscillatory rhythms of the brain in an effort to elucidate spectrally specific behavioral correlates. The result is a transient model for LFP. We exploit data-driven techniques to fully estimate the model parameters with the added feature of exceptional temporal resolution of the resulting events. We utilize the learned features in the alpha and beta bands to assess correlations to tic events in patients with Tourette Syndrome (TS). The final results show stronger coupling between LFP recorded from the centromedian-paraficicular complex of the thalamus and the tic marks, in comparison to electrocorticogram (ECoG) recordings from the hand area of the primary motor cortex (M1) in terms of the area under the curve (AUC) of the receiver operating characteristic (ROC) curve.

  1. Planetary Crater Detection and Registration Using Marked Point Processes, Multiple Birth and Death Algorithms, and Region-Based Analysis

    NASA Technical Reports Server (NTRS)

    Solarna, David; Moser, Gabriele; Le Moigne-Stewart, Jacqueline; Serpico, Sebastiano B.

    2017-01-01

    Because of the large variety of sensors and spacecraft collecting data, planetary science needs to integrate various multi-sensor and multi-temporal images. These multiple data represent a precious asset, as they allow the study of targets spectral responses and of changes in the surface structure; because of their variety, they also require accurate and robust registration. A new crater detection algorithm, used to extract features that will be integrated in an image registration framework, is presented. A marked point process-based method has been developed to model the spatial distribution of elliptical objects (i.e. the craters) and a birth-death Markov chain Monte Carlo method, coupled with a region-based scheme aiming at computational efficiency, is used to find the optimal configuration fitting the image. The extracted features are exploited, together with a newly defined fitness function based on a modified Hausdorff distance, by an image registration algorithm whose architecture has been designed to minimize the computational time.

  2. Robust cubature Kalman filter for GNSS/INS with missing observations and colored measurement noise.

    PubMed

    Cui, Bingbo; Chen, Xiyuan; Tang, Xihua; Huang, Haoqian; Liu, Xiao

    2018-01-01

    In order to improve the accuracy of GNSS/INS working in GNSS-denied environment, a robust cubature Kalman filter (RCKF) is developed by considering colored measurement noise and missing observations. First, an improved cubature Kalman filter (CKF) is derived by considering colored measurement noise, where the time-differencing approach is applied to yield new observations. Then, after analyzing the disadvantages of existing methods, the measurement augment in processing colored noise is translated into processing the uncertainties of CKF, and new sigma point update framework is utilized to account for the bounded model uncertainties. By reusing the diffused sigma points and approximation residual in the prediction stage of CKF, the RCKF is developed and its error performance is analyzed theoretically. Results of numerical experiment and field test reveal that RCKF is more robust than CKF and extended Kalman filter (EKF), and compared with EKF, the heading error of land vehicle is reduced by about 72.4%. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  3. In-camera automation of photographic composition rules.

    PubMed

    Banerjee, Serene; Evans, Brian L

    2007-07-01

    At the time of image acquisition, professional photographers apply many rules of thumb to improve the composition of their photographs. This paper develops a joint optical-digital processing framework for automating composition rules during image acquisition for photographs with one main subject. Within the framework, we automate three photographic composition rules: repositioning the main subject, making the main subject more prominent, and making objects that merge with the main subject less prominent. The idea is to provide to the user alternate pictures obtained by applying photographic composition rules in addition to the original picture taken by the user. The proposed algorithms do not depend on prior knowledge of the indoor/outdoor setting or scene content. The proposed algorithms are also designed to be amenable to software implementation on fixed-point programmable digital signal processors available in digital still cameras.

  4. The influences of accelerated aging on mechanical properties of veneering ceramics used for zirconia restorations.

    PubMed

    Luo, Huinan; Tang, Xuehua; Dong, Zhen; Tang, Hui; Nakamura, Takashi; Yatani, Hirofumi

    2016-01-01

    This study evaluated the influences of accelerated aging on the mechanical properties of veneering ceramics used for zirconia frameworks. Five different veneering ceramics for zirconia frameworks were used. Twenty specimens were fabricated for each veneering ceramic. All specimens were divided into two groups. One was subjected to accelerated aging and the other was used as a control. Accelerated aging was performed in distilled water for 5 h at 200ºC and 2 atm. The density, open porosity, surface roughness, three-point flexural strength, and Vickers hardness were measured. The results showed that the density, open porosity, and surface roughness of all examined veneering ceramics were changed by the accelerated aging process. Accelerated aging was also found to have a positive effect on strength and a negative effect on the hardness.

  5. Run-to-Run Optimization Control Within Exact Inverse Framework for Scan Tracking.

    PubMed

    Yeoh, Ivan L; Reinhall, Per G; Berg, Martin C; Chizeck, Howard J; Seibel, Eric J

    2017-09-01

    A run-to-run optimization controller uses a reduced set of measurement parameters, in comparison to more general feedback controllers, to converge to the best control point for a repetitive process. A new run-to-run optimization controller is presented for the scanning fiber device used for image acquisition and display. This controller utilizes very sparse measurements to estimate a system energy measure and updates the input parameterizations iteratively within a feedforward with exact-inversion framework. Analysis, simulation, and experimental investigations on the scanning fiber device demonstrate improved scan accuracy over previous methods and automatic controller adaptation to changing operating temperature. A specific application example and quantitative error analyses are provided of a scanning fiber endoscope that maintains high image quality continuously across a 20 °C temperature rise without interruption of the 56 Hz video.

  6. Using OpenEHR in SICTI an electronic health record system for critical medicine

    NASA Astrophysics Data System (ADS)

    Filgueira, R.; Odriazola, A.; Simini, F.

    2007-11-01

    SICTI is a software tool for registering health records in critical medicine environments. Version 1.0 has been in use since 2003. The Biomedical Engineering Group (Núcleo de Ingeniería Biomédica), with support from the Technological Development Programme (Programa de Desarrollo Tecnológico), decided to develop a new version, to provide an aid for more critical medicine processes, based on a framework which would make the application domain change oriented. The team analyzed three alternatives: to develop an original product based on new research, to base the development on OpenEHR framework, or to use HL7 RIM as the reference model for SICTI. The team opted for OpenEHR. This work describes the use of OpenEHR, its strong and weak points, and states future work perspectives.

  7. VIC-CropSyst-v2: A regional-scale modeling platform to simulate the nexus of climate, hydrology, cropping systems, and human decisions

    NASA Astrophysics Data System (ADS)

    Malek, Keyvan; Stöckle, Claudio; Chinnayakanahalli, Kiran; Nelson, Roger; Liu, Mingliang; Rajagopalan, Kirti; Barik, Muhammad; Adam, Jennifer C.

    2017-08-01

    Food supply is affected by a complex nexus of land, atmosphere, and human processes, including short- and long-term stressors (e.g., drought and climate change, respectively). A simulation platform that captures these complex elements can be used to inform policy and best management practices to promote sustainable agriculture. We have developed a tightly coupled framework using the macroscale variable infiltration capacity (VIC) hydrologic model and the CropSyst agricultural model. A mechanistic irrigation module was also developed for inclusion in this framework. Because VIC-CropSyst combines two widely used and mechanistic models (for crop phenology, growth, management, and macroscale hydrology), it can provide realistic and hydrologically consistent simulations of water availability, crop water requirements for irrigation, and agricultural productivity for both irrigated and dryland systems. This allows VIC-CropSyst to provide managers and decision makers with reliable information on regional water stresses and their impacts on food production. Additionally, VIC-CropSyst is being used in conjunction with socioeconomic models, river system models, and atmospheric models to simulate feedback processes between regional water availability, agricultural water management decisions, and land-atmosphere interactions. The performance of VIC-CropSyst was evaluated on both regional (over the US Pacific Northwest) and point scales. Point-scale evaluation involved using two flux tower sites located in agricultural fields in the US (Nebraska and Illinois). The agreement between recorded and simulated evapotranspiration (ET), applied irrigation water, soil moisture, leaf area index (LAI), and yield indicated that, although the model is intended to work on regional scales, it also captures field-scale processes in agricultural areas.

  8. Intergenerational Transmission of Self-Regulation: A Multidisciplinary Review and Integrative Conceptual Framework

    PubMed Central

    Bridgett, David J.; Burt, Nicole M.; Edwards, Erin S.; Deater-Deckard, Kirby

    2014-01-01

    This review examines mechanisms contributing to the intergenerational transmission of self-regulation. To provide an integrated account of how self-regulation is transmitted across generations, we draw from over 75 years of accumulated evidence, spanning case studies to experimental approaches, in literatures covering developmental, social, and clinical psychology, and criminology, physiology, genetics, and human and animal neuroscience (among others). First, we present a taxonomy of what self-regulation is and then examine how it develops – overviews that guide the main foci of the review. Next, studies supporting an association between parent and child self-regulation are reviewed. Subsequently, literature that considers potential social mechanisms of transmission, specifically parenting behavior, inter-parental (i.e., marital) relationship behaviors, and broader rearing influences (e.g., household chaos) are considered. Finally, literature providing evidence that prenatal programming may be the starting point of the intergenerational transmission of self-regulation is covered, along with key findings from the behavioral and molecular genetics literatures. To integrate these literatures, we introduce the Self-Regulation Intergenerational Transmission Model, a framework that brings together prenatal, social, and neurobiological mechanisms (spanning endocrine, neural, and genetic levels, including gene-environment interplay and epigenetic processes) to explain the intergenerational transmission of self-regulation. This model also incorporates potential transactional processes between generations (e.g., children’s self-regulation and parent-child interaction dynamics that may affect parents’ self-regulation) that further influence intergenerational processes. In pointing the way forward, we note key future directions and ways to address limitations in existing work throughout the review and in closing. We also conclude by noting several implications for intervention work. PMID:25938878

  9. Minimizing human error in radiopharmaceutical preparation and administration via a bar code-enhanced nuclear pharmacy management system.

    PubMed

    Hakala, John L; Hung, Joseph C; Mosman, Elton A

    2012-09-01

    The objective of this project was to ensure correct radiopharmaceutical administration through the use of a bar code system that links patient and drug profiles with on-site information management systems. This new combined system would minimize the amount of manual human manipulation, which has proven to be a primary source of error. The most common reason for dosing errors is improper patient identification when a dose is obtained from the nuclear pharmacy or when a dose is administered. A standardized electronic transfer of information from radiopharmaceutical preparation to injection will further reduce the risk of misadministration. Value stream maps showing the flow of the patient dose information, as well as potential points of human error, were developed. Next, a future-state map was created that included proposed corrections for the most common critical sites of error. Transitioning the current process to the future state will require solutions that address these sites. To optimize the future-state process, a bar code system that links the on-site radiology management system with the nuclear pharmacy management system was proposed. A bar-coded wristband connects the patient directly to the electronic information systems. The bar code-enhanced process linking the patient dose with the electronic information reduces the number of crucial points for human error and provides a framework to ensure that the prepared dose reaches the correct patient. Although the proposed flowchart is designed for a site with an in-house central nuclear pharmacy, much of the framework could be applied by nuclear medicine facilities using unit doses. An electronic connection between information management systems to allow the tracking of a radiopharmaceutical from preparation to administration can be a useful tool in preventing the mistakes that are an unfortunate reality for any facility.

  10. Multi-scale Modeling of Arctic Clouds

    NASA Astrophysics Data System (ADS)

    Hillman, B. R.; Roesler, E. L.; Dexheimer, D.

    2017-12-01

    The presence and properties of clouds are critically important to the radiative budget in the Arctic, but clouds are notoriously difficult to represent in global climate models (GCMs). The challenge stems partly from a disconnect in the scales at which these models are formulated and the scale of the physical processes important to the formation of clouds (e.g., convection and turbulence). Because of this, these processes are parameterized in large-scale models. Over the past decades, new approaches have been explored in which a cloud system resolving model (CSRM), or in the extreme a large eddy simulation (LES), is embedded into each gridcell of a traditional GCM to replace the cloud and convective parameterizations to explicitly simulate more of these important processes. This approach is attractive in that it allows for more explicit simulation of small-scale processes while also allowing for interaction between the small and large-scale processes. The goal of this study is to quantify the performance of this framework in simulating Arctic clouds relative to a traditional global model, and to explore the limitations of such a framework using coordinated high-resolution (eddy-resolving) simulations. Simulations from the global model are compared with satellite retrievals of cloud fraction partioned by cloud phase from CALIPSO, and limited-area LES simulations are compared with ground-based and tethered-balloon measurements from the ARM Barrow and Oliktok Point measurement facilities.

  11. Bayesian Computation for Log-Gaussian Cox Processes: A Comparative Analysis of Methods

    PubMed Central

    Teng, Ming; Nathoo, Farouk S.; Johnson, Timothy D.

    2017-01-01

    The Log-Gaussian Cox Process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly-stochastic property, i.e., it is an hierarchical combination of a Poisson process at the first level and a Gaussian Process at the second level. Various methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods. We focus here on Bayesian methods and several approaches that have been considered for model fitting within this framework, including Hamiltonian Monte Carlo, the Integrated nested Laplace approximation, and Variational Bayes. We consider these approaches and make comparisons with respect to statistical and computational efficiency. These comparisons are made through several simulation studies as well as through two applications, the first examining ecological data and the second involving neuroimaging data. PMID:29200537

  12. Acculturation and reacculturation influence: multilayer contexts in therapy.

    PubMed

    Abu Baker, K

    1999-12-01

    Clients who live within a minority culture while being influenced by a dominant culture usually bring to therapy the impact of their multilayered cultural experience. Migration literature point to separation and marginalization processes during the acculturation process as the main cause of relocators' psychosocial problems. In contrast to other studies that appreciate assimilation and integration within the dominant culture, this study shows that these processes often lead to disharmony and disintegration within the home culture, especially among those who remigrate back home or those who continue to live simultaneously within the sending culture and the receiving culture. Additionally, this study emphasizes that acculturation often happens as a multilinear and multidimensional process within the host culture and the sending culture. Therapists may help clients when they become aware of the complexity of the multidirectional process of acculturation and its various levels, such as the interfamilial, the intrafamilial, and the social. Three case studies will illustrate the theoretical framework.

  13. Modules, theories, or islands of expertise? Domain specificity in socialization.

    PubMed

    Gelman, Susan A

    2010-01-01

    The domain-specific approach to socialization processes presented by J. E. Grusec and M. Davidov (this issue) provides a compelling framework for integrating and interpreting a large and disparate body of research findings, and it generates a wealth of testable new hypotheses. At the same time, it introduces core theoretical questions regarding the nature of social interactions, from the perspective of both children and their caregivers. This commentary draws on the literature regarding domain specificity in cognitive development, applauds what is innovative and exciting about applying a domain-specific approach to socialization processes, and points to questions for future research. Foremost among these is what is meant by "domain specificity."

  14. Stochastic optimal control of ultradiffusion processes with application to dynamic portfolio management

    NASA Astrophysics Data System (ADS)

    Marcozzi, Michael D.

    2008-12-01

    We consider theoretical and approximation aspects of the stochastic optimal control of ultradiffusion processes in the context of a prototype model for the selling price of a European call option. Within a continuous-time framework, the dynamic management of a portfolio of assets is effected through continuous or point control, activation costs, and phase delay. The performance index is derived from the unique weak variational solution to the ultraparabolic Hamilton-Jacobi equation; the value function is the optimal realization of the performance index relative to all feasible portfolios. An approximation procedure based upon a temporal box scheme/finite element method is analyzed; numerical examples are presented in order to demonstrate the viability of the approach.

  15. A proposed framework for evaluating and comparing efficacy estimates in clinical trials of new rotavirus vaccines.

    PubMed

    Neuzil, Kathleen M; Zaman, K; Victor, John C

    2014-08-11

    Oral rotavirus vaccines have yielded different point estimates of efficacy when tested in different populations. While population and environmental factors may account for these differences, study design characteristics should also be considered. We review the study design elements of rotavirus vaccine trials that may affect point estimates of efficacy, and propose a framework for evaluating new rotavirus vaccines. Copyright © 2014. Published by Elsevier Ltd.

  16. Implementation of a school-based social and emotional learning intervention: understanding diffusion processes within complex systems.

    PubMed

    Evans, Rhiannon; Murphy, Simon; Scourfield, Jonathan

    2015-07-01

    Sporadic and inconsistent implementation remains a significant challenge for social and emotional learning (SEL) interventions. This may be partly explained by the dearth of flexible, causative models that capture the multifarious determinants of implementation practices within complex systems. This paper draws upon Rogers (2003) Diffusion of Innovations Theory to explain the adoption, implementation and discontinuance of a SEL intervention. A pragmatic, formative process evaluation was conducted in alignment with phase 1 of the UK Medical Research Council's framework for Developing and Evaluating Complex Interventions. Employing case-study methodology, qualitative data were generated with four socio-economically and academically contrasting secondary schools in Wales implementing the Student Assistance Programme. Semi-structured interviews were conducted with 15 programme stakeholders. Data suggested that variation in implementation activity could be largely attributed to four key intervention reinvention points, which contributed to the transformation of the programme as it interacted with contextual features and individual needs. These reinvention points comprise the following: intervention training, which captures the process through which adopters acquire knowledge about a programme and delivery expertise; intervention assessment, which reflects adopters' evaluation of an intervention in relation to contextual needs; intervention clarification, which comprises the cascading of knowledge through an organisation in order to secure support in delivery; and intervention responsibility, which refers to the process of assigning accountability for sustainable delivery. Taken together, these points identify opportunities to predict and intervene with potential implementation problems. Further research would benefit from exploring additional reinvention activity.

  17. How to evaluate population management? Transforming the Care Continuum Alliance population health guide toward a broadly applicable analytical framework.

    PubMed

    Struijs, Jeroen N; Drewes, Hanneke W; Heijink, Richard; Baan, Caroline A

    2015-04-01

    Many countries face the persistent twin challenge of providing high-quality care while keeping health systems affordable and accessible. As a result, the interest for more efficient strategies to stimulate population health is increasing. A possible successful strategy is population management (PM). PM strives to address health needs for the population at-risk and the chronically ill at all points along the health continuum by integrating services across health care, prevention, social care and welfare. The Care Continuum Alliance (CCA) population health guide, which recently changed their name in Population Health Alliance (PHA) provides a useful instrument for implementing and evaluating such innovative approaches. This framework is developed for PM specifically and describes the core elements of the PM-concept on the basis of six subsequent interrelated steps. The aim of this article is to transform the CCA framework into an analytical framework. Quantitative methods are refined and we operationalized a set of indicators to measure the impact of PM in terms of the Triple Aim (population health, quality of care and cost per capita). Additionally, we added a qualitative part to gain insight into the implementation process of PM. This resulted in a broadly applicable analytical framework based on a mixed-methods approach. In the coming years, the analytical framework will be applied within the Dutch Monitor Population Management to derive transferable 'lessons learned' and to methodologically underpin the concept of PM. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. The Impact of Mutation and Gene Conversion on the Local Diversification of Antigen Genes in African Trypanosomes

    PubMed Central

    Gjini, Erida; Haydon, Daniel T.; Barry, J. David; Cobbold, Christina A.

    2012-01-01

    Patterns of genetic diversity in parasite antigen gene families hold important information about their potential to generate antigenic variation within and between hosts. The evolution of such gene families is typically driven by gene duplication, followed by point mutation and gene conversion. There is great interest in estimating the rates of these processes from molecular sequences for understanding the evolution of the pathogen and its significance for infection processes. In this study, a series of models are constructed to investigate hypotheses about the nucleotide diversity patterns between closely related gene sequences from the antigen gene archive of the African trypanosome, the protozoan parasite causative of human sleeping sickness in Equatorial Africa. We use a hidden Markov model approach to identify two scales of diversification: clustering of sequence mismatches, a putative indicator of gene conversion events with other lower-identity donor genes in the archive, and at a sparser scale, isolated mismatches, likely arising from independent point mutations. In addition to quantifying the respective probabilities of occurrence of these two processes, our approach yields estimates for the gene conversion tract length distribution and the average diversity contributed locally by conversion events. Model fitting is conducted using a Bayesian framework. We find that diversifying gene conversion events with lower-identity partners occur at least five times less frequently than point mutations on variant surface glycoprotein (VSG) pairs, and the average imported conversion tract is between 14 and 25 nucleotides long. However, because of the high diversity introduced by gene conversion, the two processes have almost equal impact on the per-nucleotide rate of sequence diversification between VSG subfamily members. We are able to disentangle the most likely locations of point mutations and conversions on each aligned gene pair. PMID:22735079

  19. The Offline Software Framework of the NA61/SHINE Experiment

    NASA Astrophysics Data System (ADS)

    Sipos, Roland; Laszlo, Andras; Marcinek, Antoni; Paul, Tom; Szuba, Marek; Unger, Michael; Veberic, Darko; Wyszynski, Oskar

    2012-12-01

    NA61/SHINE (SHINE = SPS Heavy Ion and Neutrino Experiment) is an experiment at the CERN SPS using the upgraded NA49 hadron spectrometer. Among its physics goals are precise hadron production measurements for improving calculations of the neutrino beam flux in the T2K neutrino oscillation experiment as well as for more reliable simulations of cosmic-ray air showers. Moreover, p+p, p+Pb and nucleus+nucleus collisions will be studied extensively to allow for a study of properties of the onset of deconfinement and search for the critical point of strongly interacting matter. Currently NA61/SHINE uses the old NA49 software framework for reconstruction, simulation and data analysis. The core of this legacy framework was developed in the early 1990s. It is written in different programming and scripting languages (C, pgi-Fortran, shell) and provides several concurrent data formats for the event data model, which includes also obsolete parts. In this contribution we will introduce the new software framework, called Shine, that is written in C++ and designed to comprise three principal parts: a collection of processing modules which can be assembled and sequenced by the user via XML files, an event data model which contains all simulation and reconstruction information based on STL and ROOT streaming, and a detector description which provides data on the configuration and state of the experiment. To assure a quick migration to the Shine framework, wrappers were introduced that allow to run legacy code parts as modules in the new framework and we will present first results on the cross validation of the two frameworks.

  20. Algorithmic framework for group analysis of differential equations and its application to generalized Zakharov-Kuznetsov equations

    NASA Astrophysics Data System (ADS)

    Huang, Ding-jiang; Ivanova, Nataliya M.

    2016-02-01

    In this paper, we explain in more details the modern treatment of the problem of group classification of (systems of) partial differential equations (PDEs) from the algorithmic point of view. More precisely, we revise the classical Lie algorithm of construction of symmetries of differential equations, describe the group classification algorithm and discuss the process of reduction of (systems of) PDEs to (systems of) equations with smaller number of independent variables in order to construct invariant solutions. The group classification algorithm and reduction process are illustrated by the example of the generalized Zakharov-Kuznetsov (GZK) equations of form ut +(F (u)) xxx +(G (u)) xyy +(H (u)) x = 0. As a result, a complete group classification of the GZK equations is performed and a number of new interesting nonlinear invariant models which have non-trivial invariance algebras are obtained. Lie symmetry reductions and exact solutions for two important invariant models, i.e., the classical and modified Zakharov-Kuznetsov equations, are constructed. The algorithmic framework for group analysis of differential equations presented in this paper can also be applied to other nonlinear PDEs.

  1. Statistical physics of language dynamics

    NASA Astrophysics Data System (ADS)

    Loreto, Vittorio; Baronchelli, Andrea; Mukherjee, Animesh; Puglisi, Andrea; Tria, Francesca

    2011-04-01

    Language dynamics is a rapidly growing field that focuses on all processes related to the emergence, evolution, change and extinction of languages. Recently, the study of self-organization and evolution of language and meaning has led to the idea that a community of language users can be seen as a complex dynamical system, which collectively solves the problem of developing a shared communication framework through the back-and-forth signaling between individuals. We shall review some of the progress made in the past few years and highlight potential future directions of research in this area. In particular, the emergence of a common lexicon and of a shared set of linguistic categories will be discussed, as examples corresponding to the early stages of a language. The extent to which synthetic modeling is nowadays contributing to the ongoing debate in cognitive science will be pointed out. In addition, the burst of growth of the web is providing new experimental frameworks. It makes available a huge amount of resources, both as novel tools and data to be analyzed, allowing quantitative and large-scale analysis of the processes underlying the emergence of a collective information and language dynamics.

  2. A hybrid absorption–adsorption method to efficiently capture carbon

    PubMed Central

    Liu, Huang; Liu, Bei; Lin, Li-Chiang; Chen, Guangjin; Wu, Yuqing; Wang, Jin; Gao, Xueteng; Lv, Yining; Pan, Yong; Zhang, Xiaoxin; Zhang, Xianren; Yang, Lanying; Sun, Changyu; Smit, Berend; Wang, Wenchuan

    2014-01-01

    Removal of carbon dioxide is an essential step in many energy-related processes. Here we report a novel slurry concept that combines specific advantages of metal-organic frameworks, ion liquids, amines and membranes by suspending zeolitic imidazolate framework-8 in glycol-2-methylimidazole solution. We show that this approach may give a more efficient technology to capture carbon dioxide compared to conventional technologies. The carbon dioxide sorption capacity of our slurry reaches 1.25 mol l−1 at 1 bar and the selectivity of carbon dioxide/hydrogen, carbon dioxide/nitrogen and carbon dioxide/methane achieves 951, 394 and 144, respectively. We demonstrate that the slurry can efficiently remove carbon dioxide from gas mixtures at normal pressure/temperature through breakthrough experiments. Most importantly, the sorption enthalpy is only −29 kJ mol−1, indicating that significantly less energy is required for sorbent regeneration. In addition, from a technological point of view, unlike solid adsorbents slurries can flow and be pumped. This allows us to use a continuous separation process with heat integration. PMID:25296559

  3. New statistical scission-point model to predict fission fragment observables

    NASA Astrophysics Data System (ADS)

    Lemaître, Jean-François; Panebianco, Stefano; Sida, Jean-Luc; Hilaire, Stéphane; Heinrich, Sophie

    2015-09-01

    The development of high performance computing facilities makes possible a massive production of nuclear data in a full microscopic framework. Taking advantage of the individual potential calculations of more than 7000 nuclei, a new statistical scission-point model, called SPY, has been developed. It gives access to the absolute available energy at the scission point, which allows the use of a parameter-free microcanonical statistical description to calculate the distributions and the mean values of all fission observables. SPY uses the richness of microscopy in a rather simple theoretical framework, without any parameter except the scission-point definition, to draw clear answers based on perfect knowledge of the ingredients involved in the model, with very limited computing cost.

  4. Mechanical properties of sol–gel derived SiO2 nanotubes

    PubMed Central

    Antsov, Mikk; Vlassov, Sergei; Dorogin, Leonid M; Vahtrus, Mikk; Zabels, Roberts; Lange, Sven; Lõhmus, Rünno

    2014-01-01

    Summary The mechanical properties of thick-walled SiO2 nanotubes (NTs) prepared by a sol–gel method while using Ag nanowires (NWs) as templates were measured by using different methods. In situ scanning electron microscopy (SEM) cantilever beam bending tests were carried out by using a nanomanipulator equipped with a force sensor in order to investigate plasticity and flexural response of NTs. Nanoindentation and three point bending tests of NTs were performed by atomic force microscopy (AFM) under ambient conditions. Half-suspended and three-point bending tests were processed in the framework of linear elasticity theory. Finite element method simulations were used to extract Young’s modulus values from the nanoindentation data. Finally, the Young’s moduli of SiO2 NTs measured by different methods were compared and discussed. PMID:25383292

  5. Framework for developing a spatial walkability index (SWI) for the light-rail transit (LRT) stations in Kuala Lumpur city centre using analytical network process (ANP) and GIS

    NASA Astrophysics Data System (ADS)

    Naharudin, Nabilah; Ahamad, Mohd Sanusi S.; Sadullah, Ahmad Farhan Mohd

    2017-10-01

    In support to the nation's goal of developing a liveable city, Malaysian government aims to improve the mobility in Kuala Lumpur by providing good quality transit services across the city. However, the public starts to demand for more than just a connectivity between two points. They want their transit journey to be comfortable and pleasant from the very first mile. The key here is the first and last mile (FLM) of the transit service which defines their journey to access the station itself. The question is, does the existing transit services' FLM satisfy public's needs? Therefore, many studies had emerged in attempt to assess the pedestrian-friendliness. While most of them did base on the pedestrian's perceptions, there were also studies that spatially measured the connectivity and accessibility to various landuses and point of interests. While both can be a good method, their integration could actually produce a better assessment. However, till date, only a few studies had attempted to do so. This paper proposes a framework to develop a Spatial Walkability Index (SWI) by integrating a multicriteria evaluation technique, Analytical Network Process (ANP) and network analysis on geographical information system (GIS) platform. First, ANP will aggregate the degree of importance for each walkability criteria based on the pedestrian's perceptions. Then, the network analysis will use the weighted criteria as attributes to find the walkable routes within half mile radius from each station. The index will be calculated by rationing the total length of walkable routes in respect to the available footpath. The final outcome is a percentage of walkable FLM transit routes for each station which will be named as the SWI. It is expected that the developed framework can be applied in other cities across the globe. It can also be improvised to suit the demand and purpose there.

  6. Activating clinical trials: a process improvement approach.

    PubMed

    Martinez, Diego A; Tsalatsanis, Athanasios; Yalcin, Ali; Zayas-Castro, José L; Djulbegovic, Benjamin

    2016-02-24

    The administrative process associated with clinical trial activation has been criticized as costly, complex, and time-consuming. Prior research has concentrated on identifying administrative barriers and proposing various solutions to reduce activation time, and consequently associated costs. Here, we expand on previous research by incorporating social network analysis and discrete-event simulation to support process improvement decision-making. We searched for all operational data associated with the administrative process of activating industry-sponsored clinical trials at the Office of Clinical Research of the University of South Florida in Tampa, Florida. We limited the search to those trials initiated and activated between July 2011 and June 2012. We described the process using value stream mapping, studied the interactions of the various process participants using social network analysis, and modeled potential process modifications using discrete-event simulation. The administrative process comprised 5 sub-processes, 30 activities, 11 decision points, 5 loops, and 8 participants. The mean activation time was 76.6 days. Rate-limiting sub-processes were those of contract and budget development. Key participants during contract and budget development were the Office of Clinical Research, sponsors, and the principal investigator. Simulation results indicate that slight increments on the number of trials, arriving to the Office of Clinical Research, would increase activation time by 11 %. Also, incrementing the efficiency of contract and budget development would reduce the activation time by 28 %. Finally, better synchronization between contract and budget development would reduce time spent on batching documentation; however, no improvements would be attained in total activation time. The presented process improvement analytic framework not only identifies administrative barriers, but also helps to devise and evaluate potential improvement scenarios. The strength of our framework lies in its system analysis approach that recognizes the stochastic duration of the activation process and the interdependence between process activities and entities.

  7. Assessment of wildland fire impacts on watershed annual water yield: Analytical framework and case studies in the United States

    DOE PAGES

    Hallema, Dennis W.; Sun, Ge; Caldwell, Peter V.; ...

    2016-11-29

    More than 50% of water supplies in the conterminous United States originate on forestland or rangeland and are potentially under increasing stress as a result of larger and more severe wildfires. Little is known, however, about the long-term impacts of fire on annual water yield and the role of climate variability within this context. We here propose a framework for evaluating wildland fire impacts on streamflow that combines double-mass analysis with new methods (change point analysis, climate elasticity modeling, and process-based modeling) to distinguish between multiyear fire and climate impacts. The framework captures a wide range of fire types, watershedsmore » characteristics, and climate conditions using streamflow data, as opposed to other approaches requiring paired watersheds. The process is illustrated with three case studies. A watershed in Arizona experienced a +266% increase in annual water yield in the 5 years after a wildfire, where +219% was attributed to wildfire and +24% to precipitation trends. In contrast, a California watershed had a lower (–64%) post-fire net water yield, comprised of enhanced flow (+38%) attributed to wildfire offset (–102%) by lower precipitation in the post-fire period. Changes in streamflow within a watershed in South Carolina had no apparent link to periods of prescribed burning but matched a very wet winter and reports of storm damage. As a result, the presented framework is unique in its ability to detect and quantify fire or other disturbances, even if the date or nature of the disturbance event is uncertain, and regardless of precipitation trends.« less

  8. New phenomena in non-equilibrium quantum physics

    NASA Astrophysics Data System (ADS)

    Kitagawa, Takuya

    From its beginning in the early 20th century, quantum theory has become progressively more important especially due to its contributions to the development of technologies. Quantum mechanics is crucial for current technology such as semiconductors, and also holds promise for future technologies such as superconductors and quantum computing. Despite of the success of quantum theory, its applications have been mostly limited to equilibrium or static systems due to 1. lack of experimental controllability of non-equilibrium quantum systems 2. lack of theoretical frameworks to understand non-equilibrium dynamics. Consequently, physicists have not yet discovered too many interesting phenomena in non-equilibrium quantum systems from both theoretical and experimental point of view and thus, non-equilibrium quantum physics did not attract too much attentions. The situation has recently changed due to the rapid development of experimental techniques in condensed matter as well as cold atom systems, which now enables a better control of non-equilibrium quantum systems. Motivated by this experimental progress, we constructed theoretical frameworks to study three different non-equilibrium regimes of transient dynamics, steady states and periodically drives. These frameworks provide new perspectives for dynamical quantum process, and help to discover new phenomena in these systems. In this thesis, we describe these frameworks through explicit examples and demonstrate their versatility. Some of these theoretical proposals have been realized in experiments, confirming the applicability of the theories to realistic experimental situations. These studies have led to not only the improved fundamental understanding of non-equilibrium processes in quantum systems, but also suggested entirely different venues for developing quantum technologies.

  9. Assessment of wildland fire impacts on watershed annual water yield: Analytical framework and case studies in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hallema, Dennis W.; Sun, Ge; Caldwell, Peter V.

    More than 50% of water supplies in the conterminous United States originate on forestland or rangeland and are potentially under increasing stress as a result of larger and more severe wildfires. Little is known, however, about the long-term impacts of fire on annual water yield and the role of climate variability within this context. We here propose a framework for evaluating wildland fire impacts on streamflow that combines double-mass analysis with new methods (change point analysis, climate elasticity modeling, and process-based modeling) to distinguish between multiyear fire and climate impacts. The framework captures a wide range of fire types, watershedsmore » characteristics, and climate conditions using streamflow data, as opposed to other approaches requiring paired watersheds. The process is illustrated with three case studies. A watershed in Arizona experienced a +266% increase in annual water yield in the 5 years after a wildfire, where +219% was attributed to wildfire and +24% to precipitation trends. In contrast, a California watershed had a lower (–64%) post-fire net water yield, comprised of enhanced flow (+38%) attributed to wildfire offset (–102%) by lower precipitation in the post-fire period. Changes in streamflow within a watershed in South Carolina had no apparent link to periods of prescribed burning but matched a very wet winter and reports of storm damage. As a result, the presented framework is unique in its ability to detect and quantify fire or other disturbances, even if the date or nature of the disturbance event is uncertain, and regardless of precipitation trends.« less

  10. A Unified Estimation Framework for State-Related Changes in Effective Brain Connectivity.

    PubMed

    Samdin, S Balqis; Ting, Chee-Ming; Ombao, Hernando; Salleh, Sh-Hussain

    2017-04-01

    This paper addresses the critical problem of estimating time-evolving effective brain connectivity. Current approaches based on sliding window analysis or time-varying coefficient models do not simultaneously capture both slow and abrupt changes in causal interactions between different brain regions. To overcome these limitations, we develop a unified framework based on a switching vector autoregressive (SVAR) model. Here, the dynamic connectivity regimes are uniquely characterized by distinct vector autoregressive (VAR) processes and allowed to switch between quasi-stationary brain states. The state evolution and the associated directed dependencies are defined by a Markov process and the SVAR parameters. We develop a three-stage estimation algorithm for the SVAR model: 1) feature extraction using time-varying VAR (TV-VAR) coefficients, 2) preliminary regime identification via clustering of the TV-VAR coefficients, 3) refined regime segmentation by Kalman smoothing and parameter estimation via expectation-maximization algorithm under a state-space formulation, using initial estimates from the previous two stages. The proposed framework is adaptive to state-related changes and gives reliable estimates of effective connectivity. Simulation results show that our method provides accurate regime change-point detection and connectivity estimates. In real applications to brain signals, the approach was able to capture directed connectivity state changes in functional magnetic resonance imaging data linked with changes in stimulus conditions, and in epileptic electroencephalograms, differentiating ictal from nonictal periods. The proposed framework accurately identifies state-dependent changes in brain network and provides estimates of connectivity strength and directionality. The proposed approach is useful in neuroscience studies that investigate the dynamics of underlying brain states.

  11. Awareness, persuasion, and adoption: Enriching the Bass model

    NASA Astrophysics Data System (ADS)

    Colapinto, Cinzia; Sartori, Elena; Tolotti, Marco

    2014-02-01

    In the context of diffusion of innovations, we propose a probabilistic model based on interacting populations connected through new communication channels. The potential adopters are heterogeneous in the connectivity levels and in their taste for innovation. The proposed framework can model the different stages of the adoption dynamics. In particular, the adoption curve is the result of a micro-founded decision process following the awareness phase. Eventually, we recover stylized facts pointed out by the extant literature in the field, such as delayed adoptions and non-monotonic adoption curves.

  12. Software development for teleroentgenogram analysis

    NASA Astrophysics Data System (ADS)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  13. Code orange: Towards transformational leadership of emergency management systems.

    PubMed

    Caro, Denis H J

    2015-09-01

    The 21(st) century calls upon health leaders to recognize and respond to emerging threats and systemic emergency management challenges through transformative processes inherent in the LEADS in a caring environment framework. Using a grounded theory approach, this qualitative study explores key informant perspectives of leaders in emergency management across Canada on pressing needs for relevant systemic transformation. The emerging model points to eight specific attributes of transformational leadership central to emergency management and suggests that contextualization of health leadership is of particular import. © 2015 The Canadian College of Health Leaders.

  14. Formation Flying With Decentralized Control in Libration Point Orbits

    NASA Technical Reports Server (NTRS)

    Folta, David; Carpenter, J. Russell; Wagner, Christoph

    2000-01-01

    A decentralized control framework is investigated for applicability of formation flying control in libration orbits. The decentralized approach, being non-hierarchical, processes only direct measurement data, in parallel with the other spacecraft. Control is accomplished via linearization about a reference libration orbit with standard control using a Linear Quadratic Regulator (LQR) or the GSFC control algorithm. Both are linearized about the current state estimate as with the extended Kalman filter. Based on this preliminary work, the decentralized approach appears to be feasible for upcoming libration missions using distributed spacecraft.

  15. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes.

    PubMed

    Ragan, Eric D; Endert, Alex; Sanyal, Jibonananda; Chen, Jian

    2016-01-01

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance information and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research.

  16. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric; Alex, Endert; Sanyal, Jibonananda

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less

  17. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes

    DOE PAGES

    Ragan, Eric; Alex, Endert; Sanyal, Jibonananda; ...

    2016-01-01

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less

  18. Laser welding of removable partial denture frameworks.

    PubMed

    Brudvik, James S; Lee, Seungbum; Croshaw, Steve N; Reimers, Donald L; Reimers, Dave L

    2008-01-01

    To identify and measure distortions inherent in the casting process of a Class III mandibular cobalt-chromium (Co-Cr) framework to illustrate the problems faced by the laboratory technician and the clinician and to measure the changes that occur during the correction of the fit discrepancy using laser welding. Five identical castings of a Co-Cr alloy partial denture casting were made and measured between 3 widely separated points using the x, y, and z adjustments of a Nikon Measurescope. The same measurements were made after each of the following clinical and laboratory procedures: sprue removal, sectioning of the casting into 3 parts through the posterior meshwork, fitting the segments to the master cast, picking up the segments using resin, and laser welding of the 3 segments. Measurements of all 5 castings showed a cross-arch decrease after sprue removal, an increase after fitting the segments to the master cast, and a slight decrease after resin pickup and laser welding. Within the limitations of this study, the findings suggest that precise tooth-frame relations can be established by resin pickup and laser welding of segments of Co-Cr removable partial denture frameworks.

  19. No special K! A signal detection framework for the strategic regulation of memory accuracy.

    PubMed

    Higham, Philip A

    2007-02-01

    Two experiments investigated criterion setting and metacognitive processes underlying the strategic regulation of accuracy on the Scholastic Aptitude Test (SAT) using Type-2 signal detection theory (SDT). In Experiment 1, report bias was manipulated by penalizing participants either 0.25 (low incentive) or 4 (high incentive) points for each error. Best guesses to unanswered items were obtained so that Type-2 signal detection indices of discrimination and bias could be calculated. The same incentive manipulation was used in Experiment 2, only the test was computerized, confidence ratings were taken so that receiver operating characteristic (ROC) curves could be generated, and feedback was manipulated. The results of both experiments demonstrated that SDT provides a viable alternative to A. Koriat and M. Goldsmith's (1996c) framework of monitoring and control and reveals information about the regulation of accuracy that their framework does not. For example, ROC analysis indicated that the threshold model implied by formula scoring is inadequate. Instead, performance on the SAT should be modeled with an equal-variance Gaussian, Type-2 signal detection model. ((c) 2007 APA, all rights reserved).

  20. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.

    PubMed

    Elgendi, Mohamed

    2016-11-02

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.

  1. A view of the tip of the iceberg: revisiting conceptual continuities and their implications for science learning

    NASA Astrophysics Data System (ADS)

    Brown, Bryan A.; Kloser, Matt

    2009-12-01

    We respond to Hwang and Kim and Yeo's critiques of the conceptual continuity framework in science education. First, we address the criticism that their analysis fails to recognize the situated perspective of learning by denying the dichotomy of the formal and informal knowledge as a starting point in the learning process. Second, we address the critique that students' descriptions fail to meet the "gold standard" of science education—alignment with an authoritative source and generalizability—by highlighting some student-expert congruence that could serve as the foundation for future learning. Third, we address the critique that a conceptual continuity framework could lead to less rigorous science education goals by arguing that the ultimate goals do not change, but rather that if the pathways that lead to the goals' achievement could recognize existing lexical continuities' science teaching may become more efficient. In sum, we argue that a conceptual continuities framework provides an asset, not deficit lexical perspective from which science teacher educators and science educators can begin to address and build complete science understandings.

  2. Uncertainty estimation of Intensity-Duration-Frequency relationships: A regional analysis

    NASA Astrophysics Data System (ADS)

    Mélèse, Victor; Blanchet, Juliette; Molinié, Gilles

    2018-03-01

    We propose in this article a regional study of uncertainties in IDF curves derived from point-rainfall maxima. We develop two generalized extreme value models based on the simple scaling assumption, first in the frequentist framework and second in the Bayesian framework. Within the frequentist framework, uncertainties are obtained i) from the Gaussian density stemming from the asymptotic normality theorem of the maximum likelihood and ii) with a bootstrap procedure. Within the Bayesian framework, uncertainties are obtained from the posterior densities. We confront these two frameworks on the same database covering a large region of 100, 000 km2 in southern France with contrasted rainfall regime, in order to be able to draw conclusion that are not specific to the data. The two frameworks are applied to 405 hourly stations with data back to the 1980's, accumulated in the range 3 h-120 h. We show that i) the Bayesian framework is more robust than the frequentist one to the starting point of the estimation procedure, ii) the posterior and the bootstrap densities are able to better adjust uncertainty estimation to the data than the Gaussian density, and iii) the bootstrap density give unreasonable confidence intervals, in particular for return levels associated to large return period. Therefore our recommendation goes towards the use of the Bayesian framework to compute uncertainty.

  3. Large Scale Textured Mesh Reconstruction from Mobile Mapping Images and LIDAR Scans

    NASA Astrophysics Data System (ADS)

    Boussaha, M.; Vallet, B.; Rives, P.

    2018-05-01

    The representation of 3D geometric and photometric information of the real world is one of the most challenging and extensively studied research topics in the photogrammetry and robotics communities. In this paper, we present a fully automatic framework for 3D high quality large scale urban texture mapping using oriented images and LiDAR scans acquired by a terrestrial Mobile Mapping System (MMS). First, the acquired points and images are sliced into temporal chunks ensuring a reasonable size and time consistency between geometry (points) and photometry (images). Then, a simple, fast and scalable 3D surface reconstruction relying on the sensor space topology is performed on each chunk after an isotropic sampling of the point cloud obtained from the raw LiDAR scans. Finally, the algorithm proposed in (Waechter et al., 2014) is adapted to texture the reconstructed surface with the images acquired simultaneously, ensuring a high quality texture with no seams and global color adjustment. We evaluate our full pipeline on a dataset of 17 km of acquisition in Rouen, France resulting in nearly 2 billion points and 40000 full HD images. We are able to reconstruct and texture the whole acquisition in less than 30 computing hours, the entire process being highly parallel as each chunk can be processed independently in a separate thread or computer.

  4. A NEW METHOD FOR DERIVING THE STELLAR BIRTH FUNCTION OF RESOLVED STELLAR POPULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gennaro, M.; Brown, T. M.; Gordon, K. D.

    We present a new method for deriving the stellar birth function (SBF) of resolved stellar populations. The SBF (stars born per unit mass, time, and metallicity) is the combination of the initial mass function (IMF), the star formation history (SFH), and the metallicity distribution function (MDF). The framework of our analysis is that of Poisson Point Processes (PPPs), a class of statistical models suitable when dealing with points (stars) in a multidimensional space (the measurement space of multiple photometric bands). The theory of PPPs easily accommodates the modeling of measurement errors as well as that of incompleteness. Our method avoidsmore » binning stars in the color–magnitude diagram and uses the whole likelihood function for each data point; combining the individual likelihoods allows the computation of the posterior probability for the population's SBF. Within the proposed framework it is possible to include nuisance parameters, such as distance and extinction, by specifying their prior distributions and marginalizing over them. The aim of this paper is to assess the validity of this new approach under a range of assumptions, using only simulated data. Forthcoming work will show applications to real data. Although it has a broad scope of possible applications, we have developed this method to study multi-band Hubble Space Telescope observations of the Milky Way Bulge. Therefore we will focus on simulations with characteristics similar to those of the Galactic Bulge.« less

  5. Model-based decoding, information estimation, and change-point detection techniques for multineuron spike trains.

    PubMed

    Pillow, Jonathan W; Ahmadian, Yashar; Paninski, Liam

    2011-01-01

    One of the central problems in systems neuroscience is to understand how neural spike trains convey sensory information. Decoding methods, which provide an explicit means for reading out the information contained in neural spike responses, offer a powerful set of tools for studying the neural coding problem. Here we develop several decoding methods based on point-process neural encoding models, or forward models that predict spike responses to stimuli. These models have concave log-likelihood functions, which allow efficient maximum-likelihood model fitting and stimulus decoding. We present several applications of the encoding model framework to the problem of decoding stimulus information from population spike responses: (1) a tractable algorithm for computing the maximum a posteriori (MAP) estimate of the stimulus, the most probable stimulus to have generated an observed single- or multiple-neuron spike train response, given some prior distribution over the stimulus; (2) a gaussian approximation to the posterior stimulus distribution that can be used to quantify the fidelity with which various stimulus features are encoded; (3) an efficient method for estimating the mutual information between the stimulus and the spike trains emitted by a neural population; and (4) a framework for the detection of change-point times (the time at which the stimulus undergoes a change in mean or variance) by marginalizing over the posterior stimulus distribution. We provide several examples illustrating the performance of these estimators with simulated and real neural data.

  6. nSTAT: Open-Source Neural Spike Train Analysis Toolbox for Matlab

    PubMed Central

    Cajigas, I.; Malik, W.Q.; Brown, E.N.

    2012-01-01

    Over the last decade there has been a tremendous advance in the analytical tools available to neuroscientists to understand and model neural function. In particular, the point process - Generalized Linear Model (PPGLM) framework has been applied successfully to problems ranging from neuro-endocrine physiology to neural decoding. However, the lack of freely distributed software implementations of published PP-GLM algorithms together with problem-specific modifications required for their use, limit wide application of these techniques. In an effort to make existing PP-GLM methods more accessible to the neuroscience community, we have developed nSTAT – an open source neural spike train analysis toolbox for Matlab®. By adopting an Object-Oriented Programming (OOP) approach, nSTAT allows users to easily manipulate data by performing operations on objects that have an intuitive connection to the experiment (spike trains, covariates, etc.), rather than by dealing with data in vector/matrix form. The algorithms implemented within nSTAT address a number of common problems including computation of peri-stimulus time histograms, quantification of the temporal response properties of neurons, and characterization of neural plasticity within and across trials. nSTAT provides a starting point for exploratory data analysis, allows for simple and systematic building and testing of point process models, and for decoding of stimulus variables based on point process models of neural function. By providing an open-source toolbox, we hope to establish a platform that can be easily used, modified, and extended by the scientific community to address limitations of current techniques and to extend available techniques to more complex problems. PMID:22981419

  7. A new statistical time-dependent model of earthquake occurrence: failure processes driven by a self-correcting model

    NASA Astrophysics Data System (ADS)

    Rotondi, Renata; Varini, Elisa

    2016-04-01

    The long-term recurrence of strong earthquakes is often modelled by the stationary Poisson process for the sake of simplicity, although renewal and self-correcting point processes (with non-decreasing hazard functions) are more appropriate. Short-term models mainly fit earthquake clusters due to the tendency of an earthquake to trigger other earthquakes; in this case, self-exciting point processes with non-increasing hazard are especially suitable. In order to provide a unified framework for analyzing earthquake catalogs, Schoenberg and Bolt proposed the SELC (Short-term Exciting Long-term Correcting) model (BSSA, 2000) and Varini employed a state-space model for estimating the different phases of a seismic cycle (PhD Thesis, 2005). Both attempts are combinations of long- and short-term models, but results are not completely satisfactory, due to the different scales at which these models appear to operate. In this study, we split a seismic sequence in two groups: the leader events, whose magnitude exceeds a threshold magnitude, and the remaining ones considered as subordinate events. The leader events are assumed to follow a well-known self-correcting point process named stress release model (Vere-Jones, J. Phys. Earth, 1978; Bebbington & Harte, GJI, 2003, Varini & Rotondi, Env. Ecol. Stat., 2015). In the interval between two subsequent leader events, subordinate events are expected to cluster at the beginning (aftershocks) and at the end (foreshocks) of that interval; hence, they are modeled by a failure processes that allows bathtub-shaped hazard function. In particular, we have examined the generalized Weibull distributions, a large family that contains distributions with different bathtub-shaped hazard as well as the standard Weibull distribution (Lai, Springer, 2014). The model is fitted to a dataset of Italian historical earthquakes and the results of Bayesian inference are shown.

  8. Cortical Surface Registration for Image-Guided Neurosurgery Using Laser-Range Scanning

    PubMed Central

    Sinha, Tuhin K.; Cash, David M.; Galloway, Robert L.; Weil, Robert J.

    2013-01-01

    In this paper, a method of acquiring intraoperative data using a laser range scanner (LRS) is presented within the context of model-updated image-guided surgery. Registering textured point clouds generated by the LRS to tomographic data is explored using established point-based and surface techniques as well as a novel method that incorporates geometry and intensity information via mutual information (SurfaceMI). Phantom registration studies were performed to examine accuracy and robustness for each framework. In addition, an in vivo registration is performed to demonstrate feasibility of the data acquisition system in the operating room. Results indicate that SurfaceMI performed better in many cases than point-based (PBR) and iterative closest point (ICP) methods for registration of textured point clouds. Mean target registration error (TRE) for simulated deep tissue targets in a phantom were 1.0 ± 0.2, 2.0 ± 0.3, and 1.2 ± 0.3 mm for PBR, ICP, and SurfaceMI, respectively. With regard to in vivo registration, the mean TRE of vessel contour points for each framework was 1.9 ± 1.0, 0 9 ± 0.6, and 1.3 ± 0.5 for PBR, ICP, and SurfaceMI, respectively. The methods discussed in this paper in conjunction with the quantitative data provide impetus for using LRS technology within the model-updated image-guided surgery framework. PMID:12906252

  9. Overgrazing- How far are we from passing the tipping point of turning our rangelands into desert?

    USDA-ARS?s Scientific Manuscript database

    Ecological science, particularly with regard to Mongolian rangelands, is not able to estimate when a tipping point will be passed. Nonetheless, it does provide a framework for responding to the threat of desertification tipping points. ...

  10. Using a Malcolm Baldrige framework to understand high-performing clinical microsystems.

    PubMed

    Foster, Tina C; Johnson, Julie K; Nelson, Eugene C; Batalden, Paul B

    2007-10-01

    BACKGROUND, OBJECTIVES AND METHOD: The Malcolm Baldrige National Quality Award (MBNQA) provides a set of criteria for organisational quality assessment and improvement that has been used by thousands of business, healthcare and educational organisations for more than a decade. The criteria can be used as a tool for self-evaluation, and are widely recognised as a robust framework for design and evaluation of healthcare systems. The clinical microsystem, as an organisational construct, is a systems approach for providing clinical care based on theories from organisational development, leadership and improvement. This study compared the MBNQA criteria for healthcare and the success factors of high-performing clinical microsystems to (1) determine whether microsystem success characteristics cover the same range of issues addressed by the Baldrige criteria and (2) examine whether this comparison might better inform our understanding of either framework. Both Baldrige criteria and microsystem success characteristics cover a wide range of areas crucial to high performance. Those particularly called out by this analysis are organisational leadership, work systems and service processes from a Baldrige standpoint, and leadership, performance results, process improvement, and information and information technology from the microsystem success characteristics view. Although in many cases the relationship between Baldrige criteria and microsystem success characteristics are obvious, in others the analysis points to ways in which the Baldrige criteria might be better understood and worked with by a microsystem through the design of work systems and a deep understanding of processes. Several tools are available for those who wish to engage in self-assessment based on MBNQA criteria and microsystem characteristics.

  11. Using a Malcolm Baldrige framework to understand high‐performing clinical microsystems

    PubMed Central

    Foster, Tina C; Johnson, Julie K; Nelson, Eugene C; Batalden, Paul B

    2007-01-01

    Background, objectives and method The Malcolm Baldrige National Quality Award (MBNQA) provides a set of criteria for organisational quality assessment and improvement that has been used by thousands of business, healthcare and educational organisations for more than a decade. The criteria can be used as a tool for self‐evaluation, and are widely recognised as a robust framework for design and evaluation of healthcare systems. The clinical microsystem, as an organisational construct, is a systems approach for providing clinical care based on theories from organisational development, leadership and improvement. This study compared the MBNQA criteria for healthcare and the success factors of high‐performing clinical microsystems to (1) determine whether microsystem success characteristics cover the same range of issues addressed by the Baldrige criteria and (2) examine whether this comparison might better inform our understanding of either framework. Results and conclusions Both Baldrige criteria and microsystem success characteristics cover a wide range of areas crucial to high performance. Those particularly called out by this analysis are organisational leadership, work systems and service processes from a Baldrige standpoint, and leadership, performance results, process improvement, and information and information technology from the microsystem success characteristics view. Although in many cases the relationship between Baldrige criteria and microsystem success characteristics are obvious, in others the analysis points to ways in which the Baldrige criteria might be better understood and worked with by a microsystem through the design of work systems and a deep understanding of processes. Several tools are available for those who wish to engage in self‐assessment based on MBNQA criteria and microsystem characteristics. PMID:17913773

  12. Marginal and internal fit of curved anterior CAD/CAM-milled zirconia fixed dental prostheses: an in-vitro study.

    PubMed

    Büchi, Dominik L; Ebler, Sabine; Hämmerle, Christoph H F; Sailer, Irena

    2014-01-01

    To test whether or not different types of CAD/CAM systems, processing zirconia in the densely and in the pre-sintered stage, lead to differences in the accuracy of 4-unit anterior fixed dental prosthesis (FDP) frameworks, and to evaluate the efficiency. 40 curved anterior 4-unit FDP frameworks were manufactured with four different CAD/CAM systems: DCS Precident (DCS) (control group), Cercon (DeguDent) (test group 1), Cerec InLab (Sirona) (test group 2), Kavo Everest (Kavo) (test group 3). The DCS System was chosen as the control group because the zirconia frameworks are processed in its densely sintered stage and there is no shrinkage of the zirconia during the manufacturing process. The initial fit of the frameworks was checked and adjusted to a subjectively similar level of accuracy by one dental technician, and the time taken for this was recorded. After cementation, the frameworks were embedded into resin and the abutment teeth were cut in mesiodistal and orobuccal directions in four specimens. The thickness of the cement gap was measured at 50× (internal adaptation) and 200× (marginal adaptation) magnification. The measurement of the accuracy was performed at four sites. Site 1: marginal adaptation, the marginal opening at the point of closest perpendicular approximation between the die and framework margin. Site 2: Internal adaptation at the chamfer. Site 3: Internal adaptation at the axial wall. Site 4: Internal adaptation in the occlusal area. The data were analyzed descriptively using the ANOVA and Bonferroni/ Dunn tests. The mean marginal adaptation (site 1) of the control group was 107 ± 26 μm; test group 1, 140 ± 26 μm; test group 2, 104 ± 40 μm; and test group 3, 95 ± 31 μm. Test group 1 showed a tendency to exhibit larger marginal gaps than the other groups, however, this difference was only significant when test groups 1 and 3 were compared (P = .0022; Bonferroni/Dunn test). Significantly more time was needed for the adjustment of the frameworks of test group 1 compared to the other test groups and the control group (21.1 min vs 3.8 min) (P < .0001; Bonferroni/Dunn test). For the adjustment of the frameworks of test groups 2 and 3, the same time was needed as for the frameworks of the control group. No differences of the framework accuracy resulting from the different CAM and CAD/CAM procedures were found; however, only after adjustment of the fit by an experienced dental technician. Hence, the influence of a manual correction of the fit was crucial, and the efforts differed for the tested systems. The CAM system led to lower initial accuracy of the frameworks than the CAD/CAM systems, which may be crucial for the dental laboratory. The stage of the zirconia materials used for the different CAD/CAM procedures, ie presintered or densely sintered, exhibited no influence.

  13. Steps in creating a methodology for interpreting a geodiversity element -integrating a geodiversity element in the popular knowledge

    NASA Astrophysics Data System (ADS)

    Toma, Cristina; Andrasanu, Alexandru

    2017-04-01

    Conserving geodiversity and especially geological heritage is not very well integrated in the general knowledge as biodiversity is, for example. Keeping that in mind we are trying, through this research, to find a better way of transmitting a geological process to the general public. The means to integrate a geodiversity element in the popular knowledge is through interpretation. Interpretation "translates" the scientific information into a common language with very well known facts by the general public. The purpose of this paper is creating a framework for a methodology necessary in interpreting a geodiversity element - salt - in Buzau Land Geopark. We will approach the salt subject through a scheme in order to have a general view of the process and to better understand and explain it to the general public. We will look into the subject from three scientific points of view: GEODIVERSITY, ANTHROPOLOGY, and the SOCIO-ECONOMICAL aspect. Each of these points of view or domains will be divided into themes. For GEODIVERSITY we will have the following themes: Formation, Accumulation, Diapirism process, Chemical formula, Landscape (here we will include also the specific biodiversity with the halophile plants), Landforms, Hazard. For ANTHROPOLOGY will contain themes of tangible and intangible heritage like: Salt symbolistic, Stories and ritual usage, Recipes, How the knowledge is transmitted. The SOCIO-ECONOMIC aspect will be reflected through themes like: Extractive methods, Usage, Interdictions, Taxes, Commercial exchanges. Each theme will have a set of keywords that will be described and each one will be at the base of the elements that together will form the interpretation of the geodiversity element - the salt. The next step will be to clearly set the scope of the interpretation, to which field of expertise is our interpretation process addressed: Education (Undergraduate or post-graduate Students), Science, Geotourism, Entrepreneurship. After putting together the elements derived from the key words, and establishing the purpose of the interpretation, the following step will be finding the message to be sent through interpretation. The last step of the framework will be finding the proper means to transmit the interpretive message: panels, installations, geo-routes, visitors centers, landart, virtual/augmented reality. This framework would represent a methodology to be followed when interpreting scientific knowledge about a geological process. Thus, this approach - the geodiversity reflected through the anthropological and socio-economic aspects- would be a successful method for showing the general public how a geological element influenced their lives, drawing them closer to Earth Sciences.

  14. Studying the clinical encounter with the Adaptive Leadership framework.

    PubMed

    Bailey, Donald E; Docherty, Sharron L; Adams, Judith A; Carthron, Dana L; Corazzini, Kirsten; Day, Jennifer R; Neglia, Elizabeth; Thygeson, Marcus; Anderson, Ruth A

    2012-08-01

    In this paper we discuss the concept of leadership as a personal capability, not contingent on one's position in a hierarchy. This type of leadership allows us to reframe both the care-giving and organizational roles of nurses and other front-line clinical staff. Little research has been done to explore what leadership means at the point of care, particularly in reference to the relationship between health care practitioners and patients and their family caregivers. The Adaptive Leadership framework, based on complexity science theory, provides a useful lens to explore practitioners' leadership behaviors at the point of care. This framework proposes that there are two broad categories of challenges that patients face: technical and adaptive. Whereas technical challenges are addressed with technical solutions that are delivered by practitioners, adaptive challenges require the patient (or family member) to adjust to a new situation and to do the work of adapting, learning, and behavior change. Adaptive leadership is the work that practitioners do to mobilize and support patients to do the adaptive work. The purpose of this paper is to describe this framework and demonstrate its application to nursing research. We demonstrate the framework's utility with five exemplars of nursing research problems that range from the individual to the system levels. The framework has the potential to guide researchers to ask new questions and to gain new insights into how practitioners interact with patients at the point of care to increase the patient's ability to tackle challenging problems and improve their own health care outcomes. It is a potentially powerful framework for developing and testing a new generation of interventions to address complex issues by harnessing and learning about the adaptive capabilities of patients within their life contexts.

  15. A Framework for Conducting ESL/EFL Construct Validation Studies.

    ERIC Educational Resources Information Center

    Mouw, John T.; Perkins, Kyle

    The purpose for which a test is used and the examinees' stage of learning are two anchor points that are incorporated into a suggested framework for conducting construct validation studies for tests of students with English as a second language (ESL) or English as a foreign language (EFL). The framework includes the use of generalizability theory,…

  16. Anderson Acceleration for Fixed-Point Iterations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Homer F.

    The purpose of this grant was to support research on acceleration methods for fixed-point iterations, with applications to computational frameworks and simulation problems that are of interest to DOE.

  17. Multilevel Optimization Framework for Hierarchical Stiffened Shells Accelerated by Adaptive Equivalent Strategy

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Tian, Kuo; Zhao, Haixin; Hao, Peng; Zhu, Tianyu; Zhang, Ke; Ma, Yunlong

    2017-06-01

    In order to improve the post-buckling optimization efficiency of hierarchical stiffened shells, a multilevel optimization framework accelerated by adaptive equivalent strategy is presented in this paper. Firstly, the Numerical-based Smeared Stiffener Method (NSSM) for hierarchical stiffened shells is derived by means of the numerical implementation of asymptotic homogenization (NIAH) method. Based on the NSSM, a reasonable adaptive equivalent strategy for hierarchical stiffened shells is developed from the concept of hierarchy reduction. Its core idea is to self-adaptively decide which hierarchy of the structure should be equivalent according to the critical buckling mode rapidly predicted by NSSM. Compared with the detailed model, the high prediction accuracy and efficiency of the proposed model is highlighted. On the basis of this adaptive equivalent model, a multilevel optimization framework is then established by decomposing the complex entire optimization process into major-stiffener-level and minor-stiffener-level sub-optimizations, during which Fixed Point Iteration (FPI) is employed to accelerate convergence. Finally, the illustrative examples of the multilevel framework is carried out to demonstrate its efficiency and effectiveness to search for the global optimum result by contrast with the single-level optimization method. Remarkably, the high efficiency and flexibility of the adaptive equivalent strategy is indicated by compared with the single equivalent strategy.

  18. Noise-aware dictionary-learning-based sparse representation framework for detection and removal of single and combined noises from ECG signal

    PubMed Central

    Ramkumar, Barathram; Sabarimalai Manikandan, M.

    2017-01-01

    Automatic electrocardiogram (ECG) signal enhancement has become a crucial pre-processing step in most ECG signal analysis applications. In this Letter, the authors propose an automated noise-aware dictionary learning-based generalised ECG signal enhancement framework which can automatically learn the dictionaries based on the ECG noise type for effective representation of ECG signal and noises, and can reduce the computational load of sparse representation-based ECG enhancement system. The proposed framework consists of noise detection and identification, noise-aware dictionary learning, sparse signal decomposition and reconstruction. The noise detection and identification is performed based on the moving average filter, first-order difference, and temporal features such as number of turning points, maximum absolute amplitude, zerocrossings, and autocorrelation features. The representation dictionary is learned based on the type of noise identified in the previous stage. The proposed framework is evaluated using noise-free and noisy ECG signals. Results demonstrate that the proposed method can significantly reduce computational load as compared with conventional dictionary learning-based ECG denoising approaches. Further, comparative results show that the method outperforms existing methods in automatically removing noises such as baseline wanders, power-line interference, muscle artefacts and their combinations without distorting the morphological content of local waves of ECG signal. PMID:28529758

  19. A theoretical framework for holistic hospital management in the Japanese healthcare context.

    PubMed

    Liu, Hu-Chen

    2013-11-01

    This paper develops a conceptual framework for performance measurement as a pilot study on holistic hospital management in the Japanese healthcare context. We primarily used two data sources as well as expert statements obtained through interviews: a systematic review of literature and a questionnaire survey to healthcare experts. The systematic survey searched PubMed and PubMed Central, and 24 relevant papers were elicited. The expert questionnaire asked respondents to rate the degree of "usefulness" for each of 66 indicators on a three-point scale. Applying the theoretical framework, a minimum set of performance indicators was selected for holistic hospital management, which well fit the healthcare context in Japan. This indicator set comprised 35 individual indicators and several factors measured through questionnaire surveys. The indicators were confirmed by expert judgments from viewpoints of face, content and construct validities as well as their usefulness. A theoretical framework of performance measurement was established from primary healthcare stakeholders' perspectives. Performance indicators were largely divided into healthcare outcomes and performance shaping factors. Indicators in the former category may be applied for the detection of operational problems, while their latent causes can be effectively addressed by the latter category in terms of process, structure and culture/climate within the organization. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. Noise-aware dictionary-learning-based sparse representation framework for detection and removal of single and combined noises from ECG signal.

    PubMed

    Satija, Udit; Ramkumar, Barathram; Sabarimalai Manikandan, M

    2017-02-01

    Automatic electrocardiogram (ECG) signal enhancement has become a crucial pre-processing step in most ECG signal analysis applications. In this Letter, the authors propose an automated noise-aware dictionary learning-based generalised ECG signal enhancement framework which can automatically learn the dictionaries based on the ECG noise type for effective representation of ECG signal and noises, and can reduce the computational load of sparse representation-based ECG enhancement system. The proposed framework consists of noise detection and identification, noise-aware dictionary learning, sparse signal decomposition and reconstruction. The noise detection and identification is performed based on the moving average filter, first-order difference, and temporal features such as number of turning points, maximum absolute amplitude, zerocrossings, and autocorrelation features. The representation dictionary is learned based on the type of noise identified in the previous stage. The proposed framework is evaluated using noise-free and noisy ECG signals. Results demonstrate that the proposed method can significantly reduce computational load as compared with conventional dictionary learning-based ECG denoising approaches. Further, comparative results show that the method outperforms existing methods in automatically removing noises such as baseline wanders, power-line interference, muscle artefacts and their combinations without distorting the morphological content of local waves of ECG signal.

  1. VIP: Vortex Image Processing Package for High-contrast Direct Imaging

    NASA Astrophysics Data System (ADS)

    Gomez Gonzalez, Carlos Alberto; Wertz, Olivier; Absil, Olivier; Christiaens, Valentin; Defrère, Denis; Mawet, Dimitri; Milli, Julien; Absil, Pierre-Antoine; Van Droogenbroeck, Marc; Cantalloube, Faustine; Hinz, Philip M.; Skemer, Andrew J.; Karlsson, Mikael; Surdej, Jean

    2017-07-01

    We present the Vortex Image Processing (VIP) library, a python package dedicated to astronomical high-contrast imaging. Our package relies on the extensive python stack of scientific libraries and aims to provide a flexible framework for high-contrast data and image processing. In this paper, we describe the capabilities of VIP related to processing image sequences acquired using the angular differential imaging (ADI) observing technique. VIP implements functionalities for building high-contrast data processing pipelines, encompassing pre- and post-processing algorithms, potential source position and flux estimation, and sensitivity curve generation. Among the reference point-spread function subtraction techniques for ADI post-processing, VIP includes several flavors of principal component analysis (PCA) based algorithms, such as annular PCA and incremental PCA algorithms capable of processing big datacubes (of several gigabytes) on a computer with limited memory. Also, we present a novel ADI algorithm based on non-negative matrix factorization, which comes from the same family of low-rank matrix approximations as PCA and provides fairly similar results. We showcase the ADI capabilities of the VIP library using a deep sequence on HR 8799 taken with the LBTI/LMIRCam and its recently commissioned L-band vortex coronagraph. Using VIP, we investigated the presence of additional companions around HR 8799 and did not find any significant additional point source beyond the four known planets. VIP is available at http://github.com/vortex-exoplanet/VIP and is accompanied with Jupyter notebook tutorials illustrating the main functionalities of the library.

  2. On Looking into the Black Box: Prospects and Limits in the Search for Mental Models

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.; Morris, N. M.

    1984-01-01

    To place the arguments advanced in this paper in alternative points of view with regard to mental models are reviewed. Use of the construct in areas such as neural information processing, manual control, decision making, problem solving, and cognitive science are discussed. Also reviewed are several taxonomies of mental models. The available empirical evidence for answering questions concerning the nature and usage of mental models is then discussed. A variety of studies are reviewed where the type and form of humans' knowledge have been manipulated. Also considered are numerous transfer of training studies whose results provide indirect evidence of the nature of mental models. The alternative perspectives considered and the spectrum of empirical evidence are combined to suggest a framework within which research on mental models can be viewed. By considering interactions of dimensions of this framework, the most salient unanswered questions can be identified.

  3. Sonority as variation: A study about the conceptualization of physical notions in university students

    NASA Astrophysics Data System (ADS)

    Escudero, Consuelo; Jaime, Eduardo A.

    2007-11-01

    Results of researches over conceptions and specific competencies of university students as regards acoustic waves and their conceptualization are put forward in this paper. The starting point is a theoretical scheme previously done [4] [5] that allows the linking and interconnection of theorical contributions related with the cognitive psychology, the developmental psychology, problems solving, the linguistic and symbolical representation of concepts and their relation with the didactics. The corpus is made up mainly by answers to written works which have allowed analyzing implicit conceptions of students, especially those ignored or misunderstood by them. This is a qualitative research, in which data are grouped in categories that are not provided before the theoretical framework. Conclusions show the potentiality of the theoretical framework to interpret processes of meaning building of the level of sonority as variation, and for the design and improvement of instructional proposals tending to achieve a critical meaningful learning.

  4. Comprehensive preclinical evaluation of a multi-physics model of liver tumor radiofrequency ablation.

    PubMed

    Audigier, Chloé; Mansi, Tommaso; Delingette, Hervé; Rapaka, Saikiran; Passerini, Tiziano; Mihalef, Viorel; Jolly, Marie-Pierre; Pop, Raoul; Diana, Michele; Soler, Luc; Kamen, Ali; Comaniciu, Dorin; Ayache, Nicholas

    2017-09-01

    We aim at developing a framework for the validation of a subject-specific multi-physics model of liver tumor radiofrequency ablation (RFA). The RFA computation becomes subject specific after several levels of personalization: geometrical and biophysical (hemodynamics, heat transfer and an extended cellular necrosis model). We present a comprehensive experimental setup combining multimodal, pre- and postoperative anatomical and functional images, as well as the interventional monitoring of intra-operative signals: the temperature and delivered power. To exploit this dataset, an efficient processing pipeline is introduced, which copes with image noise, variable resolution and anisotropy. The validation study includes twelve ablations from five healthy pig livers: a mean point-to-mesh error between predicted and actual ablation extent of 5.3 ± 3.6 mm is achieved. This enables an end-to-end preclinical validation framework that considers the available dataset.

  5. A Conceptual framework of Strategy, Structure and Innovative Behaviour for the Development of a Dynamic Simulation Model

    NASA Astrophysics Data System (ADS)

    Konstantopoulos, Nikolaos; Trivellas, Panagiotis; Reklitis, Panagiotis

    2007-12-01

    According to many researchers of organizational theory, a great number of problems encountered by the manufacturing firms are due to their failure to foster innovative behaviour by aligning business strategy and structure. From this point of view, the fit between strategy and structure is essential in order to facilitate firms' innovative behaviour. In the present paper, we adopt Porter's typology to operationalise business strategy (cost leadership, innovative and marketing differentiation, and focus). Organizational structure is built on four dimensions (centralization, formalization, complexity and employees' initiatives to implement new ideas). Innovativeness is measured as product innovation, process and technological innovation. This study provides the necessary theoretical framework for the development of a dynamic simulation method, although the simulation of social events is a quite difficult task, considering that there are so many alternatives (not all well understood).

  6. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction.

    PubMed

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  7. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction

    NASA Astrophysics Data System (ADS)

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  8. Unraveling dynamics of human physical activity patterns in chronic pain conditions

    NASA Astrophysics Data System (ADS)

    Paraschiv-Ionescu, Anisoara; Buchser, Eric; Aminian, Kamiar

    2013-06-01

    Chronic pain is a complex disabling experience that negatively affects the cognitive, affective and physical functions as well as behavior. Although the interaction between chronic pain and physical functioning is a well-accepted paradigm in clinical research, the understanding of how pain affects individuals' daily life behavior remains a challenging task. Here we develop a methodological framework allowing to objectively document disruptive pain related interferences on real-life physical activity. The results reveal that meaningful information is contained in the temporal dynamics of activity patterns and an analytical model based on the theory of bivariate point processes can be used to describe physical activity behavior. The model parameters capture the dynamic interdependence between periods and events and determine a `signature' of activity pattern. The study is likely to contribute to the clinical understanding of complex pain/disease-related behaviors and establish a unified mathematical framework to quantify the complex dynamics of various human activities.

  9. An Organizational Learning Framework for Patient Safety.

    PubMed

    Edwards, Marc T

    Despite concerted effort to improve quality and safety, high reliability remains a distant goal. Although this likely reflects the challenge of organizational change, persistent controversy over basic issues suggests that weaknesses in conceptual models may contribute. The essence of operational improvement is organizational learning. This article presents a framework for identifying leverage points for improvement based on organizational learning theory and applies it to an analysis of current practice and controversy. Organizations learn from others, from defects, from measurement, and from mindfulness. These learning modes correspond with contemporary themes of collaboration, no blame for human error, accountability for performance, and managing the unexpected. The collaborative model has dominated improvement efforts. Greater attention to the underdeveloped modes of organizational learning may foster more rapid progress in patient safety by increasing organizational capabilities, strengthening a culture of safety, and fixing more of the process problems that contribute to patient harm.

  10. Implications of climate change mitigation for sustainable development

    NASA Astrophysics Data System (ADS)

    Jakob, Michael; Steckel, Jan Christoph

    2016-10-01

    Evaluating the trade-offs between the risks related to climate change, climate change mitigation as well as co-benefits requires an integrated scenarios approach to sustainable development. We outline a conceptual multi-objective framework to assess climate policies that takes into account climate impacts, mitigation costs, water and food availability, technological risks of nuclear energy and carbon capture and sequestration as well as co-benefits of reducing local air pollution and increasing energy security. This framework is then employed as an example to different climate change mitigation scenarios generated with integrated assessment models. Even though some scenarios encompass considerable challenges for sustainability, no scenario performs better or worse than others in all dimensions, pointing to trade-offs between different dimensions of sustainable development. For this reason, we argue that these trade-offs need to be evaluated in a process of public deliberation that includes all relevant social actors.

  11. Application of System Operational Effectiveness Methodology to Space Launch Vehicle Development and Operations

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.; Kelley, Gary W.

    2012-01-01

    The Department of Defense (DoD) defined System Operational Effectiveness (SOE) model provides an exceptional framework for an affordable approach to the development and operation of space launch vehicles and their supporting infrastructure. The SOE model provides a focal point from which to direct and measure technical effectiveness and process efficiencies of space launch vehicles. The application of the SOE model to a space launch vehicle's development and operation effort leads to very specific approaches and measures that require consideration during the design phase. This paper provides a mapping of the SOE model to the development of space launch vehicles for human exploration by addressing the SOE model key points of measurement including System Performance, System Availability, Technical Effectiveness, Process Efficiency, System Effectiveness, Life Cycle Cost, and Affordable Operational Effectiveness. In addition, the application of the SOE model to the launch vehicle development process is defined providing the unique aspects of space launch vehicle production and operations in lieu of the traditional broader SOE context that examines large quantities of fielded systems. The tailoring and application of the SOE model to space launch vehicles provides some key insights into the operational design drivers, capability phasing, and operational support systems.

  12. Engineering Change Management Method Framework in Mechanical Engineering

    NASA Astrophysics Data System (ADS)

    Stekolschik, Alexander

    2016-11-01

    Engineering changes make an impact on different process chains in and outside the company, and lead to most error costs and time shifts. In fact, 30 to 50 per cent of development costs result from technical changes. Controlling engineering change processes can help us to avoid errors and risks, and contribute to cost optimization and a shorter time to market. This paper presents a method framework for controlling engineering changes at mechanical engineering companies. The developed classification of engineering changes and accordingly process requirements build the basis for the method framework. The developed method framework comprises two main areas: special data objects managed in different engineering IT tools and process framework. Objects from both areas are building blocks that can be selected to the overall business process based on the engineering process type and change classification. The process framework contains steps for the creation of change objects (both for overall change and for parts), change implementation, and release. Companies can select singleprocess building blocks from the framework, depending on the product development process and change impact. The developed change framework has been implemented at a division (10,000 employees) of a big German mechanical engineering company.

  13. Moles: Tool-Assisted Environment Isolation with Closures

    NASA Astrophysics Data System (ADS)

    de Halleux, Jonathan; Tillmann, Nikolai

    Isolating test cases from environment dependencies is often desirable, as it increases test reliability and reduces test execution time. However, code that calls non-virtual methods or consumes sealed classes is often impossible to test in isolation. Moles is a new lightweight framework which addresses this problem. For any .NET method, Moles allows test-code to provide alternative implementations, given as .NET delegates, for which C# provides very concise syntax while capturing local variables in a closure object. Using code instrumentation, the Moles framework will redirect calls to provided delegates instead of the original methods. The Moles framework is designed to work together with the dynamic symbolic execution tool Pex to enable automated test generation. In a case study, testing code programmed against the Microsoft SharePoint Foundation API, we achieved full code coverage while running tests in isolation without an actual SharePoint server. The Moles framework integrates with .NET and Visual Studio.

  14. Smart Point Cloud: Definition and Remaining Challenges

    NASA Astrophysics Data System (ADS)

    Poux, F.; Hallot, P.; Neuville, R.; Billen, R.

    2016-10-01

    Dealing with coloured point cloud acquired from terrestrial laser scanner, this paper identifies remaining challenges for a new data structure: the smart point cloud. This concept arises with the statement that massive and discretized spatial information from active remote sensing technology is often underused due to data mining limitations. The generalisation of point cloud data associated with the heterogeneity and temporality of such datasets is the main issue regarding structure, segmentation, classification, and interaction for an immediate understanding. We propose to use both point cloud properties and human knowledge through machine learning to rapidly extract pertinent information, using user-centered information (smart data) rather than raw data. A review of feature detection, machine learning frameworks and database systems indexed both for mining queries and data visualisation is studied. Based on existing approaches, we propose a new 3-block flexible framework around device expertise, analytic expertise and domain base reflexion. This contribution serves as the first step for the realisation of a comprehensive smart point cloud data structure.

  15. Photogrammetric Processing of Planetary Linear Pushbroom Images Based on Approximate Orthophotos

    NASA Astrophysics Data System (ADS)

    Geng, X.; Xu, Q.; Xing, S.; Hou, Y. F.; Lan, C. Z.; Zhang, J. J.

    2018-04-01

    It is still a great challenging task to efficiently produce planetary mapping products from orbital remote sensing images. There are many disadvantages in photogrammetric processing of planetary stereo images, such as lacking ground control information and informative features. Among which, image matching is the most difficult job in planetary photogrammetry. This paper designs a photogrammetric processing framework for planetary remote sensing images based on approximate orthophotos. Both tie points extraction for bundle adjustment and dense image matching for generating digital terrain model (DTM) are performed on approximate orthophotos. Since most of planetary remote sensing images are acquired by linear scanner cameras, we mainly deal with linear pushbroom images. In order to improve the computational efficiency of orthophotos generation and coordinates transformation, a fast back-projection algorithm of linear pushbroom images is introduced. Moreover, an iteratively refined DTM and orthophotos scheme was adopted in the DTM generation process, which is helpful to reduce search space of image matching and improve matching accuracy of conjugate points. With the advantages of approximate orthophotos, the matching results of planetary remote sensing images can be greatly improved. We tested the proposed approach with Mars Express (MEX) High Resolution Stereo Camera (HRSC) and Lunar Reconnaissance Orbiter (LRO) Narrow Angle Camera (NAC) images. The preliminary experimental results demonstrate the feasibility of the proposed approach.

  16. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 1: Frequency analysis

    NASA Astrophysics Data System (ADS)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    We develop a general framework for the frequency analysis of irregularly sampled time series. It is based on the Lomb-Scargle periodogram, but extended to algebraic operators accounting for the presence of a polynomial trend in the model for the data, in addition to a periodic component and a background noise. Special care is devoted to the correlation between the trend and the periodic component. This new periodogram is then cast into the Welch overlapping segment averaging (WOSA) method in order to reduce its variance. We also design a test of significance for the WOSA periodogram, against the background noise. The model for the background noise is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, more general than the classical Gaussian white or red noise processes. CARMA parameters are estimated following a Bayesian framework. We provide algorithms that compute the confidence levels for the WOSA periodogram and fully take into account the uncertainty in the CARMA noise parameters. Alternatively, a theory using point estimates of CARMA parameters provides analytical confidence levels for the WOSA periodogram, which are more accurate than Markov chain Monte Carlo (MCMC) confidence levels and, below some threshold for the number of data points, less costly in computing time. We then estimate the amplitude of the periodic component with least-squares methods, and derive an approximate proportionality between the squared amplitude and the periodogram. This proportionality leads to a new extension for the periodogram: the weighted WOSA periodogram, which we recommend for most frequency analyses with irregularly sampled data. The estimated signal amplitude also permits filtering in a frequency band. Our results generalise and unify methods developed in the fields of geosciences, engineering, astronomy and astrophysics. They also constitute the starting point for an extension to the continuous wavelet transform developed in a companion article (Lenoir and Crucifix, 2018). All the methods presented in this paper are available to the reader in the Python package WAVEPAL.

  17. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.

  18. This is my kidney, I should be able to do with it what I want: towards a legal framework for organ transplants in South Africa.

    PubMed

    Slabbert, Magda

    2012-12-01

    In 2010 illegal kidney transplants performed in South African hospitals were exposed. Living donors (actually sellers) from Brazil and Romania were flown into South Africa where a kidney was harvested from each and transplanted into Israeli patients. The media reports that followed indicated an outcry against the sale of human kidneys. But by analysing the whole transplantation process from the point of view of each person involved in the transplantation, namely the recipient, the donor, the doctor and the black market in the background the feeling is created that a process of payment for a kidney seems fairer than the current way of procuring organs either legally or illegally.

  19. Common Approach to Geoprocessing of Uav Data across Application Domains

    NASA Astrophysics Data System (ADS)

    Percivall, G. S.; Reichardt, M.; Taylor, T.

    2015-08-01

    UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.

  20. Setting the stage for habitable planets.

    PubMed

    Gonzalez, Guillermo

    2014-02-21

    Our understanding of the processes that are relevant to the formation and maintenance of habitable planetary systems is advancing at a rapid pace, both from observation and theory. The present review focuses on recent research that bears on this topic and includes discussions of processes occurring in astrophysical, geophysical and climatic contexts, as well as the temporal evolution of planetary habitability. Special attention is given to recent observations of exoplanets and their host stars and the theories proposed to explain the observed trends. Recent theories about the early evolution of the Solar System and how they relate to its habitability are also summarized. Unresolved issues requiring additional research are pointed out, and a framework is provided for estimating the number of habitable planets in the Universe.

  1. Setting the Stage for Habitable Planets

    PubMed Central

    Gonzalez, Guillermo

    2014-01-01

    Our understanding of the processes that are relevant to the formation and maintenance of habitable planetary systems is advancing at a rapid pace, both from observation and theory. The present review focuses on recent research that bears on this topic and includes discussions of processes occurring in astrophysical, geophysical and climatic contexts, as well as the temporal evolution of planetary habitability. Special attention is given to recent observations of exoplanets and their host stars and the theories proposed to explain the observed trends. Recent theories about the early evolution of the Solar System and how they relate to its habitability are also summarized. Unresolved issues requiring additional research are pointed out, and a framework is provided for estimating the number of habitable planets in the Universe. PMID:25370028

  2. Improved monitoring framework for local planning in the water, sanitation and hygiene sector: From data to decision-making.

    PubMed

    Garriga, Ricard Giné; de Palencia, Alejandro Jiménez Fdez; Foguet, Agustí Pérez

    2015-09-01

    Today, a vast proportion of people still lack a simple pit latrine and a source of safe drinking water. To help end this appalling state of affairs, there is a pressing need to provide policymakers with evidences which may be the basis of effective planning, targeting and prioritization. Two major challenges often hinder this process: i) lack of reliable data to identify which areas are most in need; and ii) inadequate instruments for decision-making support. In tackling previous shortcomings, this paper proposes a monitoring framework to compile, analyze, interpret and disseminate water, sanitation and hygiene information. In an era of decentralization, where decision-making moves to local governments, we apply such framework at the local level. The ultimate goal is to develop appropriate tools for decentralized planning support. To this end, the study first implements a methodology for primary data collection, which combines the household and the waterpoint as information sources. In doing so, we provide a complete picture of the context in which domestic WASH services are delivered. Second, the collected data are analyzed to underline the emerging development challenges. The use of simple planning indicators serves as the basis to i) reveal which areas require policy attention, and to ii) identify the neediest. Third, a classification process is proposed to prioritize among various populations. Three different case studies from East and Southern African countries are presented. Results indicate that accurate and comprehensive data, if adequately exploited through simple instruments, may be the basis of effective targeting and prioritization, which are central to sector planning. The application of the proposed framework in the real world, however, is to a certain extent elusive; and we point out to conclude two specific challenges that remain unaddressed, namely the upgrade of existing decision-making processes to enhance transparency and inclusiveness, and the development of data updating mechanisms. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. A standardised graphic method for describing data privacy frameworks in primary care research using a flexible zone model.

    PubMed

    Kuchinke, Wolfgang; Ohmann, Christian; Verheij, Robert A; van Veen, Evert-Ben; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C

    2014-12-01

    To develop a model describing core concepts and principles of data flow, data privacy and confidentiality, in a simple and flexible way, using concise process descriptions and a diagrammatic notation applied to research workflow processes. The model should help to generate robust data privacy frameworks for research done with patient data. Based on an exploration of EU legal requirements for data protection and privacy, data access policies, and existing privacy frameworks of research projects, basic concepts and common processes were extracted, described and incorporated into a model with a formal graphical representation and a standardised notation. The Unified Modelling Language (UML) notation was enriched by workflow and own symbols to enable the representation of extended data flow requirements, data privacy and data security requirements, privacy enhancing techniques (PET) and to allow privacy threat analysis for research scenarios. Our model is built upon the concept of three privacy zones (Care Zone, Non-care Zone and Research Zone) containing databases, data transformation operators, such as data linkers and privacy filters. Using these model components, a risk gradient for moving data from a zone of high risk for patient identification to a zone of low risk can be described. The model was applied to the analysis of data flows in several general clinical research use cases and two research scenarios from the TRANSFoRm project (e.g., finding patients for clinical research and linkage of databases). The model was validated by representing research done with the NIVEL Primary Care Database in the Netherlands. The model allows analysis of data privacy and confidentiality issues for research with patient data in a structured way and provides a framework to specify a privacy compliant data flow, to communicate privacy requirements and to identify weak points for an adequate implementation of data privacy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Enzyme clustering accelerates processing of intermediates through metabolic channeling

    PubMed Central

    Castellana, Michele; Wilson, Maxwell Z.; Xu, Yifan; Joshi, Preeti; Cristea, Ileana M.; Rabinowitz, Joshua D.; Gitai, Zemer; Wingreen, Ned S.

    2015-01-01

    We present a quantitative model to demonstrate that coclustering multiple enzymes into compact agglomerates accelerates the processing of intermediates, yielding the same efficiency benefits as direct channeling, a well-known mechanism in which enzymes are funneled between enzyme active sites through a physical tunnel. The model predicts the separation and size of coclusters that maximize metabolic efficiency, and this prediction is in agreement with previously reported spacings between coclusters in mammalian cells. For direct validation, we study a metabolic branch point in Escherichia coli and experimentally confirm the model prediction that enzyme agglomerates can accelerate the processing of a shared intermediate by one branch, and thus regulate steady-state flux division. Our studies establish a quantitative framework to understand coclustering-mediated metabolic channeling and its application to both efficiency improvement and metabolic regulation. PMID:25262299

  5. A non-equilibrium neutral model for analysing cultural change.

    PubMed

    Kandler, Anne; Shennan, Stephen

    2013-08-07

    Neutral evolution is a frequently used model to analyse changes in frequencies of cultural variants over time. Variants are chosen to be copied according to their relative frequency and new variants are introduced by a process of random mutation. Here we present a non-equilibrium neutral model which accounts for temporally varying population sizes and mutation rates and makes it possible to analyse the cultural system under consideration at any point in time. This framework gives an indication whether observed changes in the frequency distributions of a set of cultural variants between two time points are consistent with the random copying hypothesis. We find that the likelihood of the existence of the observed assemblage at the end of the considered time period (expressed by the probability of the observed number of cultural variants present in the population during the whole period under neutral evolution) is a powerful indicator of departures from neutrality. Further, we study the effects of frequency-dependent selection on the evolutionary trajectories and present a case study of change in the decoration of pottery in early Neolithic Central Europe. Based on the framework developed we show that neutral evolution is not an adequate description of the observed changes in frequency. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Generalized estimators of avian abundance from count survey data

    USGS Publications Warehouse

    Royle, J. Andrew

    2004-01-01

    I consider modeling avian abundance from spatially referenced bird count data collected according to common protocols such as capture?recapture, multiple observer, removal sampling and simple point counts. Small sample sizes and large numbers of parameters have motivated many analyses that disregard the spatial indexing of the data, and thus do not provide an adequate treatment of spatial structure. I describe a general framework for modeling spatially replicated data that regards local abundance as a random process, motivated by the view that the set of spatially referenced local populations (at the sample locations) constitute a metapopulation. Under this view, attention can be focused on developing a model for the variation in local abundance independent of the sampling protocol being considered. The metapopulation model structure, when combined with the data generating model, define a simple hierarchical model that can be analyzed using conventional methods. The proposed modeling framework is completely general in the sense that broad classes of metapopulation models may be considered, site level covariates on detection and abundance may be considered, and estimates of abundance and related quantities may be obtained for sample locations, groups of locations, unsampled locations. Two brief examples are given, the first involving simple point counts, and the second based on temporary removal counts. Extension of these models to open systems is briefly discussed.

  7. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 2: Framework process description

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paula S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    In the second volume of the Demonstration Framework Document, the graphical representation of the demonstration framework is given. This second document was created to facilitate the reading and comprehension of the demonstration framework. It is designed to be viewed in parallel with Section 4.2 of the first volume to help give a picture of the relationships between the UOB's (Unit of Behavior) of the model. The model is quite large and the design team felt that this form of presentation would make it easier for the reader to get a feel for the processes described in this document. The IDEF3 (Process Description Capture Method) diagrams of the processes of an Information System Development are presented. Volume 1 describes the processes and the agents involved with each process, while this volume graphically shows the precedence relationships among the processes.

  8. Challenges and opportunities of power systems from smart homes to super-grids.

    PubMed

    Kuhn, Philipp; Huber, Matthias; Dorfner, Johannes; Hamacher, Thomas

    2016-01-01

    The world's power systems are facing a structural change including liberalization of markets and integration of renewable energy sources. This paper describes the challenges that lie ahead in this process and points out avenues for overcoming different problems at different scopes, ranging from individual homes to international super-grids. We apply energy system models at those different scopes and find a trade-off between technical and social complexity. Small-scale systems would require technological breakthroughs, especially for storage, but individual agents can and do already start to build and operate such systems. In contrast, large-scale systems could potentially be more efficient from a techno-economic point of view. However, new political frameworks are required that enable long-term cooperation among sovereign entities through mutual trust. Which scope first achieves its breakthrough is not clear yet.

  9. TAMU: Blueprint for A New Space Mission Operations System Paradigm

    NASA Technical Reports Server (NTRS)

    Ruszkowski, James T.; Meshkat, Leila; Haensly, Jean; Pennington, Al; Hogle, Charles

    2011-01-01

    The Transferable, Adaptable, Modular and Upgradeable (TAMU) Flight Production Process (FPP) is a System of System (SOS) framework which cuts across multiple organizations and their associated facilities, that are, in the most general case, in geographically disperse locations, to develop the architecture and associated workflow processes of products for a broad range of flight projects. Further, TAMU FPP provides for the automatic execution and re-planning of the workflow processes as they become operational. This paper provides the blueprint for the TAMU FPP paradigm. This blueprint presents a complete, coherent technique, process and tool set that results in an infrastructure that can be used for full lifecycle design and decision making during the flight production process. Based on the many years of experience with the Space Shuttle Program (SSP) and the International Space Station (ISS), the currently cancelled Constellation Program which aimed on returning humans to the moon as a starting point, has been building a modern model-based Systems Engineering infrastructure to Re-engineer the FPP. This infrastructure uses a structured modeling and architecture development approach to optimize the system design thereby reducing the sustaining costs and increasing system efficiency, reliability, robustness and maintainability metrics. With the advent of the new vision for human space exploration, it is now necessary to further generalize this framework to take into consideration a broad range of missions and the participation of multiple organizations outside of the MOD; hence the Transferable, Adaptable, Modular and Upgradeable (TAMU) concept.

  10. Outcomes-focused knowledge translation: a framework for knowledge translation and patient outcomes improvement.

    PubMed

    Doran, Diane M; Sidani, Souraya

    2007-01-01

    Regularly accessing information that is current and reliable continues to be a challenge for front-line staff nurses. Reconceptualizing how nurses access information and designing appropriate decision support systems to facilitate timely access to information may be important for increasing research utilization. An outcomes-focused knowledge translation framework was developed to guide the continuous improvement of patient care through the uptake of research evidence and feedback data about patient outcomes. The framework operationalizes the three elements of the PARIHS framework at the point of care. Outcomes-focused knowledge translation involves four components: (a) patient outcomes measurement and real-time feedback about outcomes achievement; (b) best-practice guidelines, embedded in decision support tools that deliver key messages in response to patient assessment data; (c) clarification of patients' preferences for care; and (d) facilitation by advanced practice nurses and practice leaders. In this paper the framework is described and evidence is provided to support theorized relationships among the concepts in the framework. The framework guided the design of a knowledge translation intervention aimed at continuous improvement of patient care and evidence-based practice, which are fostered through real-time feedback data about patient outcomes, electronic access to evidence-based resources at the point of care, and facilitation by advanced practice nurses. The propositions in the framework need to be empirically tested through future research.

  11. A framework for automatic feature extraction from airborne light detection and ranging data

    NASA Astrophysics Data System (ADS)

    Yan, Jianhua

    Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance.

  12. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework.

    PubMed

    Yang, Guoxiang; Best, Elly P H

    2015-09-15

    Best management practices (BMPs) can be used effectively to reduce nutrient loads transported from non-point sources to receiving water bodies. However, methodologies of BMP selection and placement in a cost-effective way are needed to assist watershed management planners and stakeholders. We developed a novel modeling-optimization framework that can be used to find cost-effective solutions of BMP placement to attain nutrient load reduction targets. This was accomplished by integrating a GIS-based BMP siting method, a WQM-TMDL-N modeling approach to estimate total nitrogen (TN) loading, and a multi-objective optimization algorithm. Wetland restoration and buffer strip implementation were the two BMP categories used to explore the performance of this framework, both differing greatly in complexity of spatial analysis for site identification. Minimizing TN load and BMP cost were the two objective functions for the optimization process. The performance of this framework was demonstrated in the Tippecanoe River watershed, Indiana, USA. Optimized scenario-based load reduction indicated that the wetland subset selected by the minimum scenario had the greatest N removal efficiency. Buffer strips were more effective for load removal than wetlands. The optimized solutions provided a range of trade-offs between the two objective functions for both BMPs. This framework can be expanded conveniently to a regional scale because the NHDPlus catchment serves as its spatial computational unit. The present study demonstrated the potential of this framework to find cost-effective solutions to meet a water quality target, such as a 20% TN load reduction, under different conditions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. An Overview of Models of Speaking Performance and Its Implications for the Development of Procedural Framework for Diagnostic Speaking Tests

    ERIC Educational Resources Information Center

    Zhao, Zhongbao

    2013-01-01

    This paper aims at developing a procedural framework for the development and validation of diagnostic speaking tests. The researcher reviews the current available models of speaking performance, analyzes the distinctive features and then points out the implications for the development of a procedural framework for diagnostic speaking tests. On…

  14. Stress distribution in Co-Cr implant frameworks after laser or TIG welding.

    PubMed

    de Castro, Gabriela Cassaro; de Araújo, Cleudmar Amaral; Mesquita, Marcelo Ferraz; Consani, Rafael Leonardo Xediek; Nóbilo, Mauro Antônio de Arruda

    2013-01-01

    Lack of passivity has been associated with biomechanical problems in implant-supported prosthesis. The aim of this study was to evaluate the passivity of three techniques to fabricate an implant framework from a Co-Cr alloy by photoelasticity. The model was obtained from a steel die simulating an edentulous mandible with 4 external hexagon analog implants with a standard platform. On this model, five frameworks were fabricated for each group: a monoblock framework (control), laser and TIG welding frameworks. The photoelastic model was made from a flexible epoxy resin. On the photoelastic analysis, the frameworks were bolted onto the model for the verification of maximum shear stress at 34 selected points around the implants and 5 points in the middle of the model. The stresses were compared all over the photoelastic model, between the right, left, and center regions and between the cervical and apical regions. The values were subjected to two-way ANOVA, and Tukey's test (α=0.05). There was no significant difference among the groups and studied areas (p>0.05). It was concluded that the stresses generated around the implants were similar for all techniques.

  15. Personality and the Intergenerational Transmission of Educational Attainment: Evidence from Germany.

    PubMed

    Ryberg, Renee; Bauldry, Shawn; Schultz, Michael A; Steinhoff, Annekatrin; Shanahan, Michael

    2017-10-01

    Research based in the United States, with its relatively open educational system, has found that personality mediates the relationship between parents' and child's educational attainment and this mediational pattern is especially beneficial to students from less-educated households. Yet in highly structured, competitive educational systems, personality characteristics may not predict attainment or may be more or less consequential at different points in the educational career. We examine the salience of personality in the educational attainment process in the German educational system. Data come from a longitudinal sample of 682 17 to 25 year-olds (54% female) from the 2005 and 2015 German Socio-Economic Panel (SOEP). Results show that adolescent personality traits-openness, neuroticism, and conscientiousness-are associated with educational attainment, but personality plays a negligible role in the intergenerational transmission of education. Personality is influential before the decision about the type of secondary degree that a student will pursue (during adolescence). After that turning point, when students have entered different pathways through the system, personality is less salient. Cross-national comparisons in a life course framework broaden the scope of current research on non-cognitive skills and processes of socioeconomic attainment, alerting the analyst to the importance of both institutional structures and the changing importance of these skills at different points in the life course.

  16. DEEP ATTRACTOR NETWORK FOR SINGLE-MICROPHONE SPEAKER SEPARATION.

    PubMed

    Chen, Zhuo; Luo, Yi; Mesgarani, Nima

    2017-03-01

    Despite the overwhelming success of deep learning in various speech processing tasks, the problem of separating simultaneous speakers in a mixture remains challenging. Two major difficulties in such systems are the arbitrary source permutation and unknown number of sources in the mixture. We propose a novel deep learning framework for single channel speech separation by creating attractor points in high dimensional embedding space of the acoustic signals which pull together the time-frequency bins corresponding to each source. Attractor points in this study are created by finding the centroids of the sources in the embedding space, which are subsequently used to determine the similarity of each bin in the mixture to each source. The network is then trained to minimize the reconstruction error of each source by optimizing the embeddings. The proposed model is different from prior works in that it implements an end-to-end training, and it does not depend on the number of sources in the mixture. Two strategies are explored in the test time, K-means and fixed attractor points, where the latter requires no post-processing and can be implemented in real-time. We evaluated our system on Wall Street Journal dataset and show 5.49% improvement over the previous state-of-the-art methods.

  17. Assessment of variability in the hydrological cycle of the Loess Plateau, China: examining dependence structures of hydrological processes

    NASA Astrophysics Data System (ADS)

    Guo, A.; Wang, Y.

    2017-12-01

    Investigating variability in dependence structures of hydrological processes is of critical importance for developing an understanding of mechanisms of hydrological cycles in changing environments. In focusing on this topic, present work involves the following: (1) identifying and eliminating serial correlation and conditional heteroscedasticity in monthly streamflow (Q), precipitation (P) and potential evapotranspiration (PE) series using the ARMA-GARCH model (ARMA: autoregressive moving average; GARCH: generalized autoregressive conditional heteroscedasticity); (2) describing dependence structures of hydrological processes using partial copula coupled with the ARMA-GARCH model and identifying their variability via copula-based likelihood-ratio test method; and (3) determining conditional probability of annual Q under different climate scenarios on account of above results. This framework enables us to depict hydrological variables in the presence of conditional heteroscedasticity and to examine dependence structures of hydrological processes while excluding the influence of covariates by using partial copula-based ARMA-GARCH model. Eight major catchments across the Loess Plateau (LP) are used as study regions. Results indicate that (1) The occurrence of change points in dependence structures of Q and P (PE) varies across the LP. Change points of P-PE dependence structures in all regions almost fully correspond to the initiation of global warming, i.e., the early 1980s. (3) Conditional probabilities of annual Q under various P and PE scenarios are estimated from the 3-dimensional joint distribution of (Q, P and PE) based on the above change points. These findings shed light on mechanisms of the hydrological cycle and can guide water supply planning and management, particularly in changing environments.

  18. Exploring knowledge exchange: a useful framework for practice and policy.

    PubMed

    Ward, Vicky; Smith, Simon; House, Allan; Hamer, Susan

    2012-02-01

    Knowledge translation is underpinned by a dynamic and social knowledge exchange process but there are few descriptions of how this unfolds in practice settings. This has hampered attempts to produce realistic and useful models to help policymakers and researchers understand how knowledge exchange works. This paper reports the results of research which investigated the nature of knowledge exchange. We aimed to understand whether dynamic and fluid definitions of knowledge exchange are valid and to produce a realistic, descriptive framework of knowledge exchange. Our research was informed by a realist approach. We embedded a knowledge broker within three service delivery teams across a mental health organisation in the UK, each of whom was grappling with specific challenges. The knowledge broker participated in the team's problem-solving process and collected observational fieldnotes. We also interviewed the team members. Observational and interview data were analysed quantitatively and qualitatively in order to determine and describe the nature of the knowledge exchange process in more detail. This enabled us to refine our conceptual framework of knowledge exchange. We found that knowledge exchange can be understood as a dynamic and fluid process which incorporates distinct forms of knowledge from multiple sources. Quantitative analysis illustrated that five broadly-defined components of knowledge exchange (problem, context, knowledge, activities, use) can all be in play at any one time and do not occur in a set order. Qualitative analysis revealed a number of distinct themes which better described the nature of knowledge exchange. By shedding light on the nature of knowledge exchange, our findings problematise some of the linear, technicist approaches to knowledge translation. The revised model of knowledge exchange which we propose here could therefore help to reorient thinking about knowledge exchange and act as a starting point for further exploration and evaluation of the knowledge exchange process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. A decision-making framework for sediment contamination.

    PubMed

    Chapman, Peter M; Anderson, Janette

    2005-07-01

    A decision-making framework for determining whether or not contaminated sediments are polluted is described. This framework is intended to be sufficiently prescriptive to standardize the decision-making process but without using "cook book" assessments. It emphasizes 4 guidance "rules": (1) sediment chemistry data are only to be used alone for remediation decisions when the costs of further investigation outweigh the costs of remediation and there is agreement among all stakeholders to act; (2) remediation decisions are based primarily on biology; (3) lines of evidence (LOE), such as laboratory toxicity tests and models that contradict the results of properly conducted field surveys, are assumed incorrect; and (4) if the impacts of a remedial alternative will cause more environmental harm than good, then it should not be implemented. Sediments with contaminant concentrations below sediment quality guidelines (SQGs) that predict toxicity toless than 5% of sediment-dwelling infauna and that contain no quantifiable concentrations of substances capable of biomagnifying are excluded from further consideration, as are sediments that do not meet these criteria but have contaminant concentrations equal to or below reference concentrations. Biomagnification potential is initially addressed by conservative (worst case) modeling based on benthos and sediments and, subsequently, by additional food chain data and more realistic assumptions. Toxicity (acute and chronic) and alterations to resident communities are addressed by, respectively, laboratory studies and field observations. The integrative decision point for sediments is a weight of evidence (WOE) matrix combining up to 4 main LOE: chemistry, toxicity, community alteration, and biomagnification potential. Of 16 possible WOE scenarios, 6 result in definite decisions, and 10 require additional assessment. Typically, this framework will be applied to surficial sediments. The possibility that deeper sediments may be uncovered as a result of natural or other processes must also be investigated and may require similar assessment.

  20. Process evaluation of a patient-centred, patient-directed, group-based education program for the management of type 2 diabetes mellitus.

    PubMed

    Odgers-Jewell, Kate; Isenring, Elisabeth; Thomas, Rae; Reidlinger, Dianne P

    2017-07-01

    The present study developed and evaluated a patient-centred, patient-directed, group-based education program for the management of type 2 diabetes mellitus. Two frameworks, the Medical Research Council (MRC) framework for developing and evaluating complex interventions and the RE-AIM framework were followed. Data to develop the intervention were sourced from scoping of the literature and formative evaluation. Program evaluation comprised analysis of primary recruitment of participants through general practitioners, baseline and end-point measures of anthropometry, four validated questionnaires, contemporaneous facilitator notes and telephone interviews with participants. A total of 16 participants enrolled in the intervention. Post-intervention results were obtained from 13 participants, with an estimated mean change from baseline in weight of -0.72 kg (95%CI -1.44 to -0.01), body mass index of -0.25 kg/m 2 (95%CI -0.49 to -0.01) and waist circumference of -1.04 cm (95%CI -4.52 to 2.44). The group education program was acceptable to participants. The results suggest that recruitment through general practitioners is ineffective, and alternative recruitment strategies are required. This patient-centred, patient-directed, group-based intervention for the management of type 2 diabetes mellitus was both feasible and acceptable to patients. Health professionals should consider the combined use of the MRC and RE-AIM frameworks in the development of interventions to ensure a rigorous design process and to enable the evaluation of all phases of the intervention, which will facilitate translation to other settings. Further research with a larger sample trialling additional recruitment strategies, evaluating further measures of effectiveness and utilising lengthier follow-up periods is required. © 2016 Dietitians Association of Australia.

  1. Integrated Technology Assessment Center (ITAC) Update

    NASA Technical Reports Server (NTRS)

    Taylor, J. L.; Neely, M. A.; Curran, F. M.; Christensen, E. R.; Escher, D.; Lovell, N.; Morris, Charles (Technical Monitor)

    2002-01-01

    The Integrated Technology Assessment Center (ITAC) has developed a flexible systems analysis framework to identify long-term technology needs, quantify payoffs for technology investments, and assess the progress of ASTP-sponsored technology programs in the hypersonics area. For this, ITAC has assembled an experienced team representing a broad sector of the aerospace community and developed a systematic assessment process complete with supporting tools. Concepts for transportation systems are selected based on relevance to the ASTP and integrated concept models (ICM) of these concepts are developed. Key technologies of interest are identified and projections are made of their characteristics with respect to their impacts on key aspects of the specific concepts of interest. Both the models and technology projections are then fed into the ITAC's probabilistic systems analysis framework in ModelCenter. This framework permits rapid sensitivity analysis, single point design assessment, and a full probabilistic assessment of each concept with respect to both embedded and enhancing technologies. Probabilistic outputs are weighed against metrics of interest to ASTP using a multivariate decision making process to provide inputs for technology prioritization within the ASTP. ITAC program is currently finishing the assessment of a two-stage-to-orbit (TSTO), rocket-based combined cycle (RBCC) concept and a TSTO turbine-based combined cycle (TBCC) concept developed by the team with inputs from NASA. A baseline all rocket TSTO concept is also being developed for comparison. Boeing has recently submitted a performance model for their Flexible Aerospace System Solution for Tomorrow (FASST) concept and the ISAT program will provide inputs for a single-stage-to-orbit (SSTO) TBCC based concept in the near-term. Both of these latter concepts will be analyzed within the ITAC framework over the summer. This paper provides a status update of the ITAC program.

  2. 2D first break tomographic processing of data measured for celebration profiles: CEL01, CEL04, CEL05, CEL06, CEL09, CEL11

    NASA Astrophysics Data System (ADS)

    Bielik, M.; Vozar, J.; Hegedus, E.; Celebration Working Group

    2003-04-01

    The contribution informs about the preliminary results that relate to the first arrival p-wave seismic tomographic processing of data measured along the profiles CEL01, CEL04, CEL05, CEL06, CEL09 and CEL11. These profiles were measured in a framework of the seismic project called CELEBRATION 2000. Data acquisition and geometric parameters of the processed profiles, tomographic processing’s principle, particular processing steps and program parameters are described. Characteristic data (shot points, geophone points, total length of profiles, for all profiles, sampling, sensors and record lengths) of observation profiles are given. The fast program package developed by C. Zelt was applied for tomographic velocity inversion. This process consists of several steps. First step is a creation of the starting velocity field for which the calculated arrival times are modelled by the method of finite differences. The next step is minimization of differences between the measured and modelled arrival time till the deviation is small. Elimination of equivalency problem by including a priori information in the starting velocity field was done too. A priori information consists of the depth to the pre-Tertiary basement, estimation of its overlying sedimentary velocity from well-logging and or other seismic velocity data, etc. After checking the reciprocal times, pickings were corrected. The final result of the processing is a reliable travel time curve set considering the reciprocal times. We carried out picking of travel time curves, enhancement of signal-to-noise ratio on the seismograms using the program system of PROMAX. Tomographic inversion was carried out by so called 3D/2D procedure taking into account 3D wave propagation. It means that a corridor along the profile, which contains the outlying shot points and geophone points as well was defined and we carried out 3D processing within this corridor. The preliminary results indicate the seismic anomalous zones within the crust and the uppermost part of the upper mantle in the area consists of the Western Carpathians, the North European platform, the Pannonian basin and the Bohemian Massif.

  3. Conceptual framework for holistic dialysis management based on key performance indicators.

    PubMed

    Liu, Hu-Chen; Itoh, Kenji

    2013-10-01

    This paper develops a theoretical framework of holistic hospital management based on performance indicators that can be applied to dialysis hospitals, clinics or departments in Japan. Selection of a key indicator set and its validity tests were performed primarily by a questionnaire survey to dialysis experts as well as their statements obtained through interviews. The expert questionnaire asked respondents to rate the degree of "usefulness" for each of 66 indicators on a three-point scale (19 responses collected). Applying the theoretical framework, we selected a minimum set of key performance indicators for dialysis management that can be used in the Japanese context. The indicator set comprised 27 indicators and items that will be collected through three surveys: patient satisfaction, employee satisfaction, and safety culture. The indicators were confirmed by expert judgment from viewpoints of face, content and construct validity as well as their usefulness. This paper established a theoretical framework of performance measurement for holistic dialysis management from primary healthcare stakeholders' perspectives. In this framework, performance indicators were largely divided into healthcare outcomes and performance shaping factors. Indicators of the former type may be applied for the detection of operational problems or weaknesses in a dialysis hospital, clinic or department, while latent causes of each problem can be more effectively addressed by the latter type of indicators in terms of process, structure and culture/climate within the organization. © 2013 The Authors. Therapeutic Apheresis and Dialysis © 2013 International Society for Apheresis.

  4. Systems consolidation revisited, but not revised: The promise and limits of optogenetics in the study of memory.

    PubMed

    Hardt, Oliver; Nadel, Lynn

    2017-12-05

    Episodic memories (in humans) and event-like memories (in non-human animals) require the hippocampus for some time after acquisition, but at remote points seem to depend more on cortical areas instead. Systems consolidation refers to the process that promotes this reorganization of memory. Various theoretical frameworks accounting for this process have been proposed, but clear evidence favoring one or another of these positions has been lacking. Addressing this issue, a recent study deployed some of the most advanced neurobiological technologies - optogenetics and calcium imaging - and provided high resolution, precise observations regarding brain systems involved in recent and remote contextual fear memories. We critically review these findings within their historical context and conclude that they do not resolve the debate concerning systems consolidation. This is because the relevant question concerning the quality of memory at recent and remote time points has not been answered: Does the memory reorganization taking place during systems consolidation result in changes to the content of memory? Copyright © 2017 Elsevier B.V. All rights reserved.

  5. VISdish: A new tool for canting and shape-measuring solar-dish facets.

    PubMed

    Montecchi, Marco; Cara, Giuseppe; Benedetti, Arcangelo

    2017-06-01

    Solar dishes allow us to obtain highly concentrated solar fluxes used to produce electricity or feed thermal processes/storage. For practical reasons, the reflecting surface is composed by a number of facets. After the dish assembly, facet-canting is an important task for improving the concentration of solar radiation around the focus-point, as well as the capture ratio at the receiver placed there. Finally, flux profile should be measured or evaluated to verify the concentration quality. All these tasks can be achieved by the new tool we developed at ENEA, named VISdish. The instrument is based on the visual inspection system (VIS) approach and can work in two functionalities: canting and shape-measurement. The shape data are entered in a simulation software for evaluating the flux profile and concentration quality. With respect to prior methods, VISdish offers several advantages: (i) simpler data processing, because light point-source and its reflections are univocally related, (ii) higher accuracy. The instrument functionality is illustrated through the preliminary experimental results obtained on the dish recently installed in ENEA-Casaccia in the framework of the E.U. project OMSoP.

  6. An analysis of the influence of framework aspects on the study design of health economic modeling evaluations.

    PubMed

    Gurtner, Sebastian

    2013-04-01

    Research and practical guidelines have many implications for how to structure a health economic study. A major focus in recent decades has been the quality of health economic research. In practice, the factors influencing a study design are not limited to the quest for quality. Moreover, the framework of the study is important. This research addresses three major questions related to these framework aspects. First, we want to know whether the design of health economic studies has changed over time. Second, we want to know how the subject of a study, whether it is a process or product innovation, influences the parameters of the study design. Third, one of the most important questions we will answer is whether and how the study's source of funding has an impact on the design of the research. To answer these questions, a total of 234 health economic studies were analyzed using a correspondence analysis and a logistic regression analysis. All three categories of framework factors have an influence on the aspects of the study design. Health economic studies have evolved over time, leading to the use of more advanced methods like complex sensitivity analyses. Additionally, the patient's point of view has increased in importance. The evaluation of product innovations has focused more on utility concepts. On the other hand, the source of funding may influence only a few aspects of the study design, such as the use of evaluation methods, the source of data, and the use of certain utility measures. The most important trends in health care, such as the emphasis on the patients' point of view, become increasingly established in health economic evaluations with the passage of time. Although methodological challenges remain, modern information and communication technologies provide a basis for increasing the complexity and quality of health economic studies if used frequently.

  7. Predicting Bradycardia in Preterm Infants Using Point Process Analysis of Heart Rate.

    PubMed

    Gee, Alan H; Barbieri, Riccardo; Paydarfar, David; Indic, Premananda

    2017-09-01

    Episodes of bradycardia are common and recur sporadically in preterm infants, posing a threat to the developing brain and other vital organs. We hypothesize that bradycardias are a result of transient temporal destabilization of the cardiac autonomic control system and that fluctuations in the heart rate signal might contain information that precedes bradycardia. We investigate infant heart rate fluctuations with a novel application of point process theory. In ten preterm infants, we estimate instantaneous linear measures of the heart rate signal, use these measures to extract statistical features of bradycardia, and propose a simplistic framework for prediction of bradycardia. We present the performance of a prediction algorithm using instantaneous linear measures (mean area under the curve = 0.79 ± 0.018) for over 440 bradycardia events. The algorithm achieves an average forecast time of 116 s prior to bradycardia onset (FPR = 0.15). Our analysis reveals that increased variance in the heart rate signal is a precursor of severe bradycardia. This increase in variance is associated with an increase in power from low content dynamics in the LF band (0.04-0.2 Hz) and lower multiscale entropy values prior to bradycardia. Point process analysis of the heartbeat time series reveals instantaneous measures that can be used to predict infant bradycardia prior to onset. Our findings are relevant to risk stratification, predictive monitoring, and implementation of preventative strategies for reducing morbidity and mortality associated with bradycardia in neonatal intensive care units.

  8. Distributed Computing Framework for Synthetic Radar Application

    NASA Technical Reports Server (NTRS)

    Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael

    2006-01-01

    We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.

  9. A novel Bayesian approach to acoustic emission data analysis.

    PubMed

    Agletdinov, E; Pomponi, E; Merson, D; Vinogradov, A

    2016-12-01

    Acoustic emission (AE) technique is a popular tool for materials characterization and non-destructive testing. Originating from the stochastic motion of defects in solids, AE is a random process by nature. The challenging problem arises whenever an attempt is made to identify specific points corresponding to the changes in the trends in the fluctuating AE time series. A general Bayesian framework is proposed for the analysis of AE time series, aiming at automated finding the breakpoints signaling a crossover in the dynamics of underlying AE sources. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Liquefaction of Saturated Soil and the Diffusion Equation

    NASA Astrophysics Data System (ADS)

    Sawicki, Andrzej; Sławińska, Justyna

    2015-06-01

    The paper deals with the diffusion equation for pore water pressures with the source term, which is widely promoted in the marine engineering literature. It is shown that such an equation cannot be derived in a consistent way from the mass balance and the Darcy law. The shortcomings of the artificial source term are pointed out, including inconsistencies with experimental data. It is concluded that liquefaction and the preceding process of pore pressure generation and the weakening of the soil skeleton should be described by constitutive equations within the well-known framework of applied mechanics. Relevant references are provided

  11. Values based practice: a framework for thinking with.

    PubMed

    Mohanna, Kay

    2017-07-01

    Values are those principles that govern behaviours, and values-based practice has been described as a theory and skills base for effective healthcare decision-making where different (and hence potentially conflicting) values are in play. The emphasis is on good process rather than pre-set right outcomes, aiming to achieve balanced decision-making. In this article we will consider the utility of this model by looking at leadership development, a current area of much interest and investment in healthcare. Copeland points out that 'values based leadership behaviors are styles with a moral, authentic and ethical dimension', important qualities in healthcare decision-making.

  12. Study of the time evolution of correlation functions of the transverse Ising chain with ring frustration by perturbative theory

    NASA Astrophysics Data System (ADS)

    Zheng, Zhen-Yu; Li, Peng

    2018-04-01

    We consider the time evolution of two-point correlation function in the transverse-field Ising chain (TFIC) with ring frustration. The time-evolution procedure we investigated is equivalent to a quench process in which the system is initially prepared in a classical kink state and evolves according to the time-dependent Schrödinger equation. Within a framework of perturbative theory (PT) in the strong kink phase, the evolution of the correlation function is disclosed to demonstrate a qualitatively new behavior in contrast to the traditional case without ring frustration.

  13. Conflict of interest in biomedical research: a view from Europe.

    PubMed

    Salvi, Maurizio

    2003-01-01

    In this paper I address the conflict of interest (CoI) issue from a legal point of view at a European level. We will see that the regulatory framework that exists in Europe does state the need for the independence of ethics committee involved in authorisation of research and clinical trials. We will see that CoI is an element that has to be closely monitored at National and International level. Therefore, Member States and Newly Associated States do have to address CoI in the authorisation process of research and clinical protocols of biomedicine.

  14. [Claim and reality of selective contact options : experiences in finalizing selective contracts in urological care].

    PubMed

    Ex, P; Schroeder, A

    2014-08-01

    Selective contracts are an important component in addition to the total healthcare concept in order to introduce process-related innovations into the healthcare system. Since 2011 the Berufsverband der Deutschen Urologen (BDU, Professional Association of German Urologists) has held negotiations with individual health insurance companies and care providers in order to view selective contracts as collective contracts, not only as pilot projects but also as additional forms of care.This article illustrates the experiences of the BDU in the initiation and finalizing of selective contracts as well as existing weak points in the framework conditions.

  15. Robust group-wise rigid registration of point sets using t-mixture model

    NASA Astrophysics Data System (ADS)

    Ravikumar, Nishant; Gooya, Ali; Frangi, Alejandro F.; Taylor, Zeike A.

    2016-03-01

    A probabilistic framework for robust, group-wise rigid alignment of point-sets using a mixture of Students t-distribution especially when the point sets are of varying lengths, are corrupted by an unknown degree of outliers or in the presence of missing data. Medical images (in particular magnetic resonance (MR) images), their segmentations and consequently point-sets generated from these are highly susceptible to corruption by outliers. This poses a problem for robust correspondence estimation and accurate alignment of shapes, necessary for training statistical shape models (SSMs). To address these issues, this study proposes to use a t-mixture model (TMM), to approximate the underlying joint probability density of a group of similar shapes and align them to a common reference frame. The heavy-tailed nature of t-distributions provides a more robust registration framework in comparison to state of the art algorithms. Significant reduction in alignment errors is achieved in the presence of outliers, using the proposed TMM-based group-wise rigid registration method, in comparison to its Gaussian mixture model (GMM) counterparts. The proposed TMM-framework is compared with a group-wise variant of the well-known Coherent Point Drift (CPD) algorithm and two other group-wise methods using GMMs, using both synthetic and real data sets. Rigid alignment errors for groups of shapes are quantified using the Hausdorff distance (HD) and quadratic surface distance (QSD) metrics.

  16. Rationality versus reality: the challenges of evidence-based decision making for health policy makers

    PubMed Central

    2010-01-01

    Background Current healthcare systems have extended the evidence-based medicine (EBM) approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. Discussion We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM) and evidence-based policy making (EBPM) because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. Summary In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial suggestions about how the EBDM/EBPM process can be improved. PMID:20504357

  17. Rationality versus reality: the challenges of evidence-based decision making for health policy makers.

    PubMed

    McCaughey, Deirdre; Bruning, Nealia S

    2010-05-26

    Current healthcare systems have extended the evidence-based medicine (EBM) approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM) and evidence-based policy making (EBPM) because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial suggestions about how the EBDM/EBPM process can be improved.

  18. Feature Geo Analytics and Big Data Processing: Hybrid Approaches for Earth Science and Real-Time Decision Support

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.

    2016-12-01

    Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.

  19. Integration of health into urban spatial planning through impact assessment: Identifying governance and policy barriers and facilitators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmichael, Laurence, E-mail: Laurence.carmichael@uwe.ac.uk; Barton, Hugh; Gray, Selena

    This article presents the results of a review of literature examining the barriers and facilitators in integrating health in spatial planning at the local, mainly urban level, through appraisals. Our literature review covered the UK and non UK experiences of appraisals used to consider health issues in the planning process. We were able to identify four main categories of obstacles and facilitators including first the different knowledge and conceptual understanding of health by different actors/stakeholders, second the types of governance arrangements, in particular partnerships, in place and the political context, third the way institutions work, the responsibilities they have andmore » their capacity and resources and fourth the timeliness, comprehensiveness and inclusiveness of the appraisal process. The findings allowed us to draw some lessons on the governance and policy framework regarding the integration of health impact into spatial planning, in particular considering the pros and cons of integrating health impact assessment (HIA) into other forms of impact assessment of spatial planning decisions such as environmental impact assessment (EIA) and strategic environment assessment (SEA). In addition, the research uncovered a gap in the literature that tends to focus on the mainly voluntary HIA to assess health outcomes of planning decisions and neglect the analysis of regulatory mechanisms such as EIA and SEA. - Highlights: Black-Right-Pointing-Pointer Governance and policy barriers and facilitators to the integration of health into urban planning. Black-Right-Pointing-Pointer Review of literature on impact assessment methods used across the world. Black-Right-Pointing-Pointer Knowledge, partnerships, management/resources and processes can impede integration. Black-Right-Pointing-Pointer HIA evaluations prevail uncovering research opportunities for evaluating other techniques.« less

  20. Defining end-stage renal disease in clinical trials: a framework for adjudication.

    PubMed

    Agarwal, Rajiv

    2016-06-01

    Unlike definition of stroke and myocardial infarction, there is no uniformly agreed upon definition to adjudicate end-stage renal disease (ESRD). ESRD remains the most unambiguous and clinically relevant end point for clinical trialists, regulators, payers and patients with chronic kidney disease. The prescription of dialysis to patients with advanced chronic kidney disease is subjective and great variations exist among physicians and countries. Given the difficulties in diagnosing ESRD, the presence of estimated GFR <15 mL/min/1.7 3m(2) itself has been suggested as an end point. However, this definition is still a surrogate since many patients may live years without being symptomatic or needing dialysis. The purpose of this report is to describe a framework to define when the kidney function ends and when ESRD can be adjudicated. Discussed in this report are (i) the importance of diagnosing symptomatic uremia or advanced asymptomatic uremia thus establishing the need for dialysis; (ii) establishing the chronicity of dialysis so as to distinguish it from acute dialysis; (iii) establishing ESRD when dialysis is unavailable, refused or considered futile and (iv) the adjudication process. Several challenges and ambiguities that emerge in clinical trials and their possible solutions are provided. The criteria proposed herein may help to standardize the definition of ESRD and reduce the variability in adjudicating the most important renal end point in clinical trials of chronic kidney disease. Published by Oxford University Press on behalf of ERA-EDTA 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  1. Pre-defined and optional staging for the deployment of enterprise systems: a case study and a framework

    NASA Astrophysics Data System (ADS)

    Lichtenstein, Yossi; Cucuy, Shy; Fink, Lior

    2017-03-01

    The effective deployment of enterprise systems has been a major challenge for many organisations. Customising the new system, changing business processes, and integrating multiple information sources are all difficult tasks. As such, they are typically done in carefully planned stages in a process known as phased implementation. Using ideas from Option Theory, this article critiques aspects of phased implementation. One customer relationship management (CRM) project and its phased implementation are described in detail and ten other enterprise system deployments are summarised as a basis for the observation that almost all deployment stages are pre-defined operational steps rather than decision points. However, Option Theory suggests that optional stages, to be used only when risk materialises, should be integral parts of project plans. Although such optional stages are often more valuable than pre-defined stages, the evidence presented in this article shows that they are only rarely utilised. Therefore, a simple framework is presented; it first identifies risks related to the deployment of enterprise systems, then identifies optional stages that can mitigate these risks, and finally compares the costs and benefits of both pre-defined and optional stages.

  2. ``Force,'' ontology, and language

    NASA Astrophysics Data System (ADS)

    Brookes, David T.; Etkina, Eugenia

    2009-06-01

    We introduce a linguistic framework through which one can interpret systematically students’ understanding of and reasoning about force and motion. Some researchers have suggested that students have robust misconceptions or alternative frameworks grounded in everyday experience. Others have pointed out the inconsistency of students’ responses and presented a phenomenological explanation for what is observed, namely, knowledge in pieces. We wish to present a view that builds on and unifies aspects of this prior research. Our argument is that many students’ difficulties with force and motion are primarily due to a combination of linguistic and ontological difficulties. It is possible that students are primarily engaged in trying to define and categorize the meaning of the term “force” as spoken about by physicists. We found that this process of negotiation of meaning is remarkably similar to that engaged in by physicists in history. In this paper we will describe a study of the historical record that reveals an analogous process of meaning negotiation, spanning multiple centuries. Using methods from cognitive linguistics and systemic functional grammar, we will present an analysis of the force and motion literature, focusing on prior studies with interview data. We will then discuss the implications of our findings for physics instruction.

  3. Importance of dust storms in the diagenesis of sandstones: a case study, Entrada sandstone in the Ghost Ranch area, New Mexico, USA

    NASA Astrophysics Data System (ADS)

    Orhan, Hükmü

    1992-04-01

    The importance of dust storms on geological processes has only been studied recently. Case-hardening, desert-varnish formation, duricrust development, reddening and cementation of sediments and caliche formation, are some important geological processes related to dust storms. Dust storms can also be a major source for cements in aeolian sandstones. The Jurassic aeolian Entrada Formation in the Ghost Ranch area is composed of quartz with minor amounts of feldspar and rock fragments, and is cemented with smectite as grain coatings and calcite and kaolinite as pore fillings. Smectite shows a crinkly and honeycomb-like morphology which points to an authigenic origin. The absence of smectite as framework grains and the presence of partially dissolved grains, coated with smectite and smectite egg-shells, indicate an external source. Clay and fine silt-size particles are believed to be the major source for cements, smectite and calcite in the Entrada Formation. The common association of kaolinite with altered feldspar, and the absence of kaolinite in spots heavily cemented with calcite, lead to the conclusions that the kaolinite formation postdates carbonates and that framework feldspar grains were the source of kaolinite.

  4. The Use of Crow-AMSAA Plots to Assess Mishap Trends

    NASA Technical Reports Server (NTRS)

    Dawson, Jeffrey W.

    2011-01-01

    Crow-AMSAA (CA) plots are used to model reliability growth. Use of CA plots has expanded into other areas, such as tracking events of interest to management, maintenance problems, and safety mishaps. Safety mishaps can often be successfully modeled using a Poisson probability distribution. CA plots show a Poisson process in log-log space. If the safety mishaps are a stable homogenous Poisson process, a linear fit to the points in a CA plot will have a slope of one. Slopes of greater than one indicate a nonhomogenous Poisson process, with increasing occurrence. Slopes of less than one indicate a nonhomogenous Poisson process, with decreasing occurrence. Changes in slope, known as "cusps," indicate a change in process, which could be an improvement or a degradation. After presenting the CA conceptual framework, examples are given of trending slips, trips and falls, and ergonomic incidents at NASA (from Agency-level data). Crow-AMSAA plotting is a robust tool for trending safety mishaps that can provide insight into safety performance over time.

  5. Mobile Ultrasound Plane Wave Beamforming on iPhone or iPad using Metal- based GPU Processing

    NASA Astrophysics Data System (ADS)

    Hewener, Holger J.; Tretbar, Steffen H.

    Mobile and cost effective ultrasound devices are being used in point of care scenarios or the drama room. To reduce the costs of such devices we already presented the possibilities of consumer devices like the Apple iPad for full signal processing of raw data for ultrasound image generation. Using technologies like plane wave imaging to generate a full image with only one excitation/reception event the acquisition times and power consumption of ultrasound imaging can be reduced for low power mobile devices based on consumer electronics realizing the transition from FPGA or ASIC based beamforming into more flexible software beamforming. The massive parallel beamforming processing can be done with the Apple framework "Metal" for advanced graphics and general purpose GPU processing for the iOS platform. We were able to integrate the beamforming reconstruction into our mobile ultrasound processing application with imaging rates up to 70 Hz on iPad Air 2 hardware.

  6. It's about time: revisiting temporal processing deficits in dyslexia.

    PubMed

    Casini, Laurence; Pech-Georgel, Catherine; Ziegler, Johannes C

    2018-03-01

    Temporal processing in French children with dyslexia was evaluated in three tasks: a word identification task requiring implicit temporal processing, and two explicit temporal bisection tasks, one in the auditory and one in the visual modality. Normally developing children matched on chronological age and reading level served as a control group. Children with dyslexia exhibited robust deficits in temporal tasks whether they were explicit or implicit and whether they involved the auditory or the visual modality. First, they presented larger perceptual variability when performing temporal tasks, whereas they showed no such difficulties when performing the same task on a non-temporal dimension (intensity). This dissociation suggests that their difficulties were specific to temporal processing and could not be attributed to lapses of attention, reduced alertness, faulty anchoring, or overall noisy processing. In the framework of cognitive models of time perception, these data point to a dysfunction of the 'internal clock' of dyslexic children. These results are broadly compatible with the recent temporal sampling theory of dyslexia. © 2017 John Wiley & Sons Ltd.

  7. Spatial Modeling for Resources Framework (SMRF)

    USDA-ARS?s Scientific Manuscript database

    Spatial Modeling for Resources Framework (SMRF) was developed by Dr. Scott Havens at the USDA Agricultural Research Service (ARS) in Boise, ID. SMRF was designed to increase the flexibility of taking measured weather data and distributing the point measurements across a watershed. SMRF was developed...

  8. A Framework for Restructuring the Military Retirement System

    DTIC Science & Technology

    2013-07-01

    Associate Professor of Economics in the Social Sciences Department at West Point where he teaches econometrics and labor economics. His areas of...others worth considering, but each should be carefully benchmarked against our proposed framework. 25 ENDNOTES 1. Office of the Actuary , Statistical

  9. Changing behavior towards sustainable practices using Information Technology.

    PubMed

    Iveroth, Einar; Bengtsson, Fredrik

    2014-06-15

    This article addresses the question of how to change individuals' behavior towards more sustainable practices using Information Technology (IT). By following a multidisciplinary and socio-technical perspective, this inquiry is answered by applying a new framework-The Commonality Framework for IT-enabled Change-on a case study of sustainable behavioral change. The framework is grounded in practice theory and is used to analyze the implementation of an IT-system aimed at changing citizens' behavior towards more sustainable transport logistics and procurement in Uppsala, Sweden. The article applies case study research design and the empirical data consists of surveys, in-depth and semi-structured interviews, observations and archival documents. The results show how the change towards sustainable practices is an entanglement of both social and technical-structural elements across time. In this process, structures such as IT are the enablers, and the actors and their social activities are the tipping-point factors that ultimately determine the success of changing individuals' behavior towards a more sustainable direction. This article provides a more balanced view of how both actor and structure related properties interact during the on-going work with change towards greater sustainability practices than earlier research has offered. More specifically, the article offers both a lower-level theory and a method from which we can analyze change processes where technology is seen in its context, and where both technology and the human actor is brought forth to center stage. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Shared mental models of integrated care: aligning multiple stakeholder perspectives.

    PubMed

    Evans, Jenna M; Baker, G Ross

    2012-01-01

    Health service organizations and professionals are under increasing pressure to work together to deliver integrated patient care. A common understanding of integration strategies may facilitate the delivery of integrated care across inter-organizational and inter-professional boundaries. This paper aims to build a framework for exploring and potentially aligning multiple stakeholder perspectives of systems integration. The authors draw from the literature on shared mental models, strategic management and change, framing, stakeholder management, and systems theory to develop a new construct, Mental Models of Integrated Care (MMIC), which consists of three types of mental models, i.e. integration-task, system-role, and integration-belief. The MMIC construct encompasses many of the known barriers and enablers to integrating care while also providing a comprehensive, theory-based framework of psychological factors that may influence inter-organizational and inter-professional relations. While the existing literature on integration focuses on optimizing structures and processes, the MMIC construct emphasizes the convergence and divergence of stakeholders' knowledge and beliefs, and how these underlying cognitions influence interactions (or lack thereof) across the continuum of care. MMIC may help to: explain what differentiates effective from ineffective integration initiatives; determine system readiness to integrate; diagnose integration problems; and develop interventions for enhancing integrative processes and ultimately the delivery of integrated care. Global interest and ongoing challenges in integrating care underline the need for research on the mental models that characterize the behaviors of actors within health systems; the proposed framework offers a starting point for applying a cognitive perspective to health systems integration.

  11. A proposed new framework for valorization of geoheritage in Norway

    NASA Astrophysics Data System (ADS)

    Dahl, Rolv; Bergengren, Anna; Heldal, Tom

    2015-04-01

    The geological history of Norway is a complex one, . The exploitation of geological resources of different kinds has always provided the backbone of the Norwegian community. Nevertheless, the perception of geology and the geological processes that created the landscape is little appreciated, compared to bio-diversity and cultural heritage. Some geological localities play an important role in our perception and scientific understanding of the landscape. Other localities are, or could be, important tourist destinations. Other localities can in turn be important for geoscience education on all levels, whereas other plays a major role in the understanding of geodiversity and geoheritage and should be protected as natural monuments. A database based on old registrations has been compiled and a web mapping server is recently launched based on old and new registrations. However, no systematical classification and identification of important sites has been done for the last thirty years. We are now calling for a crowdsourcing process in the geological community in order to validate and valorize the registrations, as well as defining new points and areas of interest. Furthermore, we are developing a valorization system for these localities. The framework for this system is based on studies from inventories in other countries, as well as suggestions from ProGeo. The aim is to raise awareness of important sites, and how they are treated and utilized for scientific, or educational purposes, as tourist destinations or heritage sites. Our presentation will focus on the development of the framework and its implications.

  12. Extraction of 3D Femur Neck Trabecular Bone Architecture from Clinical CT Images in Osteoporotic Evaluation: a Novel Framework.

    PubMed

    Sapthagirivasan, V; Anburajan, M; Janarthanam, S

    2015-08-01

    The early detection of osteoporosis risk enhances the lifespan and quality of life of an individual. A reasonable in-vivo assessment of trabecular bone strength at the proximal femur helps to evaluate the fracture risk and henceforth, to understand the associated structural dynamics on occurrence of osteoporosis. The main aim of our study was to develop a framework to automatically determine the trabecular bone strength from clinical femur CT images and thereby to estimate its correlation with BMD. All the 50 studied south Indian female subjects aged 30 to 80 years underwent CT and DXA measurements at right femur region. Initially, the original CT slices were intensified and active contour model was utilised for the extraction of the neck region. After processing through a novel process called trabecular enrichment approach (TEA), the three dimensional (3D) trabecular features were extracted. The extracted 3D trabecular features, such as volume fraction (VF), solidity of delta points (SDP) and boundness, demonstrated a significant correlation with femoral neck bone mineral density (r = 0.551, r = 0.432, r = 0.552 respectively) at p < 0.001. The higher area under the curve values of the extracted features (VF: 85.3 %; 95CI: 68.2-100 %, SDP: 82.1 %; 95CI: 65.1-98.9 % and boundness: 90.4 %; 95CI: 78.7-100 %) were observed. The findings suggest that the proposed framework with TEA method would be useful for spotting women vulnerable to osteoporotic risk.

  13. Traces of Unconscious Mental Processes in Introspective Reports and Physiological Responses

    PubMed Central

    Ivonin, Leonid; Chang, Huang-Ming; Diaz, Marta; Catala, Andreu; Chen, Wei; Rauterberg, Matthias

    2015-01-01

    Unconscious mental processes have recently started gaining attention in a number of scientific disciplines. One of the theoretical frameworks for describing unconscious processes was introduced by Jung as a part of his model of the psyche. This framework uses the concept of archetypes that represent prototypical experiences associated with objects, people, and situations. Although the validity of Jungian model remains an open question, this framework is convenient from the practical point of view. Moreover, archetypes found numerous applications in the areas of psychology and marketing. Therefore, observation of both conscious and unconscious traces related to archetypal experiences seems to be an interesting research endeavor. In a study with 36 subjects, we examined the effects of experiencing conglomerations of unconscious emotions associated with various archetypes on the participants’ introspective reports and patterns of physiological activations. Our hypothesis for this experiment was that physiological data may predict archetypes more precisely than introspective reports due to the implicit nature of archetypal experiences. Introspective reports were collected using the Self-Assessment Manikin (SAM) technique. Physiological measures included cardiovascular, electrodermal, respiratory responses and skin temperature of the subjects. The subjects were stimulated to feel four archetypal experiences and four explicit emotions by means of film clips. The data related to the explicit emotions served as a reference in analysis of archetypal experiences. Our findings indicated that while prediction models trained on the collected physiological data could recognize the archetypal experiences with accuracy of 55 percent, similar models built based on the SAM data demonstrated performance of only 33 percent. Statistical tests enabled us to confirm that physiological observations are better suited for observation of implicit psychological constructs like archetypes than introspective reports. PMID:25875608

  14. Operationalizing resilience for adaptive coral reef management under global environmental change

    PubMed Central

    Anthony, Kenneth RN; Marshall, Paul A; Abdulla, Ameer; Beeden, Roger; Bergh, Chris; Black, Ryan; Eakin, C Mark; Game, Edward T; Gooch, Margaret; Graham, Nicholas AJ; Green, Alison; Heron, Scott F; van Hooidonk, Ruben; Knowland, Cheryl; Mangubhai, Sangeeta; Marshall, Nadine; Maynard, Jeffrey A; McGinnity, Peter; McLeod, Elizabeth; Mumby, Peter J; Nyström, Magnus; Obura, David; Oliver, Jamie; Possingham, Hugh P; Pressey, Robert L; Rowlands, Gwilym P; Tamelander, Jerker; Wachenfeld, David; Wear, Stephanie

    2015-01-01

    Cumulative pressures from global climate and ocean change combined with multiple regional and local-scale stressors pose fundamental challenges to coral reef managers worldwide. Understanding how cumulative stressors affect coral reef vulnerability is critical for successful reef conservation now and in the future. In this review, we present the case that strategically managing for increased ecological resilience (capacity for stress resistance and recovery) can reduce coral reef vulnerability (risk of net decline) up to a point. Specifically, we propose an operational framework for identifying effective management levers to enhance resilience and support management decisions that reduce reef vulnerability. Building on a system understanding of biological and ecological processes that drive resilience of coral reefs in different environmental and socio-economic settings, we present an Adaptive Resilience-Based management (ARBM) framework and suggest a set of guidelines for how and where resilience can be enhanced via management interventions. We argue that press-type stressors (pollution, sedimentation, overfishing, ocean warming and acidification) are key threats to coral reef resilience by affecting processes underpinning resistance and recovery, while pulse-type (acute) stressors (e.g. storms, bleaching events, crown-of-thorns starfish outbreaks) increase the demand for resilience. We apply the framework to a set of example problems for Caribbean and Indo-Pacific reefs. A combined strategy of active risk reduction and resilience support is needed, informed by key management objectives, knowledge of reef ecosystem processes and consideration of environmental and social drivers. As climate change and ocean acidification erode the resilience and increase the vulnerability of coral reefs globally, successful adaptive management of coral reefs will become increasingly difficult. Given limited resources, on-the-ground solutions are likely to focus increasingly on actions that support resilience at finer spatial scales, and that are tightly linked to ecosystem goods and services. PMID:25196132

  15. Traces of unconscious mental processes in introspective reports and physiological responses.

    PubMed

    Ivonin, Leonid; Chang, Huang-Ming; Diaz, Marta; Catala, Andreu; Chen, Wei; Rauterberg, Matthias

    2015-01-01

    Unconscious mental processes have recently started gaining attention in a number of scientific disciplines. One of the theoretical frameworks for describing unconscious processes was introduced by Jung as a part of his model of the psyche. This framework uses the concept of archetypes that represent prototypical experiences associated with objects, people, and situations. Although the validity of Jungian model remains an open question, this framework is convenient from the practical point of view. Moreover, archetypes found numerous applications in the areas of psychology and marketing. Therefore, observation of both conscious and unconscious traces related to archetypal experiences seems to be an interesting research endeavor. In a study with 36 subjects, we examined the effects of experiencing conglomerations of unconscious emotions associated with various archetypes on the participants' introspective reports and patterns of physiological activations. Our hypothesis for this experiment was that physiological data may predict archetypes more precisely than introspective reports due to the implicit nature of archetypal experiences. Introspective reports were collected using the Self-Assessment Manikin (SAM) technique. Physiological measures included cardiovascular, electrodermal, respiratory responses and skin temperature of the subjects. The subjects were stimulated to feel four archetypal experiences and four explicit emotions by means of film clips. The data related to the explicit emotions served as a reference in analysis of archetypal experiences. Our findings indicated that while prediction models trained on the collected physiological data could recognize the archetypal experiences with accuracy of 55 percent, similar models built based on the SAM data demonstrated performance of only 33 percent. Statistical tests enabled us to confirm that physiological observations are better suited for observation of implicit psychological constructs like archetypes than introspective reports.

  16. Measures and Metrics for Feasibility of Proof-of-Concept Studies With Human Immunodeficiency Virus Rapid Point-of-Care Technologies: The Evidence and the Framework.

    PubMed

    Pant Pai, Nitika; Chiavegatti, Tiago; Vijh, Rohit; Karatzas, Nicolaos; Daher, Jana; Smallwood, Megan; Wong, Tom; Engel, Nora

    2017-12-01

    Pilot (feasibility) studies form a vast majority of diagnostic studies with point-of-care technologies but often lack use of clear measures/metrics and a consistent framework for reporting and evaluation. To fill this gap, we systematically reviewed data to ( a ) catalog feasibility measures/metrics and ( b ) propose a framework. For the period January 2000 to March 2014, 2 reviewers searched 4 databases (MEDLINE, EMBASE, CINAHL, Scopus), retrieved 1441 citations, and abstracted data from 81 studies. We observed 2 major categories of measures, that is, implementation centered and patient centered, and 4 subcategories of measures, that is, feasibility, acceptability, preference, and patient experience. We defined and delineated metrics and measures for a feasibility framework. We documented impact measures for a comparison. We observed heterogeneity in reporting of metrics as well as misclassification and misuse of metrics within measures. Although we observed poorly defined measures and metrics for feasibility, preference, and patient experience, in contrast, acceptability measure was the best defined. For example, within feasibility, metrics such as consent, completion, new infection, linkage rates, and turnaround times were misclassified and reported. Similarly, patient experience was variously reported as test convenience, comfort, pain, and/or satisfaction. In contrast, within impact measures, all the metrics were well documented, thus serving as a good baseline comparator. With our framework, we classified, delineated, and defined quantitative measures and metrics for feasibility. Our framework, with its defined measures/metrics, could reduce misclassification and improve the overall quality of reporting for monitoring and evaluation of rapid point-of-care technology strategies and their context-driven optimization.

  17. Automated integration of wireless biosignal collection devices for patient-centred decision-making in point-of-care systems

    PubMed Central

    Menychtas, Andreas; Tsanakas, Panayiotis

    2016-01-01

    The proper acquisition of biosignals data from various biosensor devices and their remote accessibility are still issues that prevent the wide adoption of point-of-care systems in the routine of monitoring chronic patients. This Letter presents an advanced framework for enabling patient monitoring that utilises a cloud computing infrastructure for data management and analysis. The framework introduces also a local mechanism for uniform biosignals collection from wearables and biosignal sensors, and decision support modules, in order to enable prompt and essential decisions. A prototype smartphone application and the related cloud modules have been implemented for demonstrating the value of the proposed framework. Initial results regarding the performance of the system and the effectiveness in data management and decision-making have been quite encouraging. PMID:27222731

  18. Automated integration of wireless biosignal collection devices for patient-centred decision-making in point-of-care systems.

    PubMed

    Menychtas, Andreas; Tsanakas, Panayiotis; Maglogiannis, Ilias

    2016-03-01

    The proper acquisition of biosignals data from various biosensor devices and their remote accessibility are still issues that prevent the wide adoption of point-of-care systems in the routine of monitoring chronic patients. This Letter presents an advanced framework for enabling patient monitoring that utilises a cloud computing infrastructure for data management and analysis. The framework introduces also a local mechanism for uniform biosignals collection from wearables and biosignal sensors, and decision support modules, in order to enable prompt and essential decisions. A prototype smartphone application and the related cloud modules have been implemented for demonstrating the value of the proposed framework. Initial results regarding the performance of the system and the effectiveness in data management and decision-making have been quite encouraging.

  19. A pose estimation method for unmanned ground vehicles in GPS denied environments

    NASA Astrophysics Data System (ADS)

    Tamjidi, Amirhossein; Ye, Cang

    2012-06-01

    This paper presents a pose estimation method based on the 1-Point RANSAC EKF (Extended Kalman Filter) framework. The method fuses the depth data from a LIDAR and the visual data from a monocular camera to estimate the pose of a Unmanned Ground Vehicle (UGV) in a GPS denied environment. Its estimation framework continuy updates the vehicle's 6D pose state and temporary estimates of the extracted visual features' 3D positions. In contrast to the conventional EKF-SLAM (Simultaneous Localization And Mapping) frameworks, the proposed method discards feature estimates from the extended state vector once they are no longer observed for several steps. As a result, the extended state vector always maintains a reasonable size that is suitable for online calculation. The fusion of laser and visual data is performed both in the feature initialization part of the EKF-SLAM process and in the motion prediction stage. A RANSAC pose calculation procedure is devised to produce pose estimate for the motion model. The proposed method has been successfully tested on the Ford campus's LIDAR-Vision dataset. The results are compared with the ground truth data of the dataset and the estimation error is ~1.9% of the path length.

  20. Reminiscence through the Lens of Social Media

    PubMed Central

    Thomas, Lisa; Briggs, Pam

    2016-01-01

    Reminiscence is used to support and create new social bonds and give meaning to life. Originally perceived as a preoccupation of the aged, we now recognize that reminiscence has value throughout the lifespan. Increasingly, social media can be used to both support and prompt reminiscence, with Facebook’s Lookback or Year in Review as recent examples. This work takes prompted reminiscence further, asking what forms and functions of reminiscence are supported by social media. Utilizing the online service MySocialBook, we invited participants to curate content from their personal Facebook account to then be transformed into a printed book. We used that book as a prompt for discussion of the reminiscence function of the curated material, using Westerhof and Bohlmeijer’s (2014) reminiscence framework as a starting point. We conclude that this framework is valuable in understanding the role of social media in reminiscence, but note that earlier models, such as Webster’s Reminiscence Functions Scale, are also relevant. We contribute to the reminiscence debate by adding a technological lens to the process of life review, whilst concurring with other researchers in this field that a robust conceptual framework is lacking, particularly when considering the forms of reminiscence that are most salient for younger people. PMID:27378971

  1. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach

    PubMed Central

    Elgendi, Mohamed

    2016-01-01

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852

  2. Phase transition of the susceptible-infected-susceptible dynamics on time-varying configuration model networks

    NASA Astrophysics Data System (ADS)

    St-Onge, Guillaume; Young, Jean-Gabriel; Laurence, Edward; Murphy, Charles; Dubé, Louis J.

    2018-02-01

    We present a degree-based theoretical framework to study the susceptible-infected-susceptible (SIS) dynamics on time-varying (rewired) configuration model networks. Using this framework on a given degree distribution, we provide a detailed analysis of the stationary state using the rewiring rate to explore the whole range of the time variation of the structure relative to that of the SIS process. This analysis is suitable for the characterization of the phase transition and leads to three main contributions: (1) We obtain a self-consistent expression for the absorbing-state threshold, able to capture both collective and hub activation. (2) We recover the predictions of a number of existing approaches as limiting cases of our analysis, providing thereby a unifying point of view for the SIS dynamics on random networks. (3) We obtain bounds for the critical exponents of a number of quantities in the stationary state. This allows us to reinterpret the concept of hub-dominated phase transition. Within our framework, it appears as a heterogeneous critical phenomenon: observables for different degree classes have a different scaling with the infection rate. This phenomenon is followed by the successive activation of the degree classes beyond the epidemic threshold.

  3. On the Origin of a Maximum Peak Pressure on the Target Outside of the Stagnation Point upon Normal Impact of a Blunt Projectile and with Underwater Explosion

    NASA Astrophysics Data System (ADS)

    Gonor, Alexander; Hooton, Irene

    2006-07-01

    Impact of a rigid projectile (impactor), against a metal target and a condensed explosive surface considered as the important process accompanying the normal entry of a rigid projectile into a target, was overlooked in the preceding studies. Within the framework of accurate shock wave theory, the flow-field, behind the shock wave attached to the perimeter of the adjoined surface, was defined. An important result is the peak pressure rises at points along the target surface away from the stagnation point. The maximum values of the peak pressure are 2.2 to 3.2 times higher for the metallic and soft targets (nitromethane, PBX 9502), than peak pressure values at the stagnation point. This effect changes the commonly held notion that the maximum peak pressure is reached at the projectile stagnation point. In the present study the interaction of a spherical decaying blast wave, caused by an underwater explosion, with a piece-wise plane target, having corner configurations, is investigated. The numerical calculation results in the determination of the vulnerable spots on the target, where the maximum peak overpressure surpassed that for the head-on shock wave reflection by a factor of 4.

  4. Gaussian process-based Bayesian nonparametric inference of population size trajectories from gene genealogies.

    PubMed

    Palacios, Julia A; Minin, Vladimir N

    2013-03-01

    Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method. Copyright © 2013, The International Biometric Society.

  5. Reconstruction of Consistent 3d CAD Models from Point Cloud Data Using a Priori CAD Models

    NASA Astrophysics Data System (ADS)

    Bey, A.; Chaine, R.; Marc, R.; Thibault, G.; Akkouche, S.

    2011-09-01

    We address the reconstruction of 3D CAD models from point cloud data acquired in industrial environments, using a pre-existing 3D model as an initial estimate of the scene to be processed. Indeed, this prior knowledge can be used to drive the reconstruction so as to generate an accurate 3D model matching the point cloud. We more particularly focus our work on the cylindrical parts of the 3D models. We propose to state the problem in a probabilistic framework: we have to search for the 3D model which maximizes some probability taking several constraints into account, such as the relevancy with respect to the point cloud and the a priori 3D model, and the consistency of the reconstructed model. The resulting optimization problem can then be handled using a stochastic exploration of the solution space, based on the random insertion of elements in the configuration under construction, coupled with a greedy management of the conflicts which efficiently improves the configuration at each step. We show that this approach provides reliable reconstructed 3D models by presenting some results on industrial data sets.

  6. PointCom: semi-autonomous UGV control with intuitive interface

    NASA Astrophysics Data System (ADS)

    Rohde, Mitchell M.; Perlin, Victor E.; Iagnemma, Karl D.; Lupa, Robert M.; Rohde, Steven M.; Overholt, James; Fiorani, Graham

    2008-04-01

    Unmanned ground vehicles (UGVs) will play an important role in the nation's next-generation ground force. Advances in sensing, control, and computing have enabled a new generation of technologies that bridge the gap between manual UGV teleoperation and full autonomy. In this paper, we present current research on a unique command and control system for UGVs named PointCom (Point-and-Go Command). PointCom is a semi-autonomous command system for one or multiple UGVs. The system, when complete, will be easy to operate and will enable significant reduction in operator workload by utilizing an intuitive image-based control framework for UGV navigation and allowing a single operator to command multiple UGVs. The project leverages new image processing algorithms for monocular visual servoing and odometry to yield a unique, high-performance fused navigation system. Human Computer Interface (HCI) techniques from the entertainment software industry are being used to develop video-game style interfaces that require little training and build upon the navigation capabilities. By combining an advanced navigation system with an intuitive interface, a semi-autonomous control and navigation system is being created that is robust, user friendly, and less burdensome than many current generation systems. mand).

  7. Federated Process Framework in a Virtual Enterprise Using an Object-Oriented Database and Extensible Markup Language.

    ERIC Educational Resources Information Center

    Bae, Kyoung-Il; Kim, Jung-Hyun; Huh, Soon-Young

    2003-01-01

    Discusses process information sharing among participating organizations in a virtual enterprise and proposes a federated process framework and system architecture that provide a conceptual design for effective implementation of process information sharing supporting the autonomy and agility of the organizations. Develops the framework using an…

  8. NASA System Safety Framework and Concepts for Implementation

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon

    2012-01-01

    This report has been developed by the National Aeronautics and Space Administration (NASA) Human Exploration and Operations Mission Directorate (HEOMD) Risk Management team knowledge capture forums.. This document provides a point-in-time, cumulative, summary of actionable key lessons learned in safety framework and concepts.

  9. A two-step framework for the registration of HE stained and FTIR images

    NASA Astrophysics Data System (ADS)

    Peñaranda, Francisco; Naranjo, Valery; Verdú, Rafaél.; Lloyd, Gavin R.; Nallala, Jayakrupakar; Stone, Nick

    2016-03-01

    FTIR spectroscopy is an emerging technology with high potential for cancer diagnosis but with particular physical phenomena that require special processing. Little work has been done in the field with the aim of registering hyperspectral Fourier-Transform Infrared (FTIR) spectroscopic images and Hematoxilin and Eosin (HE) stained histological images of contiguous slices of tissue. This registration is necessary to transfer the location of relevant structures that the pathologist may identify in the gold standard HE images. A two-step registration framework is presented where a representative gray image extracted from the FTIR hypercube is used as an input. This representative image, which must have a spatial contrast as similar as possible to a gray image obtained from the HE image, is calculated through the spectrum variation in the fingerprint region. In the first step of the registration algorithm a similarity transformation is estimated from interest points, which are automatically detected by the popular SURF algorithm. In the second stage, a variational registration framework defined in the frequency domain compensates for local anatomical variations between both images. After a proper tuning of some parameters the proposed registration framework works in an automated way. The method was tested on 7 samples of colon tissue in different stages of cancer. Very promising qualitative and quantitative results were obtained (a mean correlation ratio of 92.16% with a standard deviation of 3.10%).

  10. Automated segmentation and tracking of non-rigid objects in time-lapse microscopy videos of polymorphonuclear neutrophils.

    PubMed

    Brandes, Susanne; Mokhtari, Zeinab; Essig, Fabian; Hünniger, Kerstin; Kurzai, Oliver; Figge, Marc Thilo

    2015-02-01

    Time-lapse microscopy is an important technique to study the dynamics of various biological processes. The labor-intensive manual analysis of microscopy videos is increasingly replaced by automated segmentation and tracking methods. These methods are often limited to certain cell morphologies and/or cell stainings. In this paper, we present an automated segmentation and tracking framework that does not have these restrictions. In particular, our framework handles highly variable cell shapes and does not rely on any cell stainings. Our segmentation approach is based on a combination of spatial and temporal image variations to detect moving cells in microscopy videos. This method yields a sensitivity of 99% and a precision of 95% in object detection. The tracking of cells consists of different steps, starting from single-cell tracking based on a nearest-neighbor-approach, detection of cell-cell interactions and splitting of cell clusters, and finally combining tracklets using methods from graph theory. The segmentation and tracking framework was applied to synthetic as well as experimental datasets with varying cell densities implying different numbers of cell-cell interactions. We established a validation framework to measure the performance of our tracking technique. The cell tracking accuracy was found to be >99% for all datasets indicating a high accuracy for connecting the detected cells between different time points. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Justice and Negotiation.

    PubMed

    Druckman, Daniel; Wagner, Lynn M

    2016-01-01

    This review article examines the literature regarding the role played by principles of justice in negotiation. Laboratory experiments and high-stakes negotiations reveal that justice is a complex concept, both in relation to attaining just outcomes and to establishing just processes. We focus on how justice preferences guide the process and outcome of negotiated exchanges. Focusing primarily on the two types of principles that have received the most attention, distributive justice (outcomes of negotiation) and procedural justice (process of negotiation), we introduce the topic by reviewing the most relevant experimental and field or archival research on the roles played by these justice principles in negotiation. A discussion of the methods used in these studies precedes a review organized in terms of a framework that highlights the concept of negotiating stages. We also develop hypotheses based on the existing literature to point the way forward for further research on this topic.

  12. Segmentation of Large Unstructured Point Clouds Using Octree-Based Region Growing and Conditional Random Fields

    NASA Astrophysics Data System (ADS)

    Bassier, M.; Bonduel, M.; Van Genechten, B.; Vergauwen, M.

    2017-11-01

    Point cloud segmentation is a crucial step in scene understanding and interpretation. The goal is to decompose the initial data into sets of workable clusters with similar properties. Additionally, it is a key aspect in the automated procedure from point cloud data to BIM. Current approaches typically only segment a single type of primitive such as planes or cylinders. Also, current algorithms suffer from oversegmenting the data and are often sensor or scene dependent. In this work, a method is presented to automatically segment large unstructured point clouds of buildings. More specifically, the segmentation is formulated as a graph optimisation problem. First, the data is oversegmented with a greedy octree-based region growing method. The growing is conditioned on the segmentation of planes as well as smooth surfaces. Next, the candidate clusters are represented by a Conditional Random Field after which the most likely configuration of candidate clusters is computed given a set of local and contextual features. The experiments prove that the used method is a fast and reliable framework for unstructured point cloud segmentation. Processing speeds up to 40,000 points per second are recorded for the region growing. Additionally, the recall and precision of the graph clustering is approximately 80%. Overall, nearly 22% of oversegmentation is reduced by clustering the data. These clusters will be classified and used as a basis for the reconstruction of BIM models.

  13. Social determinants and lifestyles: integrating environmental and public health perspectives.

    PubMed

    Graham, H; White, P C L

    2016-12-01

    Industrialization and urbanization have been associated with an epidemiological transition, from communicable to non-communicable disease, and a geological transition that is moving the planet beyond the stable Holocene epoch in which human societies have prospered. The lifestyles of high-income countries are major drivers of these twin processes. Our objective is to highlight the common causes of chronic disease and environmental change and, thereby, contribute to shared perspectives across public health and the environment. Integrative reviews focused on social determinants and lifestyles as two 'bridging' concepts between the fields of public health and environmental sustainability. We drew on established frameworks to consider the position of the natural environment within social determinants of health (SDH) frameworks and the position of social determinants within environmental frameworks. We drew on evidence on lifestyle factors central to both public health and environmental change (mobility- and diet-related factors). We investigated how public health's focus on individual behaviour can be enriched by environmental perspectives that give attention to household consumption practices. While SDH frameworks can incorporate the biophysical environment, their causal structure positions it as a determinant and one largely separate from the social factors that shape it. Environmental frameworks are more likely to represent the environment and its ecosystems as socially determined. A few frameworks also include human health as an outcome, providing the basis for a combined public health/environmental sustainability framework. Environmental analyses of household impacts broaden public health's concern with individual risk behaviours, pointing to the more damaging lifestyles of high-income households. The conditions for health are being undermined by rapid environmental change. There is scope for frameworks reaching across public health and environmental sustainability and a shared evidence base that captures the health- and environmentally damaging impacts of high-consumption lifestyles. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  14. Hierarchical Higher Order Crf for the Classification of Airborne LIDAR Point Clouds in Urban Areas

    NASA Astrophysics Data System (ADS)

    Niemeyer, J.; Rottensteiner, F.; Soergel, U.; Heipke, C.

    2016-06-01

    We propose a novel hierarchical approach for the classification of airborne 3D lidar points. Spatial and semantic context is incorporated via a two-layer Conditional Random Field (CRF). The first layer operates on a point level and utilises higher order cliques. Segments are generated from the labelling obtained in this way. They are the entities of the second layer, which incorporates larger scale context. The classification result of the segments is introduced as an energy term for the next iteration of the point-based layer. This framework iterates and mutually propagates context to improve the classification results. Potentially wrong decisions can be revised at later stages. The output is a labelled point cloud as well as segments roughly corresponding to object instances. Moreover, we present two new contextual features for the segment classification: the distance and the orientation of a segment with respect to the closest road. It is shown that the classification benefits from these features. In our experiments the hierarchical framework improve the overall accuracies by 2.3% on a point-based level and by 3.0% on a segment-based level, respectively, compared to a purely point-based classification.

  15. Sensitivity study of experimental measures for the nuclear liquid-gas phase transition in the statistical multifragmentation model

    NASA Astrophysics Data System (ADS)

    Lin, W.; Ren, P.; Zheng, H.; Liu, X.; Huang, M.; Wada, R.; Qu, G.

    2018-05-01

    The experimental measures of the multiplicity derivatives—the moment parameters, the bimodal parameter, the fluctuation of maximum fragment charge number (normalized variance of Zmax, or NVZ), the Fisher exponent (τ ), and the Zipf law parameter (ξ )—are examined to search for the liquid-gas phase transition in nuclear multifragmention processes within the framework of the statistical multifragmentation model (SMM). The sensitivities of these measures are studied. All these measures predict a critical signature at or near to the critical point both for the primary and secondary fragments. Among these measures, the total multiplicity derivative and the NVZ provide accurate measures for the critical point from the final cold fragments as well as the primary fragments. The present study will provide a guide for future experiments and analyses in the study of the nuclear liquid-gas phase transition.

  16. Effect of aging and ice structuring proteins on the morphology of frozen hydrated gluten networks.

    PubMed

    Kontogiorgos, Vassilis; Goff, H Douglas; Kasapis, Stefan

    2007-04-01

    The present investigation constitutes an attempt to rationalize the effect of aging and ice structuring proteins (ISPs) on the network morphology of frozen hydrated gluten. In doing so, it employs differential scanning calorimetry, time-domain NMR, dynamic oscillation on shear, creep testing, and electron microscopy. Experimentation and first principles modeling allows identification and description of the processes of ice formation and recrystallization in molecular terms. It is demonstrated that in the absence of a readily discernible glass transition temperature in gluten-ice composites, the approach of considering the melting point and aging at constant or fluctuating temperature conditions in the vicinity of this point can provide a valid index of functional quality. A theoretical framework supporting the concept of capillary confined frozen water in the gluten matrix was advanced, and it was found that ISPs were effective in controlling recrystallization both within these confines and within ice in the bulk.

  17. Toward an Ethical Framework for Climate Services

    NASA Astrophysics Data System (ADS)

    Wilby, R.; Adams, P.; Eitland, E.; Hewitson, B.; Shumake, J.; Vaughan, C.; Zebiak, S. E.

    2015-12-01

    Climate services offer information and tools to help stakeholders anticipate and/or manage risks posed by climate change. However, climate services lack a cohesive ethical framework to govern their development and application. This paper describes a prototype, open-ended process to form a set of ethical principles to ensure that climate services are effectively deployed to manage climate risks, realize opportunities, and advance human security.We begin by acknowledging the multiplicity of competing interests and motivations across individuals and institutions. Growing awareness of potential climate impacts has raised interest and investments in climate services and led to the entrance of new providers. User demand for climate services is also rising, as are calls for new types of services. Meanwhile, there is growing pressure from funders to operationalize climate research.Our proposed ethical framework applies reference points founded on diverse experiences in western and developing countries, fundamental and applied climate research, different sectors, gender, and professional practice (academia, private sector, government). We assert that climate service providers should be accountable for both their practices and products by upholding values of integrity, transparency, humility, and collaboration.Principles of practice include: communicating all value judgements; eschewing climate change as a singular threat; engaging in the co-exploration of knowledge; establishing mechanisms for monitoring/evaluating procedures and products; declaring any conflicts of interest. Examples of principles of products include: clear and defensible provenance of information; descriptions of the extent and character of uncertainties using terms that are meaningful to intended users; tools and information that are tailored to the context of the user; and thorough documentation of methods and meta-data.We invite the community to test and refine these points.

  18. A proposed framework to operationalize ESS for the mitigation of soil threats

    NASA Astrophysics Data System (ADS)

    Schwilch, Gudrun; Bernet, Lea; Fleskens, Luuk; Mills, Jane; Stolte, Jannes; van Delden, Hedwig; Verzandvoort, Simone

    2015-04-01

    Despite various research activities in the last decades across the world, many challenges remain to integrate the concept of ecosystem services (ESS) in decision-making, and a coherent approach to assess and value ESS is still lacking. There are a lot of different - often context-specific - ESS frameworks with their own definitions and understanding of terms. Based on a thorough review, the EU FP7 project RECARE (www.recare-project.eu) suggests an adapted framework for ecosystem services related to soils that can be used for practical application in preventing and remediating degradation of soils in Europe. This lays the foundation for the development and selection of appropriate methods to measure, evaluate, communicate and negotiate the services we obtain from soils with stakeholders in order to improve land management. Similar to many ESS frameworks, the RECARE framework distinguishes between an ecosystem and human well-being part. As the RECARE project is focused on soil threats, this is the starting point on the ecosystem part of the framework. Soil threats affect natural capital, such as soil, water, vegetation, air and animals, and are in turn influenced by those. Within the natural capital, the RECARE framework focuses especially on soil and its properties, classified in inherent and manageable properties. The natural capital then enables and underpins soil processes, while at the same time being affected by those. Soil processes, finally, are the ecosystem's capacity to provide services, thus they support the provision of soil functions and ESS. ESS may be utilized to produce benefits for individuals and human society. Those benefits are explicitly or implicitly valued by individuals and human society. The values placed on those benefits influence policy and decision-making and thus lead to a societal response. Individual (e.g. farmers') and societal decision making and policy determine land management and other (human) driving forces, which in turn affect soil threats and natural capital. In order to improve ESS with Sustainable Land Management (SLM) - i.e. measures aimed to prevent or remediate soil threats, the services identified in the framework need to be "manageable" (modifiable) for the stakeholders. To this end, effects of soil threats and prevention / remediation measures are captured by key soil properties as well as through bio-physical (e.g. reduced soil loss), socio-economic (e.g. reduced workload) and socio-cultural (e.g. aesthetics) impact indicators. In order to use such indicators in RECARE, it should be possible to associate the changes in soil processes to impacts of prevention / remediation measures (SLM). This requires the indicators to be sensitive enough to small changes, but still sufficiently robust to provide evidence of the change and attribute it to SLM. The RECARE ESS framework will be presented and discussed in order to further develop its operationalization. Inputs from the conference participants are highly welcome.

  19. Influence of scattering processes on electron quantum states in nanowires

    PubMed Central

    Galenchik, Vadim; Borzdov, Andrei; Borzdov, Vladimir; Komarov, Fadei

    2007-01-01

    In the framework of quantum perturbation theory the self-consistent method of calculation of electron scattering rates in nanowires with the one-dimensional electron gas in the quantum limit is worked out. The developed method allows both the collisional broadening and the quantum correlations between scattering events to be taken into account. It is an alternativeper seto the Fock approximation for the self-energy approach based on Green’s function formalism. However this approach is free of mathematical difficulties typical to the Fock approximation. Moreover, the developed method is simpler than the Fock approximation from the computational point of view. Using the approximation of stable one-particle quantum states it is proved that the electron scattering processes determine the dependence of electron energy versus its wave vector.

  20. Extracting nursing practice patterns from structured labor and delivery data sets.

    PubMed

    Hall, Eric S; Thornton, Sidney N

    2007-10-11

    This study was designed to demonstrate the feasibility of a computerized care process model that provides real-time case profiling and outcome forecasting. A methodology was defined for extracting nursing practice patterns from structured point-of-care data collected using the labor and delivery information system at Intermountain Healthcare. Data collected during January 2006 were retrieved from Intermountain Healthcare's enterprise data warehouse for use in the study. The knowledge discovery in databases process provided a framework for data analysis including data selection, preprocessing, data-mining, and evaluation. Development of an interactive data-mining tool and construction of a data model for stratification of patient records into profiles supported the goals of the study. Five benefits of the practice pattern extraction capability, which extend to other clinical domains, are listed with supporting examples.

  1. Resource allocation planning with international components

    NASA Technical Reports Server (NTRS)

    Burke, Gene; Durham, Ralph; Leppla, Frank; Porter, David

    1993-01-01

    Dumas, Briggs, Reid and Smith (1989) describe the need for identifying mutually acceptable methodologies for developing standard agreements for the exchange of tracking time or facility use among international components. One possible starting point is the current process used at the Jet Propulsion Laboratory (JPL) in planning the use of tracking resources. While there is a significant promise of better resource utilization by international cooperative agreements, there is a serious challenge to provide convenient user participation given the separate project and network locations. Coordination among users and facility providers will require a more decentralized communication process and a wider variety of automated planning tools to help users find potential exchanges. This paper provides a framework in which international cooperation in the utilization of ground based space communication systems can be facilitated.

  2. Implementation Of Quality Management System For Irradiation Processing Services

    NASA Astrophysics Data System (ADS)

    Lungu, Ion-Bogdan; Manea, Maria-Mihaela

    2015-07-01

    In today's market, due to an increasing competitiveness, quality management has set itself as an indispensable tool and a reference point for every business. It is ultimately focused on customer satisfaction which is a stringent factor for every business. Implementing and maintaining a QMS is a rather difficult, time consuming and expensive process which must be done with respect of many factors. The aim of this paper is to present a case study for implementing QMS ISO 9001 in a gamma irradiation treatment service provider. The research goals are the identification of key benefits, reasons, advantages, disadvantages, drawbacks etc for a successful QMS implementation and use. Finally, the expected results focus on creating a general framework for implementing an efficient QMS plan that can be easily adapted to other kind of services and markets.

  3. Simultaneous multi-component seismic denoising and reconstruction via K-SVD

    NASA Astrophysics Data System (ADS)

    Hou, Sian; Zhang, Feng; Li, Xiangyang; Zhao, Qiang; Dai, Hengchang

    2018-06-01

    Data denoising and reconstruction play an increasingly significant role in seismic prospecting for their value in enhancing effective signals, dealing with surface obstacles and reducing acquisition costs. In this paper, we propose a novel method to denoise and reconstruct multicomponent seismic data simultaneously. This method lies within the framework of machine learning and the key points are defining a suitable weight function and a modified inner product operator. The purpose of these two processes are to perform missing data machine learning when the random noise deviation is unknown, and building a mathematical relationship for each component to incorporate all the information of multi-component data. Two examples, using synthetic and real multicomponent data, demonstrate that the new method is a feasible alternative for multi-component seismic data processing.

  4. Interactive Visualization of Near Real-Time and Production Global Precipitation Mission Data Online Using CesiumJS

    NASA Astrophysics Data System (ADS)

    Lammers, M.

    2016-12-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology make online visualization of large geospatial datasets viable. Commonly this is done using static image overlays, pre-rendered animations, or cumbersome geoservers. These methods can limit interactivity and/or place a large burden on server-side post-processing and storage of data. Geospatial data, and satellite data specifically, benefit from being visualized both on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS, developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. It has entered the void left by the abandonment of the Google Earth Web API, and it serves as a capable and well-maintained platform upon which data can be displayed. This paper will describe the technology behind the two primary products developed as part of the NASA Precipitation Processing System STORM website: GPM Near Real Time Viewer (GPMNRTView) and STORM Virtual Globe (STORM VG). GPMNRTView reads small post-processed CZML files derived from various Level 1 through 3 near real-time products. For swath-based products, several brightness temperature channels or precipitation-related variables are available for animating in virtual real-time as the satellite observed them on and above the Earth's surface. With grid-based products, only precipitation rates are available, but the grid points are visualized in such a way that they can be interactively examined to explore raw values. STORM VG reads values directly off the HDF5 files, converting the information into JSON on the fly. All data points both on and above the surface can be examined here as well. Both the raw values and, if relevant, elevations are displayed. Surface and above-ground precipitation rates from select Level 2 and 3 products are shown. Examples from both products will be shown, including visuals from high impact events observed by GPM constellation satellites.

  5. Interactive Visualization of Near Real Time and Production Global Precipitation Measurement (GPM) Mission Data Online Using CesiumJS

    NASA Technical Reports Server (NTRS)

    Lammers, Matthew

    2016-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology make online visualization of large geospatial datasets viable. Commonly this is done using static image overlays, prerendered animations, or cumbersome geoservers. These methods can limit interactivity andor place a large burden on server-side post-processing and storage of data. Geospatial data, and satellite data specifically, benefit from being visualized both on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS, developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. It has entered the void left by the abandonment of the Google Earth Web API, and it serves as a capable and well-maintained platform upon which data can be displayed. This paper will describe the technology behind the two primary products developed as part of the NASA Precipitation Processing System STORM website: GPM Near Real Time Viewer (GPMNRTView) and STORM Virtual Globe (STORM VG). GPMNRTView reads small post-processed CZML files derived from various Level 1 through 3 near real-time products. For swath-based products, several brightness temperature channels or precipitation-related variables are available for animating in virtual real-time as the satellite-observed them on and above the Earths surface. With grid-based products, only precipitation rates are available, but the grid points are visualized in such a way that they can be interactively examined to explore raw values. STORM VG reads values directly off the HDF5 files, converting the information into JSON on the fly. All data points both on and above the surface can be examined here as well. Both the raw values and, if relevant, elevations are displayed. Surface and above-ground precipitation rates from select Level 2 and 3 products are shown. Examples from both products will be shown, including visuals from high impact events observed by GPM constellation satellites.

  6. Enhanced intelligence through optimized TCPED concepts for airborne ISR

    NASA Astrophysics Data System (ADS)

    Spitzer, M.; Kappes, E.; Böker, D.

    2012-06-01

    Current multinational operations show an increased demand for high quality actionable intelligence for different operational levels and users. In order to achieve sufficient availability, quality and reliability of information, various ISR assets are orchestrated within operational theatres. Especially airborne Intelligence, Surveillance and Reconnaissance (ISR) assets provide - due to their endurance, non-intrusiveness, robustness, wide spectrum of sensors and flexibility to mission changes - significant intelligence coverage of areas of interest. An efficient and balanced utilization of airborne ISR assets calls for advanced concepts for the entire ISR process framework including the Tasking, Collection, Processing, Exploitation and Dissemination (TCPED). Beyond this, the employment of current visualization concepts, shared information bases and information customer profiles, as well as an adequate combination of ISR sensors with different information age and dynamic (online) retasking process elements provides the optimization of interlinked TCPED processes towards higher process robustness, shorter process duration, more flexibility between ISR missions and, finally, adequate "entry points" for information requirements by operational users and commands. In addition, relevant Trade-offs of distributed and dynamic TCPED processes are examined and future trends are depicted.

  7. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-07

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.

  8. Automatic short axis orientation of the left ventricle in 3D ultrasound recordings

    NASA Astrophysics Data System (ADS)

    Pedrosa, João.; Heyde, Brecht; Heeren, Laurens; Engvall, Jan; Zamorano, Jose; Papachristidis, Alexandros; Edvardsen, Thor; Claus, Piet; D'hooge, Jan

    2016-04-01

    The recent advent of three-dimensional echocardiography has led to an increased interest from the scientific community in left ventricle segmentation frameworks for cardiac volume and function assessment. An automatic orientation of the segmented left ventricular mesh is an important step to obtain a point-to-point correspondence between the mesh and the cardiac anatomy. Furthermore, this would allow for an automatic division of the left ventricle into the standard 17 segments and, thus, fully automatic per-segment analysis, e.g. regional strain assessment. In this work, a method for fully automatic short axis orientation of the segmented left ventricle is presented. The proposed framework aims at detecting the inferior right ventricular insertion point. 211 three-dimensional echocardiographic images were used to validate this framework by comparison to manual annotation of the inferior right ventricular insertion point. A mean unsigned error of 8, 05° +/- 18, 50° was found, whereas the mean signed error was 1, 09°. Large deviations between the manual and automatic annotations (> 30°) only occurred in 3, 79% of cases. The average computation time was 666ms in a non-optimized MATLAB environment, which potentiates real-time application. In conclusion, a successful automatic real-time method for orientation of the segmented left ventricle is proposed.

  9. Toward an Analytic Framework of Interdisciplinary Reasoning and Communication (IRC) Processes in Science

    NASA Astrophysics Data System (ADS)

    Shen, Ji; Sung, Shannon; Zhang, Dongmei

    2015-11-01

    Students need to think and work across disciplinary boundaries in the twenty-first century. However, it is unclear what interdisciplinary thinking means and how to analyze interdisciplinary interactions in teamwork. In this paper, drawing on multiple theoretical perspectives and empirical analysis of discourse contents, we formulate a theoretical framework that helps analyze interdisciplinary reasoning and communication (IRC) processes in interdisciplinary collaboration. Specifically, we propose four interrelated IRC processes-integration, translation, transfer, and transformation, and develop a corresponding analytic framework. We apply the framework to analyze two meetings of a project that aims to develop interdisciplinary science assessment items. The results illustrate that the framework can help interpret the interdisciplinary meeting dynamics and patterns. Our coding process and results also suggest that these IRC processes can be further examined in terms of interconnected sub-processes. We also discuss the implications of using the framework in conceptualizing, practicing, and researching interdisciplinary learning and teaching in science education.

  10. A Health Economics Approach to US Value Assessment Frameworks-Summary and Recommendations of the ISPOR Special Task Force Report [7].

    PubMed

    Garrison, Louis P; Neumann, Peter J; Willke, Richard J; Basu, Anirban; Danzon, Patricia M; Doshi, Jalpa A; Drummond, Michael F; Lakdawalla, Darius N; Pauly, Mark V; Phelps, Charles E; Ramsey, Scott D; Towse, Adrian; Weinstein, Milton C

    2018-02-01

    This summary section first lists key points from each of the six sections of the report, followed by six key recommendations. The Special Task Force chose to take a health economics approach to the question of whether a health plan should cover and reimburse a specific technology, beginning with the view that the conventional cost-per-quality-adjusted life-year metric has both strengths as a starting point and recognized limitations. This report calls for the development of a more comprehensive economic evaluation that could include novel elements of value (e.g., insurance value and equity) as part of either an "augmented" cost-effectiveness analysis or a multicriteria decision analysis. Given an aggregation of elements to a measure of value, consistent use of a cost-effectiveness threshold can help ensure the maximization of health gain and well-being for a given budget. These decisions can benefit from the use of deliberative processes. The six recommendations are to: 1) be explicit about decision context and perspective in value assessment frameworks; 2) base health plan coverage and reimbursement decisions on an evaluation of the incremental costs and benefits of health care technologies as is provided by cost-effectiveness analysis; 3) develop value thresholds to serve as one important input to help guide coverage and reimbursement decisions; 4) manage budget constraints and affordability on the basis of cost-effectiveness principles; 5) test and consider using structured deliberative processes for health plan coverage and reimbursement decisions; and 6) explore and test novel elements of benefit to improve value measures that reflect the perspectives of both plan members and patients. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. A hybrid framework for quantifying the influence of data in hydrological model calibration

    NASA Astrophysics Data System (ADS)

    Wright, David P.; Thyer, Mark; Westra, Seth; McInerney, David

    2018-06-01

    Influence diagnostics aim to identify a small number of influential data points that have a disproportionate impact on the model parameters and/or predictions. The key issues with current influence diagnostic techniques are that the regression-theory approaches do not provide hydrologically relevant influence metrics, while the case-deletion approaches are computationally expensive to calculate. The main objective of this study is to introduce a new two-stage hybrid framework that overcomes these challenges, by delivering hydrologically relevant influence metrics in a computationally efficient manner. Stage one uses computationally efficient regression-theory influence diagnostics to identify the most influential points based on Cook's distance. Stage two then uses case-deletion influence diagnostics to quantify the influence of points using hydrologically relevant metrics. To illustrate the application of the hybrid framework, we conducted three experiments on 11 hydro-climatologically diverse Australian catchments using the GR4J hydrological model. The first experiment investigated how many data points from stage one need to be retained in order to reliably identify those points that have the hightest influence on hydrologically relevant metrics. We found that a choice of 30-50 is suitable for hydrological applications similar to those explored in this study (30 points identified the most influential data 98% of the time and reduced the required recalibrations by 99% for a 10 year calibration period). The second experiment found little evidence of a change in the magnitude of influence with increasing calibration period length from 1, 2, 5 to 10 years. Even for 10 years the impact of influential points can still be high (>30% influence on maximum predicted flows). The third experiment compared the standard least squares (SLS) objective function with the weighted least squares (WLS) objective function on a 10 year calibration period. In two out of three flow metrics there was evidence that SLS, with the assumption of homoscedastic residual error, identified data points with higher influence (largest changes of 40%, 10%, and 44% for the maximum, mean, and low flows, respectively) than WLS, with the assumption of heteroscedastic residual errors (largest changes of 26%, 6%, and 6% for the maximum, mean, and low flows, respectively). The hybrid framework complements existing model diagnostic tools and can be applied to a wide range of hydrological modelling scenarios.

  12. Contactless sub-millimeter displacement measurements

    NASA Astrophysics Data System (ADS)

    Sliepen, Guus; Jägers, Aswin P. L.; Bettonvil, Felix C. M.; Hammerschlag, Robert H.

    2008-07-01

    Weather effects on foldable domes, as used at the DOT and GREGOR, are investigated, in particular the correlation between the wind field and the stresses caused to both metal framework and tent clothing. Camera systems measure contactless the displacement of several dome points. The stresses follow from the measured deformation pattern. The cameras placed near the dome floor do not disturb telescope operations. In the set-ups of DOT and GREGOR, these cameras are up to 8 meters away from the measured points and must be able to detect displacements of less than 0.1 mm. The cameras have a FireWire (IEEE1394) interface to eliminate the need for frame grabbers. Each camera captures 15 images of 640 × 480 pixels per second. All data is processed on-site in real-time. In order to get the best estimate for the displacement within the constraints of available processing power, all image processing is done in Fourier-space, with all convolution operations being pre-computed once. A sub-pixel estimate of the peak of the correlation function is made. This enables to process the images of four cameras using only one commodity PC with a dual-core processor, and achieve an effective sensitivity of up to 0.01 mm. The deformation measurements are well correlated to the simultaneous wind measurements. The results are of high interest to upscaling the dome design (ELTs and solar telescopes).

  13. Migrant Farm Child Abuse and Neglect within an Ecosystem Framework.

    ERIC Educational Resources Information Center

    Tan, Gerdean G.; And Others

    1991-01-01

    Discusses environmental stress within ecosystem framework as predictor of migrant farm child abuse and neglect. Reviews relationship among individual, family, community, and cultural elements as primary etiologic factor in maltreatment of migrant children. Points to need for strengthening family and neighborhood systems through changes in…

  14. Use of the PHM Framework to Create Safe-Sex Ads Targeted to Mature Women 50 and Older.

    PubMed

    Morton, Cynthia R; Kim, Hyojin

    2015-01-01

    This research applies the Witte's persuasive health message (PHM) framework to the development of creative concepts that promote sexual health strategies to senior-aged women. The PHM framework proposes an integrated approach to improving message effectiveness and maximizing persuasion in health communication campaigns. A focus group method was used to explore two research questions focused on message effectiveness and persuasion. The findings suggest the PHM framework can be a useful starting point for ensuring that health communicators identify the criteria most relevant to successful ad promotions.

  15. The positioning of palliative care in acute care: A multiperspective qualitative study in the context of metastatic melanoma.

    PubMed

    Fox, Jennifer; Windsor, Carol; Connell, Shirley; Yates, Patsy

    2016-06-01

    The positioning and meaning of palliative care within the healthcare system lacks clarity which adds a level of complexity to the process of transition to palliative care. This study explores the transition to the palliative care process in the acute care context of metastatic melanoma. A theoretical framework drawing on interpretive and critical traditions informs this research. The pragmatism of symbolic interactionism and the critical theory of Habermas brought a broad orientation to the research. Integration of the theoretical framework and grounded-theory methods facilitated data generation and analysis of 29 interviews with patients, family carers, and healthcare professionals. The key analytical findings depict a scope of palliative care that was uncertain for users of the system and for those working within the system. Becoming "palliative" is not a defined event; nor is there unanimity around referral to a palliative care service. As such, ambiguity and tension contribute to the difficulties involved in negotiating the transition to palliative care. Our findings point to uncertainty around the scopes of practice in the transition to palliative care. The challenge in the transition process lies in achieving greater coherency of care within an increasingly specialized healthcare system. The findings may not only inform those within a metastatic melanoma context but may contribute more broadly to palliative practices within the acute care setting.

  16. Creative Practices Embodied, Embedded, and Enacted in Architectural Settings: Toward an Ecological Model of Creativity.

    PubMed

    Malinin, Laura H

    2015-01-01

    Memoires by eminently creative people often describe architectural spaces and qualities they believe instrumental for their creativity. However, places designed to encourage creativity have had mixed results, with some found to decrease creative productivity for users. This may be due, in part, to lack of suitable empirical theory or model to guide design strategies. Relationships between creative cognition and features of the physical environment remain largely uninvestigated in the scientific literature, despite general agreement among researchers that human cognition is physically and socially situated. This paper investigates what role architectural settings may play in creative processes by examining documented first person and biographical accounts of creativity with respect to three central theories of situated cognition. First, the embodied thesis argues that cognition encompasses both the mind and the body. Second, the embedded thesis maintains that people exploit features of the physical and social environment to increase their cognitive capabilities. Third, the enaction thesis describes cognition as dependent upon a person's interactions with the world. Common themes inform three propositions, illustrated in a new theoretical framework describing relationships between people and their architectural settings with respect to different cognitive processes of creativity. The framework is intended as a starting point toward an ecological model of creativity, which may be used to guide future creative process research and architectural design strategies to support user creative productivity.

  17. Creative Practices Embodied, Embedded, and Enacted in Architectural Settings: Toward an Ecological Model of Creativity

    PubMed Central

    Malinin, Laura H.

    2016-01-01

    Memoires by eminently creative people often describe architectural spaces and qualities they believe instrumental for their creativity. However, places designed to encourage creativity have had mixed results, with some found to decrease creative productivity for users. This may be due, in part, to lack of suitable empirical theory or model to guide design strategies. Relationships between creative cognition and features of the physical environment remain largely uninvestigated in the scientific literature, despite general agreement among researchers that human cognition is physically and socially situated. This paper investigates what role architectural settings may play in creative processes by examining documented first person and biographical accounts of creativity with respect to three central theories of situated cognition. First, the embodied thesis argues that cognition encompasses both the mind and the body. Second, the embedded thesis maintains that people exploit features of the physical and social environment to increase their cognitive capabilities. Third, the enaction thesis describes cognition as dependent upon a person’s interactions with the world. Common themes inform three propositions, illustrated in a new theoretical framework describing relationships between people and their architectural settings with respect to different cognitive processes of creativity. The framework is intended as a starting point toward an ecological model of creativity, which may be used to guide future creative process research and architectural design strategies to support user creative productivity. PMID:26779087

  18. Interactive Design and Visualization of Branched Covering Spaces.

    PubMed

    Roy, Lawrence; Kumar, Prashant; Golbabaei, Sanaz; Zhang, Yue; Zhang, Eugene

    2018-01-01

    Branched covering spaces are a mathematical concept which originates from complex analysis and topology and has applications in tensor field topology and geometry remeshing. Given a manifold surface and an -way rotational symmetry field, a branched covering space is a manifold surface that has an -to-1 map to the original surface except at the ramification points, which correspond to the singularities in the rotational symmetry field. Understanding the notion and mathematical properties of branched covering spaces is important to researchers in tensor field visualization and geometry processing, and their application areas. In this paper, we provide a framework to interactively design and visualize the branched covering space (BCS) of an input mesh surface and a rotational symmetry field defined on it. In our framework, the user can visualize not only the BCSs but also their construction process. In addition, our system allows the user to design the geometric realization of the BCS using mesh deformation techniques as well as connecting tubes. This enables the user to verify important facts about BCSs such as that they are manifold surfaces around singularities, as well as the Riemann-Hurwitz formula which relates the Euler characteristic of the BCS to that of the original mesh. Our system is evaluated by student researchers in scientific visualization and geometry processing as well as faculty members in mathematics at our university who teach topology. We include their evaluations and feedback in the paper.

  19. Analysis of data characterizing tide and current fluxes in coastal basins

    NASA Astrophysics Data System (ADS)

    Armenio, Elvira; De Serio, Francesca; Mossa, Michele

    2017-07-01

    Many coastal monitoring programmes have been carried out to investigate in situ hydrodynamic patterns and correlated physical processes, such as sediment transport or spreading of pollutants. The key point is the necessity to transform this growing amount of data provided by marine sensors into information for users. The present paper aims to outline that it is possible to recognize the recurring and typical hydrodynamic processes of a coastal basin, by conveniently processing some selected marine field data. The illustrated framework is made up of two steps. Firstly, a sequence of analysis with classic methods characterized by low computational cost was executed in both time and frequency domains on detailed field measurements of waves, tides, and currents. After this, some indicators of the hydrodynamic state of the basin were identified and evaluated. Namely, the assessment of the net flow through a connecting channel, the time delay of current peaks between upper and bottom layers, the ratio of peak ebb and peak flood currents and the tidal asymmetry factor exemplify results on the vertical structure of the flow, on the correlation between currents and tide and flood/ebb dominance. To demonstrate how this simple and generic framework could be applied, a case study is presented, referring to Mar Piccolo, a shallow water basin located in the inner part of the Ionian Sea (southern Italy).

  20. Identifying determinants of medication adherence following myocardial infarction using the Theoretical Domains Framework and the Health Action Process Approach.

    PubMed

    Presseau, Justin; Schwalm, J D; Grimshaw, Jeremy M; Witteman, Holly O; Natarajan, Madhu K; Linklater, Stefanie; Sullivan, Katrina; Ivers, Noah M

    2017-10-01

    Despite evidence-based recommendations, adherence with secondary prevention medications post-myocardial infarction (MI) remains low. Taking medication requires behaviour change, and using behavioural theories to identify what factors determine adherence could help to develop novel adherence interventions. Compare the utility of different behaviour theory-based approaches for identifying modifiable determinants of medication adherence post-MI that could be targeted by interventions. Two studies were conducted with patients 0-2, 3-12, 13-24 or 25-36 weeks post-MI. Study 1: 24 patients were interviewed about barriers and facilitators to medication adherence. Interviews were conducted and coded using the Theoretical Domains Framework. Study 2: 201 patients answered a telephone questionnaire assessing Health Action Process Approach constructs to predict intention and medication adherence (MMAS-8). Study 1: domains identified: Beliefs about Consequences, Memory/Attention/Decision Processes, Behavioural Regulation, Social Influences and Social Identity. Study 2: 64, 59, 42 and 58% reported high adherence at 0-2, 3-12, 13-24 and 25-36 weeks. Social Support and Action Planning predicted adherence at all time points, though the relationship between Action Planning and adherence decreased over time. Using two behaviour theory-based approaches provided complimentary findings and identified modifiable factors that could be targeted to help translate Intention into action to improve medication adherence post-MI.

  1. E-learning process maturity level: a conceptual framework

    NASA Astrophysics Data System (ADS)

    Rahmah, A.; Santoso, H. B.; Hasibuan, Z. A.

    2018-03-01

    ICT advancement is a sure thing with the impact influencing many domains, including learning in both formal and informal situations. It leads to a new mindset that we should not only utilize the given ICT to support the learning process, but also improve it gradually involving a lot of factors. These phenomenon is called e-learning process evolution. Accordingly, this study attempts to explore maturity level concept to provide the improvement direction gradually and progression monitoring for the individual e-learning process. Extensive literature review, observation, and forming constructs are conducted to develop a conceptual framework for e-learning process maturity level. The conceptual framework consists of learner, e-learning process, continuous improvement, evolution of e-learning process, technology, and learning objectives. Whilst, evolution of e-learning process depicted as current versus expected conditions of e-learning process maturity level. The study concludes that from the e-learning process maturity level conceptual framework, it may guide the evolution roadmap for e-learning process, accelerate the evolution, and decrease the negative impact of ICT. The conceptual framework will be verified and tested in the future study.

  2. The Intelligence-Religiosity Nexus: A Representative Study of White Adolescent Americans

    ERIC Educational Resources Information Center

    Nyborg, Helmuth

    2009-01-01

    The present study examined whether IQ relates systematically to denomination and income within the framework of the "g" nexus, using representative data from the National Longitudinal Study of Youth (NLSY97). Atheists score 1.95 IQ points higher than Agnostics, 3.82 points higher than Liberal persuasions, and 5.89 IQ points higher than…

  3. The Point of the Point: Washington's Student Achievement Initiative through the Looking Glass of a Community College

    ERIC Educational Resources Information Center

    Li, Amy Y.

    2017-01-01

    For 8 years Washington State has operated a performance funding policy, the Student Achievement Initiative (SAI). The policy allocates appropriations to the state's 34 community and technical colleges based on points earned through student achievement of college-readiness, retention, and completion milestones. Grounded in a conceptual framework of…

  4. Sample size and classification error for Bayesian change-point models with unlabelled sub-groups and incomplete follow-up.

    PubMed

    White, Simon R; Muniz-Terrera, Graciela; Matthews, Fiona E

    2018-05-01

    Many medical (and ecological) processes involve the change of shape, whereby one trajectory changes into another trajectory at a specific time point. There has been little investigation into the study design needed to investigate these models. We consider the class of fixed effect change-point models with an underlying shape comprised two joined linear segments, also known as broken-stick models. We extend this model to include two sub-groups with different trajectories at the change-point, a change and no change class, and also include a missingness model to account for individuals with incomplete follow-up. Through a simulation study, we consider the relationship of sample size to the estimates of the underlying shape, the existence of a change-point, and the classification-error of sub-group labels. We use a Bayesian framework to account for the missing labels, and the analysis of each simulation is performed using standard Markov chain Monte Carlo techniques. Our simulation study is inspired by cognitive decline as measured by the Mini-Mental State Examination, where our extended model is appropriate due to the commonly observed mixture of individuals within studies who do or do not exhibit accelerated decline. We find that even for studies of modest size ( n = 500, with 50 individuals observed past the change-point) in the fixed effect setting, a change-point can be detected and reliably estimated across a range of observation-errors.

  5. Robotic Online Path Planning on Point Cloud.

    PubMed

    Liu, Ming

    2016-05-01

    This paper deals with the path-planning problem for mobile wheeled- or tracked-robot which drive in 2.5-D environments, where the traversable surface is usually considered as a 2-D-manifold embedded in a 3-D ambient space. Specially, we aim at solving the 2.5-D navigation problem using raw point cloud as input. The proposed method is independent of traditional surface parametrization or reconstruction methods, such as a meshing process, which generally has high-computational complexity. Instead, we utilize the output of 3-D tensor voting framework on the raw point clouds. The computation of tensor voting is accelerated by optimized implementation on graphics computation unit. Based on the tensor voting results, a novel local Riemannian metric is defined using the saliency components, which helps the modeling of the latent traversable surface. Using the proposed metric, we prove that the geodesic in the 3-D tensor space leads to rational path-planning results by experiments. Compared to traditional methods, the results reveal the advantages of the proposed method in terms of smoothing the robot maneuver while considering the minimum travel distance.

  6. Reflectance from images: a model-based approach for human faces.

    PubMed

    Fuchs, Martin; Blanz, Volker; Lensch, Hendrik; Seidel, Hans-Peter

    2005-01-01

    In this paper, we present an image-based framework that acquires the reflectance properties of a human face. A range scan of the face is not required. Based on a morphable face model, the system estimates the 3D shape and establishes point-to-point correspondence across images taken from different viewpoints and across different individuals' faces. This provides a common parameterization of all reconstructed surfaces that can be used to compare and transfer BRDF data between different faces. Shape estimation from images compensates deformations of the face during the measurement process, such as facial expressions. In the common parameterization, regions of homogeneous materials on the face surface can be defined a priori. We apply analytical BRDF models to express the reflectance properties of each region and we estimate their parameters in a least-squares fit from the image data. For each of the surface points, the diffuse component of the BRDF is locally refined, which provides high detail. We present results for multiple analytical BRDF models, rendered at novel orientations and lighting conditions.

  7. A voting-based statistical cylinder detection framework applied to fallen tree mapping in terrestrial laser scanning point clouds

    NASA Astrophysics Data System (ADS)

    Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe

    2017-07-01

    This paper introduces a statistical framework for detecting cylindrical shapes in dense point clouds. We target the application of mapping fallen trees in datasets obtained through terrestrial laser scanning. This is a challenging task due to the presence of ground vegetation, standing trees, DTM artifacts, as well as the fragmentation of dead trees into non-collinear segments. Our method shares the concept of voting in parameter space with the generalized Hough transform, however two of its significant drawbacks are improved upon. First, the need to generate samples on the shape's surface is eliminated. Instead, pairs of nearby input points lying on the surface cast a vote for the cylinder's parameters based on the intrinsic geometric properties of cylindrical shapes. Second, no discretization of the parameter space is required: the voting is carried out in continuous space by means of constructing a kernel density estimator and obtaining its local maxima, using automatic, data-driven kernel bandwidth selection. Furthermore, we show how the detected cylindrical primitives can be efficiently merged to obtain object-level (entire tree) semantic information using graph-cut segmentation and a tailored dynamic algorithm for eliminating cylinder redundancy. Experiments were performed on 3 plots from the Bavarian Forest National Park, with ground truth obtained through visual inspection of the point clouds. It was found that relative to sample consensus (SAC) cylinder fitting, the proposed voting framework can improve the detection completeness by up to 10 percentage points while maintaining the correctness rate.

  8. SLAM-based dense surface reconstruction in monocular Minimally Invasive Surgery and its application to Augmented Reality.

    PubMed

    Chen, Long; Tang, Wen; John, Nigel W; Wan, Tao Ruan; Zhang, Jian Jun

    2018-05-01

    While Minimally Invasive Surgery (MIS) offers considerable benefits to patients, it also imposes big challenges on a surgeon's performance due to well-known issues and restrictions associated with the field of view (FOV), hand-eye misalignment and disorientation, as well as the lack of stereoscopic depth perception in monocular endoscopy. Augmented Reality (AR) technology can help to overcome these limitations by augmenting the real scene with annotations, labels, tumour measurements or even a 3D reconstruction of anatomy structures at the target surgical locations. However, previous research attempts of using AR technology in monocular MIS surgical scenes have been mainly focused on the information overlay without addressing correct spatial calibrations, which could lead to incorrect localization of annotations and labels, and inaccurate depth cues and tumour measurements. In this paper, we present a novel intra-operative dense surface reconstruction framework that is capable of providing geometry information from only monocular MIS videos for geometry-aware AR applications such as site measurements and depth cues. We address a number of compelling issues in augmenting a scene for a monocular MIS environment, such as drifting and inaccurate planar mapping. A state-of-the-art Simultaneous Localization And Mapping (SLAM) algorithm used in robotics has been extended to deal with monocular MIS surgical scenes for reliable endoscopic camera tracking and salient point mapping. A robust global 3D surface reconstruction framework has been developed for building a dense surface using only unorganized sparse point clouds extracted from the SLAM. The 3D surface reconstruction framework employs the Moving Least Squares (MLS) smoothing algorithm and the Poisson surface reconstruction framework for real time processing of the point clouds data set. Finally, the 3D geometric information of the surgical scene allows better understanding and accurate placement AR augmentations based on a robust 3D calibration. We demonstrate the clinical relevance of our proposed system through two examples: (a) measurement of the surface; (b) depth cues in monocular endoscopy. The performance and accuracy evaluations of the proposed framework consist of two steps. First, we have created a computer-generated endoscopy simulation video to quantify the accuracy of the camera tracking by comparing the results of the video camera tracking with the recorded ground-truth camera trajectories. The accuracy of the surface reconstruction is assessed by evaluating the Root Mean Square Distance (RMSD) of surface vertices of the reconstructed mesh with that of the ground truth 3D models. An error of 1.24 mm for the camera trajectories has been obtained and the RMSD for surface reconstruction is 2.54 mm, which compare favourably with previous approaches. Second, in vivo laparoscopic videos are used to examine the quality of accurate AR based annotation and measurement, and the creation of depth cues. These results show the potential promise of our geometry-aware AR technology to be used in MIS surgical scenes. The results show that the new framework is robust and accurate in dealing with challenging situations such as the rapid endoscopy camera movements in monocular MIS scenes. Both camera tracking and surface reconstruction based on a sparse point cloud are effective and operated in real-time. This demonstrates the potential of our algorithm for accurate AR localization and depth augmentation with geometric cues and correct surface measurements in MIS with monocular endoscopes. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Mean-Lagrangian formalism and covariance of fluid turbulence.

    PubMed

    Ariki, Taketo

    2017-05-01

    Mean-field-based Lagrangian framework is developed for the fluid turbulence theory, which enables physically objective discussions, especially, of the history effect. Mean flow serves as a purely geometrical object of Lie group theory, providing useful operations to measure the objective rate and history integration of the general tensor field. The proposed framework is applied, on the one hand, to one-point closure model, yielding an objective expression of the turbulence viscoelastic effect. Application to two-point closure, on the other hand, is also discussed, where natural extension of known Lagrangian correlation is discovered on the basis of an extended covariance group.

  10. Modeling Synergistic Drug Inhibition of Mycobacterium tuberculosis Growth in Murine Macrophages

    DTIC Science & Technology

    2011-01-01

    important application of metabolic network modeling is the ability to quantitatively model metabolic enzyme inhibition and predict bacterial growth...describe the extensions of this framework to model drug- induced growth inhibition of M. tuberculosis in macrophages.39 Mathematical framework Fig. 1 shows...starting point, we used the previously developed iNJ661v model to represent the metabolic Fig. 1 Mathematical framework: a set of coupled models used to

  11. Fundamental Studies of Crystal Growth of Microporous Materials

    NASA Technical Reports Server (NTRS)

    Singh, Ramsharan; Doolittle, John, Jr.; Payra, Pramatha; Dutta, Prabir K.; George, Michael A.; Ramachandran, Narayanan; Schoeman, Brian J.

    2003-01-01

    Microporous materials are framework structures with well-defined porosity, often of molecular dimensions. Zeolites contain aluminum and silicon atoms in their framework and are the most extensively studied amongst all microporous materials. Framework structures with P, Ga, Fe, Co, Zn, B, Ti and a host of other elements have also been made. Typical synthesis of microporous materials involve mixing the framework elements (or compounds, thereof) in a basic solution, followed by aging in some cases and then heating at elevated temperatures. This process is termed hydrothermal synthesis, and involves complex chemical and physical changes. Because of a limited understanding of this process, most synthesis advancements happen by a trial and error approach. There is considerable interest in understanding the synthesis process at a molecular level with the expectation that eventually new framework structures will be built by design. The basic issues in the microporous materials crystallization process include: (a) Nature of the molecular units responsible for the crystal nuclei formation; (b) Nature of the nuclei and nucleation process; (c) Growth process of the nuclei into crystal; (d) Morphological control and size of the resulting crystal; (e) Surface structure of the resulting crystals; and (f) Transformation of frameworks into other frameworks or condensed structures.

  12. A Framework for Validating Traffic Simulation Models at the Vehicle Trajectory Level

    DOT National Transportation Integrated Search

    2017-03-01

    Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...

  13. Developmental Implications of the Levels of Processing Memory Framework.

    ERIC Educational Resources Information Center

    Naus, Mary J.

    The levels of processing framework for understanding memory development has generated little empirical or theoretical work that furthers an understanding of the developmental memory system. Although empirical studies by those testing the levels of processing framework have demonstrated that mnemonic strategies employed by children are the critical…

  14. A Framework for Ethical Deliberation in Special Education.

    ERIC Educational Resources Information Center

    Howe, Kenneth R.; Miramontes, Ofelia B.

    1991-01-01

    The case of a school district refusing to supply an interpreter for an above-average student with a hearing impairment is used as a point of departure for this discussion of a framework for ethical deliberation and the special role-related obligations that help define the ethics of special education. (PB)

  15. Time, Space, and Mass at the Operational Level of War: The Dynamics of the Culminating Point,

    DTIC Science & Technology

    1988-04-28

    theoretical framework for operational culmination and then examining the theory as reflected in recent history. This paper focuses on the concept of...the paper first examines key definitions and provides a theoretical framework for understanding culmination. Next, it considers the application of the

  16. Disability and the Moral Point of View.

    ERIC Educational Resources Information Center

    Hyland, Terry

    1987-01-01

    Discussions of disability should be within a clearly-defined moral framework if the disabled person's rights are to be translated into society's duty to the disabled. An ethical system based on modern versions of utilitarianism is suggested as a moral framework, supplemented by prescriptions based on social justice and respect. (Author/CB)

  17. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework

    EPA Science Inventory

    Best management practices (BMPs) are perceived as being effective in reducing nutrient loads transported from non-point sources (NPS) to receiving water bodies. The objective of this study was to develop a modeling-optimization framework that can be used by watershed management p...

  18. On Teaching a Fractured Macroeconomics: Thoughts

    ERIC Educational Resources Information Center

    Salemi, Michael K.

    1987-01-01

    Discusses Galbraith's (see SO516713) three major points, 1) that the Joint Council's "Framework" should not hide the fact that macroeconomics is messy and political; 2) that the emphasis in the "Framework" is misplaced; and 3) that in certain areas, such as aggregate supply and demand, it is wrong. (JDH)

  19. Researches on Adolescent Thought: A Framework.

    ERIC Educational Resources Information Center

    Vaidya, Narendera

    This document presents research studies/findings and provides a developing point of view on adolescent thought. The first chapter discusses the nature and definitions of thinking. The second and third chapters discuss frameworks for adolescent thought (focusing on the Gestalt school, Geneva school, and accelerated learning) and survey studies on…

  20. 76 FR 37620 - Risk-Based Capital Standards: Advanced Capital Adequacy Framework-Basel II; Establishment of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-28

    ... systems. E. Quantitative Methods for Comparing Capital Frameworks The NPR sought comment on how the... industry while assessing levels of capital. This commenter points out maintaining reliable comparative data over time could make quantitative methods for this purpose difficult. For example, evaluating asset...

  1. Conceptual design of industrial process displays.

    PubMed

    Pedersen, C R; Lind, M

    1999-11-01

    Today, process displays used in industry are often designed on the basis of piping and instrumentation diagrams without any method of ensuring that the needs of the operators are fulfilled. Therefore, a method for a systematic approach to the design of process displays is needed. This paper discusses aspects of process display design taking into account both the designer's and the operator's points of view. Three aspects are emphasized: the operator tasks, the display content and the display form. The distinction between these three aspects is the basis for proposing an outline for a display design method that matches the industrial practice of modular plant design and satisfies the needs of reusability of display design solutions. The main considerations in display design in the industry are to specify the operator's activities in detail, to extract the information the operators need from the plant design specification and documentation, and finally to present this information. The form of the display is selected from existing standardized display elements such as trend curves, mimic diagrams, ecological interfaces, etc. Further knowledge is required to invent new display elements. That is, knowledge about basic visual means of presenting information and how humans perceive and interpret these means and combinations. This knowledge is required in the systematic selection of graphical items for a given display content. The industrial part of the method is first illustrated in the paper by a simple example from a plant with batch processes. Later the method is applied to develop a supervisory display for a condenser system in a nuclear power plant. The differences between the continuous plant domain of power production and the batch processes from the example are analysed and broad categories of display types are proposed. The problems involved in specification and invention of a supervisory display are analysed and conclusions from these problems are made. It is concluded that the design method proposed provides a framework for the progress of the display design and is useful in pin-pointing the actual problems. The method was useful in reducing the number of existing displays that could fulfil the requirements of the supervision task. The method provided at the same time a framework for dealing with the problems involved in inventing new displays based on structured analysis. However the problems in a systematic approach to display invention still need consideration.

  2. Development of a Planetary Web GIS at the ``Photothèque Planétaire'' in Orsay

    NASA Astrophysics Data System (ADS)

    Marmo, C.

    2012-09-01

    The “Photothèque Planétaire d'Orsay” belongs to the Regional Planetary Image Facilities (RPIF) network started by NASA in 1984. The original purpose of the RPIF was mainly to provide easy access to data from US space missions throughout the world. The “Photothèque” itself specializes in planetary data processing and distribution for research and public outreach. Planetary data are heterogeneous, and combining different observations is particularly challenging, especially if they belong to different data-sets. A common description framework is needed, similar to the existing Geographical Information Systems (GIS) that have been developed for manipulating Earth data. In their present state, GIS software and standards cannot directly be applied to other planets because they still lack flexibility in managing coordinate systems. Yet, the GIS framework serves as an excellent starting point for the implementation of a Virtual Observatory for Planetary Sciences, provided it is made more generic and inter-operable. The “Photothèque Planétaire d'Orsay” has produced some planetary GIS examples using historical and public data-sets. Our main project is a Web-based visualization system for planetary data, which features direct point-and-click access to quantitative measurements. Thanks to being compatible with all recent web browsers, our interface can also be used for public outreach and to make data accessible for education and training.

  3. “A child’s nightmare. Mum comes and comforts her child.” Attachment evaluation as a guide in the assessment and treatment in a clinical case study

    PubMed Central

    Salcuni, Silvia; Di Riso, Daniela; Lis, Adriana

    2014-01-01

    There is a gap between proposed theoretical attachment theory frameworks, measures of attachment in the assessment phase and their relationship with changes in outcome after a psychodynamic oriented psychotherapy. Based on a clinical case study of a young woman with Panic Attack Disorder, this paper examined psychotherapy outcome findings comparing initial and post-treatment assessments, according to the mental functioning in S and M-axis of the psychodynamic diagnostic manual. Treatment planning and post-treatment changes were described with the main aim to illustrate from a clinical point of view why a psycho-dynamic approach, with specific attention to an “attachment theory stance,” was considered the treatment of choice for this patient. The Symptom Check List 90 Revised (SCL-90-R) and the Shedler–Westen Assessment Procedure (SWAP–200) were administered to detect patient’s symptomatic perception and clinician’s diagnostic points of view, respectively; the Adult Attachment Interview and the Adult Attachment Projective Picture System (AAP) were also administered as to pay attention to patient’s unconscious internal organization and changes in defense processes. A qualitative description of how the treatment unfolded was included. Findings highlight the important contribution of attachment theory in a 22-month psychodynamic psychotherapy framework, promoting resolution of patient’s symptoms and adjustment. PMID:25191293

  4. Promotion of a healthy public living environment: participatory design of public toilets with visually impaired persons.

    PubMed

    Siu, Kin Wai Michael; Wong, M M Y

    2013-07-01

    The principal objective of a healthy living environment is to improve the quality of everyday life. Visually impaired persons (VIPs) encounter many difficulties in everyday life through a series of barriers, particularly in relation to public toilets. This study aimed to explore the concerns of VIPs in accessing public toilets, and identify methods for improvement. Considerations about user participation are also discussed. Adopting a case study approach, VIPs were invited to participate in the research process. In addition to in-depth interviews and field visits, models and a simulated full-scale environment were produced to facilitate the VIPs to voice their opinions. The key findings indicate that the design of public toilets for promoting public health should be considered and tackled from a three-level framework: plain, line and point. Governments, professionals and the public need to consider the quality of public toilets in terms of policy, implementation and management. VIPs have the right to access public toilets. Governments and professionals should respect the particular needs and concerns of VIPs. A three-level framework (plain, line and point) is required to consider the needs of VIPs in accessing public toilets, and user participation is a good way to reveal the actual needs of VIPs. Copyright © 2013 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  5. The mechanical problems on additive manufacturing of viscoelastic solids with integral conditions on a surface increasing in the growth process

    NASA Astrophysics Data System (ADS)

    Parshin, D. A.; Manzhirov, A. V.

    2018-04-01

    Quasistatic mechanical problems on additive manufacturing aging viscoelastic solids are investigated. The processes of piecewise-continuous accretion of such solids are considered. The consideration is carried out in the framework of linear mechanics of growing solids. A theorem about commutativity of the integration over an arbitrary surface increasing in the solid growing process and the time-derived integral operator of viscoelasticity with a limit depending on the solid point is proved. This theorem provides an efficient way to construct on the basis of Saint-Venant principle solutions of nonclassical boundary-value problems for describing the mechanical behaviour of additively formed solids with integral satisfaction of boundary conditions on the surfaces expanding due to the additional material influx to the formed solid. The constructed solutions will retrace the evolution of the stress-strain state of the solids under consideration during and after the processes of their additive formation. An example of applying the proved theorem is given.

  6. Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets.

    PubMed

    Datta, Abhirup; Banerjee, Sudipto; Finley, Andrew O; Gelfand, Alan E

    2016-01-01

    Spatial process models for analyzing geostatistical data entail computations that become prohibitive as the number of spatial locations become large. This article develops a class of highly scalable nearest-neighbor Gaussian process (NNGP) models to provide fully model-based inference for large geostatistical datasets. We establish that the NNGP is a well-defined spatial process providing legitimate finite-dimensional Gaussian densities with sparse precision matrices. We embed the NNGP as a sparsity-inducing prior within a rich hierarchical modeling framework and outline how computationally efficient Markov chain Monte Carlo (MCMC) algorithms can be executed without storing or decomposing large matrices. The floating point operations (flops) per iteration of this algorithm is linear in the number of spatial locations, thereby rendering substantial scalability. We illustrate the computational and inferential benefits of the NNGP over competing methods using simulation studies and also analyze forest biomass from a massive U.S. Forest Inventory dataset at a scale that precludes alternative dimension-reducing methods. Supplementary materials for this article are available online.

  7. Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets

    PubMed Central

    Datta, Abhirup; Banerjee, Sudipto; Finley, Andrew O.; Gelfand, Alan E.

    2018-01-01

    Spatial process models for analyzing geostatistical data entail computations that become prohibitive as the number of spatial locations become large. This article develops a class of highly scalable nearest-neighbor Gaussian process (NNGP) models to provide fully model-based inference for large geostatistical datasets. We establish that the NNGP is a well-defined spatial process providing legitimate finite-dimensional Gaussian densities with sparse precision matrices. We embed the NNGP as a sparsity-inducing prior within a rich hierarchical modeling framework and outline how computationally efficient Markov chain Monte Carlo (MCMC) algorithms can be executed without storing or decomposing large matrices. The floating point operations (flops) per iteration of this algorithm is linear in the number of spatial locations, thereby rendering substantial scalability. We illustrate the computational and inferential benefits of the NNGP over competing methods using simulation studies and also analyze forest biomass from a massive U.S. Forest Inventory dataset at a scale that precludes alternative dimension-reducing methods. Supplementary materials for this article are available online. PMID:29720777

  8. The semantic anatomical network: Evidence from healthy and brain-damaged patient populations.

    PubMed

    Fang, Yuxing; Han, Zaizhu; Zhong, Suyu; Gong, Gaolang; Song, Luping; Liu, Fangsong; Huang, Ruiwang; Du, Xiaoxia; Sun, Rong; Wang, Qiang; He, Yong; Bi, Yanchao

    2015-09-01

    Semantic processing is central to cognition and is supported by widely distributed gray matter (GM) regions and white matter (WM) tracts. The exact manner in which GM regions are anatomically connected to process semantics remains unknown. We mapped the semantic anatomical network (connectome) by conducting diffusion imaging tractography in 48 healthy participants across 90 GM "nodes," and correlating the integrity of each obtained WM edge and semantic performance across 80 brain-damaged patients. Fifty-three WM edges were obtained whose lower integrity associated with semantic deficits and together with their linked GM nodes constitute a semantic WM network. Graph analyses of this network revealed three structurally segregated modules that point to distinct semantic processing components and identified network hubs and connectors that are central in the communication across the subnetworks. Together, our results provide an anatomical framework of human semantic network, advancing the understanding of the structural substrates supporting semantic processing. © 2015 Wiley Periodicals, Inc.

  9. Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle

    NASA Technical Reports Server (NTRS)

    Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.

    2004-01-01

    This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.

  10. Dynamics of the Molten Contact Line

    NASA Technical Reports Server (NTRS)

    Sonin, Ain A.; Schiaffino, Stefano

    1996-01-01

    In contrast to the ordinary contact line problem, virtually no information is available on the similar problem associated with a molten material spreading on a solid which is below the melt's fusion point. The latter is a more complex problem which heat transfer and solidification take place simultaneously with spreading, and requires answers not only for the hot melt's advance speed over the cold solid as a function of contact angle, but also for how one is to predict the point of the molten contact line's arrest by freezing. This issues are of importance in evolving methods of materials processing. The purpose of our work is to develop, based on both experiments and theory, an understanding of the dynamic processes that occur when a molten droplet touches a subcooled solid, spreads partly over it by capillary action, and freezes. We seek answers to the following basic questions. First, what is the relationship between the melt's contact line speed and the apparent (dynamic) contact angle? Secondly, at what point will the contact line modon be arrested by freezing? The talk will describe three components of our work: (1) deposition experiments with small molten droplets; (2) investigation of the dynamics of the molten contact line by means of a novel forced spreading method; and (3) an attempt to provide a theoretical framework for answering the basic questions posed above.

  11. Early environments and the ecology of inflammation

    PubMed Central

    McDade, Thomas W.

    2012-01-01

    Recent research has implicated inflammatory processes in the pathophysiology of a wide range of chronic degenerative diseases, although inflammation has long been recognized as a critical line of defense against infectious disease. However, current scientific understandings of the links between chronic low-grade inflammation and diseases of aging are based primarily on research in high-income nations with low levels of infectious disease and high levels of overweight/obesity. From a comparative and historical point of view, this epidemiological situation is relatively unique, and it may not capture the full range of ecological variation necessary to understand the processes that shape the development of inflammatory phenotypes. The human immune system is characterized by substantial developmental plasticity, and a comparative, developmental, ecological framework is proposed to cast light on the complex associations among early environments, regulation of inflammation, and disease. Recent studies in the Philippines and lowland Ecuador reveal low levels of chronic inflammation, despite higher burdens of infectious disease, and point to nutritional and microbial exposures in infancy as important determinants of inflammation in adulthood. By shaping the regulation of inflammation, early environments moderate responses to inflammatory stimuli later in life, with implications for the association between inflammation and chronic diseases. Attention to the eco-logics of inflammation may point to promising directions for future research, enriching our understanding of this important physiological system and informing approaches to the prevention and treatment of disease. PMID:23045646

  12. Personality and the Intergenerational Transmission of Educational Attainment: Evidence from Germany

    PubMed Central

    Ryberg, Renee; Bauldry, Shawn; Schultz, Michael A.; Steinhoff, Annekatrin; Shanahan, Michael

    2018-01-01

    Research based in the United States, with its relatively open educational system, has found that personality mediates the relationship between parents’ and child’s educational attainment and this meditational pattern is especially beneficial to students from less-educated households. Yet in highly structured, competitive educational systems, personal characteristics may not predict attainment or may be more or less consequential at different points in the educational career. We examine the salience of personality in the educational attainment process in the German educational system. Data come from a longitudinal sample of 682 seventeen to twenty-five year-olds (54% female) from the 2005 and 2015 German Socio-Economic Panel (SOEP). Results show that adolescent personality traits — openness, neuroticism, and conscientiousness — are associated with educational attainment, but personality plays a negligible role in the intergenerational transmission of education. Personality is influential before the decision about the type of secondary degree that a student will pursue (during adolescence). After that turning point, when students have entered different pathways through the system, personality is less salient. Cross-national comparisons in a life course framework broaden the scope of current research on non-cognitive skills and processes of socioeconomic attainment, alerting the analyst to the importance of both institutional structures and the changing importance of these skills at different points in the life course. PMID:28707154

  13. A framework for the definition of standardized protocols for measuring upper-extremity kinematics.

    PubMed

    Kontaxis, A; Cutti, A G; Johnson, G R; Veeger, H E J

    2009-03-01

    Increasing interest in upper extremity biomechanics has led to closer investigations of both segment movements and detailed joint motion. Unfortunately, conceptual and practical differences in the motion analysis protocols used up to date reduce compatibility for post data and cross validation analysis and so weaken the body of knowledge. This difficulty highlights a need for standardised protocols, each addressing a set of questions of comparable content. The aim of this work is therefore to open a discussion and propose a flexible framework to support: (1) the definition of standardised protocols, (2) a standardised description of these protocols, and (3) the formulation of general recommendations. Proposal of a framework for the definition of standardized protocols. The framework is composed by two nested flowcharts. The first defines what a motion analysis protocol is by pointing out its role in a motion analysis study. The second flowchart describes the steps to build a protocol, which requires decisions on the joints or segments to be investigated and the description of their mechanical equivalent model, the definition of the anatomical or functional coordinate frames, the choice of marker or sensor configuration and the validity of their use, the definition of the activities to be measured and the refinements that can be applied to the final measurements. Finally, general recommendations are proposed for each of the steps based on the current literature, and open issues are highlighted for future investigation and standardisation. Standardisation of motion analysis protocols is urgent. The proposed framework can guide this process through the rationalisation of the approach.

  14. From Field Notes to Data Portal - A Scalable Data QA/QC Framework for Tower Networks: Progress and Preliminary Results

    NASA Astrophysics Data System (ADS)

    Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.

    2017-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.

  15. Rapid, sensitive, and selective fluorescent DNA detection using iron-based metal-organic framework nanorods: Synergies of the metal center and organic linker.

    PubMed

    Tian, Jingqi; Liu, Qian; Shi, Jinle; Hu, Jianming; Asiri, Abdullah M; Sun, Xuping; He, Yuquan

    2015-09-15

    Considerable recent attention has been paid to homogeneous fluorescent DNA detection with the use of nanostructures as a universal "quencher", but it still remains a great challenge to develop such nanosensor with the benefits of low cost, high speed, sensitivity, and selectivity. In this work, we report the use of iron-based metal-organic framework nanorods as a high-efficient sensing platform for fluorescent DNA detection. It only takes about 4 min to complete the whole "mix-and-detect" process with a low detection limit of 10 pM and a strong discrimination of single point mutation. Control experiments reveal the remarkable sensing behavior is a consequence of the synergies of the metal center and organic linker. This work elucidates how composition control of nanostructures can significantly impact their sensing properties, enabling new opportunities for the rational design of functional materials for analytical applications. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. A clustering-based graph Laplacian framework for value function approximation in reinforcement learning.

    PubMed

    Xu, Xin; Huang, Zhenhua; Graves, Daniel; Pedrycz, Witold

    2014-12-01

    In order to deal with the sequential decision problems with large or continuous state spaces, feature representation and function approximation have been a major research topic in reinforcement learning (RL). In this paper, a clustering-based graph Laplacian framework is presented for feature representation and value function approximation (VFA) in RL. By making use of clustering-based techniques, that is, K-means clustering or fuzzy C-means clustering, a graph Laplacian is constructed by subsampling in Markov decision processes (MDPs) with continuous state spaces. The basis functions for VFA can be automatically generated from spectral analysis of the graph Laplacian. The clustering-based graph Laplacian is integrated with a class of approximation policy iteration algorithms called representation policy iteration (RPI) for RL in MDPs with continuous state spaces. Simulation and experimental results show that, compared with previous RPI methods, the proposed approach needs fewer sample points to compute an efficient set of basis functions and the learning control performance can be improved for a variety of parameter settings.

  17. Research and implementation of role-playing teaching mode supported by gamification

    NASA Astrophysics Data System (ADS)

    Cui, Xu; Zhang, Zhenglei; Sun, Lei

    2017-08-01

    The paper designs a Role-playing Teaching Mode Supported by Gamification to stimulate the interest of learners. In the process of creating the teaching mode, the factors of incentive factors, teaching mode and course selection are the most important factors gained by investigate and research. Then under the guidance of the three factors, a leaning framework of role-playing teaching mode which is called Gamification Learning Framework (GM1.0) is determined. In the design of GM1.0, First, collect problem cases which students interested in and select three courses which are Algorithm Design, Data Structure and Program Design. Then, extract the knowledge points of the three courses and merge into the problem cases to form game maps. Last, Learners gain a role-playing actor to join games with the support of game maps and finish selected tasks reaching a higher task level by upgrade checkpoints, experience promotions and award medals changing. After that, learners’ enthusiasm for learning can be stimulated and the innovation abilities can also be improved gradually.

  18. In situ analysis of the organic framework in the prismatic layer of mollusc shell.

    PubMed

    Tong, Hua; Hu, Jiming; Ma, Wentao; Zhong, Guirong; Yao, Songnian; Cao, Nianxing

    2002-06-01

    A novel in situ analytic approach was constructed by means of ion sputtering, decalcification and deprotein techniques combining with scanning electron microscopy (SEM) and transmission electron microscope (TEM) ultrastructural analysis. The method was employed to determine the spatial distribution of the organic framework outside and the inner crystal and organic/inorganic interface spatial geometrical relationship in the prismatic layer of cristaris plicate (leach). The results show that there is a substructure of organic matrix in the intracrystalline region. The prismatic layer forms according to strict hierarchical configuration of regular pattern. Each unit of organic template of prismatic layer can uniquely determine the column crystal growth direction, spatial orientation and size. Cavity templates are responsible for supporting. limiting size and shape and determining the crystal growth spatial orientation, while the intracrystal organic matrix is responsible for providing nucleation point and inducing the nucleation process of calcite. The stereo hierarchical fabrication of prismatic layer was elucidated for the first time.

  19. Quantifying the Energy Landscape Statistics in Proteins - a Relaxation Mode Analysis

    NASA Astrophysics Data System (ADS)

    Cai, Zhikun; Zhang, Yang

    Energy landscape, the hypersurface in the configurational space, has been a useful concept in describing complex processes that occur over a very long time scale, such as the multistep slow relaxations of supercooled liquids and folding of polypeptide chains into structured proteins. Despite extensive simulation studies, its experimental characterization still remains a challenge. To address this challenge, we developed a relaxation mode analysis (RMA) for liquids under a framework analogous to the normal mode analysis for solids. Using RMA, important statistics of the activation barriers of the energy landscape becomes accessible from experimentally measurable two-point correlation functions, e.g. using quasi-elastic and inelastic scattering experiments. We observed a prominent coarsening effect of the energy landscape. The results were further confirmed by direct sampling of the energy landscape using a metadynamics-like adaptive autonomous basin climbing computation. We first demonstrate RMA in a supercooled liquid when dynamical cooperativity emerges in the landscape-influenced regime. Then we show this framework reveals encouraging energy landscape statistics when applied to proteins.

  20. A socio-technical analytical framework on the EHR-organizational innovation interplay: Insights from a public hospital in Greece.

    PubMed

    Emmanouilidou, Maria

    2015-01-01

    The healthcare sector globally is confronted with increasing internal and external pressures that urge for a radical reform of health systems' status quo. The role of technological innovations such as Electronic Health Records (EHR) is recognized as instrumental in this transition process as it is expected to accelerate organizational innovations. This is why the widespread uptake of EHR systems is a top priority in the global healthcare agenda. The successful co-deployment though of EHR systems and organizational innovations within the context of secondary healthcare institutions is a complex and multifaceted issue. Existing research in the field has made little progress thus emphasizing the need for further research contribution that will incorporate a holistic perspective. This paper presents insights about the EHR-organizational innovation interplay from a public hospital in Greece into a socio-technical analytical framework providing a multilevel set of action points for the eHealth roadmap with worldwide relevance.

Top