Sample records for interaction methodology applicable

  1. Exploring the bases for a mixed reality stroke rehabilitation system, Part I: A unified approach for representing action, quantitative evaluation, and interactive feedback

    PubMed Central

    2011-01-01

    Background Although principles based in motor learning, rehabilitation, and human-computer interfaces can guide the design of effective interactive systems for rehabilitation, a unified approach that connects these key principles into an integrated design, and can form a methodology that can be generalized to interactive stroke rehabilitation, is presently unavailable. Results This paper integrates phenomenological approaches to interaction and embodied knowledge with rehabilitation practices and theories to achieve the basis for a methodology that can support effective adaptive, interactive rehabilitation. Our resulting methodology provides guidelines for the development of an action representation, quantification of action, and the design of interactive feedback. As Part I of a two-part series, this paper presents key principles of the unified approach. Part II then describes the application of this approach within the implementation of the Adaptive Mixed Reality Rehabilitation (AMRR) system for stroke rehabilitation. Conclusions The accompanying principles for composing novel mixed reality environments for stroke rehabilitation can advance the design and implementation of effective mixed reality systems for the clinical setting, and ultimately be adapted for home-based application. They furthermore can be applied to other rehabilitation needs beyond stroke. PMID:21875441

  2. User Interaction Modeling and Profile Extraction in Interactive Systems: A Groupware Application Case Study †

    PubMed Central

    Tîrnăucă, Cristina; Duque, Rafael; Montaña, José L.

    2017-01-01

    A relevant goal in human–computer interaction is to produce applications that are easy to use and well-adjusted to their users’ needs. To address this problem it is important to know how users interact with the system. This work constitutes a methodological contribution capable of identifying the context of use in which users perform interactions with a groupware application (synchronous or asynchronous) and provides, using machine learning techniques, generative models of how users behave. Additionally, these models are transformed into a text that describes in natural language the main characteristics of the interaction of the users with the system. PMID:28726762

  3. Elastic interactions between two-dimensional geometric defects

    NASA Astrophysics Data System (ADS)

    Moshe, Michael; Sharon, Eran; Kupferman, Raz

    2015-12-01

    In this paper, we introduce a methodology applicable to a wide range of localized two-dimensional sources of stress. This methodology is based on a geometric formulation of elasticity. Localized sources of stress are viewed as singular defects—point charges of the curvature associated with a reference metric. The stress field in the presence of defects can be solved using a scalar stress function that generalizes the classical Airy stress function to the case of materials with nontrivial geometry. This approach allows the calculation of interaction energies between various types of defects. We apply our methodology to two physical systems: shear-induced failure of amorphous materials and the mechanical interaction between contracting cells.

  4. A methodology for the design and evaluation of user interfaces for interactive information systems. Ph.D. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Farooq, Mohammad U.

    1986-01-01

    The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.

  5. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  6. Watermark: An Application and Methodology and Application for Interactive and intelligent Decision Support for Groundwater Systems

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.

    2016-12-01

    Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.

  7. Modal interactions due to friction in the nonlinear vibration response of the "Harmony" test structure: Experiments and simulations

    NASA Astrophysics Data System (ADS)

    Claeys, M.; Sinou, J.-J.; Lambelin, J.-P.; Todeschini, R.

    2016-08-01

    The nonlinear vibration response of an assembly with friction joints - named "Harmony" - is studied both experimentally and numerically. The experimental results exhibit a softening effect and an increase of dissipation with excitation level. Modal interactions due to friction are also evidenced. The numerical methodology proposed groups together well-known structural dynamic methods, including finite elements, substructuring, Harmonic Balance and continuation methods. On the one hand, the application of this methodology proves its capacity to treat a complex system where several friction movements occur at the same time. On the other hand, the main contribution of this paper is the experimental and numerical study of evidence of modal interactions due to friction. The simulation methodology succeeds in reproducing complex form of dynamic behavior such as these modal interactions.

  8. Review of Land Use Models: Theory and Application

    DOT National Transportation Integrated Search

    1997-01-01

    This paper discusses methodology in reviewing land use models and identifying desired attributes for recommending a model for application by the Delaware Valley Planning Commission (DVRPC). The need for land-use transportation interaction is explored...

  9. From Know How to Do Now: Instructional Applications of Simulated Interactions within Teacher Education

    ERIC Educational Resources Information Center

    Dotger, Benjamin H.

    2011-01-01

    The induction years of teaching challenge novice educators to quickly transition from what they learned as teacher candidates into what they can do as emerging professionals. This article outlines a simulated interaction methodology to help bridge teacher preparation and practice. Building from examples of simulated interactions between teacher…

  10. Applications of Tutoring Systems in Specialized Subject Areas: An Analysis of Skills, Methodologies, and Results.

    ERIC Educational Resources Information Center

    Heron, Timothy E.; Welsch, Richard G.; Goddard, Yvonne L.

    2003-01-01

    This article reviews how tutoring systems have been applied across specialized subject areas (e.g., music, horticulture, health and safety, social interactions). It summarizes findings, provides an analysis of skills learned within each tutoring system, identifies the respective methodologies, and reports relevant findings, implications, and…

  11. Remote sensing applied to agriculture: Basic principles, methodology, and applications

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Mendonca, F. J.

    1981-01-01

    The general principles of remote sensing techniques as applied to agriculture and the methods of data analysis are described. the theoretical spectral responses of crops; reflectance, transmittance, and absorbtance of plants; interactions of plants and soils with reflectance energy; leaf morphology; and factors which affect the reflectance of vegetation cover are dicussed. The methodologies of visual and computer-aided analyses of LANDSAT data are presented. Finally, a case study wherein infrared film was used to detect crop anomalies and other data applications are described.

  12. Verification of nonlinear dynamic structural test results by combined image processing and acoustic analysis

    NASA Astrophysics Data System (ADS)

    Tene, Yair; Tene, Noam; Tene, G.

    1993-08-01

    An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.

  13. Aircraft optimization by a system approach: Achievements and trends

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1992-01-01

    Recently emerging methodology for optimal design of aircraft treated as a system of interacting physical phenomena and parts is examined. The methodology is found to coalesce into methods for hierarchic, non-hierarchic, and hybrid systems all dependent on sensitivity analysis. A separate category of methods has also evolved independent of sensitivity analysis, hence suitable for discrete problems. References and numerical applications are cited. Massively parallel computer processing is seen as enabling technology for practical implementation of the methodology.

  14. Flow Cytometric Analysis of Bimolecular Fluorescence Complementation: A High Throughput Quantitative Method to Study Protein-protein Interaction

    PubMed Central

    Wang, Li; Carnegie, Graeme K.

    2013-01-01

    Among methods to study protein-protein interaction inside cells, Bimolecular Fluorescence Complementation (BiFC) is relatively simple and sensitive. BiFC is based on the production of fluorescence using two non-fluorescent fragments of a fluorescent protein (Venus, a Yellow Fluorescent Protein variant, is used here). Non-fluorescent Venus fragments (VN and VC) are fused to two interacting proteins (in this case, AKAP-Lbc and PDE4D3), yielding fluorescence due to VN-AKAP-Lbc-VC-PDE4D3 interaction and the formation of a functional fluorescent protein inside cells. BiFC provides information on the subcellular localization of protein complexes and the strength of protein interactions based on fluorescence intensity. However, BiFC analysis using microscopy to quantify the strength of protein-protein interaction is time-consuming and somewhat subjective due to heterogeneity in protein expression and interaction. By coupling flow cytometric analysis with BiFC methodology, the fluorescent BiFC protein-protein interaction signal can be accurately measured for a large quantity of cells in a short time. Here, we demonstrate an application of this methodology to map regions in PDE4D3 that are required for the interaction with AKAP-Lbc. This high throughput methodology can be applied to screening factors that regulate protein-protein interaction. PMID:23979513

  15. Flow cytometric analysis of bimolecular fluorescence complementation: a high throughput quantitative method to study protein-protein interaction.

    PubMed

    Wang, Li; Carnegie, Graeme K

    2013-08-15

    Among methods to study protein-protein interaction inside cells, Bimolecular Fluorescence Complementation (BiFC) is relatively simple and sensitive. BiFC is based on the production of fluorescence using two non-fluorescent fragments of a fluorescent protein (Venus, a Yellow Fluorescent Protein variant, is used here). Non-fluorescent Venus fragments (VN and VC) are fused to two interacting proteins (in this case, AKAP-Lbc and PDE4D3), yielding fluorescence due to VN-AKAP-Lbc-VC-PDE4D3 interaction and the formation of a functional fluorescent protein inside cells. BiFC provides information on the subcellular localization of protein complexes and the strength of protein interactions based on fluorescence intensity. However, BiFC analysis using microscopy to quantify the strength of protein-protein interaction is time-consuming and somewhat subjective due to heterogeneity in protein expression and interaction. By coupling flow cytometric analysis with BiFC methodology, the fluorescent BiFC protein-protein interaction signal can be accurately measured for a large quantity of cells in a short time. Here, we demonstrate an application of this methodology to map regions in PDE4D3 that are required for the interaction with AKAP-Lbc. This high throughput methodology can be applied to screening factors that regulate protein-protein interaction.

  16. The Role of Research in Making Interactive Products Effective.

    ERIC Educational Resources Information Center

    Rossi, Robert J.

    1986-01-01

    Argues that research and development (R&D) methods should be utilized to develop new technologies for training and retailing and describes useful research tools--critical incident methodology, task analysis, performance recording. Discussion covers R&D applications to interactive systems development in the areas of product need, customer…

  17. Employing WebGL to develop interactive stereoscopic 3D content for use in biomedical visualization

    NASA Astrophysics Data System (ADS)

    Johnston, Semay; Renambot, Luc; Sauter, Daniel

    2013-03-01

    Web Graphics Library (WebGL), the forthcoming web standard for rendering native 3D graphics in a browser, represents an important addition to the biomedical visualization toolset. It is projected to become a mainstream method of delivering 3D online content due to shrinking support for third-party plug-ins. Additionally, it provides a virtual reality (VR) experience to web users accommodated by the growing availability of stereoscopic displays (3D TV, desktop, and mobile). WebGL's value in biomedical visualization has been demonstrated by applications for interactive anatomical models, chemical and molecular visualization, and web-based volume rendering. However, a lack of instructional literature specific to the field prevents many from utilizing this technology. This project defines a WebGL design methodology for a target audience of biomedical artists with a basic understanding of web languages and 3D graphics. The methodology was informed by the development of an interactive web application depicting the anatomy and various pathologies of the human eye. The application supports several modes of stereoscopic displays for a better understanding of 3D anatomical structures.

  18. Transportation Systems Evaluation

    NASA Technical Reports Server (NTRS)

    Fanning, M. L.; Michelson, R. A.

    1972-01-01

    A methodology for the analysis of transportation systems consisting of five major interacting elements is reported. The analysis begins with the causes of travel demand: geographic, economic, and demographic characteristics as well as attitudes toward travel. Through the analysis, the interaction of these factors with the physical and economic characteristics of the transportation system is determined. The result is an evaluation of the system from the point of view of both passenger and operator. The methodology is applicable to the intraurban transit systems as well as major airlines. Applications of the technique to analysis of a PRT system and a study of intraurban air travel are given. In the discussion several unique models or techniques are mentioned: i.e., passenger preference modeling, an integrated intraurban transit model, and a series of models to perform airline analysis.

  19. Application of control theory to dynamic systems simulation

    NASA Technical Reports Server (NTRS)

    Auslander, D. M.; Spear, R. C.; Young, G. E.

    1982-01-01

    The application of control theory is applied to dynamic systems simulation. Theory and methodology applicable to controlled ecological life support systems are considered. Spatial effects on system stability, design of control systems with uncertain parameters, and an interactive computing language (PARASOL-II) designed for dynamic system simulation, report quality graphics, data acquisition, and simple real time control are discussed.

  20. Cumulative impact assessment: Application of a methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witmer, G.W.; Bain, M.B.; Irving, J.S.

    We expanded upon the Federal Energy Regulatory Commission's (FERC) Cluster Impact Assessment Procedure (CIAP) to provide a practical methodology for assessing potential cumulative impacts from multiple hydroelectric projects within a river basin. The objectives in designing the methodology were to allow the evaluation of a large number of combinations of proposed projects and to minimize constraints on the use of ecological knowledge for planning and regulating hydroelectric development at the river basin level. Interactive workshops and evaluative matrices were used to identify preferred development scenarios in the Snohomish (Washington) and Salmon (Idaho) River Basins. Although the methodology achieved its basicmore » objectives, some difficulties were encountered. These revolved around issues of (1) data quality and quantity, (2) alternatives analysis, (3) determination of project interactions, (4) determination of cumulative impact thresholds, and (5) the use of evaluative techniques to express degrees of impact. 8 refs., 1 fig., 2 tabs.« less

  1. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    PubMed

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  2. Analysis of the impact of safeguards criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mullen, M.F.; Reardon, P.T.

    As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less

  3. Evaluative methodology for comprehensive water quality management planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, H. L.

    Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.

  4. P-Soccer: Soccer Games Application using Kinect

    NASA Astrophysics Data System (ADS)

    Nasir, Mohamad Fahim Mohamed; Suparjoh, Suriawati; Razali, Nazim; Mustapha, Aida

    2018-05-01

    This paper presents a soccer game application called P-Soccer that uses Kinect as the interaction medium between users and the game characters. P-Soccer focuses on training penalty kicks with one character who is taking the kick. This game is developed based on the Game Development Life Cycle (GDLC) methodology. Results for alpha and beta testing showed that the target users are satisfied with overall game design and theme as well as the interactivity with the main character in the game.

  5. Web-4D-QSAR: A web-based application to generate 4D-QSAR descriptors.

    PubMed

    Ataide Martins, João Paulo; Rougeth de Oliveira, Marco Antônio; Oliveira de Queiroz, Mário Sérgio

    2018-06-05

    A web-based application is developed to generate 4D-QSAR descriptors using the LQTA-QSAR methodology, based on molecular dynamics (MD) trajectories and topology information retrieved from the GROMACS package. The LQTAGrid module calculates the intermolecular interaction energies at each grid point, considering probes and all aligned conformations resulting from MD simulations. These interaction energies are the independent variables or descriptors employed in a QSAR analysis. A friendly front end web interface, built using the Django framework and Python programming language, integrates all steps of the LQTA-QSAR methodology in a way that is transparent to the user, and in the backend, GROMACS and LQTAGrid are executed to generate 4D-QSAR descriptors to be used later in the process of QSAR model building. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  6. Application of numerical methods to heat transfer and thermal stress analysis of aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Wieting, A. R.

    1979-01-01

    The paper describes a thermal-structural design analysis study of a fuel-injection strut for a hydrogen-cooled scramjet engine for a supersonic transport, utilizing finite-element methodology. Applications of finite-element and finite-difference codes to the thermal-structural design-analysis of space transports and structures are discussed. The interaction between the thermal and structural analyses has led to development of finite-element thermal methodology to improve the integration between these two disciplines. The integrated thermal-structural analysis capability developed within the framework of a computer code is outlined.

  7. Protein-ligand docking using FFT based sampling: D3R case study.

    PubMed

    Padhorny, Dzmitry; Hall, David R; Mirzaei, Hanieh; Mamonov, Artem B; Moghadasi, Mohammad; Alekseenko, Andrey; Beglov, Dmitri; Kozakov, Dima

    2018-01-01

    Fast Fourier transform (FFT) based approaches have been successful in application to modeling of relatively rigid protein-protein complexes. Recently, we have been able to adapt the FFT methodology to treatment of flexible protein-peptide interactions. Here, we report our latest attempt to expand the capabilities of the FFT approach to treatment of flexible protein-ligand interactions in application to the D3R PL-2016-1 challenge. Based on the D3R assessment, our FFT approach in conjunction with Monte Carlo minimization off-grid refinement was among the top performing methods in the challenge. The potential advantage of our method is its ability to globally sample the protein-ligand interaction landscape, which will be explored in further applications.

  8. Design Requirements for Communication-Intensive Interactive Applications

    NASA Astrophysics Data System (ADS)

    Bolchini, Davide; Garzotto, Franca; Paolini, Paolo

    Online interactive applications call for new requirements paradigms to capture the growing complexity of computer-mediated communication. Crafting successful interactive applications (such as websites and multimedia) involves modeling the requirements for the user experience, including those leading to content design, usable information architecture and interaction, in profound coordination with the communication goals of all stakeholders involved, ranging from persuasion to social engagement, to call for action. To face this grand challenge, we propose a methodology for modeling communication requirements and provide a set of operational conceptual tools to be used in complex projects with multiple stakeholders. Through examples from real-life projects and lessons-learned from direct experience, we draw on the concepts of brand, value, communication goals, information and persuasion requirements to systematically guide analysts to master the multifaceted connections of these elements as drivers to inform successful communication designs.

  9. Grounded theory as a method for research in speech and language therapy.

    PubMed

    Skeat, J; Perry, A

    2008-01-01

    The use of qualitative methodologies in speech and language therapy has grown over the past two decades, and there is now a body of literature, both generally describing qualitative research, and detailing its applicability to health practice(s). However, there has been only limited profession-specific discussion of qualitative methodologies and their potential application to speech and language therapy. To describe the methodology of grounded theory, and to explain how it might usefully be applied to areas of speech and language research where theoretical frameworks or models are lacking. Grounded theory as a methodology for inductive theory-building from qualitative data is explained and discussed. Some differences between 'modes' of grounded theory are clarified and areas of controversy within the literature are highlighted. The past application of grounded theory to speech and language therapy, and its potential for informing research and clinical practice, are examined. This paper provides an in-depth critique of a qualitative research methodology, including an overview of the main difference between two major 'modes'. The article supports the application of a theory-building approach in the profession, which is sometimes complex to learn and apply, but worthwhile in its results. Grounded theory as a methodology has much to offer speech and language therapists and researchers. Although the majority of research and discussion around this methodology has rested within sociology and nursing, grounded theory can be applied by researchers in any field, including speech and language therapists. The benefit of the grounded theory method to researchers and practitioners lies in its application to social processes and human interactions. The resulting theory may support further research in the speech and language therapy profession.

  10. Single-molecule pull-down (SiMPull) for new-age biochemistry: methodology and biochemical applications of single-molecule pull-down (SiMPull) for probing biomolecular interactions in crude cell extracts.

    PubMed

    Aggarwal, Vasudha; Ha, Taekjip

    2014-11-01

    Macromolecular interactions play a central role in many biological processes. Protein-protein interactions have mostly been studied by co-immunoprecipitation, which cannot provide quantitative information on all possible molecular connections present in the complex. We will review a new approach that allows cellular proteins and biomolecular complexes to be studied in real-time at the single-molecule level. This technique is called single-molecule pull-down (SiMPull), because it integrates principles of conventional immunoprecipitation with the powerful single-molecule fluorescence microscopy. SiMPull is used to count how many of each protein is present in the physiological complexes found in cytosol and membranes. Concurrently, it serves as a single-molecule biochemical tool to perform functional studies on the pulled-down proteins. In this review, we will focus on the detailed methodology of SiMPull, its salient features and a wide range of biological applications in comparison with other biosensing tools. © 2014 WILEY Periodicals, Inc.

  11. Discrete Abstractions of Hybrid Systems: Verification of Safety and Application to User-Interface Design

    NASA Technical Reports Server (NTRS)

    Oishi, Meeko; Tomlin, Claire; Degani, Asaf

    2003-01-01

    Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.

  12. A 3D, fully Eulerian, VOF-based solver to study the interaction between two fluids and moving rigid bodies using the fictitious domain method

    NASA Astrophysics Data System (ADS)

    Pathak, Ashish; Raessi, Mehdi

    2016-04-01

    We present a three-dimensional (3D) and fully Eulerian approach to capturing the interaction between two fluids and moving rigid structures by using the fictitious domain and volume-of-fluid (VOF) methods. The solid bodies can have arbitrarily complex geometry and can pierce the fluid-fluid interface, forming contact lines. The three-phase interfaces are resolved and reconstructed by using a VOF-based methodology. Then, a consistent scheme is employed for transporting mass and momentum, allowing for simulations of three-phase flows of large density ratios. The Eulerian approach significantly simplifies numerical resolution of the kinematics of rigid bodies of complex geometry and with six degrees of freedom. The fluid-structure interaction (FSI) is computed using the fictitious domain method. The methodology was developed in a message passing interface (MPI) parallel framework accelerated with graphics processing units (GPUs). The computationally intensive solution of the pressure Poisson equation is ported to GPUs, while the remaining calculations are performed on CPUs. The performance and accuracy of the methodology are assessed using an array of test cases, focusing individually on the flow solver and the FSI in surface-piercing configurations. Finally, an application of the proposed methodology in simulations of the ocean wave energy converters is presented.

  13. Geo-Engineering through Internet Informatics (GEMINI)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doveton, John H.; Watney, W. Lynn

    The program, for development and methodologies, was a 3-year interdisciplinary effort to develop an interactive, integrated Internet Website named GEMINI (Geo-Engineering Modeling through Internet Informatics) that would build real-time geo-engineering reservoir models for the Internet using the latest technology in Web applications.

  14. How do young and senior cytopathologists interact with digital cytology?

    PubMed

    Giovagnoli, Maria Rosaria; Giarnieri, Enrico; Carico, Elisabetta; Giansanti, Daniele

    2010-01-01

    Today thanks to the technological advances in information technology the scenario of utilization of digital cytology has radically changed. New competitive systems, such as client-server architectures are now available in digital cytology. Their application in telemedicine should be investigated. A new interactive tool designed for the final destination user (the cytopathologist) has been proposed. Taking into account the different expertise of the subjects of the study, the investigation was focused both on the senior cytopathologist and on the younger student pathologist. The methodology was tested on 10 students of a Master in cytopathology and on 3 senior cytopathologists. The study showed that the use of digital cytology applications is effective and feasible for telediagnosis. In particular, the study on younger and senior expert investigators showed that, although they interacted with the novel technology of the virtual slide in a different manner, all of them reached the objective of a "correct diagnosis". This investigation, in consideration of the effectiveness of the digital cytology, also showed other indirect and tangible cost-beneft and quantitative advantages. In particular for the learning methodologies for the students of the Master itself and for the biomedical personnel involved in diagnosis.

  15. Techniques of EMG signal analysis: detection, processing, classification and applications

    PubMed Central

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  16. Multithreaded hybrid feature tracking for markerless augmented reality.

    PubMed

    Lee, Taehee; Höllerer, Tobias

    2009-01-01

    We describe a novel markerless camera tracking approach and user interaction methodology for augmented reality (AR) on unprepared tabletop environments. We propose a real-time system architecture that combines two types of feature tracking. Distinctive image features of the scene are detected and tracked frame-to-frame by computing optical flow. In order to achieve real-time performance, multiple operations are processed in a synchronized multi-threaded manner: capturing a video frame, tracking features using optical flow, detecting distinctive invariant features, and rendering an output frame. We also introduce user interaction methodology for establishing a global coordinate system and for placing virtual objects in the AR environment by tracking a user's outstretched hand and estimating a camera pose relative to it. We evaluate the speed and accuracy of our hybrid feature tracking approach, and demonstrate a proof-of-concept application for enabling AR in unprepared tabletop environments, using bare hands for interaction.

  17. Navier-Stokes simulation of plume/Vertical Launching System interaction flowfields

    NASA Astrophysics Data System (ADS)

    York, B. J.; Sinha, N.; Dash, S. M.; Anderson, L.; Gominho, L.

    1992-01-01

    The application of Navier-Stokes methodology to the analysis of Vertical Launching System/missile exhaust plume interactions is discussed. The complex 3D flowfields related to the Vertical Launching System are computed utilizing the PARCH/RNP Navier-Stokes code. PARCH/RNP solves the fully-coupled system of fluid, two-equation turbulence (k-epsilon) and chemical species equations via the implicit, approximately factored, Beam-Warming algorithm utilizing a block-tridiagonal inversion procedure.

  18. Interactive Macroeconomics

    NASA Astrophysics Data System (ADS)

    Di Guilmi, Corrado; Gallegati, Mauro; Landini, Simone

    2017-04-01

    Preface; List of tables; List of figures, 1. Introduction; Part I. Methodological Notes and Tools: 2. The state space notion; 3. The master equation; Part II. Applications to HIA Based Models: 4. Financial fragility and macroeconomic dynamics I: heterogeneity and interaction; 5. Financial fragility and macroeconomic Dynamics II: learning; Part III. Conclusions: 6. Conclusive remarks; Part IV. Appendices and Complements: Appendix A: Complements to Chapter 3; Appendix B: Solving the ME to solve the ABM; Appendix C: Specifying transition rates; Index.

  19. Advancements in mass spectrometry for biological samples: Protein chemical cross-linking and metabolite analysis of plant tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Adam

    2015-01-01

    This thesis presents work on advancements and applications of methodology for the analysis of biological samples using mass spectrometry. Included in this work are improvements to chemical cross-linking mass spectrometry (CXMS) for the study of protein structures and mass spectrometry imaging and quantitative analysis to study plant metabolites. Applications include using matrix-assisted laser desorption/ionization-mass spectrometry imaging (MALDI-MSI) to further explore metabolic heterogeneity in plant tissues and chemical interactions at the interface between plants and pests. Additional work was focused on developing liquid chromatography-mass spectrometry (LC-MS) methods to investigate metabolites associated with plant-pest interactions.

  20. Large-Scale Networked Virtual Environments: Architecture and Applications

    ERIC Educational Resources Information Center

    Lamotte, Wim; Quax, Peter; Flerackers, Eddy

    2008-01-01

    Purpose: Scalability is an important research topic in the context of networked virtual environments (NVEs). This paper aims to describe the ALVIC (Architecture for Large-scale Virtual Interactive Communities) approach to NVE scalability. Design/methodology/approach: The setup and results from two case studies are shown: a 3-D learning environment…

  1. Synergogy: A New Strategy for Education, Training, and Development.

    ERIC Educational Resources Information Center

    Mouton, Jane Srygley; Blake, Robert R.

    The premises, methodologies, and applications of synergogy, a new approach to education and training, are discussed. The synergogic approach allows members of small teams to learn from one another through structured interactions. After examining education within the context of human relationships, consideration is given to the way that the…

  2. Engaging or Distracting: Children's Tablet Computer Use in Education

    ERIC Educational Resources Information Center

    McEwen, Rhonda N.; Dubé, Adam K.

    2015-01-01

    Communications studies and psychology offer analytical and methodological tools that when combined have the potential to bring novel perspectives on human interaction with technologies. In this study of children using simple and complex mathematics applications on tablet computers, cognitive load theory is used to answer the question: how…

  3. A Low Cost Course Information Syndication System

    ERIC Educational Resources Information Center

    Ajayi, A. O.; Olajubu, E. A.; Bello, S. A.; Soriyan, H. A.; Obamuyide, A. V.

    2011-01-01

    This study presents a cost effective, reliable, and convenient mobile web-based system to facilitate the dissemination of course information to students, to support interaction that goes beyond the classroom. The system employed the Really Simple Syndication (RSS) technology and was developed using Rapid Application Development (RAD) methodology.…

  4. Interrelationship of Nondestructive Evaluation Methodologies Applied to Testing of Composite Overwrapped Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Leifeste, Mark R.

    2007-01-01

    Composite Overwrapped Pressure Vessels (COPVs) are commonly used in spacecraft for containment of pressurized gases and fluids, incorporating strength and weight savings. The energy stored is capable of extensive spacecraft damage and personal injury in the event of sudden failure. These apparently simple structures, composed of a metallic media impermeable liner and fiber/resin composite overwrap are really complex structures with numerous material and structural phenomena interacting during pressurized use which requires multiple, interrelated monitoring methodologies to monitor and understand subtle changes critical to safe use. Testing of COPVs at NASA Johnson Space Center White Sands T est Facility (WSTF) has employed multiple in-situ, real-time nondestructive evaluation (NDE) methodologies as well as pre- and post-test comparative techniques to monitor changes in material and structural parameters during advanced pressurized testing. The use of NDE methodologies and their relationship to monitoring changes is discussed based on testing of real-world spacecraft COPVs. Lessons learned are used to present recommendations for use in testing, as well as a discussion of potential applications to vessel health monitoring in future applications.

  5. Combining user logging with eye tracking for interactive and dynamic applications.

    PubMed

    Ooms, Kristien; Coltekin, Arzu; De Maeyer, Philippe; Dupont, Lien; Fabrikant, Sara; Incoul, Annelies; Kuhn, Matthias; Slabbinck, Hendrik; Vansteenkiste, Pieter; Van der Haegen, Lise

    2015-12-01

    User evaluations of interactive and dynamic applications face various challenges related to the active nature of these displays. For example, users can often zoom and pan on digital products, and these interactions cause changes in the extent and/or level of detail of the stimulus. Therefore, in eye tracking studies, when a user's gaze is at a particular screen position (gaze position) over a period of time, the information contained in this particular position may have changed. Such digital activities are commonplace in modern life, yet it has been difficult to automatically compare the changing information at the viewed position, especially across many participants. Existing solutions typically involve tedious and time-consuming manual work. In this article, we propose a methodology that can overcome this problem. By combining eye tracking with user logging (mouse and keyboard actions) with cartographic products, we are able to accurately reference screen coordinates to geographic coordinates. This referencing approach allows researchers to know which geographic object (location or attribute) corresponds to the gaze coordinates at all times. We tested the proposed approach through two case studies, and discuss the advantages and disadvantages of the applied methodology. Furthermore, the applicability of the proposed approach is discussed with respect to other fields of research that use eye tracking-namely, marketing, sports and movement sciences, and experimental psychology. From these case studies and discussions, we conclude that combining eye tracking and user-logging data is an essential step forward in efficiently studying user behavior with interactive and static stimuli in multiple research fields.

  6. Development of description framework of pharmacodynamics ontology and its application to possible drug-drug interaction reasoning.

    PubMed

    Imai, Takeshi; Hayakawa, Masayo; Ohe, Kazuhiko

    2013-01-01

    Prediction of synergistic or antagonistic effects of drug-drug interaction (DDI) in vivo has been of considerable interest over the years. Formal representation of pharmacological knowledge such as ontology is indispensable for machine reasoning of possible DDIs. However, current pharmacology knowledge bases are not sufficient to provide formal representation of DDI information. With this background, this paper presents: (1) a description framework of pharmacodynamics ontology; and (2) a methodology to utilize pharmacodynamics ontology to detect different types of possible DDI pairs with supporting information such as underlying pharmacodynamics mechanisms. We also evaluated our methodology in the field of drugs related to noradrenaline signal transduction process and 11 different types of possible DDI pairs were detected. The main features of our methodology are the explanation capability of the reason for possible DDIs and the distinguishability of different types of DDIs. These features will not only be useful for providing supporting information to prescribers, but also for large-scale monitoring of drug safety.

  7. General transfer matrix formalism to calculate DNA-protein-drug binding in gene regulation: application to OR operator of phage lambda.

    PubMed

    Teif, Vladimir B

    2007-01-01

    The transfer matrix methodology is proposed as a systematic tool for the statistical-mechanical description of DNA-protein-drug binding involved in gene regulation. We show that a genetic system of several cis-regulatory modules is calculable using this method, considering explicitly the site-overlapping, competitive, cooperative binding of regulatory proteins, their multilayer assembly and DNA looping. In the methodological section, the matrix models are solved for the basic types of short- and long-range interactions between DNA-bound proteins, drugs and nucleosomes. We apply the matrix method to gene regulation at the O(R) operator of phage lambda. The transfer matrix formalism allowed the description of the lambda-switch at a single-nucleotide resolution, taking into account the effects of a range of inter-protein distances. Our calculations confirm previously established roles of the contact CI-Cro-RNAP interactions. Concerning long-range interactions, we show that while the DNA loop between the O(R) and O(L) operators is important at the lysogenic CI concentrations, the interference between the adjacent promoters P(R) and P(RM) becomes more important at small CI concentrations. A large change in the expression pattern may arise in this regime due to anticooperative interactions between DNA-bound RNA polymerases. The applicability of the matrix method to more complex systems is discussed.

  8. General transfer matrix formalism to calculate DNA–protein–drug binding in gene regulation: application to OR operator of phage λ

    PubMed Central

    Teif, Vladimir B.

    2007-01-01

    The transfer matrix methodology is proposed as a systematic tool for the statistical–mechanical description of DNA–protein–drug binding involved in gene regulation. We show that a genetic system of several cis-regulatory modules is calculable using this method, considering explicitly the site-overlapping, competitive, cooperative binding of regulatory proteins, their multilayer assembly and DNA looping. In the methodological section, the matrix models are solved for the basic types of short- and long-range interactions between DNA-bound proteins, drugs and nucleosomes. We apply the matrix method to gene regulation at the OR operator of phage λ. The transfer matrix formalism allowed the description of the λ-switch at a single-nucleotide resolution, taking into account the effects of a range of inter-protein distances. Our calculations confirm previously established roles of the contact CI–Cro–RNAP interactions. Concerning long-range interactions, we show that while the DNA loop between the OR and OL operators is important at the lysogenic CI concentrations, the interference between the adjacent promoters PR and PRM becomes more important at small CI concentrations. A large change in the expression pattern may arise in this regime due to anticooperative interactions between DNA-bound RNA polymerases. The applicability of the matrix method to more complex systems is discussed. PMID:17526526

  9. Using Hidden Markov Models to characterise intermittent social behaviour in fish shoals

    NASA Astrophysics Data System (ADS)

    Bode, Nikolai W. F.; Seitz, Michael J.

    2018-02-01

    The movement of animals in groups is widespread in nature. Understanding this phenomenon presents an important problem in ecology with many applications that range from conservation to robotics. Underlying all group movements are interactions between individual animals and it is therefore crucial to understand the mechanisms of this social behaviour. To date, despite promising methodological developments, there are few applications to data of practical statistical techniques that inferentially investigate the extent and nature of social interactions in group movement. We address this gap by demonstrating the usefulness of a Hidden Markov Model approach to characterise individual-level social movement in published trajectory data on three-spined stickleback shoals ( Gasterosteus aculeatus) and novel data on guppy shoals ( Poecilia reticulata). With these models, we formally test for speed-mediated social interactions and verify that they are present. We further characterise this inferred social behaviour and find that despite the substantial shoal-level differences in movement dynamics between species, it is qualitatively similar in guppies and sticklebacks. It is intermittent, occurring in varying numbers of individuals at different time points. The speeds of interacting fish follow a bimodal distribution, indicating that they are either stationary or move at a preferred mean speed, and social fish with more social neighbours move at higher speeds, on average. Our findings and methodology present steps towards characterising social behaviour in animal groups.

  10. Heuristic decomposition for non-hierarchic systems

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.; Hajela, P.

    1991-01-01

    Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.

  11. Coarse-grained molecular dynamics simulations for giant protein-DNA complexes

    NASA Astrophysics Data System (ADS)

    Takada, Shoji

    Biomolecules are highly hierarchic and intrinsically flexible. Thus, computational modeling calls for multi-scale methodologies. We have been developing a coarse-grained biomolecular model where on-average 10-20 atoms are grouped into one coarse-grained (CG) particle. Interactions among CG particles are tuned based on atomistic interactions and the fluctuation matching algorithm. CG molecular dynamics methods enable us to simulate much longer time scale motions of much larger molecular systems than fully atomistic models. After broad sampling of structures with CG models, we can easily reconstruct atomistic models, from which one can continue conventional molecular dynamics simulations if desired. Here, we describe our CG modeling methodology for protein-DNA complexes, together with various biological applications, such as the DNA duplication initiation complex, model chromatins, and transcription factor dynamics on chromatin-like environment.

  12. Practical aspects of protein co-evolution.

    PubMed

    Ochoa, David; Pazos, Florencio

    2014-01-01

    Co-evolution is a fundamental aspect of Evolutionary Theory. At the molecular level, co-evolutionary linkages between protein families have been used as indicators of protein interactions and functional relationships from long ago. Due to the complexity of the problem and the amount of genomic data required for these approaches to achieve good performances, it took a relatively long time from the appearance of the first ideas and concepts to the quotidian application of these approaches and their incorporation to the standard toolboxes of bioinformaticians and molecular biologists. Today, these methodologies are mature (both in terms of performance and usability/implementation), and the genomic information that feeds them large enough to allow their general application. This review tries to summarize the current landscape of co-evolution-based methodologies, with a strong emphasis on describing interesting cases where their application to important biological systems, alone or in combination with other computational and experimental approaches, allowed getting new insight into these.

  13. Practical aspects of protein co-evolution

    PubMed Central

    Ochoa, David; Pazos, Florencio

    2014-01-01

    Co-evolution is a fundamental aspect of Evolutionary Theory. At the molecular level, co-evolutionary linkages between protein families have been used as indicators of protein interactions and functional relationships from long ago. Due to the complexity of the problem and the amount of genomic data required for these approaches to achieve good performances, it took a relatively long time from the appearance of the first ideas and concepts to the quotidian application of these approaches and their incorporation to the standard toolboxes of bioinformaticians and molecular biologists. Today, these methodologies are mature (both in terms of performance and usability/implementation), and the genomic information that feeds them large enough to allow their general application. This review tries to summarize the current landscape of co-evolution-based methodologies, with a strong emphasis on describing interesting cases where their application to important biological systems, alone or in combination with other computational and experimental approaches, allowed getting new insight into these. PMID:25364721

  14. Application of atomic force microscopy as a nanotechnology tool in food science.

    PubMed

    Yang, Hongshun; Wang, Yifen; Lai, Shaojuan; An, Hongjie; Li, Yunfei; Chen, Fusheng

    2007-05-01

    Atomic force microscopy (AFM) provides a method for detecting nanoscale structural information. First, this review explains the fundamentals of AFM, including principle, manipulation, and analysis. Applications of AFM are then reported in food science and technology research, including qualitative macromolecule and polymer imaging, complicated or quantitative structure analysis, molecular interaction, molecular manipulation, surface topography, and nanofood characterization. The results suggested that AFM could bring insightful knowledge on food properties, and the AFM analysis could be used to illustrate some mechanisms of property changes during processing and storage. However, the current difficulty in applying AFM to food research is lacking appropriate methodology for different food systems. Better understanding of AFM technology and developing corresponding methodology for complicated food systems would lead to a more in-depth understanding of food properties at macromolecular levels and enlarge their applications. The AFM results could greatly improve the food processing and storage technologies.

  15. Tracking problem solving by multivariate pattern analysis and Hidden Markov Model algorithms.

    PubMed

    Anderson, John R

    2012-03-01

    Multivariate pattern analysis can be combined with Hidden Markov Model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first "mind reading" application involves using fMRI activity to track what students are doing as they solve a sequence of algebra problems. The methodology achieves considerable accuracy at determining both what problem-solving step the students are taking and whether they are performing that step correctly. The second "model discovery" application involves using statistical model evaluation to determine how many substates are involved in performing a step of algebraic problem solving. This research indicates that different steps involve different numbers of substates and these substates are associated with different fluency in algebra problem solving. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Canine neuroanatomy: Development of a 3D reconstruction and interactive application for undergraduate veterinary education

    PubMed Central

    Raffan, Hazel; Guevar, Julien; Poyade, Matthieu; Rea, Paul M.

    2017-01-01

    Current methods used to communicate and present the complex arrangement of vasculature related to the brain and spinal cord is limited in undergraduate veterinary neuroanatomy training. Traditionally it is taught with 2-dimensional (2D) diagrams, photographs and medical imaging scans which show a fixed viewpoint. 2D representations of 3-dimensional (3D) objects however lead to loss of spatial information, which can present problems when translating this to the patient. Computer-assisted learning packages with interactive 3D anatomical models have become established in medical training, yet equivalent resources are scarce in veterinary education. For this reason, we set out to develop a workflow methodology creating an interactive model depicting the vasculature of the canine brain that could be used in undergraduate education. Using MR images of a dog and several commonly available software programs, we set out to show how combining image editing, segmentation and surface generation, 3D modeling and texturing can result in the creation of a fully interactive application for veterinary training. In addition to clearly identifying a workflow methodology for the creation of this dataset, we have also demonstrated how an interactive tutorial and self-assessment tool can be incorporated into this. In conclusion, we present a workflow which has been successful in developing a 3D reconstruction of the canine brain and associated vasculature through segmentation, surface generation and post-processing of readily available medical imaging data. The reconstructed model was implemented into an interactive application for veterinary education that has been designed to target the problems associated with learning neuroanatomy, primarily the inability to visualise complex spatial arrangements from 2D resources. The lack of similar resources in this field suggests this workflow is original within a veterinary context. There is great potential to explore this method, and introduce a new dimension into veterinary education and training. PMID:28192461

  17. Canine neuroanatomy: Development of a 3D reconstruction and interactive application for undergraduate veterinary education.

    PubMed

    Raffan, Hazel; Guevar, Julien; Poyade, Matthieu; Rea, Paul M

    2017-01-01

    Current methods used to communicate and present the complex arrangement of vasculature related to the brain and spinal cord is limited in undergraduate veterinary neuroanatomy training. Traditionally it is taught with 2-dimensional (2D) diagrams, photographs and medical imaging scans which show a fixed viewpoint. 2D representations of 3-dimensional (3D) objects however lead to loss of spatial information, which can present problems when translating this to the patient. Computer-assisted learning packages with interactive 3D anatomical models have become established in medical training, yet equivalent resources are scarce in veterinary education. For this reason, we set out to develop a workflow methodology creating an interactive model depicting the vasculature of the canine brain that could be used in undergraduate education. Using MR images of a dog and several commonly available software programs, we set out to show how combining image editing, segmentation and surface generation, 3D modeling and texturing can result in the creation of a fully interactive application for veterinary training. In addition to clearly identifying a workflow methodology for the creation of this dataset, we have also demonstrated how an interactive tutorial and self-assessment tool can be incorporated into this. In conclusion, we present a workflow which has been successful in developing a 3D reconstruction of the canine brain and associated vasculature through segmentation, surface generation and post-processing of readily available medical imaging data. The reconstructed model was implemented into an interactive application for veterinary education that has been designed to target the problems associated with learning neuroanatomy, primarily the inability to visualise complex spatial arrangements from 2D resources. The lack of similar resources in this field suggests this workflow is original within a veterinary context. There is great potential to explore this method, and introduce a new dimension into veterinary education and training.

  18. Formal and heuristic system decomposition methods in multidisciplinary synthesis. Ph.D. Thesis, 1991

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.

    1991-01-01

    The multidisciplinary interactions which exist in large scale engineering design problems provide a unique set of difficulties. These difficulties are associated primarily with unwieldy numbers of design variables and constraints, and with the interdependencies of the discipline analysis modules. Such obstacles require design techniques which account for the inherent disciplinary couplings in the analyses and optimizations. The objective of this work was to develop an efficient holistic design synthesis methodology that takes advantage of the synergistic nature of integrated design. A general decomposition approach for optimization of large engineering systems is presented. The method is particularly applicable for multidisciplinary design problems which are characterized by closely coupled interactions among discipline analyses. The advantage of subsystem modularity allows for implementation of specialized methods for analysis and optimization, computational efficiency, and the ability to incorporate human intervention and decision making in the form of an expert systems capability. The resulting approach is not a method applicable to only a specific situation, but rather, a methodology which can be used for a large class of engineering design problems in which the system is non-hierarchic in nature.

  19. Interactive Materials In The Teaching Of Astronomy

    NASA Astrophysics Data System (ADS)

    Macêdo, J. A.; Voelzke, M. R.

    2014-10-01

    This study presents results of a survey conducted at the Federal Institution of Education, Science and Technology in the North of Minas Gerais (IFNMG), and aimed to investigate the potentialities of the use of interactive materials in the teaching of astronomy. An advanced training course with involved learning activities about basic concepts of astronomy was offered to thirty-two Licenciate students in Physics, Mathematics and Biological Science. The following steps were to be taken: i) analysis of the pedagogical projects (PPC) of the licenciates at the IFNMG, research locus of its Campus Januária; ii) analysis of students' preconceptions about astronomy and digital technologies, identified by the application of an initial questionnaire; iii) preparation of the course taking into account the students' previous knowledge; iv) application of the education proposal developed under part-time presence modality, using various interactive tools; v) application and analysis of the final questionnaire. The test was conducted with the qualitative and quantitative methodology, combined with a content analysis. The results indicated that in the IFNMG only the licenciate-course in physics includes astronomy content diluted in various subjects of the curriculum; the rates of students prior knowledge in relation to astronomy was low; an evidence of meaningful learning of the concepts related to astronomy, and of viability of resource use involving digital technologies in the Teaching of astronomy, which may contribute to the broadening of methodological options of future teachers and meet their training needs.

  20. Interaction Dynamics Between a Flexible Rotor and an Auxiliary Clearance Bearing

    NASA Technical Reports Server (NTRS)

    Lawen, James L., Jr.; Flowers, George T.

    1996-01-01

    This study investigates the application of synchronous interaction dynamics methodology to the design of auxiliary bearing systems. The technique is applied to a flexible rotor system and comparisons are made between the behavior predicted by this analysis method and the observed simulation response characteristics. Of particular interest is the influence of coupled shaft/bearing vibration modes on rotordynamical behavior. Experimental studies are also perFormed to validate the simulation results and provide insight into the expected behavior of such a system.

  1. On the Pressure of a Neutron Gas Interacting with the Non-Uniform Magnetic Field of a Neutron Star

    NASA Astrophysics Data System (ADS)

    Skobelev, V. V.

    2018-04-01

    On the basis of simple arguments, practically not going beyond the scope of an undergraduate course in general physics, we estimate the additional pressure (at zero temperature) of degenerate neutron matter due to its interaction with the non-uniform magnetic field of a neutron star. This work has methodological and possibly scientific value as an intuitive application of the content of such a course to a solution of topical problems of astrophysics.

  2. Calculating the habitable zones of multiple star systems with a new interactive Web site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Müller, Tobias W. A.; Haghighipour, Nader

    We have developed a comprehensive methodology and an interactive Web site for calculating the habitable zone (HZ) of multiple star systems. Using the concept of spectral weight factor, as introduced in our previous studies of the calculations of HZ in and around binary star systems, we calculate the contribution of each star (based on its spectral energy distribution) to the total flux received at the top of the atmosphere of an Earth-like planet, and use the models of the HZ of the Sun to determine the boundaries of the HZ in multiple star systems. Our interactive Web site for carryingmore » out these calculations is publicly available at http://astro.twam.info/hz. We discuss the details of our methodology and present its application to some of the multiple star systems detected by the Kepler space telescope. We also present the instructions for using our interactive Web site, and demonstrate its capabilities by calculating the HZ for two interesting analytical solutions of the three-body problem.« less

  3. Detection of differential viewing patterns to erotic and non-erotic stimuli using eye-tracking methodology.

    PubMed

    Lykins, Amy D; Meana, Marta; Kambe, Gretchen

    2006-10-01

    As a first step in the investigation of the role of visual attention in the processing of erotic stimuli, eye-tracking methodology was employed to measure eye movements during erotic scene presentation. Because eye-tracking is a novel methodology in sexuality research, we attempted to determine whether the eye-tracker could detect differences (should they exist) in visual attention to erotic and non-erotic scenes. A total of 20 men and 20 women were presented with a series of erotic and non-erotic images and tracked their eye movements during image presentation. Comparisons between erotic and non-erotic image groups showed significant differences on two of three dependent measures of visual attention (number of fixations and total time) in both men and women. As hypothesized, there was a significant Stimulus x Scene Region interaction, indicating that participants visually attended to the body more in the erotic stimuli than in the non-erotic stimuli, as evidenced by a greater number of fixations and longer total time devoted to that region. These findings provide support for the application of eye-tracking methodology as a measure of visual attentional capture in sexuality research. Future applications of this methodology to expand our knowledge of the role of cognition in sexuality are suggested.

  4. Applying a contemporary grounded theory methodology.

    PubMed

    Licqurish, Sharon; Seibold, Carmel

    2011-01-01

    The aim of this paper is to discuss the application of a contemporary grounded theory methodology to a research project exploring the experiences of students studying for a degree in midwifery. Grounded theory is a qualitative research approach developed by Glaser and Strauss in the 1950s but the methodology for this study was modelled on Clarke's (2005) approach and was underpinned by a symbolic interactionist theoretical perspective, post-structuralist theories of Michel Foucault and a constructionist epistemology. The study participants were 19 midwifery students completing their final placement. Data were collected through individual in-depth interviews and participant observation, and analysed using the grounded theory analysis techniques of coding, constant comparative analysis and theoretical sampling, as well as situational maps. The analysis focused on social action and interaction and the operation of power in the students' environment. The social process in which the students were involved, as well as the actors and discourses that affected the students' competency development, were highlighted. The methodology allowed a thorough exploration of the students' experiences of achieving competency. However, some difficulties were encountered. One of the major issues related to the understanding and application of complex sociological theories that challenged positivist notions of truth and power. Furthermore, the mapping processes were complex. Despite these minor challenges, the authors recommend applying this methodology to other similar research projects.

  5. Air-coupled laser vibrometry: analysis and applications.

    PubMed

    Solodov, Igor; Döring, Daniel; Busse, Gerd

    2009-03-01

    Acousto-optic interaction between a narrow laser beam and acoustic waves in air is analyzed theoretically. The photoelastic relation in air is used to derive the phase modulation of laser light in air-coupled reflection vibrometry induced by angular spatial spectral components comprising the acoustic beam. Maximum interaction was found for the zero spatial acoustic component propagating normal to the laser beam. The angular dependence of the imaging efficiency is determined for the axial and nonaxial acoustic components with the regard for the laser beam steering in the scanning mode. The sensitivity of air-coupled vibrometry is compared with conventional "Doppler" reflection vibrometry. Applications of the methodology for visualization of linear and nonlinear air-coupled fields are demonstrated.

  6. On the Comparison of the Long Penetration Mode (LPM) Supersonic Counterflowing Jet to the Supersonic Screech Jet

    NASA Technical Reports Server (NTRS)

    Farr, Rebecca A.; Chang, Chau-Lyan; Jones, Jess H.; Dougherty, N. Sam

    2015-01-01

    Classic tonal screech noise created by under-expanded supersonic jets; Long Penetration Mode (LPM) supersonic phenomenon -Under-expanded counter-flowing jet in supersonic free stream -Demonstrated in several wind tunnel tests -Modeled in several computational fluid dynamics (CFD) simulations; Discussion of LPM acoustics feedback and fluid interactions -Analogous to the aero-acoustics interactions seen in screech jets; Lessons Learned: Applying certain methodologies to LPM -Developed and successfully demonstrated in the study of screech jets -Discussion of mechanically induced excitation in fluid oscillators in general; Conclusions -Large body of work done on jet screech, other aero-acoustic phenomenacan have direct application to the study and applications of LPM cold flow jets

  7. Recent advances in jointed quantum mechanics and molecular mechanics calculations of biological macromolecules: schemes and applications coupled to ab initio calculations.

    PubMed

    Hagiwara, Yohsuke; Tateno, Masaru

    2010-10-20

    We review the recent research on the functional mechanisms of biological macromolecules using theoretical methodologies coupled to ab initio quantum mechanical (QM) treatments of reaction centers in proteins and nucleic acids. Since in most cases such biological molecules are large, the computational costs of performing ab initio calculations for the entire structures are prohibitive. Instead, simulations that are jointed with molecular mechanics (MM) calculations are crucial to evaluate the long-range electrostatic interactions, which significantly affect the electronic structures of biological macromolecules. Thus, we focus our attention on the methodologies/schemes and applications of jointed QM/MM calculations, and discuss the critical issues to be elucidated in biological macromolecular systems. © 2010 IOP Publishing Ltd

  8. Multi-scale structural community organisation of the human genome.

    PubMed

    Boulos, Rasha E; Tremblay, Nicolas; Arneodo, Alain; Borgnat, Pierre; Audit, Benjamin

    2017-04-11

    Structural interaction frequency matrices between all genome loci are now experimentally achievable thanks to high-throughput chromosome conformation capture technologies. This ensues a new methodological challenge for computational biology which consists in objectively extracting from these data the structural motifs characteristic of genome organisation. We deployed the fast multi-scale community mining algorithm based on spectral graph wavelets to characterise the networks of intra-chromosomal interactions in human cell lines. We observed that there exist structural domains of all sizes up to chromosome length and demonstrated that the set of structural communities forms a hierarchy of chromosome segments. Hence, at all scales, chromosome folding predominantly involves interactions between neighbouring sites rather than the formation of links between distant loci. Multi-scale structural decomposition of human chromosomes provides an original framework to question structural organisation and its relationship to functional regulation across the scales. By construction the proposed methodology is independent of the precise assembly of the reference genome and is thus directly applicable to genomes whose assembly is not fully determined.

  9. An Agent-Based Modeling Framework and Application for the Generic Nuclear Fuel Cycle

    NASA Astrophysics Data System (ADS)

    Gidden, Matthew J.

    Key components of a novel methodology and implementation of an agent-based, dynamic nuclear fuel cycle simulator, Cyclus , are presented. The nuclear fuel cycle is a complex, physics-dependent supply chain. To date, existing dynamic simulators have not treated constrained fuel supply, time-dependent, isotopic-quality based demand, or fuel fungibility particularly well. Utilizing an agent-based methodology that incorporates sophisticated graph theory and operations research techniques can overcome these deficiencies. This work describes a simulation kernel and agents that interact with it, highlighting the Dynamic Resource Exchange (DRE), the supply-demand framework at the heart of the kernel. The key agent-DRE interaction mechanisms are described, which enable complex entity interaction through the use of physics and socio-economic models. The translation of an exchange instance to a variant of the Multicommodity Transportation Problem, which can be solved feasibly or optimally, follows. An extensive investigation of solution performance and fidelity is then presented. Finally, recommendations for future users of Cyclus and the DRE are provided.

  10. A recommended workflow methodology in the creation of an educational and training application incorporating a digital reconstruction of the cerebral ventricular system and cerebrospinal fluid circulation to aid anatomical understanding.

    PubMed

    Manson, Amy; Poyade, Matthieu; Rea, Paul

    2015-10-19

    The use of computer-aided learning in education can be advantageous, especially when interactive three-dimensional (3D) models are used to aid learning of complex 3D structures. The anatomy of the ventricular system of the brain is difficult to fully understand as it is seldom seen in 3D, as is the flow of cerebrospinal fluid (CSF). This article outlines a workflow for the creation of an interactive training tool for the cerebral ventricular system, an educationally challenging area of anatomy. This outline is based on the use of widely available computer software packages. Using MR images of the cerebral ventricular system and several widely available commercial and free software packages, the techniques of 3D modelling, texturing, sculpting, image editing and animations were combined to create a workflow in the creation of an interactive educational and training tool. This was focussed on cerebral ventricular system anatomy, and the flow of cerebrospinal fluid. We have successfully created a robust methodology by using key software packages in the creation of an interactive education and training tool. This has resulted in an application being developed which details the anatomy of the ventricular system, and flow of cerebrospinal fluid using an anatomically accurate 3D model. In addition to this, our established workflow pattern presented here also shows how tutorials, animations and self-assessment tools can also be embedded into the training application. Through our creation of an established workflow in the generation of educational and training material for demonstrating cerebral ventricular anatomy and flow of cerebrospinal fluid, it has enormous potential to be adopted into student training in this field. With the digital age advancing rapidly, this has the potential to be used as an innovative tool alongside other methodologies for the training of future healthcare practitioners and scientists. This workflow could be used in the creation of other tools, which could be developed for use not only on desktop and laptop computers but also smartphones, tablets and fully immersive stereoscopic environments. It also could form the basis on which to build surgical simulations enhanced with haptic interaction.

  11. Methodology to explore emergent behaviours of the interactions between water resources and ecosystem under a pluralistic approach

    NASA Astrophysics Data System (ADS)

    García-Santos, Glenda; Madruga de Brito, Mariana; Höllermann, Britta; Taft, Linda; Almoradie, Adrian; Evers, Mariele

    2018-06-01

    Understanding the interactions between water resources and its social dimensions is crucial for an effective and sustainable water management. The identification of sensitive control variables and feedback loops of a specific human-hydro-scape can enhance the knowledge about the potential factors and/or agents leading to the current water resources and ecosystems situation, which in turn supports the decision-making process of desirable futures. Our study presents the utility of a system dynamics modeling approach for water management and decision-making for the case of a forest ecosystem under risk of wildfires. We use the pluralistic water research concept to explore different scenarios and simulate the emergent behaviour of water interception and net precipitation after a wildfire in a forest ecosystem. Through a case study, we illustrate the applicability of this new methodology.

  12. A New Method to Study Heterodimerization of Membrane Proteins and Its Application to Fibroblast Growth Factor Receptors*

    PubMed Central

    Del Piccolo, Nuala; Sarabipour, Sarvenaz; Hristova, Kalina

    2017-01-01

    The activity of receptor tyrosine kinases (RTKs) is controlled through their lateral association in the plasma membrane. RTKs are believed to form both homodimers and heterodimers, and the different dimers are believed to play unique roles in cell signaling. However, RTK heterodimers remain poorly characterized, as compared with homodimers, because of limitations in current experimental methods. Here, we develop a FRET-based methodology to assess the thermodynamics of hetero-interactions in the plasma membrane. To demonstrate the utility of the methodology, we use it to study the hetero-interactions between three fibroblast growth factor receptors—FGFR1, FGFR2, and FGFR3—in the absence of ligand. Our results show that all possible FGFR heterodimers form, suggesting that the biological roles of FGFR heterodimers may be as significant as the homodimer roles. We further investigate the effect of two pathogenic point mutations in FGFR3 (A391E and G380R) on heterodimerization. We show that each of these mutations stabilize most of the heterodimers, with the largest effects observed for FGFR3 wild-type/mutant heterodimers. We thus demonstrate that the methodology presented here can yield new knowledge about RTK interactions and can further our understanding of signal transduction across the plasma membrane. PMID:27927983

  13. Defined surface immobilization of glycosaminoglycan molecules for probing and modulation of cell-material interactions.

    PubMed

    Wang, Kai; Luo, Ying

    2013-07-08

    As one important category of biological molecules on the cell surface and in the extracellular matrix (ECM), glycosaminoglycans (GAGs) have been widely studied for biomedical applications. With the understanding that the biological functions of GAGs are driven by the complex dynamics of physiological and pathological processes, methodologies are desired to allow the elucidation of cell-GAG interactions with molecular level precision. In this study, a microtiter plate-based system was devised through a new surface modification strategy involving polydopamine (PDA) and GAG molecules functionalized with hydrazide chemical groups. A small library of GAGs including hyaluronic acid (with different molecular weights), heparin, and chondroitin sulfate was successfully immobilized via defined binding sites onto the microtiter plate surface under facile aqueous conditions. The methodology then allowed parallel studies of the GAG-modified surfaces in a high-throughput format. The results show that immobilized GAGs possess distinct properties to mediate protein adsorption, cell adhesion, and inflammatory responses, with each property showing dependence on the type and molecular weight of specific GAG molecules. The PDA-assisted immobilization of hydrazide-functionalized GAGs allows biomimetic attachment of GAG molecules and retains their bioactivity, providing a new methodology to systematically probe fundamental cell-GAG interactions to modulate the bioactivity and biocompatibility of biomaterials.

  14. The case for multimodal analysis of atypical interaction: questions, answers and gaze in play involving a child with autism.

    PubMed

    Muskett, Tom; Body, Richard

    2013-01-01

    Conversation analysis (CA) continues to accrue interest within clinical linguistics as a methodology that can enable elucidation of structural and sequential orderliness in interactions involving participants who produce ostensibly disordered communication behaviours. However, it can be challenging to apply CA to re-examine clinical phenomena that have initially been defined in terms of linguistics, as a logical starting point for analysis may be to focus primarily on the organisation of language ("talk") in such interactions. In this article, we argue that CA's methodological power can only be fully exploited in this research context when a multimodal analytic orientation is adopted, where due consideration is given to participants' co-ordinated use of multiple semiotic resources including, but not limited to, talk (e.g., gaze, embodied action, object use and so forth). To evidence this argument, a two-layered analysis of unusual question-answer sequences in a play episode involving a child with autism is presented. It is thereby demonstrated that only when the scope of enquiry is broadened to include gaze and other embodied action can an account be generated of orderliness within these sequences. This finding has important implications for CA's application as a research methodology within clinical linguistics.

  15. Games network and application to PAs system.

    PubMed

    Chettaoui, C; Delaplace, F; Manceny, M; Malo, M

    2007-02-01

    In this article, we present a game theory based framework, named games network, for modeling biological interactions. After introducing the theory, we more precisely describe the methodology to model biological interactions. Then we apply it to the plasminogen activator system (PAs) which is a signal transduction pathway involved in cancer cell migration. The games network theory extends game theory by including the locality of interactions. Each game in a games network represents local interactions between biological agents. The PAs system is implicated in cytoskeleton modifications via regulation of actin and microtubules, which in turn favors cell migration. The games network model has enabled us a better understanding of the regulation involved in the PAs system.

  16. Design, Synthesis, and Evaluation of N- and C-Terminal Protein Bioconjugates as G Protein-Coupled Receptor Agonists.

    PubMed

    Healey, Robert D; Wojciechowski, Jonathan P; Monserrat-Martinez, Ana; Tan, Susan L; Marquis, Christopher P; Sierecki, Emma; Gambin, Yann; Finch, Angela M; Thordarson, Pall

    2018-02-21

    A G protein-coupled receptor (GPCR) agonist protein, thaumatin, was site-specifically conjugated at the N- or C-terminus with a fluorophore for visualization of GPCR:agonist interactions. The N-terminus was specifically conjugated using a synthetic 2-pyridinecarboxyaldehyde reagent. The interaction profiles observed for N- and C-terminal conjugates were varied; N-terminal conjugates interacted very weakly with the GPCR of interest, whereas C-terminal conjugates bound to the receptor. These chemical biology tools allow interactions of therapeutic proteins:GPCR to be monitored and visualized. The methodology used for site-specific bioconjugation represents an advance in application of 2-pyridinecarboxyaldehydes for N-terminal specific bioconjugations.

  17. Tracking Problem Solving by Multivariate Pattern Analysis and Hidden Markov Model Algorithms

    ERIC Educational Resources Information Center

    Anderson, John R.

    2012-01-01

    Multivariate pattern analysis can be combined with Hidden Markov Model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first "mind reading" application…

  18. The application of ab initio calculations to molecular spectroscopy

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.

    1989-01-01

    The state of the art in ab initio molecular structure calculations is reviewed with an emphasis on recent developments, such as full configuration-interaction benchmark calculations and atomic natural orbital basis sets. It is found that new developments in methodology, combined with improvements in computer hardware, are leading to unprecedented accuracy in solving problems in spectroscopy.

  19. The application of ab initio calculations to molecular spectroscopy

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.

    1989-01-01

    The state of the art in ab initio molecular structure calculations is reviewed, with an emphasis on recent developments such as full configuration-interaction benchmark calculations and atomic natural orbital basis sets. It is shown that new developments in methodology combined with improvements in computer hardware are leading to unprecedented accuracy in solving problems in spectroscopy.

  20. Crystalline bipyridinium radical complexes and uses thereof

    DOEpatents

    Fahrenbach, Albert C.; Barnes, Jonathan C.; Li, Hao; Stoddart, J. Fraser; Basuray, Ashish Neil; Sampath, Srinivasan

    2015-09-01

    Described herein are methods of generating 4,4'-bipyridinium radical cations (BIPY.sup..cndot.+), and methods for utilizing the radical-radical interactions between two or more BIPY.sup..cndot.+ radical cations that ensue for the creation of novel materials for applications in nanotechnology. Synthetic methodologies, crystallographic engineering techniques, methods of physical characterization, and end uses are described.

  1. Preparation for Instruction. A Module of Instruction in Teacher Education. Prepared for Project RAFT.

    ERIC Educational Resources Information Center

    Handley, Herbert M., Ed.

    This module, developed by the Research Applications for Teaching (RAFT) project, was written to assist students to write lesson plans that are effective and interactive. Students are given directions for the preparation of behavioral objectives and for the selection of appropriate instructional methodologies to meet the widely varying needs of…

  2. Perspective on nanoparticle technology for biomedical use

    PubMed Central

    Raliya, Ramesh; Chadha, Tandeep Singh; Hadad, Kelsey; Biswas, Pratim

    2016-01-01

    This review gives a short overview on the widespread use of nanostructured and nanocomposite materials for disease diagnostics, drug delivery, imaging and biomedical sensing applications. Nanoparticle interaction with a biological matrix/entity is greatly influenced by its morphology, crystal phase, surface chemistry, functionalization, physicochemical and electronic properties of the particle. Various nanoparticle synthesis routes, characteristization, and functionalization methodologies to be used for biomedical applications ranging from drug delivery to molecular probing of underlying mechanisms and concepts are described with several examples (150 references). PMID:26951098

  3. A Framework for Spatial Interaction Analysis Based on Large-Scale Mobile Phone Data

    PubMed Central

    Li, Weifeng; Cheng, Xiaoyun; Guo, Gaohua

    2014-01-01

    The overall understanding of spatial interaction and the exact knowledge of its dynamic evolution are required in the urban planning and transportation planning. This study aimed to analyze the spatial interaction based on the large-scale mobile phone data. The newly arisen mass dataset required a new methodology which was compatible with its peculiar characteristics. A three-stage framework was proposed in this paper, including data preprocessing, critical activity identification, and spatial interaction measurement. The proposed framework introduced the frequent pattern mining and measured the spatial interaction by the obtained association. A case study of three communities in Shanghai was carried out as verification of proposed method and demonstration of its practical application. The spatial interaction patterns and the representative features proved the rationality of the proposed framework. PMID:25435865

  4. Computational simulation of probabilistic lifetime strength for aerospace materials subjected to high temperature, mechanical fatigue, creep and thermal fatigue

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.; Trimble, Greg A.

    1992-01-01

    This report presents the results of a fourth year effort of a research program, conducted for NASA-LeRC by the University of Texas at San Antonio (UTSA). The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subject to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 has been analyzed using the developed methodology.

  5. Computational simulation of probabilistic lifetime strength for aerospace materials subjected to high temperature, mechanical fatigue, creep, and thermal fatigue

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.; Trimble, Greg A.

    1992-01-01

    The results of a fourth year effort of a research program conducted for NASA-LeRC by The University of Texas at San Antonio (UTSA) are presented. The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue, or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation was randomized and is included in the computer program, PROMISC. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.

  6. Recent Developments and Applications of the MMPBSA Method

    PubMed Central

    Wang, Changhao; Greene, D'Artagnan; Xiao, Li; Qi, Ruxi; Luo, Ray

    2018-01-01

    The Molecular Mechanics Poisson-Boltzmann Surface Area (MMPBSA) approach has been widely applied as an efficient and reliable free energy simulation method to model molecular recognition, such as for protein-ligand binding interactions. In this review, we focus on recent developments and applications of the MMPBSA method. The methodology review covers solvation terms, the entropy term, extensions to membrane proteins and high-speed screening, and new automation toolkits. Recent applications in various important biomedical and chemical fields are also reviewed. We conclude with a few future directions aimed at making MMPBSA a more robust and efficient method. PMID:29367919

  7. Determination of Quantum Chemistry Based Force Fields for Molecular Dynamics Simulations of Aromatic Polymers

    NASA Technical Reports Server (NTRS)

    Jaffe, Richard; Langhoff, Stephen R. (Technical Monitor)

    1995-01-01

    Ab initio quantum chemistry calculations for model molecules can be used to parameterize force fields for molecular dynamics simulations of polymers. Emphasis in our research group is on using quantum chemistry-based force fields for molecular dynamics simulations of organic polymers in the melt and glassy states, but the methodology is applicable to simulations of small molecules, multicomponent systems and solutions. Special attention is paid to deriving reliable descriptions of the non-bonded and electrostatic interactions. Several procedures have been developed for deriving and calibrating these parameters. Our force fields for aromatic polyimide simulations will be described. In this application, the intermolecular interactions are the critical factor in determining many properties of the polymer (including its color).

  8. Digital Methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER ( E mploying CO ncept u al schema for policy and T ranslation E  in R esearch - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  9. Digital Literacy Integration in Educational Practice: Creating a Learning Community, through a Geographic Project in Mytilene Senior High School, Greece

    ERIC Educational Resources Information Center

    Exarchou, Evi; Klonari, Aikaterini; Lambrinos, Nikos; Vaitis, Michalis

    2017-01-01

    This study focused on the analysis of Grade-12 (Senior) students' sociocultural constructivist interactions using Web 2.0 applications during a geographical research process. In the study methodology context, a transdisciplinary case study (TdCS) with ethnographic and research action data was designed, implemented and analyzed in real teaching…

  10. Synthesis Guidebook. Volume 1. Methodology Definition

    DTIC Science & Technology

    1992-10-16

    OV I. Introductiaa " Synthesis Transition Strategies (Williams 1990b), which discusses a strategy for the incremental transitioning of Synthesis into... strategies for mitigating those risks. Use checkpoints, reviews, and metrics to reveal flaws and misconceptions. 4. Interaction With Other Activities... markets and customer requirements lead to evolving product and process needs of Application Engineering projects. OV2-5 OV.2. Fundamentals of Synthesis

  11. Monitoring Ligand-Activated Protein-Protein Interactions Using Bioluminescent Resonance Energy Transfer (BRET) Assay.

    PubMed

    Coriano, Carlos; Powell, Emily; Xu, Wei

    2016-01-01

    The bioluminescent resonance energy transfer (BRET) assay has been extensively used in cell-based and in vivo imaging systems for detecting protein-protein interactions in the native environment of living cells. These protein-protein interactions are essential for the functional response of many signaling pathways to environmental chemicals. BRET has been used as a toxicological tool for identifying chemicals that either induce or inhibit these protein-protein interactions. This chapter focuses on describing the toxicological applications of BRET and its optimization as a high-throughput detection system in live cells. Here we review the construction of BRET fusion proteins, describe the BRET methodology, and outline strategies to overcome obstacles that may arise. Furthermore, we describe the advantage of BRET over other resonance energy transfer methods for monitoring protein-protein interactions.

  12. Improved Discovery of Molecular Interactions in Genome-Scale Data with Adaptive Model-Based Normalization

    PubMed Central

    Brown, Patrick O.

    2013-01-01

    Background High throughput molecular-interaction studies using immunoprecipitations (IP) or affinity purifications are powerful and widely used in biology research. One of many important applications of this method is to identify the set of RNAs that interact with a particular RNA-binding protein (RBP). Here, the unique statistical challenge presented is to delineate a specific set of RNAs that are enriched in one sample relative to another, typically a specific IP compared to a non-specific control to model background. The choice of normalization procedure critically impacts the number of RNAs that will be identified as interacting with an RBP at a given significance threshold – yet existing normalization methods make assumptions that are often fundamentally inaccurate when applied to IP enrichment data. Methods In this paper, we present a new normalization methodology that is specifically designed for identifying enriched RNA or DNA sequences in an IP. The normalization (called adaptive or AD normalization) uses a basic model of the IP experiment and is not a variant of mean, quantile, or other methodology previously proposed. The approach is evaluated statistically and tested with simulated and empirical data. Results and Conclusions The adaptive (AD) normalization method results in a greatly increased range in the number of enriched RNAs identified, fewer false positives, and overall better concordance with independent biological evidence, for the RBPs we analyzed, compared to median normalization. The approach is also applicable to the study of pairwise RNA, DNA and protein interactions such as the analysis of transcription factors via chromatin immunoprecipitation (ChIP) or any other experiments where samples from two conditions, one of which contains an enriched subset of the other, are studied. PMID:23349766

  13. Impact of excipient interactions on solid dosage form stability.

    PubMed

    Narang, Ajit S; Desai, Divyakant; Badawy, Sherif

    2012-10-01

    Drug-excipient interactions in solid dosage forms can affect drug product stability in physical aspects such as organoleptic changes and dissolution slowdown, or chemically by causing drug degradation. Recent research has allowed the distinction in chemical instability resulting from direct drug-excipient interactions and from drug interactions with excipient impurities. A review of chemical instability in solid dosage forms highlights common mechanistic themes applicable to multiple degradation pathways. These common themes include the role of water and microenvironmental pH. In addition, special aspects of solid-state reactions with excipients and/or excipient impurities add to the complexity in understanding and modeling reaction pathways. This paper discusses mechanistic basis of known drug-excipient interactions with case studies and provides an overview of common underlying themes. Recent developments in the understanding of degradation pathways further impact methodologies used in the pharmaceutical industry for prospective stability assessment. This paper discusses these emerging aspects in terms of limitations of drug-excipient compatibility studies, emerging paradigms in accelerated stability testing, and application of mathematical modeling for prediction of drug product stability.

  14. Bio-inspired algorithms applied to molecular docking simulations.

    PubMed

    Heberlé, G; de Azevedo, W F

    2011-01-01

    Nature as a source of inspiration has been shown to have a great beneficial impact on the development of new computational methodologies. In this scenario, analyses of the interactions between a protein target and a ligand can be simulated by biologically inspired algorithms (BIAs). These algorithms mimic biological systems to create new paradigms for computation, such as neural networks, evolutionary computing, and swarm intelligence. This review provides a description of the main concepts behind BIAs applied to molecular docking simulations. Special attention is devoted to evolutionary algorithms, guided-directed evolutionary algorithms, and Lamarckian genetic algorithms. Recent applications of these methodologies to protein targets identified in the Mycobacterium tuberculosis genome are described.

  15. Variation of Kozinets' framework and application to nursing research.

    PubMed

    Witney, Cynthia; Hendricks, Joyce; Cope, Vicki

    2016-05-01

    Online communities are new sites for undertaking research, with their textual interactions providing a rich source of data in real time. 'Ethnonetnography' is a research methodology based on ethnography that can be used in these online communities. In this study, the researcher and a specialist breast care nurse (SBCN) were immersed in the online community, adding to patients' breast cancer care and providing a nursing research component to the community. To examine Kozinets' ( 2010 ) framework for ethnonetnography and how it may be varied for use in a purpose-built, disease-specific, online support community. The online community provided an area where members could communicate with each other. Kozinets' ( 2010 ) framework was varied in that the research was carried out in a purpose-built community opf which an SBCN was a member who could provide support and advice. The application of the ethnonetnographic methodology has wide implications for clinical nursing practice and research. Ethnonetnography can be used to study disease-specific communities in a focused manner and can provide immediate benefits through the inclusion of an expert nurse and contemporaneous application of research findings to patient care. With ethical permission and the permission of online community members, nurse researchers can enter already established online communities. Ethnonetnography is ideally suited to nursing research as it provides the immediacy of evidence-based interaction with an expert nurse. These real-time responses improve support for those experiencing a critical life event.

  16. Galerkin finite element scheme for magnetostrictive structures and composites

    NASA Astrophysics Data System (ADS)

    Kannan, Kidambi Srinivasan

    The ever increasing-role of magnetostrictives in actuation and sensing applications is an indication of their importance in the emerging field of smart structures technology. As newer, and more complex, applications are developed, there is a growing need for a reliable computational tool that can effectively address the magneto-mechanical interactions and other nonlinearities in these materials and in structures incorporating them. This thesis presents a continuum level quasi-static, three-dimensional finite element computational scheme for modeling the nonlinear behavior of bulk magnetostrictive materials and particulate magnetostrictive composites. Models for magnetostriction must deal with two sources of nonlinearities-nonlinear body forces/moments in equilibrium equations governing magneto-mechanical interactions in deformable and magnetized bodies; and nonlinear coupled magneto-mechanical constitutive models for the material of interest. In the present work, classical differential formulations for nonlinear magneto-mechanical interactions are recast in integral form using the weighted-residual method. A discretized finite element form is obtained by applying the Galerkin technique. The finite element formulation is based upon three dimensional eight-noded (isoparametric) brick element interpolation functions and magnetostatic infinite elements at the boundary. Two alternative possibilities are explored for establishing the nonlinear incremental constitutive model-characterization in terms of magnetic field or in terms of magnetization. The former methodology is the one most commonly used in the literature. In this work, a detailed comparative study of both methodologies is carried out. The computational scheme is validated, qualitatively and quantitatively, against experimental measurements published in the literature on structures incorporating the magnetostrictive material Terfenol-D. The influence of nonlinear body forces and body moments of magnetic origin, on the response of magnetostrictive structures to complex mechanical and magnetic loading conditions, is carefully examined. While monolithic magnetostrictive materials have been commercially-available since the late eighties, attention in the smart structures research community has recently focussed upon building and using magnetostrictive particulate composite structures for conventional actuation applications and novel sensing methodologies in structural health monitoring. A particulate magnetostrictive composite element has been developed in the present work to model such structures. This composite element incorporates interactions between magnetostrictive particles by combining a numerical micromechanical analysis based on magneto-mechanical Green's functions, with a homogenization scheme based upon the Mori-Tanaka approach. This element has been applied to the simulation of particulate actuators and sensors reported in the literature. Simulation results are compared to experimental data for validation purposes. The computational schemes developed, for bulk materials and for composites, are expected to be of great value to researchers and designers of novel applications based on magnetostrictives.

  17. Usability testing in medical informatics: cognitive approaches to evaluation of information systems and user interfaces.

    PubMed Central

    Kushniruk, A. W.; Patel, V. L.; Cimino, J. J.

    1997-01-01

    This paper describes an approach to the evaluation of health care information technologies based on usability engineering and a methodological framework from the study of medical cognition. The approach involves collection of a rich set of data including video recording of health care workers as they interact with systems, such as computerized patient records and decision support tools. The methodology can be applied in the laboratory setting, typically involving subjects "thinking aloud" as they interact with a system. A similar approach to data collection and analysis can also be extended to study of computer systems in the "live" environment of hospital clinics. Our approach is also influenced from work in the area of cognitive task analysis, which aims to characterize the decision making and reasoning of subjects of varied levels of expertise as they interact with information technology in carrying out representative tasks. The stages involved in conducting cognitively-based usability analyses are detailed and the application of such analysis in the iterative process of system and interface development is discussed. PMID:9357620

  18. Sharing methodology: a worked example of theoretical integration with qualitative data to clarify practical understanding of learning and generate new theoretical development.

    PubMed

    Yardley, Sarah; Brosnan, Caragh; Richardson, Jane

    2013-01-01

    Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.

  19. Selected Tether Applications Cost Model

    NASA Technical Reports Server (NTRS)

    Keeley, Michael G.

    1988-01-01

    Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.

  20. Interactive educational simulators in diabetes care.

    PubMed

    Lehmann, E D

    1997-01-01

    Since the Diabetes Control and Complications Trial demonstrated the substantial benefits of tight glycaemic control there has been renewed interest in the application of information technology (IT) based techniques for improving the day-to-day care of patients with diabetes mellitus. Computer-based educational approaches have a great deal of potential for patients use, and may offer a means of training more health-care professionals to deliver such improved care. In this article the potential role of IT in diabetes education is reviewed, focusing in particular on the application of compartmental models in both computer-based interactive simulators and educational video games. Close attention is devoted to practical applications-available today-for use by patients, their relatives, students and health-care professionals. The novel features and potential benefits of such methodologies are highlighted and some of the limitations of currently available software are discussed. The need for improved graphical user interfaces, and for further efforts to evaluate such programs and demonstrate an educational benefit from their use are identified as hurdles to their more widespread application. The review concludes with a look to the future and the type of modelling features which should be provided in the next generation of interactive diabetes simulators and educational video games.

  1. Methodologies for screening of bacteria-carbohydrate interactions: anti-adhesive milk oligosaccharides as a case study.

    PubMed

    Lane, Jonathan A; Mariño, Karina; Rudd, Pauline M; Carrington, Stephen D; Slattery, Helen; Hickey, Rita M

    2012-07-01

    Many studies have demonstrated the capacity of glycan-based compounds to disrupt microbial binding to mucosal epithelia. Therefore, oligosaccharides have potential application in the prevention of certain bacterial diseases. However, current screening methods for the identification of anti-adhesive oligosaccharides have limitations: they are time-consuming and require large amounts of oligosaccharides. There is a need to develop analytical techniques which can quickly screen for, and structurally define, anti-adhesive oligosaccharides prior to using human cell line models of infection. Considering this, we have developed a rapid method for screening complex oligosaccharide mixtures for potential anti-adhesive activity against bacteria. Our approach involves the use of whole bacterial cells to "deplete" free oligosaccharides from solution. As a case study, the free oligosaccharides from the colostrum of Holstein Friesian cows were screened for interactions with whole Escherichia coli cells. Reductions in oligosaccharide concentrations were determined by High pH Anion Exchange Chromatography and Hydrophilic Interaction Liquid Chromatography (HILIC-HPLC). Oligosaccharide structures were confirmed by a combination of HILIC-HPLC, exoglycosidase digestion and off-line negative ion mode MS/MS. The depletion assay confirmed selective bacterial interaction with certain bovine oligosaccharides which in previous studies, by other methodologies, had been shown to interact with E. coli. In particular, the bacterial cells depleted the following oligosaccharides in a population dependent manner: 3'-sialyllactose, disialyllactose, and 6'-sialyllactosamine. The assay methodology was further validated by studies in which we demonstrated the inhibitory activity of 3'-sialyllactose, and a mixture of bovine colostrum oligosaccharides, on E. coli adhesion to differentiated HT-29 cells. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. A methodological approach for designing a usable ontology-based GUI in healthcare.

    PubMed

    Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J

    2013-01-01

    This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.

  3. Interactive Concept of Operations Narrative Simulators

    NASA Technical Reports Server (NTRS)

    Denham, Andre R.

    2017-01-01

    This paper reports on an exploratory design and development project. Specifically this paper discusses the design and development of Interactive Concept of Operations Narrative Simulators (ICONS) as a means of enhancing the functionality of traditional Concept of Operations documents by leveraging the affordances provided by applications commonly used within the Interactive Fiction literary genre. Recommendations for an ICONS design and development methodology, along a detailed description of a practical proof-of-concept ICONS created using this approach are discussed. The report concludes with a discussion of how ICONS can be extended to the K-12 mathematics education domain and conclude with a discussion of how ICONS can be used to assist those involved with strategic planning at Marshall Space Flight Center.

  4. Decentralized PID controller for TITO systems using characteristic ratio assignment with an experimental application.

    PubMed

    Hajare, V D; Patre, B M

    2015-11-01

    This paper presents a decentralized PID controller design method for two input two output (TITO) systems with time delay using characteristic ratio assignment (CRA) method. The ability of CRA method to design controller for desired transient response has been explored for TITO systems. The design methodology uses an ideal decoupler to reduce the interaction. Each decoupled subsystem is reduced to first order plus dead time (FOPDT) model to design independent diagonal controllers. Based on specified overshoot and settling time, the controller parameters are computed using CRA method. To verify performance of the proposed controller, two benchmark simulation examples are presented. To demonstrate applicability of the proposed controller, experimentation is performed on real life interacting coupled tank level system. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Cloud, Aerosol, and Volcanic Ash Retrievals Using ASTR and SLSTR with ORAC

    NASA Astrophysics Data System (ADS)

    McGarragh, Gregory; Poulsen, Caroline; Povey, Adam; Thomas, Gareth; Christensen, Matt; Sus, Oliver; Schlundt, Cornelia; Stapelberg, Stefan; Stengel, Martin; Grainger, Don

    2015-12-01

    The Optimal Retrieval of Aerosol and Cloud (ORAC) is a generalized optimal estimation system that retrieves cloud, aerosol and volcanic ash parameters using satellite imager measurements in the visible to infrared. Use of the same algorithm for different sensors and parameters leads to consistency that facilitates inter-comparison and interaction studies. ORAC currently supports ATSR, AVHRR, MODIS and SEVIRI. In this proceeding we discuss the ORAC retrieval algorithm applied to ATSR data including the retrieval methodology, the forward model, uncertainty characterization and discrimination/classification techniques. Application of ORAC to SLSTR data is discussed including the additional features that SLSTR provides relative to the ATSR heritage. The ORAC level 2 and level 3 results are discussed and an application of level 3 results to the study of cloud/aerosol interactions is presented.

  6. Analysis of gap junctional intercellular communications using a dielectrophoresis-based microchip.

    PubMed

    Tellez-Gabriel, Marta; Charrier, Céline; Brounais-Le Royer, Bénédicte; Mullard, Mathilde; Brown, Hannah K; Verrecchia, Franck; Heymann, Dominique

    2017-03-01

    Gap junctions are transmembrane structures that directly connect the cytoplasm of adjacent cells, making intercellular communications possible. It has been shown that the behaviour of several tumours - such as bone tumours - is related to gap junction intercellular communications (GJIC). Several methodologies are available for studying GJIC, based on measuring different parameters that are useful for multiple applications, such as the study of carcinogenesis for example. These methods nevertheless have several limitations. The present manuscript describes the setting up of a dielectrophoresis (DEP)-based lab-on-a-chip platform for the real-time study of Gap Junctional Intercellular Communication between osteosarcoma cells and the main cells accessible to their microenvironment. We conclude that using the DEParray technology for the GJIC assessment has several advantages comparing to current techniques. This methodology is less harmful for cells integrity; cells can be recovered after interaction to make further molecular analysis; it is possible to study GJIC in real time; we can promote cell interactions using up to five different populations. The setting up of this new methodology overcomes several difficulties to perform experiments for solving questions about GJIC process that we are not able to do with current technics. Copyright © 2017 Elsevier GmbH. All rights reserved.

  7. Applying graphs and complex networks to football metric interpretation.

    PubMed

    Arriaza-Ardiles, E; Martín-González, J M; Zuniga, M D; Sánchez-Flores, J; de Saa, Y; García-Manso, J M

    2018-02-01

    This work presents a methodology for analysing the interactions between players in a football team, from the point of view of graph theory and complex networks. We model the complex network of passing interactions between players of a same team in 32 official matches of the Liga de Fútbol Profesional (Spain), using a passing/reception graph. This methodology allows us to understand the play structure of the team, by analysing the offensive phases of game-play. We utilise two different strategies for characterising the contribution of the players to the team: the clustering coefficient, and centrality metrics (closeness and betweenness). We show the application of this methodology by analyzing the performance of a professional Spanish team according to these metrics and the distribution of passing/reception in the field. Keeping in mind the dynamic nature of collective sports, in the future we will incorporate metrics which allows us to analyse the performance of the team also according to the circumstances of game-play and to different contextual variables such as, the utilisation of the field space, the time, and the ball, according to specific tactical situations. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Interactive Genetic Algorithm - An Adaptive and Interactive Decision Support Framework for Design of Optimal Groundwater Monitoring Plans

    NASA Astrophysics Data System (ADS)

    Babbar-Sebens, M.; Minsker, B. S.

    2006-12-01

    In the water resources management field, decision making encompasses many kinds of engineering, social, and economic constraints and objectives. Representing all of these problem dependant criteria through models (analytical or numerical) and various formulations (e.g., objectives, constraints, etc.) within an optimization- simulation system can be a very non-trivial issue. Most models and formulations utilized for discerning desirable traits in a solution can only approximate the decision maker's (DM) true preference criteria, and they often fail to consider important qualitative and incomputable phenomena related to the management problem. In our research, we have proposed novel decision support frameworks that allow DMs to actively participate in the optimization process. The DMs explicitly indicate their true preferences based on their subjective criteria and the results of various simulation models and formulations. The feedback from the DMs is then used to guide the search process towards solutions that are "all-rounders" from the perspective of the DM. The two main research questions explored in this work are: a) Does interaction between the optimization algorithm and a DM assist the system in searching for groundwater monitoring designs that are robust from the DM's perspective?, and b) How can an interactive search process be made more effective when human factors, such as human fatigue and cognitive learning processes, affect the performance of the algorithm? The application of these frameworks on a real-world groundwater long-term monitoring (LTM) case study in Michigan highlighted the following salient advantages: a) in contrast to the non-interactive optimization methodology, the proposed interactive frameworks were able to identify low cost monitoring designs whose interpolation maps respected the expected spatial distribution of the contaminants, b) for many same-cost designs, the interactive methodologies were able to propose multiple alternatives that met the DM's preference criteria, therefore allowing the expert to select among several strong candidate designs depending on her/his LTM budget, c) two of the methodologies - Case-Based Micro Interactive Genetic Algorithm (CBMIGA) and Interactive Genetic Algorithm with Mixed Initiative Interaction (IGAMII) - were also able to assist in controlling human fatigue and adapt to the DM's learning process.

  9. An endochronic theory for transversely isotropic fibrous composites

    NASA Technical Reports Server (NTRS)

    Pindera, M. J.; Herakovich, C. T.

    1981-01-01

    A rational methodology of modelling both nonlinear and elastic dissipative response of transversely isotropic fibrous composites is developed and illustrated with the aid of the observed response of graphite-polyimide off-axis coupons. The methodology is based on the internal variable formalism employed within the text of classical irreversible thermodynamics and entails extension of Valanis' endochronic theory to transversely isotropic media. Applicability of the theory to prediction of various response characteristics of fibrous composites is illustrated by accurately modelling such often observed phenomena as: stiffening reversible behavior along fiber direction; dissipative response in shear and transverse tension characterized by power-laws with different hardening exponents; permanent strain accumulation; nonlinear unloading and reloading; and stress-interaction effects.

  10. From systems biology to dynamical neuropharmacology: proposal for a new methodology.

    PubMed

    Erdi, P; Kiss, T; Tóth, J; Ujfalussy, B; Zalányi, L

    2006-07-01

    The concepts and methods of systems biology are extended to neuropharmacology in order to test and design drugs for the treatment of neurological and psychiatric disorders. Computational modelling by integrating compartmental neural modelling techniques and detailed kinetic descriptions of pharmacological modulation of transmitter-receptor interaction is offered as a method to test the electrophysiological and behavioural effects of putative drugs. Even more, an inverse method is suggested as a method for controlling a neural system to realise a prescribed temporal pattern. In particular, as an application of the proposed new methodology, a computational platform is offered to analyse the generation and pharmacological modulation of theta rhythm related to anxiety.

  11. Organic Carbamates in Drug Design and Medicinal Chemistry

    PubMed Central

    2016-01-01

    The carbamate group is a key structural motif in many approved drugs and prodrugs. There is an increasing use of carbamates in medicinal chemistry and many derivatives are specifically designed to make drug–target interactions through their carbamate moiety. In this Perspective, we present properties and stabilities of carbamates, reagents and chemical methodologies for the synthesis of carbamates, and recent applications of carbamates in drug design and medicinal chemistry. PMID:25565044

  12. Identification of New Potential Scientific and Technology Areas for DoD Application. Summary of Activities

    DTIC Science & Technology

    1986-07-31

    designer will be able to more rapid- ly assemble a total software package from perfected modules that can be easily de - bugged or replaced with more...antinuclear interactions e. gravitational effects of antimatter 2. possible machine parameters and lattice design 3. electron and stochastic cooling needs 4...implementation, reliability requirements; development of design environments and of experimental methodology; technology transfer methods from

  13. Project MAC Progress Report 11

    DTIC Science & Technology

    1974-12-01

    whether a subroutine would be useful as a part of some larger program , and, if so, how to use it [8]. The programming methodology employed by CALICO...7. Seriff, Marc, How to Write Programs for the CALICO Environment. SYS. 14.04 (unpublished). 8. Reeve, Chris, Marty Draper, D. E. Burmaster, and J...Introduction Automatic Programming Group A. Introduction B. Understanding How a User Might Interact with a Knowledge-Based Application System C

  14. Organic carbamates in drug design and medicinal chemistry.

    PubMed

    Ghosh, Arun K; Brindisi, Margherita

    2015-04-09

    The carbamate group is a key structural motif in many approved drugs and prodrugs. There is an increasing use of carbamates in medicinal chemistry and many derivatives are specifically designed to make drug-target interactions through their carbamate moiety. In this Perspective, we present properties and stabilities of carbamates, reagents and chemical methodologies for the synthesis of carbamates, and recent applications of carbamates in drug design and medicinal chemistry.

  15. The relationship between structure and function in locally observed complex networks

    NASA Astrophysics Data System (ADS)

    Comin, Cesar H.; Viana, Matheus P.; Costa, Luciano da F.

    2013-01-01

    Recently, studies looking at the small scale interactions taking place in complex networks have started to unveil the wealth of interactions that occur between groups of nodes. Such findings make the claim for a new systematic methodology to quantify, at node level, how dynamics are influenced (or differentiated) by the structure of the underlying system. Here we define a new measure that, based on the dynamical characteristics obtained for a large set of initial conditions, compares the dynamical behavior of the nodes present in the system. Through this measure, we find that the geographic and Barabási-Albert models have a high capacity for generating networks that exhibit groups of nodes with distinct dynamics compared to the rest of the network. The application of our methodology is illustrated with respect to two real systems. In the first we use the neuronal network of the nematode Caenorhabditis elegans to show that the interneurons of the ventral cord of the nematode present a very large dynamical differentiation when compared to the rest of the network. The second application concerns the SIS epidemic model on an airport network, where we quantify how different the distribution of infection times of high and low degree nodes can be, when compared to the expected value for the network.

  16. From face to interface recognition: a differential geometric approach to distinguish DNA from RNA binding surfaces.

    PubMed

    Shazman, Shula; Elber, Gershon; Mandel-Gutfreund, Yael

    2011-09-01

    Protein nucleic acid interactions play a critical role in all steps of the gene expression pathway. Nucleic acid (NA) binding proteins interact with their partners, DNA or RNA, via distinct regions on their surface that are characterized by an ensemble of chemical, physical and geometrical properties. In this study, we introduce a novel methodology based on differential geometry, commonly used in face recognition, to characterize and predict NA binding surfaces on proteins. Applying the method on experimentally solved three-dimensional structures of proteins we successfully classify double-stranded DNA (dsDNA) from single-stranded RNA (ssRNA) binding proteins, with 83% accuracy. We show that the method is insensitive to conformational changes that occur upon binding and can be applicable for de novo protein-function prediction. Remarkably, when concentrating on the zinc finger motif, we distinguish successfully between RNA and DNA binding interfaces possessing the same binding motif even within the same protein, as demonstrated for the RNA polymerase transcription-factor, TFIIIA. In conclusion, we present a novel methodology to characterize protein surfaces, which can accurately tell apart dsDNA from an ssRNA binding interfaces. The strength of our method in recognizing fine-tuned differences on NA binding interfaces make it applicable for many other molecular recognition problems, with potential implications for drug design.

  17. Prediction of nearfield jet entrainment by an interactive mixing/afterburning model

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Pergament, H. S.; Wilmoth, R. G.

    1978-01-01

    The development of a computational model (BOAT) for calculating nearfield jet entrainment, and its application to the prediction of nozzle boattail pressures, is discussed. BOAT accounts for the detailed turbulence and thermochemical processes occurring in the nearfield shear layers of jet engine (and rocket) exhaust plumes while interfacing with the inviscid exhaust and external flowfield regions in an overlaid, interactive manner. The ability of the model to analyze simple free shear flows is assessed by detailed comparisons with fundamental laboratory data. The overlaid methodology and the entrainment correction employed to yield the effective plume boundary conditions are assessed via application of BOAT in conjunction with the codes comprising the NASA/LRC patched viscous/inviscid model for determining nozzle boattail drag for subsonic/transonic external flows. Comparisons between the predictions and data on underexpanded laboratory cold air jets are presented.

  18. The application of remote sensing to resource management and environmental quality programs in Kansas

    NASA Technical Reports Server (NTRS)

    Barr, B. G.; Martinko, E. A.

    1976-01-01

    Activities of the Kansas Applied Remote Sensing Program (KARS) designed to establish interactions on cooperative projects with decision makers in Kansas agencies in the development and application of remote sensing procedures are reported. Cooperative demonstration projects undertaken with several different agencies involved three principal areas of effort: Wildlife Habitat and Environmental Analysis; Urban and Regional Analysis; Agricultural and Rural Analysis. These projects were designed to concentrate remote sensing concepts and methodologies on existing agency problems to insure the continued relevancy of the program and maximize the possibility for immediate operational use. Completed projects are briefly discussed.

  19. Computer-assisted visual interactive recognition and its prospects of implementation over the Internet

    NASA Astrophysics Data System (ADS)

    Zou, Jie; Gattani, Abhishek

    2005-01-01

    When completely automated systems don't yield acceptable accuracy, many practical pattern recognition systems involve the human either at the beginning (pre-processing) or towards the end (handling rejects). We believe that it may be more useful to involve the human throughout the recognition process rather than just at the beginning or end. We describe a methodology of interactive visual recognition for human-centered low-throughput applications, Computer Assisted Visual InterActive Recognition (CAVIAR), and discuss the prospects of implementing CAVIAR over the Internet. The novelty of CAVIAR is image-based interaction through a domain-specific parameterized geometrical model, which reduces the semantic gap between humans and computers. The user may interact with the computer anytime that she considers its response unsatisfactory. The interaction improves the accuracy of the classification features by improving the fit of the computer-proposed model. The computer makes subsequent use of the parameters of the improved model to refine not only its own statistical model-fitting process, but also its internal classifier. The CAVIAR methodology was applied to implement a flower recognition system. The principal conclusions from the evaluation of the system include: 1) the average recognition time of the CAVIAR system is significantly shorter than that of the unaided human; 2) its accuracy is significantly higher than that of the unaided machine; 3) it can be initialized with as few as one training sample per class and still achieve high accuracy; and 4) it demonstrates a self-learning ability. We have also implemented a Mobile CAVIAR system, where a pocket PC, as a client, connects to a server through wireless communication. The motivation behind a mobile platform for CAVIAR is to apply the methodology in a human-centered pervasive environment, where the user can seamlessly interact with the system for classifying field-data. Deploying CAVIAR to a networked mobile platform poses the challenge of classifying field images and programming under constraints of display size, network bandwidth, processor speed, and memory size. Editing of the computer-proposed model is performed on the handheld while statistical model fitting and classification take place on the server. The possibility that the user can easily take several photos of the object poses an interesting information fusion problem. The advantage of the Internet is that the patterns identified by different users can be pooled together to benefit all peer users. When users identify patterns with CAVIAR in a networked setting, they also collect training samples and provide opportunities for machine learning from their intervention. CAVIAR implemented over the Internet provides a perfect test bed for, and extends, the concept of Open Mind Initiative proposed by David Stork. Our experimental evaluation focuses on human time, machine and human accuracy, and machine learning. We devoted much effort to evaluating the use of our image-based user interface and on developing principles for the evaluation of interactive pattern recognition system. The Internet architecture and Mobile CAVIAR methodology have many applications. We are exploring in the directions of teledermatology, face recognition, and education.

  20. Agent oriented programming

    NASA Technical Reports Server (NTRS)

    Shoham, Yoav

    1994-01-01

    The goal of our research is a methodology for creating robust software in distributed and dynamic environments. The approach taken is to endow software objects with explicit information about one another, to have them interact through a commitment mechanism, and to equip them with a speech-acty communication language. System-level applications include software interoperation and compositionality. A government application of specific interest is an infrastructure for coordination among multiple planners. Daily activity applications include personal software assistants, such as programmable email, scheduling, and new group agents. Research topics include definition of mental state of agents, design of agent languages as well as interpreters for those languages, and mechanisms for coordination within agent societies such as artificial social laws and conventions.

  1. Protein–protein interactions and selection: yeast-based approaches that exploit guanine nucleotide-binding protein signaling.

    PubMed

    Ishii, Jun; Fukuda, Nobuo; Tanaka, Tsutomu; Ogino, Chiaki; Kondo, Akihiko

    2010-05-01

    For elucidating protein–protein interactions, many methodologies have been developed during the past two decades. For investigation of interactions inside cells under physiological conditions, yeast is an attractive organism with which to quickly screen for hopeful candidates using versatile genetic technologies, and various types of approaches are now available.Among them, a variety of unique systems using the guanine nucleotide-binding protein (G-protein) signaling pathway in yeast have been established to investigate the interactions of proteins for biological study and pharmaceutical research. G-proteins involved in various cellular processes are mainly divided into two groups: small monomeric G-proteins,and heterotrimeric G-proteins. In this minireview, we summarize the basic principles and applications of yeast-based screening systems, using these two types of G-protein, which are typically used for elucidating biological protein interactions but are differentiated from traditional yeast two-hybrid systems.

  2. Towards Inter- and Intra- Cellular Protein Interaction Analysis: Applying the Betweenness Centrality Graph Measure for Node Importance

    NASA Astrophysics Data System (ADS)

    Barton, Alan J.; Haqqani, Arsalan S.

    2011-11-01

    Three public biological network data sets (KEGG, GeneRIF and Reactome) are collected and described. Two problems are investigated (inter- and intra- cellular interactions) via augmentation of the collected networks to the problem specific data. Results include an estimate of the importance of proteins for the interaction of inflammatory cells with the blood-brain barrier via the computation of Betweenness Centrality. Subsequently, the interactions may be validated from a number of differing perspectives; including comparison with (i) existing biological results, (ii) the literature, and (iii) new hypothesis driven biological experiments. Novel therapeutic and diagnostic targets for inhibiting inflammation at the blood-brain barrier in a number of brain diseases including Alzheimer's disease, stroke and multiple sclerosis are possible. In addition, this methodology may also be applicable towards investigating the breast cancer tumour microenvironment.

  3. Clustering drug-drug interaction networks with energy model layouts: community analysis and drug repurposing.

    PubMed

    Udrescu, Lucreţia; Sbârcea, Laura; Topîrceanu, Alexandru; Iovanovici, Alexandru; Kurunczi, Ludovic; Bogdan, Paul; Udrescu, Mihai

    2016-09-07

    Analyzing drug-drug interactions may unravel previously unknown drug action patterns, leading to the development of new drug discovery tools. We present a new approach to analyzing drug-drug interaction networks, based on clustering and topological community detection techniques that are specific to complex network science. Our methodology uncovers functional drug categories along with the intricate relationships between them. Using modularity-based and energy-model layout community detection algorithms, we link the network clusters to 9 relevant pharmacological properties. Out of the 1141 drugs from the DrugBank 4.1 database, our extensive literature survey and cross-checking with other databases such as Drugs.com, RxList, and DrugBank 4.3 confirm the predicted properties for 85% of the drugs. As such, we argue that network analysis offers a high-level grasp on a wide area of pharmacological aspects, indicating possible unaccounted interactions and missing pharmacological properties that can lead to drug repositioning for the 15% drugs which seem to be inconsistent with the predicted property. Also, by using network centralities, we can rank drugs according to their interaction potential for both simple and complex multi-pathology therapies. Moreover, our clustering approach can be extended for applications such as analyzing drug-target interactions or phenotyping patients in personalized medicine applications.

  4. Clustering drug-drug interaction networks with energy model layouts: community analysis and drug repurposing

    PubMed Central

    Udrescu, Lucreţia; Sbârcea, Laura; Topîrceanu, Alexandru; Iovanovici, Alexandru; Kurunczi, Ludovic; Bogdan, Paul; Udrescu, Mihai

    2016-01-01

    Analyzing drug-drug interactions may unravel previously unknown drug action patterns, leading to the development of new drug discovery tools. We present a new approach to analyzing drug-drug interaction networks, based on clustering and topological community detection techniques that are specific to complex network science. Our methodology uncovers functional drug categories along with the intricate relationships between them. Using modularity-based and energy-model layout community detection algorithms, we link the network clusters to 9 relevant pharmacological properties. Out of the 1141 drugs from the DrugBank 4.1 database, our extensive literature survey and cross-checking with other databases such as Drugs.com, RxList, and DrugBank 4.3 confirm the predicted properties for 85% of the drugs. As such, we argue that network analysis offers a high-level grasp on a wide area of pharmacological aspects, indicating possible unaccounted interactions and missing pharmacological properties that can lead to drug repositioning for the 15% drugs which seem to be inconsistent with the predicted property. Also, by using network centralities, we can rank drugs according to their interaction potential for both simple and complex multi-pathology therapies. Moreover, our clustering approach can be extended for applications such as analyzing drug-target interactions or phenotyping patients in personalized medicine applications. PMID:27599720

  5. Recent developments and applications of saturation transfer difference nuclear magnetic resonance (STD NMR) spectroscopy.

    PubMed

    Wagstaff, Jane L; Taylor, Samantha L; Howard, Mark J

    2013-04-05

    This review aims to illustrate that STD NMR is not simply a method for drug screening and discovery, but has qualitative and quantitative applications that can answer fundamental and applied biological and biomedical questions involving molecular interactions between ligands and proteins. We begin with a basic introduction to the technique of STD NMR and report on recent advances and biological applications of STD including studies to follow the interactions of non-steroidal anti-inflammatories, minimum binding requirements for virus infection and understating inhibition of amyloid fibre formation. We expand on this introduction by reporting recent STD NMR studies of live-cell receptor systems, new methodologies using scanning STD, magic-angle spinning STD and approaches to use STD NMR in a quantitative fashion for dissociation constants and group epitope mapping (GEM) determination. We finish by outlining new approaches that have potential to influence future applications of the technique; NMR isotope-editing, heteronuclear multidimensional STD and (19)F STD methods that are becoming more amenable due to the latest NMR equipment technologies.

  6. The McGill Interactive Pediatric OncoGenetic Guidelines: An approach to identifying pediatric oncology patients most likely to benefit from a genetic evaluation.

    PubMed

    Goudie, Catherine; Coltin, Hallie; Witkowski, Leora; Mourad, Stephanie; Malkin, David; Foulkes, William D

    2017-08-01

    Identifying cancer predisposition syndromes in children with tumors is crucial, yet few clinical guidelines exist to identify children at high risk of having germline mutations. The McGill Interactive Pediatric OncoGenetic Guidelines project aims to create a validated pediatric guideline in the form of a smartphone/tablet application using algorithms to process clinical data and help determine whether to refer a child for genetic assessment. This paper discusses the initial stages of the project, focusing on its overall structure, the methodology underpinning the algorithms, and the upcoming algorithm validation process. © 2017 Wiley Periodicals, Inc.

  7. Systems biology: An emerging strategy for discovering novel pathogenetic mechanisms that promote cardiovascular disease.

    PubMed

    Maron, Bradley A; Leopold, Jane A

    2016-09-30

    Reductionist theory proposes that analyzing complex systems according to their most fundamental components is required for problem resolution, and has served as the cornerstone of scientific methodology for more than four centuries. However, technological gains in the current scientific era now allow for the generation of large datasets that profile the proteomic, genomic, and metabolomic signatures of biological systems across a range of conditions. The accessibility of data on such a vast scale has, in turn, highlighted the limitations of reductionism, which is not conducive to analyses that consider multiple and contemporaneous interactions between intermediates within a pathway or across constructs. Systems biology has emerged as an alternative approach to analyze complex biological systems. This methodology is based on the generation of scale-free networks and, thus, provides a quantitative assessment of relationships between multiple intermediates, such as protein-protein interactions, within and between pathways of interest. In this way, systems biology is well positioned to identify novel targets implicated in the pathogenesis or treatment of diseases. In this review, the historical root and fundamental basis of systems biology, as well as the potential applications of this methodology are discussed with particular emphasis on integration of these concepts to further understanding of cardiovascular disorders such as coronary artery disease and pulmonary hypertension.

  8. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    NASA Technical Reports Server (NTRS)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  9. Model-Driven Approach for Body Area Network Application Development.

    PubMed

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-05-12

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.

  10. Model-Driven Approach for Body Area Network Application Development

    PubMed Central

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-01-01

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394

  11. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    NASA Astrophysics Data System (ADS)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  12. A biased review of biases in Twitter studies on political collective action

    NASA Astrophysics Data System (ADS)

    Cihon, Peter; Yasseri, Taha

    2016-08-01

    In recent years researchers have gravitated to Twitter and other social media platforms as fertile ground for empirical analysis of social phenomena. Social media provides researchers access to trace data of interactions and discourse that once went unrecorded in the offline world. Researchers have sought to use these data to explain social phenomena both particular to social media and applicable to the broader social world. This paper offers a minireview of Twitter-based research on political crowd behaviour. This literature offers insight into particular social phenomena on Twitter, but often fails to use standardized methods that permit interpretation beyond individual studies. Moreover, the literature fails to ground methodologies and results in social or political theory, divorcing empirical research from the theory needed to interpret it. Rather, investigations focus primarily on methodological innovations for social media analyses, but these too often fail to sufficiently demonstrate the validity of such methodologies. This minireview considers a small number of selected papers; we analyse their (often lack of) theoretical approaches, review their methodological innovations, and offer suggestions as to the relevance of their results for political scientists and sociologists.

  13. Considering the cumulative risk of mixtures of chemicals – A challenge for policy makers

    PubMed Central

    2012-01-01

    Background The current paradigm for the assessment of the health risk of chemical substances focuses primarily on the effects of individual substances for determining the doses of toxicological concern in order to inform appropriately the regulatory process. These policy instruments place varying requirements on health and safety data of chemicals in the environment. REACH focuses on safety of individual substances; yet all the other facets of public health policy that relate to chemical stressors put emphasis on the effects of combined exposure to mixtures of chemical and physical agents. This emphasis brings about methodological problems linked to the complexity of the respective exposure pathways; the effect (more complex than simple additivity) of mixtures (the so-called 'cocktail effect'); dose extrapolation, i.e. the extrapolation of the validity of dose-response data to dose ranges that extend beyond the levels used for the derivation of the original dose-response relationship; the integrated use of toxicity data across species (including human clinical, epidemiological and biomonitoring data); and variation in inter-individual susceptibility associated with both genetic and environmental factors. Methods In this paper we give an overview of the main methodologies available today to estimate the human health risk of environmental chemical mixtures, ranging from dose addition to independent action, and from ignoring interactions among the mixture constituents to modelling their biological fate taking into account the biochemical interactions affecting both internal exposure and the toxic potency of the mixture. Results We discuss their applicability, possible options available to policy makers and the difficulties and potential pitfalls in implementing these methodologies in the frame of the currently existing policy framework in the European Union. Finally, we suggest a pragmatic solution for policy/regulatory action that would facilitate the evaluation of the health effects of chemical mixtures in the environment and consumer products. Conclusions One universally applicable methodology does not yet exist. Therefore, a pragmatic, tiered approach to regulatory risk assessment of chemical mixtures is suggested, encompassing (a) the use of dose addition to calculate a hazard index that takes into account interactions among mixture components; and (b) the use of the connectivity approach in data-rich situations to integrate mechanistic knowledge at different scales of biological organization. PMID:22759500

  14. [Pharmacological treatment conciliation methodology in patients with multiple conditions].

    PubMed

    Alfaro-Lara, Eva Rocío; Vega-Coca, María Dolores; Galván-Banqueri, Mercedes; Nieto-Martín, María Dolores; Pérez-Guerrero, Concepción; Santos-Ramos, Bernardo

    2014-02-01

    To carry out a bibliographic review in order to identify the different methodologies used along the reconciliation process of drug therapy applicable to polypathological patients. We performed a literature review. Data sources The bibliographic review (February 2012) included the following databases: Pubmed, EMBASE, CINAHL, PsycINFO and Spanish Medical Index (IME). The different methodologies, identified on those databases, to measure the conciliation process in polypathological patients, or otherwise elderly patients or polypharmacy, were studied. Study selection Two hundred and seventy three articles were retrieved, of which 25 were selected. Data extraction Specifically: the level of care, the sources of information, the use of registration forms, the established time, the medical professional in charge and the registered variables such as errors of reconciliation. Most of studies selected when the patient was admitted into the hospital and after the hospital discharge of the patient. The main sources of information to be highlighted are: the interview and the medical history of the patient. An established time is not explicitly stated on most of them, nor the registration form is used. The main professional in charge is the clinical pharmacologist. Apart from the home medication, the habits of self-medication and phytotherapy are also identified. The common errors of reconciliation vary from the omission of drugs to different forms of interaction with other medicinal products (drugs interactions). There is a large heterogeneity of methodologies used for reconciliation. There is not any work done on the specific figure of the polypathological patient, which precisely requires a standardized methodology due to its complexity and its susceptibility to errors of reconciliation. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  15. Smartfiles: An OO approach to data file interoperability

    NASA Technical Reports Server (NTRS)

    Haines, Matthew; Mehrotra, Piyush; Vanrosendale, John

    1995-01-01

    Data files for scientific and engineering codes typically consist of a series of raw data values whose descriptions are buried in the programs that interact with these files. In this situation, making even minor changes in the file structure or sharing files between programs (interoperability) can only be done after careful examination of the data file and the I/O statement of the programs interacting with this file. In short, scientific data files lack self-description, and other self-describing data techniques are not always appropriate or useful for scientific data files. By applying an object-oriented methodology to data files, we can add the intelligence required to improve data interoperability and provide an elegant mechanism for supporting complex, evolving, or multidisciplinary applications, while still supporting legacy codes. As a result, scientists and engineers should be able to share datasets with far greater ease, simplifying multidisciplinary applications and greatly facilitating remote collaboration between scientists.

  16. A new framework for interactive quality assessment with application to light field coding

    NASA Astrophysics Data System (ADS)

    Viola, Irene; Ebrahimi, Touradj

    2017-09-01

    In recent years, light field has experienced a surge of popularity, mainly due to the recent advances in acquisition and rendering technologies that have made it more accessible to the public. Thanks to image-based rendering techniques, light field contents can be rendered in real time on common 2D screens, allowing virtual navigation through the captured scenes in an interactive fashion. However, this richer representation of the scene poses the problem of reliable quality assessments for light field contents. In particular, while subjective methodologies that enable interaction have already been proposed, no work has been done on assessing how users interact with light field contents. In this paper, we propose a new framework to subjectively assess the quality of light field contents in an interactive manner and simultaneously track users behaviour. The framework is successfully used to perform subjective assessment of two coding solutions. Moreover, statistical analysis performed on the results shows interesting correlation between subjective scores and average interaction time.

  17. A Proposed Performance-Based System for Teacher Interactive Electronic Continuous Professional Development (TIE-CPD)

    ERIC Educational Resources Information Center

    Razak, Rafiza Abdul; Yusop, Farrah Dina; Idris, Aizal Yusrina; Al-Sinaiyah, Yanbu; Halili, Siti Hajar

    2016-01-01

    The paper introduces Teacher Interactive Electronic Continuous Professional Development (TIE-CPD), an online interactive training system. The framework and methodology of TIE-CPD are designed with functionalities comparable with existing e-training systems. The system design and development literature offers several methodology and framework…

  18. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  19. Panning for the gold in health research: incorporating studies' methodological quality in meta-analysis.

    PubMed

    Johnson, Blair T; Low, Robert E; MacDonald, Hayley V

    2015-01-01

    Systematic reviews now routinely assess methodological quality to gauge the validity of the included studies and of the synthesis as a whole. Although trends from higher quality studies should be clearer, it is uncertain how often meta-analyses incorporate methodological quality in models of study results either as predictors, or, more interestingly, in interactions with theoretical moderators. We survey 200 meta-analyses in three health promotion domains to examine when and how meta-analyses incorporate methodological quality. Although methodological quality assessments commonly appear in contemporary meta-analyses (usually as scales), they are rarely incorporated in analyses, and still more rarely analysed in interaction with theoretical determinants of the success of health promotions. The few meta-analyses (2.5%) that did include such an interaction analysis showed that moderator results remained significant in higher quality studies or were present only among higher quality studies. We describe how to model quality interactively with theoretically derived moderators and discuss strengths and weaknesses of this approach and in relation to current meta-analytic practice. In large literatures exhibiting heterogeneous effects, meta-analyses can incorporate methodological quality and generate conclusions that enable greater confidence not only about the substantive phenomenon but also about the role that methodological quality itself plays.

  20. Application of experimental design in geothermal resources assessment of Ciwidey-Patuha, West Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Ashat, Ali; Pratama, Heru Berian

    2017-12-01

    The successful Ciwidey-Patuha geothermal field size assessment required integration data analysis of all aspects to determined optimum capacity to be installed. Resources assessment involve significant uncertainty of subsurface information and multiple development scenarios from these field. Therefore, this paper applied the application of experimental design approach to the geothermal numerical simulation of Ciwidey-Patuha to generate probabilistic resource assessment result. This process assesses the impact of evaluated parameters affecting resources and interacting between these parameters. This methodology have been successfully estimated the maximum resources with polynomial function covering the entire range of possible values of important reservoir parameters.

  1. Design and development of an IBM/VM menu system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cazzola, D.J.

    1992-10-01

    This report describes a full screen menu system developed using IBM's Interactive System Productivity Facility (ISPF) and the REXX programming language. The software was developed for the 2800 IBM/VM Electrical Computer Aided Design (ECAD) system. The system was developed to deliver electronic drawing definitions to a corporate drawing release system. Although this report documents the status of the menu system when it was retired, the methodologies used and the requirements defined are very applicable to replacement systems.

  2. Parallel Visualization Co-Processing of Overnight CFD Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Edwards, David E.; Haimes, Robert

    1999-01-01

    An interactive visualization system pV3 is being developed for the investigation of advanced computational methodologies employing visualization and parallel processing for the extraction of information contained in large-scale transient engineering simulations. Visual techniques for extracting information from the data in terms of cutting planes, iso-surfaces, particle tracing and vector fields are included in this system. This paper discusses improvements to the pV3 system developed under NASA's Affordable High Performance Computing project.

  3. Quantifying host potentials: indexing postharvest fresh fruits for spotted wing Drosophila, Drosophila suzukii.

    PubMed

    Bellamy, David E; Sisterson, Mark S; Walse, Spencer S

    2013-01-01

    Novel methodology is presented for indexing the relative potential of hosts to function as resources. A Host Potential Index (HPI) was developed as a practical framework to express relative host potential based on combining results from one or more independent studies, such as those examining host selection, utilization, and physiological development of the organism resourcing the host. Several aspects of the HPI are addressed including: 1) model derivation; 2) influence of experimental design on establishing host rankings for a study type (no choice, two-choice, and multiple-choice); and, 3) variable selection and weighting associated with combining multiple studies. To demonstrate application of the HPI, results from the interactions of spotted wing drosophila (SWD), Drosophila suzukii Matsumura (Diptera: Drosophilidae), with seven "reported" hosts (blackberries, blueberries, sweet cherries, table grapes, peaches, raspberries, and strawberries) in a postharvest scenario were analyzed. Four aspects of SWD-host interaction were examined: attraction to host volatiles; population-level oviposition performance; individual-level oviposition performance; and key developmental factors. Application of HPI methodology indicated that raspberries ( (mean)HPIvaried  = 301.9±8.39; rank 1 of 7) have the greatest potential to serve as a postharvest host for SWD relative to the other fruit hosts, with grapes ( (mean)HPIvaried  = 232.4±3.21; rank 7 of 7) having the least potential.

  4. From face to interface recognition: a differential geometric approach to distinguish DNA from RNA binding surfaces

    PubMed Central

    Shazman, Shula; Elber, Gershon; Mandel-Gutfreund, Yael

    2011-01-01

    Protein nucleic acid interactions play a critical role in all steps of the gene expression pathway. Nucleic acid (NA) binding proteins interact with their partners, DNA or RNA, via distinct regions on their surface that are characterized by an ensemble of chemical, physical and geometrical properties. In this study, we introduce a novel methodology based on differential geometry, commonly used in face recognition, to characterize and predict NA binding surfaces on proteins. Applying the method on experimentally solved three-dimensional structures of proteins we successfully classify double-stranded DNA (dsDNA) from single-stranded RNA (ssRNA) binding proteins, with 83% accuracy. We show that the method is insensitive to conformational changes that occur upon binding and can be applicable for de novo protein-function prediction. Remarkably, when concentrating on the zinc finger motif, we distinguish successfully between RNA and DNA binding interfaces possessing the same binding motif even within the same protein, as demonstrated for the RNA polymerase transcription-factor, TFIIIA. In conclusion, we present a novel methodology to characterize protein surfaces, which can accurately tell apart dsDNA from an ssRNA binding interfaces. The strength of our method in recognizing fine-tuned differences on NA binding interfaces make it applicable for many other molecular recognition problems, with potential implications for drug design. PMID:21693557

  5. Quantifying Host Potentials: Indexing Postharvest Fresh Fruits for Spotted Wing Drosophila, Drosophila suzukii

    PubMed Central

    Bellamy, David E.; Sisterson, Mark S.; Walse, Spencer S.

    2013-01-01

    Novel methodology is presented for indexing the relative potential of hosts to function as resources. A Host Potential Index (HPI) was developed as a practical framework to express relative host potential based on combining results from one or more independent studies, such as those examining host selection, utilization, and physiological development of the organism resourcing the host. Several aspects of the HPI are addressed including: 1) model derivation; 2) influence of experimental design on establishing host rankings for a study type (no choice, two-choice, and multiple-choice); and, 3) variable selection and weighting associated with combining multiple studies. To demonstrate application of the HPI, results from the interactions of spotted wing drosophila (SWD), Drosophila suzukii Matsumura (Diptera: Drosophilidae), with seven “reported” hosts (blackberries, blueberries, sweet cherries, table grapes, peaches, raspberries, and strawberries) in a postharvest scenario were analyzed. Four aspects of SWD-host interaction were examined: attraction to host volatiles; population-level oviposition performance; individual-level oviposition performance; and key developmental factors. Application of HPI methodology indicated that raspberries (meanHPIvaried = 301.9±8.39; rank 1 of 7) have the greatest potential to serve as a postharvest host for SWD relative to the other fruit hosts, with grapes (meanHPIvaried = 232.4±3.21; rank 7 of 7) having the least potential. PMID:23593439

  6. Beyond Classical Information Theory: Advancing the Fundamentals for Improved Geophysical Prediction

    NASA Astrophysics Data System (ADS)

    Perdigão, R. A. P.; Pires, C. L.; Hall, J.; Bloeschl, G.

    2016-12-01

    Information Theory, in its original and quantum forms, has gradually made its way into various fields of science and engineering. From the very basic concepts of Information Entropy and Mutual Information to Transit Information, Interaction Information and respective partitioning into statistical synergy, redundancy and exclusivity, the overall theoretical foundations have matured as early as the mid XX century. In the Earth Sciences various interesting applications have been devised over the last few decades, such as the design of complex process networks of descriptive and/or inferential nature, wherein earth system processes are "nodes" and statistical relationships between them designed as information-theoretical "interactions". However, most applications still take the very early concepts along with their many caveats, especially in heavily non-Normal, non-linear and structurally changing scenarios. In order to overcome the traditional limitations of information theory and tackle elusive Earth System phenomena, we introduce a new suite of information dynamic methodologies towards a more physically consistent and information comprehensive framework. The methodological developments are then illustrated on a set of practical examples from geophysical fluid dynamics, where high-order nonlinear relationships elusive to the current non-linear information measures are aptly captured. In doing so, these advances increase the predictability of critical events such as the emergence of hyper-chaotic regimes in ocean-atmospheric dynamics and the occurrence of hydro-meteorological extremes.

  7. Investigating Dynamics of Eccentricity in Turbomachines

    NASA Technical Reports Server (NTRS)

    Baun, Daniel

    2010-01-01

    A methodology (and hardware and software to implement the methodology) has been developed as a means of investigating coupling between certain rotordynamic and hydrodynamic phenomena in turbomachines. Originally, the methodology was intended for application in an investigation of coupled rotordynamic and hydrodynamic effects postulated to have caused high synchronous vibration in the space shuttle s high-pressure oxygen turbopump (HPOTP). The methodology can also be applied in investigating (for the purpose of developing means of suppressing) undesired hydrodynamic rotor/stator interactions in turbomachines in general. The methodology and the types of phenomena that can be investigated by use of the methodology are best summarized by citing the original application as an example. In that application, in consideration of the high synchronous vibration in the space-shuttle main engine (SSME) HPOTP, it was determined to be necessary to perform tests to investigate the influence of inducer eccentricity and/or synchronous whirl motion on inducer hydrodynamic forces under prescribed flow and cavitation conditions. It was believed that manufacturing tolerances of the turbopump resulted in some induced runout of the pump rotor. Such runout, if oriented with an inducer blade, would cause that blade to run with tip clearance smaller than the tip clearances of the other inducer blades. It was hypothesized that the resulting hydraulic asymmetry, coupled with alternating blade cavitation, could give rise to the observed high synchronous vibration. In tests performed to investigate this hypothesis, prescribed rotor whirl motions have been imposed on a 1/3-scale water-rig version of the SSME LPOTP inducer (which is also a 4-biased inducer having similar cavitation dynamics as the HPOTP) in a magnetic-bearing test facility. The particular magnetic-bearing test facility, through active vibration control, affords a capability to impose, on the rotor, whirl orbits having shapes and whirl rates prescribed by the user, and to simultaneously measure the resulting hydrodynamic forces generated by the impeller. Active control also made it possible to modulate the inducer-blade running tip clearance and consequently effect alternating blade cavitation. The measured hydraulic forces have been compared and correlated with shroud dynamic-pressure measurements.

  8. Prediction of protein-peptide interactions: application of the XPairIt API to anthrax lethal factor and substrates

    NASA Astrophysics Data System (ADS)

    Hurley, Margaret M.; Sellers, Michael S.

    2013-05-01

    As software and methodology develop, key aspects of molecular interactions such as detailed energetics and flexibility are continuously better represented in docking simulations. In the latest iteration of the XPairIt API and Docking Protocol, we perform a blind dock of a peptide into the cleavage site of the Anthrax lethal factor (LF) metalloprotein. Molecular structures are prepared from RCSB:1JKY and we demonstrate a reasonably accurate docked peptide through analysis of protein motion and, using NCI Plot, visualize and characterize the forces leading to binding. We compare our docked structure to the 1JKY crystal structure and the more recent 1PWV structure, and discuss both captured and overlooked interactions. Our results offer a more detailed look at secondary contact and show that both van der Waals and electrostatic interactions from peptide residues further from the enzyme's catalytic site are significant.

  9. 4D-LQTA-QSAR and docking study on potent Gram-negative specific LpxC inhibitors: a comparison to CoMFA modeling.

    PubMed

    Ghasemi, Jahan B; Safavi-Sohi, Reihaneh; Barbosa, Euzébio G

    2012-02-01

    A quasi 4D-QSAR has been carried out on a series of potent Gram-negative LpxC inhibitors. This approach makes use of the molecular dynamics (MD) trajectories and topology information retrieved from the GROMACS package. This new methodology is based on the generation of a conformational ensemble profile, CEP, for each compound instead of only one conformation, followed by the calculation intermolecular interaction energies at each grid point considering probes and all aligned conformations resulting from MD simulations. These interaction energies are independent variables employed in a QSAR analysis. The comparison of the proposed methodology to comparative molecular field analysis (CoMFA) formalism was performed. This methodology explores jointly the main features of CoMFA and 4D-QSAR models. Step-wise multiple linear regression was used for the selection of the most informative variables. After variable selection, multiple linear regression (MLR) and partial least squares (PLS) methods used for building the regression models. Leave-N-out cross-validation (LNO), and Y-randomization were performed in order to confirm the robustness of the model in addition to analysis of the independent test set. Best models provided the following statistics: [Formula in text] (PLS) and [Formula in text] (MLR). Docking study was applied to investigate the major interactions in protein-ligand complex with CDOCKER algorithm. Visualization of the descriptors of the best model helps us to interpret the model from the chemical point of view, supporting the applicability of this new approach in rational drug design.

  10. General implementation of arbitrary nonlinear quadrature phase gates

    NASA Astrophysics Data System (ADS)

    Marek, Petr; Filip, Radim; Ogawa, Hisashi; Sakaguchi, Atsushi; Takeda, Shuntaro; Yoshikawa, Jun-ichi; Furusawa, Akira

    2018-02-01

    We propose general methodology of deterministic single-mode quantum interaction nonlinearly modifying single quadrature variable of a continuous-variable system. The methodology is based on linear coupling of the system to ancillary systems subsequently measured by quadrature detectors. The nonlinear interaction is obtained by using the data from the quadrature detection for dynamical manipulation of the coupling parameters. This measurement-induced methodology enables direct realization of arbitrary nonlinear quadrature interactions without the need to construct them from the lowest-order gates. Such nonlinear interactions are crucial for more practical and efficient manipulation of continuous quadrature variables as well as qubits encoded in continuous-variable systems.

  11. Motor Rehabilitation Using Kinect: A Systematic Review.

    PubMed

    Da Gama, Alana; Fallavollita, Pascal; Teichrieb, Veronica; Navab, Nassir

    2015-04-01

    Interactive systems are being developed with the intention to help in the engagement of patients on various therapies. Amid the recent technological advances, Kinect™ from Microsoft (Redmond, WA) has helped pave the way on how user interaction technology facilitates and complements many clinical applications. In order to examine the actual status of Kinect developments for rehabilitation, this article presents a systematic review of articles that involve interactive, evaluative, and technical advances related to motor rehabilitation. Systematic research was performed in the IEEE Xplore and PubMed databases using the key word combination "Kinect AND rehabilitation" with the following inclusion criteria: (1) English language, (2) page number >4, (3) Kinect system for assistive interaction or clinical evaluation, or (4) Kinect system for improvement or evaluation of the sensor tracking or movement recognition. Quality assessment was performed by QualSyst standards. In total, 109 articles were found in the database research, from which 31 were included in the review: 13 were focused on the development of assistive systems for rehabilitation, 3 in evaluation, 3 in the applicability category, 7 on validation of Kinect anatomic and clinical evaluation, and 5 on improvement techniques. Quality analysis of all included articles is also presented with their respective QualSyst checklist scores. Research and development possibilities and future works with the Kinect for rehabilitation application are extensive. Methodological improvements when performing studies on this area need to be further investigated.

  12. River water quality management considering agricultural return flows: application of a nonlinear two-stage stochastic fuzzy programming.

    PubMed

    Tavakoli, Ali; Nikoo, Mohammad Reza; Kerachian, Reza; Soltani, Maryam

    2015-04-01

    In this paper, a new fuzzy methodology is developed to optimize water and waste load allocation (WWLA) in rivers under uncertainty. An interactive two-stage stochastic fuzzy programming (ITSFP) method is utilized to handle parameter uncertainties, which are expressed as fuzzy boundary intervals. An iterative linear programming (ILP) is also used for solving the nonlinear optimization model. To accurately consider the impacts of the water and waste load allocation strategies on the river water quality, a calibrated QUAL2Kw model is linked with the WWLA optimization model. The soil, water, atmosphere, and plant (SWAP) simulation model is utilized to determine the quantity and quality of each agricultural return flow. To control pollution loads of agricultural networks, it is assumed that a part of each agricultural return flow can be diverted to an evaporation pond and also another part of it can be stored in a detention pond. In detention ponds, contaminated water is exposed to solar radiation for disinfecting pathogens. Results of applying the proposed methodology to the Dez River system in the southwestern region of Iran illustrate its effectiveness and applicability for water and waste load allocation in rivers. In the planning phase, this methodology can be used for estimating the capacities of return flow diversion system and evaporation and detention ponds.

  13. Analysis of Feedback Processes in Online Group Interaction: A Methodological Model

    ERIC Educational Resources Information Center

    Espasa, Anna; Guasch, Teresa; Alvarez, Ibis M.

    2013-01-01

    The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the…

  14. The Expanded Application of Forensic Science and Law Enforcement Methodologies in Army Counterintelligence

    DTIC Science & Technology

    2017-09-01

    THE EXPANDED APPLICATION OF FORENSIC SCIENCE AND LAW ENFORCEMENT METHODOLOGIES IN ARMY COUNTERINTELLIGENCE A RESEARCH PROJECT...Jul 2017 The Expanded Application of Forensic Science and Law Enforcement Methodologies in Army Counterintelligence CW2 Stockham, Braden E. National...forensic science resources, law enforcement methodologies and procedures, and basic investigative training. In order to determine if these changes would

  15. A succinct overview of virtual reality technology use in Alzheimer's disease.

    PubMed

    García-Betances, Rebeca I; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda

    2015-01-01

    We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer's disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers' education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments.

  16. CLIP-related methodologies and their application to retrovirology.

    PubMed

    Bieniasz, Paul D; Kutluay, Sebla B

    2018-05-02

    Virtually every step of HIV-1 replication and numerous cellular antiviral defense mechanisms are regulated by the binding of a viral or cellular RNA-binding protein (RBP) to distinct sequence or structural elements on HIV-1 RNAs. Until recently, these protein-RNA interactions were studied largely by in vitro binding assays complemented with genetics approaches. However, these methods are highly limited in the identification of the relevant targets of RBPs in physiologically relevant settings. Development of crosslinking-immunoprecipitation sequencing (CLIP) methodology has revolutionized the analysis of protein-nucleic acid complexes. CLIP combines immunoprecipitation of covalently crosslinked protein-RNA complexes with high-throughput sequencing, providing a global account of RNA sequences bound by a RBP of interest in cells (or virions) at near-nucleotide resolution. Numerous variants of the CLIP protocol have recently been developed, some with major improvements over the original. Herein, we briefly review these methodologies and give examples of how CLIP has been successfully applied to retrovirology research.

  17. An Assessment of IMPAC - Integrated Methodology for Propulsion and Airframe Controls

    NASA Technical Reports Server (NTRS)

    Walker, G. P.; Wagner, E. A.; Bodden, D. S.

    1996-01-01

    This report documents the work done under a NASA sponsored contract to transition to industry technologies developed under the NASA Lewis Research Center IMPAC (Integrated Methodology for Propulsion and Airframe Control) program. The critical steps in IMPAC are exercised on an example integrated flight/propulsion control design for linear airframe/engine models of a conceptual STOVL (Short Take-Off and Vertical Landing) aircraft, and MATRIXX (TM) executive files to implement each step are developed. The results from the example study are analyzed and lessons learned are listed along with recommendations that will improve the application of each design step. The end product of this research is a set of software requirements for developing a user-friendly control design tool which will automate the steps in the IMPAC methodology. Prototypes for a graphical user interface (GUI) are sketched to specify how the tool will interact with the user, and it is recommended to build the tool around existing computer aided control design software packages.

  18. Wall jet analysis for circulation control aerodynamics. Part 1: Fundamental CFD and turbulence modeling concepts

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; York, B. J.; Sinha, N.; Dvorak, F. A.

    1987-01-01

    An overview of parabolic and PNS (Parabolized Navier-Stokes) methodology developed to treat highly curved sub and supersonic wall jets is presented. The fundamental data base to which these models were applied is discussed in detail. The analysis of strong curvature effects was found to require a semi-elliptic extension of the parabolic modeling to account for turbulent contributions to the normal pressure variations, as well as an extension to the turbulence models utilized, to account for the highly enhanced mixing rates observed in situations with large convex curvature. A noniterative, pressure split procedure is shown to extend parabolic models to account for such normal pressure variations in an efficient manner, requiring minimal additional run time over a standard parabolic approach. A new PNS methodology is presented to solve this problem which extends parabolic methodology via the addition of a characteristic base wave solver. Applications of this approach to analyze the interaction of wave and turbulence processes in wall jets is presented.

  19. A facile method to screen inhibitors of protein-protein interactions including MDM2-p53 displayed on T7 phage.

    PubMed

    Ishi, Kazutomo; Sugawara, Fumio

    2008-05-01

    Protein-protein interactions are essential in many biological processes including cell cycle and apoptosis. It is currently of great medical interest to inhibit specific protein-protein interactions in order to treat a variety of disease states. Here, we describe a facile multiwell plate assay method using T7 phage display to screen for candidate inhibitors of protein-protein interactions. Because T7 phage display is an effective method for detecting protein-protein interactions, we aimed to utilize this technique to screen for small-molecule inhibitors that disrupt these types of interaction. We used the well-characterized interaction between p53 and MDM2 and an inhibitor of this interaction, nutlin 3, as a model system to establish a new screening method. Phage particles displaying p53 interacted with GST-MDM2 immobilized on 96-well plates, and the interaction was inhibited by nutlin 3. Multiwell plate assay was then performed using a natural product library, which identified dehydroaltenusin as a candidate inhibitor of the p53-MDM2 interaction. We discuss the potential applications of this novel T7 phage display methodology, which we propose to call 'reverse phage display'.

  20. Reassessing SERS enhancement factors: using thermodynamics to drive substrate design.

    PubMed

    Guicheteau, J A; Tripathi, A; Emmons, E D; Christesen, S D; Fountain, Augustus W

    2017-12-04

    Over the past 40 years fundamental and application research into Surface-Enhanced Raman Scattering (SERS) has been explored by academia, industry, and government laboratories. To date however, SERS has achieved little commercial success as an analytical technique. Researchers are tackling a variety of paths to help break through the commercial barrier by addressing the reproducibility in both the SERS substrates and SERS signals as well as continuing to explore the underlying mechanisms. To this end, investigators use a variety of methodologies, typically studying strongly binding analytes such as aromatic thiols and azarenes, and report SERS enhancement factor calculations. However a drawback of the traditional SERS enhancement factor calculation is that it does not yield enough information to understand substrate reproducibility, application potential with another analyte, or the driving factors behind the molecule-metal interaction. Our work at the US Army Edgewood Chemical Biological Center has focused on these questions and we have shown that thermodynamic principles play a key role in the SERS response and are an essential factor in future designs of substrates and applications. This work will discuss the advantages and disadvantages of various experimental techniques used to report SERS enhancement with planar SERS substrates and present our alternative SERS enhancement value. We will report on three types of analysis scenarios that all yield different information concerning the effectiveness of the SERS substrate, practical application of the substrate, and finally the thermodynamic properties of the substrate. We believe that through this work a greater understanding for designing substrates will be achieved, one that is based on both thermodynamic and plasmonic properties as opposed to just plasmonic properties. This new understanding and potential change in substrate design will enable more applications for SERS based methodologies including targeting molecules that are traditionally not easily detected with SERS due to the perceived weak molecule-metal interaction of substrates.

  1. A way forward for teaching and learning of Physiology: Students’ perception of the effectiveness of teaching methodologies

    PubMed Central

    Rehan, Rabiya; Ahmed, Khalid; Khan, Hira; Rehman, Rehana

    2016-01-01

    Objective: To compare the perception of medical students on the usefulness of the interactive lectures, case-based lectures, and structured interactive sessions (SIS) in teaching and learning of Physiology. Methods: A cross-sectional study was carried out from January to December 2012 at Bahria University Medical & Dental College, Karachi, which had qualitative and quantitative aspects, assessed by self- reported questionnaire and focused group discussion (FGD). The questionnaire was distributed to 100 medical students after completion of first year of teaching of MBBS Physiology. The data was analyzed using SPSS version 15. Differences were considered significant at p-values <0.05 after application of Friedman test. Responses of FGD were analyzed. Results: All the teaching methodologies helped in understanding of precise learning objectives. The comprehension of structure and functions with understanding of difficult concepts was made best possible by SIS (p=0.04, p<0.01). SIS enabled adult learning, self-directed learning, peer learning and critical reasoning more than the other teaching strategies (p< 0.01). Conclusion: SIS involved students who used reasoning skills and power of discussion in a group to comprehend difficult concepts for better understanding of Physiology as compared to interactive and case-based lectures. PMID:28083047

  2. Experimental Design and Bioinformatics Analysis for the Application of Metagenomics in Environmental Sciences and Biotechnology.

    PubMed

    Ju, Feng; Zhang, Tong

    2015-11-03

    Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation.

  3. FY16 Progress Report on Test Results In Support Of Integrated EPP and SMT Design Methods Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yanli; Jetter, Robert I.; Sham, T. -L.

    2016-08-08

    The proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology consists of incorporating an SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid using the creep-fatigue interaction diagram (the D diagram) and to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed code rules and to verify their applicability, a series of thermomechanical tests have been initiated. This report presents the recent test results for Type 2 SMT specimens on Alloy 617, Pressurization SMT on Alloy 617, Type 1 SMT on Gr. 91, and two-barmore » thermal ratcheting test results on Alloy 617 with a new thermal loading profile.« less

  4. Disclosure of Temporary Exposures as Permanent Website Applications through the Patrimonial Survey

    NASA Astrophysics Data System (ADS)

    Corso, Juan; Garcia-Almirall, Pilar; López, Daniel; Casals, Jordi

    2017-10-01

    In a context of web application in the field of the dissemination of cultural heritage, this article advances in a methodology for the optimization of points clouds obtained through the technology of Laser Scanner (TLS). Identifying the potential of TLS surveys as interactive models that allow the cultural heritage to be perpetuated over time. This point cloud optimization is developed with free software, focusing its exploitation on an interactive web application, which has made it possible to convert two temporary museum exhibitions into permanent exhibitions in virtual format. Developed in conjunction with the Museu d’Història de la Ciutat de Barcelona. The case study focuses on the Palau Reial Major, Gothic style, formed by the chapel of Santa Àgata (built in 1302, on the Roman wall) and Saló del Tinell (built between 1359 and 1370, on the Roman remains). Located in the Plaça del Rei, in the old town of Barcelona. In this application is very important the visual impact, it requires to represent a faithful model of the interior of the building, from the point of view of color and lighting, avoiding the transparencies of the model through a dense cloud of dots, without occlusions, this requires a great quantity of positions. This implies a clear methodology, using different techniques such as photographic proyection, given the complexity of lighting of the building, as much for the artificial lighting as for the lighting of the stained glass. In this process, there were 84 positions that provide greater density of points, which are optimized with free programs. The temporary exhibitions of the case studies, elaborated by the MUHBA in the Saló del Tinell are: “Indianas, 1736-1847. The origins of industrial Barcelona” exposed from May 19, 2012 to March 3, 2013 and “El Món del 1714” exposed from December 20 to September 28, 2014. Both are based on a tour with showcases and exhibitors where different objects of a museum character are shown, such as looms, cloths, dresses, books, among others, accompanied by panels with texts and images that contain the information that each exhibition shows. Virtual applications allow such temporary exposures to become an interactive model, in which information can be permanently consulted. A virtual tour where the user can interact with the information panels and observe in detail the different objects of the exhibition. The results of this work manage to generate a powerful mechanism of diffusion and approximation to the society of the cultural heritage that, otherwise, as a whole as exhibition would disappear.

  5. HyCCAPP as a tool to characterize promoter DNA-protein interactions in Saccharomyces cerevisiae.

    PubMed

    Guillen-Ahlers, Hector; Rao, Prahlad K; Levenstein, Mark E; Kennedy-Darling, Julia; Perumalla, Danu S; Jadhav, Avinash Y L; Glenn, Jeremy P; Ludwig-Kubinski, Amy; Drigalenko, Eugene; Montoya, Maria J; Göring, Harald H; Anderson, Corianna D; Scalf, Mark; Gildersleeve, Heidi I S; Cole, Regina; Greene, Alexandra M; Oduro, Akua K; Lazarova, Katarina; Cesnik, Anthony J; Barfknecht, Jared; Cirillo, Lisa A; Gasch, Audrey P; Shortreed, Michael R; Smith, Lloyd M; Olivier, Michael

    2016-06-01

    Currently available methods for interrogating DNA-protein interactions at individual genomic loci have significant limitations, and make it difficult to work with unmodified cells or examine single-copy regions without specific antibodies. In this study, we describe a physiological application of the Hybridization Capture of Chromatin-Associated Proteins for Proteomics (HyCCAPP) methodology we have developed. Both novel and known locus-specific DNA-protein interactions were identified at the ENO2 and GAL1 promoter regions of Saccharomyces cerevisiae, and revealed subgroups of proteins present in significantly different levels at the loci in cells grown on glucose versus galactose as the carbon source. Results were validated using chromatin immunoprecipitation. Overall, our analysis demonstrates that HyCCAPP is an effective and flexible technology that does not require specific antibodies nor prior knowledge of locally occurring DNA-protein interactions and can now be used to identify changes in protein interactions at target regions in the genome in response to physiological challenges. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Development of 3D pseudo pin-by-pin calculation methodology in ANC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, B.; Mayhue, L.; Huria, H.

    2012-07-01

    Advanced cores and fuel assembly designs have been developed to improve operational flexibility, economic performance and further enhance safety features of nuclear power plants. The simulation of these new designs, along with strong heterogeneous fuel loading, have brought new challenges to the reactor physics methodologies currently employed in the industrial codes for core analyses. Control rod insertion during normal operation is one operational feature in the AP1000{sup R} plant of Westinghouse next generation Pressurized Water Reactor (PWR) design. This design improves its operational flexibility and efficiency but significantly challenges the conventional reactor physics methods, especially in pin power calculations. Themore » mixture loading of fuel assemblies with significant neutron spectrums causes a strong interaction between different fuel assembly types that is not fully captured with the current core design codes. To overcome the weaknesses of the conventional methods, Westinghouse has developed a state-of-the-art 3D Pin-by-Pin Calculation Methodology (P3C) and successfully implemented in the Westinghouse core design code ANC. The new methodology has been qualified and licensed for pin power prediction. The 3D P3C methodology along with its application and validation will be discussed in the paper. (authors)« less

  7. Methodology of development and students' perceptions of a psychiatry educational smartphone application.

    PubMed

    Zhang, Melvyn W B; Ho, Cyrus S H; Ho, Roger C M

    2014-01-01

    The usage of Smartphones and smartphone applications in the recent decade has indeed become more prevalent. Previous research has highlighted the lack of critical appraisal of new applications. In addition, previous research has highlighted a method of using just the Internet Browser and a text editor to create an application, but this does not eliminate the challenges faced by clinicians. In addition, even though there has been a high rate of smartphone applications usage and acceptance, it is common knowledge that it would cost clinicians as well as their centers a lot to develop smartphone applications that could be catered to their needs, and help them in their daily educational needs. The objectives of the current research are thus to highlight a cost-effective methodology of development of interactive education smartphone applications, and also to determine whether medical students are receptive towards having smartphone applications and their perspectives with regards to the contents within. In this study, we will elaborate how the Mastering Psychiatry Online Portal and web-based mobile application were developed using HTML5 as the core programming language. The online portal and web-based application was launched in July 2012 and usage data were obtained. Subsequently, a native application was developed, as it was funded by an educational grant and students are recruited after their end of posting clinical examination to fill up a survey questionnaire relating to perspectives. Our initial analytical results showed that since inception to date, for the online portal, there have been a total of 15,803 views, with a total of 2,109 copies of the online textbook being downloaded. As for the online videos, 5,895 viewers have watched the training videos from the start till the end. 722 users have accessed the mobile textbook application. A total of 185 students participated in the perspective survey, with the majority having positive perspectives about the implementation of a smartphone application in psychiatry. This is one of the few studies that describe how an educational application could be developed using a simple and cost effective methodology and this study has also demonstrated students' perspectives towards Smartphone in psychiatric education. Our methods might apply to future research involving the use of technology in education.

  8. LQTA-QSAR: a new 4D-QSAR methodology.

    PubMed

    Martins, João Paulo A; Barbosa, Euzébio G; Pasqualoto, Kerly F M; Ferreira, Márcia M C

    2009-06-01

    A novel 4D-QSAR approach which makes use of the molecular dynamics (MD) trajectories and topology information retrieved from the GROMACS package is presented in this study. This new methodology, named LQTA-QSAR (LQTA, Laboratório de Quimiometria Teórica e Aplicada), has a module (LQTAgrid) that calculates intermolecular interaction energies at each grid point considering probes and all aligned conformations resulting from MD simulations. These interaction energies are the independent variables or descriptors employed in a QSAR analysis. The comparison of the proposed methodology to other 4D-QSAR and CoMFA formalisms was performed using a set of forty-seven glycogen phosphorylase b inhibitors (data set 1) and a set of forty-four MAP p38 kinase inhibitors (data set 2). The QSAR models for both data sets were built using the ordered predictor selection (OPS) algorithm for variable selection. Model validation was carried out applying y-randomization and leave-N-out cross-validation in addition to the external validation. PLS models for data set 1 and 2 provided the following statistics: q(2) = 0.72, r(2) = 0.81 for 12 variables selected and 2 latent variables and q(2) = 0.82, r(2) = 0.90 for 10 variables selected and 5 latent variables, respectively. Visualization of the descriptors in 3D space was successfully interpreted from the chemical point of view, supporting the applicability of this new approach in rational drug design.

  9. A new methodology to determine kinetic parameters for one- and two-step chemical models

    NASA Technical Reports Server (NTRS)

    Mantel, T.; Egolfopoulos, F. N.; Bowman, C. T.

    1996-01-01

    In this paper, a new methodology to determine kinetic parameters for simple chemical models and simple transport properties classically used in DNS of premixed combustion is presented. First, a one-dimensional code is utilized to performed steady unstrained laminar methane-air flame in order to verify intrinsic features of laminar flames such as burning velocity and temperature and concentration profiles. Second, the flame response to steady and unsteady strain in the opposed jet configuration is numerically investigated. It appears that for a well determined set of parameters, one- and two-step mechanisms reproduce the extinction limit of a laminar flame submitted to a steady strain. Computations with the GRI-mech mechanism (177 reactions, 39 species) and multicomponent transport properties are used to validate these simplified models. A sensitivity analysis of the preferential diffusion of heat and reactants when the Lewis number is close to unity indicates that the response of the flame to an oscillating strain is very sensitive to this number. As an application of this methodology, the interaction between a two-dimensional vortex pair and a premixed laminar flame is performed by Direct Numerical Simulation (DNS) using the one- and two-step mechanisms. Comparison with the experimental results of Samaniego et al. (1994) shows a significant improvement in the description of the interaction when the two-step model is used.

  10. Design and development of an IBM/VM menu system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cazzola, D.J.

    1992-10-01

    This report describes a full screen menu system developed using IBM`s Interactive System Productivity Facility (ISPF) and the REXX programming language. The software was developed for the 2800 IBM/VM Electrical Computer Aided Design (ECAD) system. The system was developed to deliver electronic drawing definitions to a corporate drawing release system. Although this report documents the status of the menu system when it was retired, the methodologies used and the requirements defined are very applicable to replacement systems.

  11. Enhancing the interaction between nuclear experiment and theory through information and statistics

    DOE PAGES

    Ireland, D. G.; Nazarewicz, W.

    2015-02-05

    This Focus Issue draws from a range of topics within nuclear physics, from studies of individual nucleons to the heaviest of nuclei. The unifying theme, however, is to illustrate the extent to which uncertainty is a key quantity, and to showcase applications of the latest computational methodologies. It is our assertion that a paradigm shift is needed in nuclear physics to enhance the coupling between theory and experiment, and we hope that this collection of articles is a good start.

  12. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  13. Thermophysics modeling of an infrared detector cryochamber for transient operational scenario

    NASA Astrophysics Data System (ADS)

    Singhal, Mayank; Singhal, Gaurav; Verma, Avinash C.; Kumar, Sushil; Singh, Manmohan

    2016-05-01

    An infrared detector (IR) is essentially a transducer capable of converting radiant energy in the infrared regime into a measurable form. The benefit of infrared radiation is that it facilitates viewing objects in dark or through obscured conditions by detecting the infrared energy emitted by them. One of the most significant applications of IR detector systems is for target acquisition and tracking of projectile systems. IR detectors also find widespread applications in the industry and commercial market. The performance of infrared detector is sensitive to temperatures and performs best when cooled to cryogenic temperatures in the range of nearly 120 K. However, the necessity to operate in such cryogenic regimes increases the complexity in the application of IR detectors. This entails a need for detailed thermophysics analysis to be able to determine the actual cooling load specific to the application and also due to its interaction with the environment. This will enable design of most appropriate cooling methodologies suitable for specific scenarios. The focus of the present work is to develop a robust thermo-physical numerical methodology for predicting IR cryochamber behavior under transient conditions, which is the most critical scenario, taking into account all relevant heat loads including radiation in its original form. The advantage of the developed code against existing commercial software (COMSOL, ANSYS, etc.), is that it is capable of handling gas conduction together with radiation terms effectively, employing a ubiquitous software such as MATLAB. Also, it requires much smaller computational resources and is significantly less time intensive. It provides physically correct results enabling thermal characterization of cryochamber geometry in conjunction with appropriate cooling methodology. The code has been subsequently validated experimentally as the observed cooling characteristics are found to be in close agreement with the results predicted using the developed model thereby proving its efficacy.

  14. Methylsorb: a simple method for quantifying DNA methylation using DNA-gold affinity interactions.

    PubMed

    Sina, Abu Ali Ibn; Carrascosa, Laura G; Palanisamy, Ramkumar; Rauf, Sakandar; Shiddiky, Muhammad J A; Trau, Matt

    2014-10-21

    The analysis of DNA methylation is becoming increasingly important both in the clinic and also as a research tool to unravel key epigenetic molecular mechanisms in biology. Current methodologies for the quantification of regional DNA methylation (i.e., the average methylation over a region of DNA in the genome) are largely affected by comprehensive DNA sequencing methodologies which tend to be expensive, tedious, and time-consuming for many applications. Herein, we report an alternative DNA methylation detection method referred to as "Methylsorb", which is based on the inherent affinity of DNA bases to the gold surface (i.e., the trend of the affinity interactions is adenine > cytosine ≥ guanine > thymine).1 Since the degree of gold-DNA affinity interaction is highly sequence dependent, it provides a new capability to detect DNA methylation by simply monitoring the relative adsorption of bisulfite treated DNA sequences onto a gold chip. Because the selective physical adsorption of DNA fragments to gold enable a direct read-out of regional DNA methylation, the current requirement for DNA sequencing is obviated. To demonstrate the utility of this method, we present data on the regional methylation status of two CpG clusters located in the EN1 and MIR200B genes in MCF7 and MDA-MB-231 cells. The methylation status of these regions was obtained from the change in relative mass on gold surface with respect to relative adsorption of an unmethylated DNA source and this was detected using surface plasmon resonance (SPR) in a label-free and real-time manner. We anticipate that the simplicity of this method, combined with the high level of accuracy for identifying the methylation status of cytosines in DNA, could find broad application in biology and diagnostics.

  15. Solving the Self-Interaction Problem in Kohn-Sham Density Functional Theory. Application to Atoms

    DOE PAGES

    Daene, M.; Gonis, A.; Nicholson, D. M.; ...

    2014-10-14

    Previously, we proposed a computational methodology that addresses the elimination of the self-interaction error from the Kohn–Sham formulation of the density functional theory. We demonstrated how the exchange potential can be obtained, and presented results of calculations for atomic systems up to Kr carried out within a Cartesian coordinate system. In our paper, we provide complete details of this self-interaction free method formulated in spherical coordinates based on the explicit equidensity basis ansatz. We also prove analytically that derivatives obtained using this method satisfy the Virial theorem for spherical orbitals, where the problem can be reduced to one dimension. Wemore » present the results of calculations of ground-state energies of atomic systems throughout the periodic table carried out within the exchange-only mode.« less

  16. Recent Experiences in Multidisciplinary Analysis and Optimization, part 1

    NASA Technical Reports Server (NTRS)

    Sobieski, J. (Compiler)

    1984-01-01

    Papers presented at the NASA Symposium on Recent Experiences in Multidisciplinary Analysis and Optimization held at NASA Langley Research Center, Hampton, Virginia April 24 to 26, 1984 are given. The purposes of the symposium were to exchange information about the status of the application of optimization and associated analyses in industry or research laboratories to real life problems and to examine the directions of future developments. Information exchange has encompassed the following: (1) examples of successful applications; (2) attempt and failure examples; (3) identification of potential applications and benefits; (4) synergistic effects of optimized interaction and trade-offs occurring among two or more engineering disciplines and/or subsystems in a system; and (5) traditional organization of a design process as a vehicle for or an impediment to the progress in the design methodology.

  17. Social Gaming and Learning Applications: A Driving Force for the Future of Virtual and Augmented Reality?

    NASA Astrophysics Data System (ADS)

    Dörner, Ralf; Lok, Benjamin; Broll, Wolfgang

    Backed by a large consumer market, entertainment and education applications have spurred developments in the fields of real-time rendering and interactive computer graphics. Relying on Computer Graphics methodologies, Virtual Reality and Augmented Reality benefited indirectly from this; however, there is no large scale demand for VR and AR in gaming and learning. What are the shortcomings of current VR/AR technology that prevent a widespread use in these application areas? What advances in VR/AR will be necessary? And what might future “VR-enhanced” gaming and learning look like? Which role can and will Virtual Humans play? Concerning these questions, this article analyzes the current situation and provides an outlook on future developments. The focus is on social gaming and learning.

  18. Benefit-Risk Monitoring of Vaccines Using an Interactive Dashboard: A Methodological Proposal from the ADVANCE Project.

    PubMed

    Bollaerts, Kaatje; De Smedt, Tom; Donegan, Katherine; Titievsky, Lina; Bauchau, Vincent

    2018-03-26

    New vaccines are launched based on their benefit-risk (B/R) profile anticipated from clinical development. Proactive post-marketing surveillance is necessary to assess whether the vaccination uptake and the B/R profile are as expected and, ultimately, whether further public health or regulatory actions are needed. There are several, typically not integrated, facets of post-marketing vaccine surveillance: the surveillance of vaccination coverage, vaccine safety, effectiveness and impact. With this work, we aim to assess the feasibility and added value of using an interactive dashboard as a potential methodology for near real-time monitoring of vaccine coverage and pre-specified health benefits and risks of vaccines. We developed a web application with an interactive dashboard for B/R monitoring. The dashboard is demonstrated using simulated electronic healthcare record data mimicking the introduction of rotavirus vaccination in the UK. The interactive dashboard allows end users to select certain parameters, including expected vaccine effectiveness, age groups, and time periods and allows calculation of the incremental net health benefit (INHB) as well as the incremental benefit-risk ratio (IBRR) for different sets of preference weights. We assessed the potential added value of the dashboard by user testing amongst a range of stakeholders experienced in the post-marketing monitoring of vaccines. The dashboard was successfully implemented and demonstrated. The feedback from the potential end users was generally positive, although reluctance to using composite B/R measures was expressed. The use of interactive dashboards for B/R monitoring is promising and received support from various stakeholders. In future research, the use of such an interactive dashboard will be further tested with real-life data as opposed to simulated data.

  19. The Incremental Launching Method for Educational Virtual Model

    NASA Astrophysics Data System (ADS)

    Martins, Octávio; Sampaio, A. Z.

    This paper describes the application of virtual reality technology to the development of an educational model related to the construction of a bridge. The model allow the visualization of the physical progression of the work following a planned construction sequence, the observation of details of the form of every component of the works and carry the study of the type and method of operation of the equipment applied in the construction. The model admit interaction and then some degree of collaboration between students and teachers in the analyses of aspects concerning geometric forms, working methodology or other technical issues observed using the application. The model presents distinct advantage as educational aids in first-degree courses in Civil Engineering.

  20. DNA barcodes for ecology, evolution, and conservation.

    PubMed

    Kress, W John; García-Robledo, Carlos; Uriarte, Maria; Erickson, David L

    2015-01-01

    The use of DNA barcodes, which are short gene sequences taken from a standardized portion of the genome and used to identify species, is entering a new phase of application as more and more investigations employ these genetic markers to address questions relating to the ecology and evolution of natural systems. The suite of DNA barcode markers now applied to specific taxonomic groups of organisms are proving invaluable for understanding species boundaries, community ecology, functional trait evolution, trophic interactions, and the conservation of biodiversity. The application of next-generation sequencing (NGS) technology will greatly expand the versatility of DNA barcodes across the Tree of Life, habitats, and geographies as new methodologies are explored and developed. Published by Elsevier Ltd.

  1. Network support for turn-taking in multimedia collaboration

    NASA Astrophysics Data System (ADS)

    Dommel, Hans-Peter; Garcia-Luna-Aceves, Jose J.

    1997-01-01

    The effectiveness of collaborative multimedia systems depends on the regulation of access to their shared resources, such as continuous media or instruments used concurrently by multiple parties. Existing applications use only simple protocols to mediate such resource contention. Their cooperative rules follow a strict agenda and are largely application-specific. The inherent problem of floor control lacks a systematic methodology. This paper presents a general model on floor control for correct, scalable, fine-grained and fair resource sharing that integrates user interaction with network conditions, and adaptation to various media types. The motion of turn-taking known from psycholinguistics in studies on discourse structure is adapted for this framework. Viewed as a computational analogy to speech communication, online collaboration revolves around dynamically allocated access permissions called floors. The control semantics of floors derives from concurrently control methodology. An explicit specification and verification of a novel distributed Floor Control Protocol are presented. Hosts assume sharing roles that allow for efficient dissemination of control information, agreeing on a floor holder which is granted mutually exclusive access to a resource. Performance analytic aspects of floor control protocols are also briefly discussed.

  2. Rapid NMR Assignments of Proteins by Using Optimized Combinatorial Selective Unlabeling.

    PubMed

    Dubey, Abhinav; Kadumuri, Rajashekar Varma; Jaipuria, Garima; Vadrevu, Ramakrishna; Atreya, Hanudatta S

    2016-02-15

    A new approach for rapid resonance assignments in proteins based on amino acid selective unlabeling is presented. The method involves choosing a set of multiple amino acid types for selective unlabeling and identifying specific tripeptides surrounding the labeled residues from specific 2D NMR spectra in a combinatorial manner. The methodology directly yields sequence specific assignments, without requiring a contiguously stretch of amino acid residues to be linked, and is applicable to deuterated proteins. We show that a 2D [(15) N,(1) H] HSQC spectrum with two 2D spectra can result in ∼50 % assignments. The methodology was applied to two proteins: an intrinsically disordered protein (12 kDa) and the 29 kDa (268 residue) α-subunit of Escherichia coli tryptophan synthase, which presents a challenging case with spectral overlaps and missing peaks. The method can augment existing approaches and will be useful for applications such as identifying active-site residues involved in ligand binding, phosphorylation, or protein-protein interactions, even prior to complete resonance assignments. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Conducting interactive experiments online.

    PubMed

    Arechar, Antonio A; Gächter, Simon; Molleman, Lucas

    2018-01-01

    Online labor markets provide new opportunities for behavioral research, but conducting economic experiments online raises important methodological challenges. This particularly holds for interactive designs. In this paper, we provide a methodological discussion of the similarities and differences between interactive experiments conducted in the laboratory and online. To this end, we conduct a repeated public goods experiment with and without punishment using samples from the laboratory and the online platform Amazon Mechanical Turk. We chose to replicate this experiment because it is long and logistically complex. It therefore provides a good case study for discussing the methodological and practical challenges of online interactive experimentation. We find that basic behavioral patterns of cooperation and punishment in the laboratory are replicable online. The most important challenge of online interactive experiments is participant dropout. We discuss measures for reducing dropout and show that, for our case study, dropouts are exogenous to the experiment. We conclude that data quality for interactive experiments via the Internet is adequate and reliable, making online interactive experimentation a potentially valuable complement to laboratory studies.

  4. Computer Class Role Playing Games, an innovative teaching methodology based on STEM and ICT: first experimental results

    NASA Astrophysics Data System (ADS)

    Maraffi, S.

    2016-12-01

    Context/PurposeWe experienced a new teaching and learning technology: a Computer Class Role Playing Game (RPG) to perform educational activity in classrooms through an interactive game. This approach is new, there are some experiences on educational games, but mainly individual and not class-based. Gaming all together in a class, with a single scope for the whole class, it enhances peer collaboration, cooperative problem solving and friendship. MethodsTo perform the research we experimented the games in several classes of different degrees, acquiring specific questionnaire by teachers and pupils. Results Experimental results were outstanding: RPG, our interactive activity, exceed by 50% the overall satisfaction compared to traditional lessons or Power Point supported teaching. InterpretationThe appreciation of RPG was in agreement with the class level outcome identified by the teacher after the experimentation. Our work experience get excellent feedbacks by teachers, in terms of efficacy of this new teaching methodology and of achieved results. Using new methodology more close to the student point of view improves the innovation and creative capacities of learners, and it support the new role of teacher as learners' "coach". ConclusionThis paper presents the first experimental results on the application of this new technology based on a Computer game which project on a wall in the class an adventure lived by the students. The plots of the actual adventures are designed for deeper learning of Science, Technology, Engineering, Mathematics (STEM) and Social Sciences & Humanities (SSH). The participation of the pupils it's based on the interaction with the game by the use of their own tablets or smartphones. The game is based on a mixed reality learning environment, giving the students the feel "to be IN the adventure".

  5. A first principles based methodology for design of axial compressor configurations

    NASA Astrophysics Data System (ADS)

    Iyengar, Vishwas

    Axial compressors are widely used in many aerodynamic applications. The design of an axial compressor configuration presents many challenges. Until recently, compressor design was done using 2-D viscous flow analyses that solve the flow field around cascades or in meridional planes or 3-D inviscid analyses. With the advent of modern computational methods it is now possible to analyze the 3-D viscous flow and accurately predict the performance of 3-D multistage compressors. It is necessary to retool the design methodologies to take advantage of the improved accuracy and physical fidelity of these advanced methods. In this study, a first-principles based multi-objective technique for designing single stage compressors is described. The study accounts for stage aerodynamic characteristics, rotor-stator interactions and blade elastic deformations. A parametric representation of compressor blades that include leading and trailing edge camber line angles, thickness and camber distributions was used in this study. A design of experiment approach is used to reduce the large combinations of design variables into a smaller subset. A response surface method is used to approximately map the output variables as a function of design variables. An optimized configuration is determined as the extremum of all extrema. This method has been applied to a rotor-stator stage similar to NASA Stage 35. The study has two parts: a preliminary study where a limited number of design variables were used to give an understanding of the important design variables for subsequent use, and a comprehensive application of the methodology where a larger, more complete set of design variables are used. The extended methodology also attempts to minimize the acoustic fluctuations at the rotor-stator interface by considering a rotor-wake influence coefficient (RWIC). Results presented include performance map calculations at design and off-design speed along with a detailed visualization of the flow field at design and off-design conditions. The present methodology provides a way to systematically screening through the plethora of design variables. By selecting the most influential design parameters and by optimizing the blade leading edge and trailing edge mean camber line angles, phenomenon's such as tip blockages, blade-to-blade shock structures and other loss mechanisms can be weakened or alleviated. It is found that these changes to the configuration can have a beneficial effect on total pressure ratio and stage adiabatic efficiency, thereby improving the performance of the axial compression system. Aeroacoustic benefits were found by minimizing the noise generating mechanisms associated with rotor wake-stator interactions. The new method presented is reliable, low time cost, and easily applicable to industry daily design optimization of turbomachinery blades.

  6. Analytical group decision making in natural resources: Methodology and application

    USGS Publications Warehouse

    Schmoldt, D.L.; Peterson, D.L.

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.

  7. Performance-Driven Hybrid Full-Body Character Control for Navigation and Interaction in Virtual Environments

    NASA Astrophysics Data System (ADS)

    Mousas, Christos; Anagnostopoulos, Christos-Nikolaos

    2017-06-01

    This paper presents a hybrid character control interface that provides the ability to synthesize in real-time a variety of actions based on the user's performance capture. The proposed methodology enables three different performance interaction modules: the performance animation control that enables the direct mapping of the user's pose to the character, the motion controller that synthesizes the desired motion of the character based on an activity recognition methodology, and the hybrid control that lies within the performance animation and the motion controller. With the methodology presented, the user will have the freedom to interact within the virtual environment, as well as the ability to manipulate the character and to synthesize a variety of actions that cannot be performed directly by him/her, but which the system synthesizes. Therefore, the user is able to interact with the virtual environment in a more sophisticated fashion. This paper presents examples of different scenarios based on the three different full-body character control methodologies.

  8. TIGGERC: Turbomachinery Interactive Grid Generator for 2-D Grid Applications and Users Guide

    NASA Technical Reports Server (NTRS)

    Miller, David P.

    1994-01-01

    A two-dimensional multi-block grid generator has been developed for a new design and analysis system for studying multiple blade-row turbomachinery problems. TIGGERC is a mouse driven, interactive grid generation program which can be used to modify boundary coordinates and grid packing and generates surface grids using a hyperbolic tangent or algebraic distribution of grid points on the block boundaries. The interior points of each block grid are distributed using a transfinite interpolation approach. TIGGERC can generate a blocked axisymmetric H-grid, C-grid, I-grid or O-grid for studying turbomachinery flow problems. TIGGERC was developed for operation on Silicon Graphics workstations. Detailed discussion of the grid generation methodology, menu options, operational features and sample grid geometries are presented.

  9. Pathway analyses and understanding disease associations

    PubMed Central

    Liu, Yu; Chance, Mark R

    2013-01-01

    High throughput technologies have been applied to investigate the underlying mechanisms of complex diseases, identify disease-associations and help to improve treatment. However it is challenging to derive biological insight from conventional single gene based analysis of “omics” data from high throughput experiments due to sample and patient heterogeneity. To address these challenges, many novel pathway and network based approaches were developed to integrate various “omics” data, such as gene expression, copy number alteration, Genome Wide Association Studies, and interaction data. This review will cover recent methodological developments in pathway analysis for the detection of dysregulated interactions and disease-associated subnetworks, prioritization of candidate disease genes, and disease classifications. For each application, we will also discuss the associated challenges and potential future directions. PMID:24319650

  10. Approximate solution of coupled cluster equations: application to the coupled cluster doubles method and non-covalent interacting systems.

    PubMed

    Smiga, Szymon; Fabiano, Eduardo

    2017-11-15

    We have developed a simplified coupled cluster (SCC) methodology, using the basic idea of scaled MP2 methods. The scheme has been applied to the coupled cluster double equations and implemented in three different non-iterative variants. This new method (especially the SCCD[3] variant, which utilizes a spin-resolved formalism) has been found to be very efficient and to yield an accurate approximation of the reference CCD results for both total and interaction energies of different atoms and molecules. Furthermore, we demonstrate that the equations determining the scaling coefficients for the SCCD[3] approach can generate non-empirical SCS-MP2 scaling coefficients which are in good agreement with previous theoretical investigations.

  11. Determination of some pure compound ideal-gas enthalpies of formation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steele, W. V.; Chirico, R. D.; Nguyen, A.

    1989-06-01

    The results of a study aimed at improvement of group-additivity methodology for estimation of thermodynamic properties of organic substances are reported. Specific weaknesses where ring corrections were unknown or next-nearest-neighbor interactions were only estimated because of lack of experimental data are addressed by experimental studies of enthalpies of combustion in the condensed- phase and vapor pressure measurements. Ideal-gas enthalpies of formation are reported for acrylamide, succinimide, ..gamma..-butyrolactone, 2-pyrrolidone, 2,3-dihydrofuran, 3,4-dihydro-2H-pyran, 1,3-cyclohexadiene, 1,4-cyclohexadiene, and 1-methyl-1-phenylhydrazine. Ring corrections, group terms, and next-nearest-neighbor interaction terms useful in the application of group additivity correlations are derived. 44 refs., 2 figs., 59 tabs.

  12. DHM and serious games: a case-study oil and gas laboratories.

    PubMed

    Santos, V; Zamberlan, M; Streit, P; Oliveira, J; Guimarães, C; Pastura, F; Cid, G

    2012-01-01

    The aim in this paper is to present a research on the application of serious games for the design of laboratories in the oil and gas industries. The focus is in human virtual representation acquired from 3D scanning, human interaction, workspace layout and equipment designed considering ergonomics standards. The laboratory studies were simulated in Unity3D platform, which allows the users to control the DHM1 on the dynamic virtual scenario, in order to simulate work activities. This methodology can change the design process by improving the level of interaction between final users, managers and human factor teams. That helps to better visualize future work settings and improve the level of participation between all stakeholders.

  13. High-Sensitivity Real-Time Imaging of Dual Protein-Protein Interactions in Living Subjects Using Multicolor Luciferases

    PubMed Central

    Hida, Naoki; Awais, Muhammad; Takeuchi, Masaki; Ueno, Naoto; Tashiro, Mayuri; Takagi, Chiyo; Singh, Tanuja; Hayashi, Makoto; Ohmiya, Yoshihiro; Ozawa, Takeaki

    2009-01-01

    Networks of protein-protein interactions play key roles in numerous important biological processes in living subjects. An effective methodology to assess protein-protein interactions in living cells of interest is protein-fragment complement assay (PCA). Particularly the assays using fluorescent proteins are powerful techniques, but they do not directly track interactions because of its irreversibility or the time for chromophore formation. By contrast, PCAs using bioluminescent proteins can overcome these drawbacks. We herein describe an imaging method for real-time analysis of protein-protein interactions using multicolor luciferases with different spectral characteristics. The sensitivity and signal-to-background ratio were improved considerably by developing a carboxy-terminal fragment engineered from a click beetle luciferase. We demonstrate its utility in spatiotemporal characterization of Smad1–Smad4 and Smad2–Smad4 interactions in early developing stages of a single living Xenopus laevis embryo. We also describe the value of this method by application of specific protein-protein interactions in cell cultures and living mice. This technique supports quantitative analyses and imaging of versatile protein-protein interactions with a selective luminescence wavelength in opaque or strongly auto-fluorescent living subjects. PMID:19536355

  14. Development and application of a hybrid transport methodology for active interrogation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Royston, K.; Walters, W.; Haghighat, A.

    A hybrid Monte Carlo and deterministic methodology has been developed for application to active interrogation systems. The methodology consists of four steps: i) neutron flux distribution due to neutron source transport and subcritical multiplication; ii) generation of gamma source distribution from (n, 7) interactions; iii) determination of gamma current at a detector window; iv) detection of gammas by the detector. This paper discusses the theory and results of the first three steps for the case of a cargo container with a sphere of HEU in third-density water cargo. To complete the first step, a response-function formulation has been developed tomore » calculate the subcritical multiplication and neutron flux distribution. Response coefficients are pre-calculated using the MCNP5 Monte Carlo code. The second step uses the calculated neutron flux distribution and Bugle-96 (n, 7) cross sections to find the resulting gamma source distribution. In the third step the gamma source distribution is coupled with a pre-calculated adjoint function to determine the gamma current at a detector window. The AIMS (Active Interrogation for Monitoring Special-Nuclear-Materials) software has been written to output the gamma current for a source-detector assembly scanning across a cargo container using the pre-calculated values and taking significantly less time than a reference MCNP5 calculation. (authors)« less

  15. Numerical study of external burning flowfields

    NASA Technical Reports Server (NTRS)

    Bittner, Robert D.; Mcclinton, Charles R.

    1991-01-01

    This paper demonstrates the successful application of CFD to modeling an external burning flowfield. The study used the 2D, 3D, and PNS versions of the SPARK code. Various grids, boundary conditions, and ignition methodologies have been employed. Flameholding was achieved through the use of a subsonic outflow condition and a hot block located behind the step to ignite the fuel. Since the resulting burning produces a large subsonic region downstream of the cowl, this entire surface can be pressurized to the level of the back pressure. An evaluation of interactions between the ramjet exhaust and the external burning products demonstrate the complexity of this design issue. Ths code is now capable of evaluating the external burning effectiveness for flight vehicles using simple injector schemes, and the methodology can be readily applied to other external burning designs.

  16. A Mode of Combined ERP and KMS Knowledge Management System Construction

    NASA Astrophysics Data System (ADS)

    Yuena, Kang; Yangeng, Wen; Qun, Zhou

    The core of ERP and knowledge management is quite similar; both will send appropriate knowledge (goods, funds) to the right people (position) at the right time. It is reasonable to believe that increase the knowledge management system in ERP will help companies achieve their goals better. This paper compares the concept of logical point of hall three-dimensional structure of the knowledge management system and the ERP in methodology level. And found they are very similar in the time dimension, logic dimension and knowledge dimension. This laid the basis of methodology in the simultaneous planning, implementation and applications. And then proposed a knowledge-based ERP Multi-Agent Management System Model. Finally, the paper described the process from planning to implementation of knowledge management ERP system with multi-Agent interaction and impact from three concepts, management thinking, software and system.

  17. FTDD973: A multimedia knowledge-based system and methodology for operator training and diagnostics

    NASA Technical Reports Server (NTRS)

    Hekmatpour, Amir; Brown, Gary; Brault, Randy; Bowen, Greg

    1993-01-01

    FTDD973 (973 Fabricator Training, Documentation, and Diagnostics) is an interactive multimedia knowledge based system and methodology for computer-aided training and certification of operators, as well as tool and process diagnostics in IBM's CMOS SGP fabrication line (building 973). FTDD973 is an example of what can be achieved with modern multimedia workstations. Knowledge-based systems, hypertext, hypergraphics, high resolution images, audio, motion video, and animation are technologies that in synergy can be far more useful than each by itself. FTDD973's modular and object-oriented architecture is also an example of how improvements in software engineering are finally making it possible to combine many software modules into one application. FTDD973 is developed in ExperMedia/2; and OS/2 multimedia expert system shell for domain experts.

  18. Decoding the Heart through Next Generation Sequencing Approaches.

    PubMed

    Pawlak, Michal; Niescierowicz, Katarzyna; Winata, Cecilia Lanny

    2018-06-07

    : Vertebrate organs develop through a complex process which involves interaction between multiple signaling pathways at the molecular, cell, and tissue levels. Heart development is an example of such complex process which, when disrupted, results in congenital heart disease (CHD). This complexity necessitates a holistic approach which allows the visualization of genome-wide interaction networks, as opposed to assessment of limited subsets of factors. Genomics offers a powerful solution to address the problem of biological complexity by enabling the observation of molecular processes at a genome-wide scale. The emergence of next generation sequencing (NGS) technology has facilitated the expansion of genomics, increasing its output capacity and applicability in various biological disciplines. The application of NGS in various aspects of heart biology has resulted in new discoveries, generating novel insights into this field of study. Here we review the contributions of NGS technology into the understanding of heart development and its disruption reflected in CHD and discuss how emerging NGS based methodologies can contribute to the further understanding of heart repair.

  19. CAGI: Computer Aided Grid Interface. A work in progress

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.; Yu, Tzu-Yi; Vaughn, David

    1992-01-01

    Progress realized in the development of a Computer Aided Grid Interface (CAGI) software system in integrating CAD/CAM geometric system output and/or Interactive Graphics Exchange Standard (IGES) files, geometry manipulations associated with grid generation, and robust grid generation methodologies is presented. CAGI is being developed in a modular fashion and will offer fast, efficient and economical response to geometry/grid preparation, allowing the ability to upgrade basic geometry in a step-by-step fashion interactively and under permanent visual control along with minimizing the differences between the actual hardware surface descriptions and corresponding numerical analog. The computer code GENIE is used as a basis. The Non-Uniform Rational B-Splines (NURBS) representation of sculptured surfaces is utilized for surface grid redistribution. The computer aided analysis system, PATRAN, is adapted as a CAD/CAM system. The progress realized in NURBS surface grid generation, the development of IGES transformer, and geometry adaption using PATRAN will be presented along with their applicability to grid generation associated with rocket propulsion applications.

  20. Towards understanding of magnetization reversal in Nd-Fe-B nanocomposites: analysis by high-throughput micromagnetic simulations

    NASA Astrophysics Data System (ADS)

    Erokhin, Sergey; Berkov, Dmitry; Ito, Masaaki; Kato, Akira; Yano, Masao; Michels, Andreas

    2018-03-01

    We demonstrate how micromagnetic simulations can be employed in order to characterize and analyze the magnetic microstructure of nanocomposites. For the example of nanocrystalline Nd-Fe-B, which is a potential material for future permanent-magnet applications, we have compared three different models for the micromagnetic analysis of this material class: (i) a description of the nanocomposite microstructure in terms of Stoner-Wohlfarth particles with and without the magnetodipolar interaction; (ii) a model based on the core-shell representation of the nanograins; (iii) the latter model including a contribution of superparamagnetic clusters. The relevant parameter spaces have been systematically scanned with the aim to establish which micromagnetic approach can most adequately describe experimental data for this material. According to our results, only the last, most sophisticated model is able to provide an excellent agreement with the measured hysteresis loop. The presented methodology is generally applicable to multiphase magnetic nanocomposites and it highligths the complex interrelationship between the microstructure, magnetic interactions, and the macroscopic magnetic properties.

  1. The Greek National Observatory of Forest Fires (NOFFi)

    NASA Astrophysics Data System (ADS)

    Tompoulidou, Maria; Stefanidou, Alexandra; Grigoriadis, Dionysios; Dragozi, Eleni; Stavrakoudis, Dimitris; Gitas, Ioannis Z.

    2016-08-01

    Efficient forest fire management is a key element for alleviating the catastrophic impacts of wildfires. Overall, the effective response to fire events necessitates adequate planning and preparedness before the start of the fire season, as well as quantifying the environmental impacts in case of wildfires. Moreover, the estimation of fire danger provides crucial information required for the optimal allocation and distribution of the available resources. The Greek National Observatory of Forest Fires (NOFFi)—established by the Greek Forestry Service in collaboration with the Laboratory of Forest Management and Remote Sensing of the Aristotle University of Thessaloniki and the International Balkan Center—aims to develop a series of modern products and services for supporting the efficient forest fire prevention management in Greece and the Balkan region, as well as to stimulate the development of transnational fire prevention and impacts mitigation policies. More specifically, NOFFi provides three main fire-related products and services: a) a remote sensing-based fuel type mapping methodology, b) a semi-automatic burned area mapping service, and c) a dynamically updatable fire danger index providing mid- to long-term predictions. The fuel type mapping methodology was developed and applied across the country, following an object-oriented approach and using Landsat 8 OLI satellite imagery. The results showcase the effectiveness of the generated methodology in obtaining highly accurate fuel type maps on a national level. The burned area mapping methodology was developed as a semi-automatic object-based classification process, carefully crafted to minimize user interaction and, hence, be easily applicable on a near real-time operational level as well as for mapping historical events. NOFFi's products can be visualized through the interactive Fire Forest portal, which allows the involvement and awareness of the relevant stakeholders via the Public Participation GIS (PPGIS) tool.

  2. Interrogating discourse: the application of Foucault's methodological discussion to specific inquiry.

    PubMed

    Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M

    2013-09-01

    Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.

  3. Challenges and Opportunities for Harmonizing Research Methodology: Raw Accelerometry.

    PubMed

    van Hees, Vincent T; Thaler-Kall, Kathrin; Wolf, Klaus-Hendrik; Brønd, Jan C; Bonomi, Alberto; Schulze, Mareike; Vigl, Matthäus; Morseth, Bente; Hopstock, Laila Arnesdatter; Gorzelniak, Lukas; Schulz, Holger; Brage, Søren; Horsch, Alexander

    2016-12-07

    Raw accelerometry is increasingly being used in physical activity research, but diversity in sensor design, attachment and signal processing challenges the comparability of research results. Therefore, efforts are needed to harmonize the methodology. In this article we reflect on how increased methodological harmonization may be achieved. The authors of this work convened for a two-day workshop (March 2014) themed on methodological harmonization of raw accelerometry. The discussions at the workshop were used as a basis for this review. Key stakeholders were identified as manufacturers, method developers, method users (application), publishers, and funders. To facilitate methodological harmonization in raw accelerometry the following action points were proposed: i) Manufacturers are encouraged to provide a detailed specification of their sensors, ii) Each fundamental step of algorithms for processing raw accelerometer data should be documented, and ideally also motivated, to facilitate interpretation and discussion, iii) Algorithm developers and method users should be open about uncertainties in the description of data and the uncertainty of the inference itself, iv) All new algorithms which are pitched as "ready for implementation" should be shared with the community to facilitate replication and ongoing evaluation by independent groups, and v) A dynamic interaction between method stakeholders should be encouraged to facilitate a well-informed harmonization process. The workshop led to the identification of a number of opportunities for harmonizing methodological practice. The discussion as well as the practical checklists proposed in this review should provide guidance for stakeholders on how to contribute to increased harmonization.

  4. Visualising crystal packing interactions in solid-state NMR: Concepts and applications

    NASA Astrophysics Data System (ADS)

    Zilka, Miri; Sturniolo, Simone; Brown, Steven P.; Yates, Jonathan R.

    2017-10-01

    In this article, we introduce and apply a methodology, based on density functional theory and the gauge-including projector augmented wave approach, to explore the effects of packing interactions on solid-state nuclear magnetic resonance (NMR) parameters. A visual map derived from a so-termed "magnetic shielding contribution field" can be made of the contributions to the magnetic shielding of a specific site—partitioning the chemical shift to specific interactions. The relation to the established approaches of examining the molecule to crystal change in the chemical shift and the nuclear independent chemical shift is established. The results are applied to a large sample of 71 molecular crystals and three further specific examples from supermolecular chemistry and pharmaceuticals. This approach extends the NMR crystallography toolkit and provides insight into the development of both cluster based approaches to the predictions of chemical shifts and for empirical predictions of chemical shifts in solids.

  5. A Comprehensive Probabilistic Tsunami Hazard Assessment: Multiple Sources and Short-Term Interactions

    NASA Astrophysics Data System (ADS)

    Anita, G.; Selva, J.; Laura, S.

    2011-12-01

    We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).

  6. Development of a novel 2D color map for interactive segmentation of histological images.

    PubMed

    Chaudry, Qaiser; Sharma, Yachna; Raza, Syed H; Wang, May D

    2012-05-01

    We present a color segmentation approach based on a two-dimensional color map derived from the input image. Pathologists stain tissue biopsies with various colored dyes to see the expression of biomarkers. In these images, because of color variation due to inconsistencies in experimental procedures and lighting conditions, the segmentation used to analyze biological features is usually ad-hoc. Many algorithms like K-means use a single metric to segment the image into different color classes and rarely provide users with powerful color control. Our 2D color map interactive segmentation technique based on human color perception information and the color distribution of the input image, enables user control without noticeable delay. Our methodology works for different staining types and different types of cancer tissue images. Our proposed method's results show good accuracy with low response and computational time making it a feasible method for user interactive applications involving segmentation of histological images.

  7. Six methodological steps to build medical data warehouses for research.

    PubMed

    Szirbik, N B; Pelletier, C; Chaussalet, T

    2006-09-01

    We propose a simple methodology for heterogeneous data collection and central repository-style database design in healthcare. Our method can be used with or without other software development frameworks, and we argue that its application can save a relevant amount of implementation effort. Also, we believe that the method can be used in other fields of research, especially those that have a strong interdisciplinary nature. The idea emerged during a healthcare research project, which consisted among others in grouping information from heterogeneous and distributed information sources. We developed this methodology by the lessons learned when we had to build a data repository, containing information about elderly patients flows in the UK's long-term care system (LTC). We explain thoroughly those aspects that influenced the methodology building. The methodology is defined by six steps, which can be aligned with various iterative development frameworks. We describe here the alignment of our methodology with the RUP (rational unified process) framework. The methodology emphasizes current trends, as early identification of critical requirements, data modelling, close and timely interaction with users and stakeholders, ontology building, quality management, and exception handling. Of a special interest is the ontological engineering aspect, which had the effects with the highest impact after the project. That is, it helped stakeholders to perform better collaborative negotiations that brought better solutions for the overall system investigated. An insight into the problems faced by others helps to lead the negotiators to win-win situations. We consider that this should be the social result of any project that collects data for better decision making that leads finally to enhanced global outcomes.

  8. Influence of Food on Paediatric Gastrointestinal Drug Absorption Following Oral Administration: A Review

    PubMed Central

    Batchelor, Hannah K.

    2015-01-01

    The objective of this paper was to review existing information regarding food effects on drug absorption within paediatric populations. Mechanisms that underpin food–drug interactions were examined to consider potential differences between adult and paediatric populations, to provide insights into how this may alter the pharmacokinetic profile in a child. Relevant literature was searched to retrieve information on food–drug interaction studies undertaken on: (i) paediatric oral drug formulations; and (ii) within paediatric populations. The applicability of existing methodology to predict food effects in adult populations was evaluated with respect to paediatric populations where clinical data was available. Several differences in physiology, anatomy and the composition of food consumed within a paediatric population are likely to lead to food–drug interactions that cannot be predicted based on adult studies. Existing methods to predict food effects cannot be directly extrapolated to allow predictions within paediatric populations. Development of systematic methods and guidelines is needed to address the general lack of information on examining food–drug interactions within paediatric populations. PMID:27417362

  9. Identification and super-resolution imaging of ligand-activated receptor dimers in live cells

    NASA Astrophysics Data System (ADS)

    Winckler, Pascale; Lartigue, Lydia; Giannone, Gregory; de Giorgi, Francesca; Ichas, François; Sibarita, Jean-Baptiste; Lounis, Brahim; Cognet, Laurent

    2013-08-01

    Molecular interactions are key to many chemical and biological processes like protein function. In many signaling processes they occur in sub-cellular areas displaying nanoscale organizations and involving molecular assemblies. The nanometric dimensions and the dynamic nature of the interactions make their investigations complex in live cells. While super-resolution fluorescence microscopies offer live-cell molecular imaging with sub-wavelength resolutions, they lack specificity for distinguishing interacting molecule populations. Here we combine super-resolution microscopy and single-molecule Förster Resonance Energy Transfer (FRET) to identify dimers of receptors induced by ligand binding and provide super-resolved images of their membrane distribution in live cells. By developing a two-color universal-Point-Accumulation-In-the-Nanoscale-Topography (uPAINT) method, dimers of epidermal growth factor receptors (EGFR) activated by EGF are studied at ultra-high densities, revealing preferential cell-edge sub-localization. This methodology which is specifically devoted to the study of molecules in interaction, may find other applications in biological systems where understanding of molecular organization is crucial.

  10. A Visual Programming Methodology for Tactical Aircrew Scheduling and Other Applications

    DTIC Science & Technology

    1991-12-01

    prgramming methodology and environment of a user-specific application remains with and is delivered as part of the application, then there is another factor...animation is useful, not only for scheduling applications, but as a general prgramming methodology. Of course, there are a number of improvements...possible using Excel because there is nothing to prevent access to cells. However, it is easy to imagine a spreadsheet which can support the

  11. Synthesis of qualitative linguistic research--a pilot review integrating and generalizing findings on doctor-patient interaction.

    PubMed

    Nowak, Peter

    2011-03-01

    There is a broad range qualitative linguistic research (sequential analysis) on doctor-patient interaction that had only a marginal impact on clinical research and practice. At least in parts this is due to the lack of qualitative research synthesis in the field. Available research summaries are not systematic in their methodology. This paper proposes a synthesis methodology for qualitative, sequential analytic research on doctor-patient interaction. The presented methodology is not new but specifies standard methodology of qualitative research synthesis for sequential analytic research. This pilot review synthesizes twelve studies on German-speaking doctor-patient interactions, identifies 45 verbal actions of doctors and structures them in a systematics of eight interaction components. Three interaction components ("Listening", "Asking for information", and "Giving information") seem to be central and cover two thirds of the identified action types. This pilot review demonstrates that sequential analytic research can be synthesized in a consistent and meaningful way, thus providing a more comprehensive and unbiased integration of research. Future synthesis of qualitative research in the area of health communication research is very much needed. Qualitative research synthesis can support the development of quantitative research and of educational materials in medical training and patient training. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  12. Atomistic and Coarse-Grained Modeling of the Adsorption of Graphene Nanoflakes at the Oil-Water Interface.

    PubMed

    Ardham, Vikram Reddy; Leroy, Frédéric

    2018-03-01

    The high interfacial tension between two immiscible liquids can provide the necessary driving force for the self-assembly of nanoparticles at the interface. Particularly, the interface between water and oily liquids (hydrocarbon chains) has been exploited to prepare networks of highly interconnected graphene sheets of only a few layers thickness, which are well suited for industrial applications. Studying such complex systems through particle-based simulations could greatly enhance the understanding of the various driving forces in action and could possibly give more control over the self-assembly process. However, the interaction potentials used in particle-based simulations are typically derived by reproducing bulk properties and are therefore not suitable for describing systems dominated by interfaces. To address this issue, we introduce a methodology to derive solid-liquid interaction potentials that yield an accurate representation of the balance between interfacial interactions at atomistic and coarse-grained resolutions. Our approach is validated through its ability to lead to the adsorption of graphene nanoflakes at the interface between water and n-hexane. The development of accurate coarse-grained potentials that our approach enables will allow us to perform large-scale simulations to study the assembly of graphene nanoparticles at the interface between immiscible liquids. Our methodology is illustrated through a simulation of many graphene nanoflakes adsorbing at the interface.

  13. A Succinct Overview of Virtual Reality Technology Use in Alzheimer’s Disease

    PubMed Central

    García-Betances, Rebeca I.; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda

    2015-01-01

    We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer’s disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers’ education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments. PMID:26029101

  14. Advanced piloted aircraft flight control system design methodology. Volume 2: The FCX flight control design expert system

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Mcruer, Duane T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.

  15. Employees' and Managers' Accounts of Interactive Workplace Learning: A Grounded Theory of "Complex Integrative Learning"

    ERIC Educational Resources Information Center

    Armson, Genevieve; Whiteley, Alma

    2010-01-01

    Purpose: The purpose of this paper is to investigate employees' and managers' accounts of interactive learning and what might encourage or inhibit emergent learning. Design/methodology/approach: The approach taken was a constructivist/social constructivist ontology, interpretive epistemology and qualitative methodology, using grounded theory…

  16. Transforming Violent Selves through Reflection in Critical Communicative Research

    ERIC Educational Resources Information Center

    Flecha, Ainhoa; Pulido, Cristina; Christou, Miranda

    2011-01-01

    Currently, teenagers are being socialized into a world of violent realities, not only through social interaction but also through interaction via the media, especially via the Internet. Research conducted using the critical communicative methodology has shown that this methodology helps young people to reflect critically about their violent…

  17. The effects of node exclusion on the centrality measures in graph models of interacting economic agents

    NASA Astrophysics Data System (ADS)

    Caetano, Marco Antonio Leonel; Yoneyama, Takashi

    2015-07-01

    This work concerns the study of the effects felt by a network as a whole when a specific node is perturbed. Many real world systems can be described by network models in which the interactions of the various agents can be represented as an edge of a graph. With a graph model in hand, it is possible to evaluate the effect of deleting some of its edges on the architecture and values of nodes of the network. Eventually a node may end up isolated from the rest of the network and an interesting problem is to have a quantitative measure of the impact of such an event. For instance, in the field of finance, the network models are very popular and the proposed methodology allows to carry out "what if" tests in terms of weakening the links between the economic agents, represented as nodes. The two main concepts employed in the proposed methodology are (i) the vibrational IC-Information Centrality, which can provide a measure of the relative importance of a particular node in a network and (ii) autocatalytic networks that can indicate the evolutionary trends of the network. Although these concepts were originally proposed in the context of other fields of knowledge, they were also found to be useful in analyzing financial networks. In order to illustrate the applicability of the proposed methodology, a case of study using the actual data comprising stock market indices of 12 countries is presented.

  18. Graph theory enables drug repurposing--how a mathematical model can drive the discovery of hidden mechanisms of action.

    PubMed

    Gramatica, Ruggero; Di Matteo, T; Giorgetti, Stefano; Barbiani, Massimo; Bevec, Dorian; Aste, Tomaso

    2014-01-01

    We introduce a methodology to efficiently exploit natural-language expressed biomedical knowledge for repurposing existing drugs towards diseases for which they were not initially intended. Leveraging on developments in Computational Linguistics and Graph Theory, a methodology is defined to build a graph representation of knowledge, which is automatically analysed to discover hidden relations between any drug and any disease: these relations are specific paths among the biomedical entities of the graph, representing possible Modes of Action for any given pharmacological compound. We propose a measure for the likeliness of these paths based on a stochastic process on the graph. This measure depends on the abundance of indirect paths between a peptide and a disease, rather than solely on the strength of the shortest path connecting them. We provide real-world examples, showing how the method successfully retrieves known pathophysiological Mode of Action and finds new ones by meaningfully selecting and aggregating contributions from known bio-molecular interactions. Applications of this methodology are presented, and prove the efficacy of the method for selecting drugs as treatment options for rare diseases.

  19. [Methodological approach to the use of artificial neural networks for predicting results in medicine].

    PubMed

    Trujillano, Javier; March, Jaume; Sorribas, Albert

    2004-01-01

    In clinical practice, there is an increasing interest in obtaining adequate models of prediction. Within the possible available alternatives, the artificial neural networks (ANN) are progressively more used. In this review we first introduce the ANN methodology, describing the most common type of ANN, the Multilayer Perceptron trained with backpropagation algorithm (MLP). Then we compare the MLP with the Logistic Regression (LR). Finally, we show a practical scheme to make an application based on ANN by means of an example with actual data. The main advantage of the RN is its capacity to incorporate nonlinear effects and interactions between the variables of the model without need to include them a priori. As greater disadvantages, they show a difficult interpretation of their parameters and large empiricism in their process of construction and training. ANN are useful for the computation of probabilities of a given outcome based on a set of predicting variables. Furthermore, in some cases, they obtain better results than LR. Both methodologies, ANN and LR, are complementary and they help us to obtain more valid models.

  20. Categorizing Biases in High-Confidence High-Throughput Protein-Protein Interaction Data Sets*

    PubMed Central

    Yu, Xueping; Ivanic, Joseph; Memišević, Vesna; Wallqvist, Anders; Reifman, Jaques

    2011-01-01

    We characterized and evaluated the functional attributes of three yeast high-confidence protein-protein interaction data sets derived from affinity purification/mass spectrometry, protein-fragment complementation assay, and yeast two-hybrid experiments. The interacting proteins retrieved from these data sets formed distinct, partially overlapping sets with different protein-protein interaction characteristics. These differences were primarily a function of the deployed experimental technologies used to recover these interactions. This affected the total coverage of interactions and was especially evident in the recovery of interactions among different functional classes of proteins. We found that the interaction data obtained by the yeast two-hybrid method was the least biased toward any particular functional characterization. In contrast, interacting proteins in the affinity purification/mass spectrometry and protein-fragment complementation assay data sets were over- and under-represented among distinct and different functional categories. We delineated how these differences affected protein complex organization in the network of interactions, in particular for strongly interacting complexes (e.g. RNA and protein synthesis) versus weak and transient interacting complexes (e.g. protein transport). We quantified methodological differences in detecting protein interactions from larger protein complexes, in the correlation of protein abundance among interacting proteins, and in their connectivity of essential proteins. In the latter case, we showed that minimizing inherent methodology biases removed many of the ambiguous conclusions about protein essentiality and protein connectivity. We used these findings to rationalize how biological insights obtained by analyzing data sets originating from different sources sometimes do not agree or may even contradict each other. An important corollary of this work was that discrepancies in biological insights did not necessarily imply that one detection methodology was better or worse, but rather that, to a large extent, the insights reflected the methodological biases themselves. Consequently, interpreting the protein interaction data within their experimental or cellular context provided the best avenue for overcoming biases and inferring biological knowledge. PMID:21876202

  1. Methodology of citrate-based biomaterial development and application

    NASA Astrophysics Data System (ADS)

    Tran, M. Richard

    Biomaterials play central roles in modern strategies of regenerative medicine and tissue engineering. Attempts to find tissue-engineered solutions to cure various injuries or diseases have led to an enormous increase in the number of polymeric biomaterials over the past decade. The breadth of new materials arises from the multiplicity of anatomical locations, cell types, and mode of application, which all place application-specific requirements on the biomaterial. Unfortunately, many of the currently available biodegradable polymers are limited in their versatility to meet the wide range of requirements for tissue engineering. Therefore, a methodology of biomaterial development, which is able to address a broad spectrum of requirements, would be beneficial to the biomaterial field. This work presents a methodology of citrate-based biomaterial design and application to meet the multifaceted needs of tissue engineering. We hypothesize that (1) citric acid, a non-toxic metabolic product of the body (Krebs Cycle), can be exploited as a universal multifunctional monomer and reacted with various diols to produce a new class of soft biodegradable elastomers with the flexibility to tune the material properties of the resulting material to meet a wide range of requirements; (2) the newly developed citrate-based polymers can be used as platform biomaterials for the design of novel tissue engineering scaffolding; and (3) microengineering approaches in the form thin scaffold sheets, microchannels, and a new porogen design can be used to generate complex cell-cell and cell-microenvironment interactions to mimic tissue complexity and architecture. To test these hypotheses, we first developed a methodology of citrate-based biomaterial development through the synthesis and characterization of a family of in situ crosslinkable and urethane-doped elastomers, which are synthesized using simple, cost-effective strategies and offer a variety methods to tailor the material properties to meet the needs of a particular application. Next, we introduced a new porogen generation technique, and showed the potential application of the newly developed materials through the fabrication and characterization of scaffold sheets, multiphasic small diameter vascular grafts, and multichanneled nerve guides. Finally, the in vivo applications of citrate-based materials are exemplified through the evaluation of peripheral nerve regeneration using multichanneled guides and the ability to assist in injection-based endoscopic mucosal resection therapy. The results presented in this work show that citric acid can be utilized as a cornerstone in the development of novel biodegradable materials, and combined with microengineering approaches to produce the next generation of tissue engineering scaffolding. These enabling new biomaterials and scaffolding strategies should address many of the existing challenges in tissue engineering and advance the field as a whole.

  2. Physics and complexity.

    PubMed

    Sherrington, David

    2010-03-13

    This paper is concerned with complex macroscopic behaviour arising in many-body systems through the combinations of competitive interactions and disorder, even with simple ingredients at the microscopic level. It attempts to indicate and illustrate the richness that has arisen, in conceptual understanding, in methodology and in application, across a large range of scientific disciplines, together with a hint of some of the further opportunities that remain to be tapped. In doing so, it takes the perspective of physics and tries to show, albeit rather briefly, how physics has contributed and been stimulated.

  3. Sensor Fusion of Gaussian Mixtures for Ballistic Target Tracking in the Re-Entry Phase

    PubMed Central

    Lu, Kelin; Zhou, Rui

    2016-01-01

    A sensor fusion methodology for the Gaussian mixtures model is proposed for ballistic target tracking with unknown ballistic coefficients. To improve the estimation accuracy, a track-to-track fusion architecture is proposed to fuse tracks provided by the local interacting multiple model filters. During the fusion process, the duplicate information is removed by considering the first order redundant information between the local tracks. With extensive simulations, we show that the proposed algorithm improves the tracking accuracy in ballistic target tracking in the re-entry phase applications. PMID:27537883

  4. Heterobimetallic Pd-Sn catalysis: Michael addition reaction with C-, N-, O-, and S-nucleophiles and in situ diagnostics.

    PubMed

    Das, Debjit; Pratihar, Sanjay; Roy, Sujit

    2013-03-15

    An efficient Michael addition reaction of differently substituted enones with carbon, sulfur, oxygen, and nitrogen nucleophiles has been achieved by a new heterobimetallic "Pd-Sn" catalyst system. The nature of the catalytically relevant species and their interactions with the enone moiety has been examined by spectroscopy. The effect of ligand and the coordination mode of enone with "Pd-Sn" heterobimetallic system have been investigated by kinetics and DFT studies. A straightforward application of this methodology is shown in the synthesis of 1,4-oxathiepane core.

  5. Joint symbolic dynamics for the assessment of cardiovascular and cardiorespiratory interactions

    PubMed Central

    Baumert, Mathias; Javorka, Michal; Kabir, Muammar

    2015-01-01

    Beat-to-beat variations in heart period provide information on cardiovascular control and are closely linked to variations in arterial pressure and respiration. Joint symbolic analysis of heart period, systolic arterial pressure and respiration allows for a simple description of their shared short-term dynamics that are governed by cardiac baroreflex control and cardiorespiratory coupling. In this review, we discuss methodology and research applications. Studies suggest that analysis of joint symbolic dynamics provides a powerful tool for identifying physiological and pathophysiological changes in cardiovascular and cardiorespiratory control. PMID:25548272

  6. Joint symbolic dynamics for the assessment of cardiovascular and cardiorespiratory interactions.

    PubMed

    Baumert, Mathias; Javorka, Michal; Kabir, Muammar

    2015-02-13

    Beat-to-beat variations in heart period provide information on cardiovascular control and are closely linked to variations in arterial pressure and respiration. Joint symbolic analysis of heart period, systolic arterial pressure and respiration allows for a simple description of their shared short-term dynamics that are governed by cardiac baroreflex control and cardiorespiratory coupling. In this review, we discuss methodology and research applications. Studies suggest that analysis of joint symbolic dynamics provides a powerful tool for identifying physiological and pathophysiological changes in cardiovascular and cardiorespiratory control.

  7. Physics and complexity

    PubMed Central

    Sherrington, David

    2010-01-01

    This paper is concerned with complex macroscopic behaviour arising in many-body systems through the combinations of competitive interactions and disorder, even with simple ingredients at the microscopic level. It attempts to indicate and illustrate the richness that has arisen, in conceptual understanding, in methodology and in application, across a large range of scientific disciplines, together with a hint of some of the further opportunities that remain to be tapped. In doing so, it takes the perspective of physics and tries to show, albeit rather briefly, how physics has contributed and been stimulated. PMID:20123753

  8. Shedding Light on Anesthetic Mechanisms: Application of Photoaffinity Ligands

    PubMed Central

    Woll, Kellie A.; Dailey, William P.; Brannigan, Grace; Eckenhoff, Roderic G.

    2016-01-01

    Anesthetic photoaffinity ligands have had an increasing presence within anesthesiology research. These ligands mimic parent general anesthetics, and allow investigators to study anesthetic interactions with receptors and enzymes; identify novel targets; and determine distribution within biological systems. To date nearly all general anesthetics used in medicine have a corresponding photoaffinity ligand represented in the literature. In this review we examine all aspects of the current methodologies, including ligand design, characterization and deployment. Finally we offer points of consideration and highlight the future outlook as more photoaffinity ligands emerge within the field. PMID:27464974

  9. Shedding Light on Anesthetic Mechanisms: Application of Photoaffinity Ligands.

    PubMed

    Woll, Kellie A; Dailey, William P; Brannigan, Grace; Eckenhoff, Roderic G

    2016-11-01

    Anesthetic photoaffinity ligands have had an increasing presence within anesthesiology research. These ligands mimic parent general anesthetics and allow investigators to study anesthetic interactions with receptors and enzymes; identify novel targets; and determine distribution within biological systems. To date, nearly all general anesthetics used in medicine have a corresponding photoaffinity ligand represented in the literature. In this review, we examine all aspects of the current methodologies, including ligand design, characterization, and deployment. Finally we offer points of consideration and highlight the future outlook as more photoaffinity ligands emerge within the field.

  10. Sensor Fusion of Gaussian Mixtures for Ballistic Target Tracking in the Re-Entry Phase.

    PubMed

    Lu, Kelin; Zhou, Rui

    2016-08-15

    A sensor fusion methodology for the Gaussian mixtures model is proposed for ballistic target tracking with unknown ballistic coefficients. To improve the estimation accuracy, a track-to-track fusion architecture is proposed to fuse tracks provided by the local interacting multiple model filters. During the fusion process, the duplicate information is removed by considering the first order redundant information between the local tracks. With extensive simulations, we show that the proposed algorithm improves the tracking accuracy in ballistic target tracking in the re-entry phase applications.

  11. Cellular Automata with Anticipation: Examples and Presumable Applications

    NASA Astrophysics Data System (ADS)

    Krushinsky, Dmitry; Makarenko, Alexander

    2010-11-01

    One of the most prospective new methodologies for modelling is the so-called cellular automata (CA) approach. According to this paradigm, the models are built from simple elements connected into regular structures with local interaction between neighbours. The patterns of connections usually have a simple geometry (lattices). As one of the classical examples of CA we mention the game `Life' by J. Conway. This paper presents two examples of CA with anticipation property. These examples include a modification of the game `Life' and a cellular model of crowd movement.

  12. Understanding the electron-phonon interaction in polar crystals: Perspective presented by the vibronic theory

    NASA Astrophysics Data System (ADS)

    Pishtshev, A.; Kristoffel, N.

    2017-05-01

    We outline our novel results relating to the physics of the electron-TO-phonon (el-TO-ph) interaction in a polar crystal. We explained why the el-TO-ph interaction becomes effectively strong in a ferroelectric, and showed how the electron density redistribution establishes favorable conditions for soft-behavior of the long-wavelength branch of the active TO vibration. In the context of the vibronic theory it has been demonstrated that at the macroscopic level the interaction of electrons with the polar zone-centre TO phonons can be associated with the internal long-range dipole forces. Also we elucidated a methodological issue of how local field effects are incorporated within the vibronic theory. These result provided not only substantial support for the vibronic mechanism of ferroelectricity but also presented direct evidence of equivalence between vibronic and the other lattice dynamics models. The corresponding comparison allowed us to introduce the original parametrization for constants of the vibronic interaction in terms of key material constants. The applicability of the suggested formula has been tested for a wide class of polar materials.

  13. Infant feeding: the interfaces between interaction design and cognitive ergonomics in user-centered design.

    PubMed

    Lima, Flavia; Araújo, Lilian Kely

    2012-01-01

    This text presents a discussion on the process of developing interactive products focused on infant behavior, which result was an interactive game for encouraging infant feeding. For that, it describes the use of cognitive psychology concepts added to interaction design methodology. Through this project, this article sustains how the cooperative use of these concepts provides adherent solutions to users' needs, whichever they are. Besides that, it verifies the closeness of those methodologies to boundary areas of knowledge, such as design focused on user and ergonomics.

  14. Study of Fluorinated Quantum Dots-Protein Interactions at the Oil/Water Interface by Interfacial Surface Tension Changes.

    PubMed

    Carrillo-Carrión, Carolina; Gallego, Marta; Parak, Wolfgang J; Carril, Mónica

    2018-05-08

    Understanding the interaction of nanoparticles with proteins and how this interaction modifies the nanoparticles’ surface is crucial before their use for biomedical applications. Since fluorinated materials are emerging as potential imaging probes and delivery vehicles, their interaction with proteins of biological interest must be studied in order to be able to predict their performance in real scenarios. It is known that fluorinated planar surfaces may repel the unspecific adsorption of proteins but little is known regarding the same process on fluorinated nanoparticles due to the scarce examples in the literature. In this context, the aim of this work is to propose a simple and fast methodology to study fluorinated nanoparticle-protein interactions based on interfacial surface tension (IFT) measurements. This technique is particularly interesting for fluorinated nanoparticles due to their increased hydrophobicity. Our study is based on the determination of IFT variations due to the interaction of quantum dots of ca. 5 nm inorganic core/shell diameter coated with fluorinated ligands (QD_F) with several proteins at the oil/water interface. Based on the results, we conclude that the presence of QD_F do not disrupt protein spontaneous film formation at the oil/water interface. Even if at very low concentrations of proteins the film formation in the presence of QD_F shows a slower rate, the final interfacial tension reached is similar to that obtained in the absence of QD_F. The differential behaviour of the studied proteins (bovine serum albumin, fibrinogen and apotransferrin) has been discussed on the basis of the adsorption affinity of each protein towards DCM/water interface and their different sizes. Additionally, it has been clearly demonstrated that the proposed methodology can serve as a complementary technique to other reported direct and indirect methods for the evaluation of nanoparticle-protein interactions at low protein concentrations.

  15. Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics

    DTIC Science & Technology

    1988-12-01

    12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring

  16. Educational Design as Conversation: A Conversation Analytical Perspective on Teacher Dialogue

    ERIC Educational Resources Information Center

    van Kruiningen, Jacqueline F.

    2013-01-01

    The aim of this methodological paper is to expound on and demonstrate the value of conversation-analytical research in the area of (informal) teacher learning. The author discusses some methodological issues in current research on interaction in teacher learning and holds a plea for conversation-analytical research on interactional processes in…

  17. Cucurbit[n]uril-Based Microcapsules Self-Assembled within Microfluidic Droplets: A Versatile Approach for Supramolecular Architectures and Materials

    PubMed Central

    2017-01-01

    Conspectus Microencapsulation is a fundamental concept behind a wide range of daily applications ranging from paints, adhesives, and pesticides to targeted drug delivery, transport of vaccines, and self-healing concretes. The beauty of microfluidics to generate microcapsules arises from the capability of fabricating monodisperse and micrometer-scale droplets, which can lead to microcapsules/particles with fine-tuned control over size, shape, and hierarchical structure, as well as high reproducibility, efficient material usage, and high-throughput manipulation. The introduction of supramolecular chemistry, such as host–guest interactions, endows the resultant microcapsules with stimuli-responsiveness and self-adjusting capabilities, and facilitates hierarchical microstructures with tunable stability and porosity, leading to the maturity of current microencapsulation industry. Supramolecular architectures and materials have attracted immense attention over the past decade, as they open the possibility to obtain a large variety of aesthetically pleasing structures, with myriad applications in biomedicine, energy, sensing, catalysis, and biomimicry, on account of the inherent reversible and adaptive nature of supramolecular interactions. As a subset of supramolecular interactions, host–guest molecular recognition involves the formation of inclusion complexes between two or more moieties, with specific three-dimensional structures and spatial arrangements, in a highly controllable and cooperative manner. Such highly selective, strong yet dynamic interactions could be exploited as an alternative methodology for programmable and controllable engineering of supramolecular architectures and materials, exploiting reversible interactions between complementary components. Through the engineering of molecular structures, assemblies can be readily functionalized based on host–guest interactions, with desirable physicochemical characteristics. In this Account, we summarize the current state of development in the field of monodisperse supramolecular microcapsules, fabricated through the integration of traditional microfluidic techniques and interfacial host–guest chemistry, specifically cucurbit[n]uril (CB[n])-mediated host–guest interactions. Three different strategies, colloidal particle-driven assembly, interfacial condensation-driven assembly and electrostatic interaction-driven assembly, are classified and discussed in detail, presenting the methodology involved in each microcapsule formation process. We highlight the state-of-the-art in design and control over structural complexity with desirable functionality, as well as promising applications, such as cargo delivery stemming from the assembled microcapsules. On account of its dynamic nature, the CB[n]-mediated host–guest complexation has demonstrated efficient response toward various external stimuli such as UV light, pH change, redox chemistry, and competitive guests. Herein, we also demonstrate different microcapsule modalities, which are engineered with CB[n] host–guest chemistry and also can be disrupted with the aid of external stimuli, for triggered release of payloads. In addition to the overview of recent achievements and current limitations of these microcapsules, we finally summarize several perspectives on tunable cargo loading and triggered release, directions, and challenges for this technology, as well as possible strategies for further improvement, which will lead to substainitial progress of host–guest chemistry in supramolecular architectures and materials. PMID:28075551

  18. Assessment of cardio-respiratory interactions in preterm infants by bivariate autoregressive modeling and surrogate data analysis.

    PubMed

    Indic, Premananda; Bloch-Salisbury, Elisabeth; Bednarek, Frank; Brown, Emery N; Paydarfar, David; Barbieri, Riccardo

    2011-07-01

    Cardio-respiratory interactions are weak at the earliest stages of human development, suggesting that assessment of their presence and integrity may be an important indicator of development in infants. Despite the valuable research devoted to infant development, there is still a need for specifically targeted standards and methods to assess cardiopulmonary functions in the early stages of life. We present a new methodological framework for the analysis of cardiovascular variables in preterm infants. Our approach is based on a set of mathematical tools that have been successful in quantifying important cardiovascular control mechanisms in adult humans, here specifically adapted to reflect the physiology of the developing cardiovascular system. We applied our methodology in a study of cardio-respiratory responses for 11 preterm infants. We quantified cardio-respiratory interactions using specifically tailored multivariate autoregressive analysis and calculated the coherence as well as gain using causal approaches. The significance of the interactions in each subject was determined by surrogate data analysis. The method was tested in control conditions as well as in two different experimental conditions; with and without use of mild mechanosensory intervention. Our multivariate analysis revealed a significantly higher coherence, as confirmed by surrogate data analysis, in the frequency range associated with eupneic breathing compared to the other ranges. Our analysis validates the models behind our new approaches, and our results confirm the presence of cardio-respiratory coupling in early stages of development, particularly during periods of mild mechanosensory intervention, thus encouraging further application of our approach. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  19. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  20. Data mining in soft computing framework: a survey.

    PubMed

    Mitra, S; Pal, S K; Mitra, P

    2002-01-01

    The present article provides a survey of the available literature on data mining using soft computing. A categorization has been provided based on the different soft computing tools and their hybridizations used, the data mining function implemented, and the preference criterion selected by the model. The utility of the different soft computing methodologies is highlighted. Generally fuzzy sets are suitable for handling the issues related to understandability of patterns, incomplete/noisy data, mixed media information and human interaction, and can provide approximate solutions faster. Neural networks are nonparametric, robust, and exhibit good learning and generalization capabilities in data-rich environments. Genetic algorithms provide efficient search algorithms to select a model, from mixed media data, based on some preference criterion/objective function. Rough sets are suitable for handling different types of uncertainty in data. Some challenges to data mining and the application of soft computing methodologies are indicated. An extensive bibliography is also included.

  1. An Innovative Structural Mode Selection Methodology: Application for the X-33 Launch Vehicle Finite Element Model

    NASA Technical Reports Server (NTRS)

    Hidalgo, Homero, Jr.

    2000-01-01

    An innovative methodology for determining structural target mode selection and mode selection based on a specific criterion is presented. An effective approach to single out modes which interact with specific locations on a structure has been developed for the X-33 Launch Vehicle Finite Element Model (FEM). We presented Root-Sum-Square (RSS) displacement method computes resultant modal displacement for each mode at selected degrees of freedom (DOF) and sorts to locate modes with highest values. This method was used to determine modes, which most influenced specific locations/points on the X-33 flight vehicle such as avionics control components, aero-surface control actuators, propellant valve and engine points for use in flight control stability analysis and for flight POGO stability analysis. Additionally, the modal RSS method allows for primary or global target vehicle modes to also be identified in an accurate and efficient manner.

  2. A computational model for the prediction of jet entrainment in the vicinity of nozzle boattails (the BOAT code)

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Pergament, H. S.

    1978-01-01

    The development of a computational model (BOAT) for calculating nearfield jet entrainment, and its incorporation in an existing methodology for the prediction of nozzle boattail pressures, is discussed. The model accounts for the detailed turbulence and thermochemical processes occurring in the mixing layer formed between a jet exhaust and surrounding external stream while interfacing with the inviscid exhaust and external flowfield regions in an overlaid, interactive manner. The ability of the BOAT model to analyze simple free shear flows is assessed by comparisons with fundamental laboratory data. The overlaid procedure for incorporating variable pressures into BOAT and the entrainment correction employed to yield an effective plume boundary for the inviscid external flow are demonstrated. This is accomplished via application of BOAT in conjunction with the codes comprising the NASA/LRC patched viscous/inviscid methodology for determining nozzle boattail drag for subsonic/transonic external flows.

  3. Parametric and energy consumption optimization of Basic Red 2 removal by electrocoagulation/egg shell adsorption coupling using response surface methodology in a batch system.

    PubMed

    de Carvalho, Helder Pereira; Huang, Jiguo; Zhao, Meixia; Liu, Gang; Yang, Xinyu; Dong, Lili; Liu, Xingjuan

    2016-01-01

    In this study, response surface methodology (RSM) model was applied for optimization of Basic Red 2 (BR2) removal using electrocoagulation/eggshell (ES) coupling process in a batch system. Central composite design was used to evaluate the effects and interactions of process parameters including current density, reaction time, initial pH and ES dosage on the BR2 removal efficiency and energy consumption. The analysis of variance revealed high R(2) values (≥85%) indicating that the predictions of RSM models are adequately applicable for both responses. The optimum conditions when the dye removal efficiency of 93.18% and energy consumption of 0.840 kWh/kg were observed were 11.40 mA/cm(2) current density, 5 min and 3 s reaction time, 6.5 initial pH and 10.91 g/L ES dosage.

  4. Forecast of future aviation fuels: The model

    NASA Technical Reports Server (NTRS)

    Ayati, M. B.; Liu, C. Y.; English, J. M.

    1981-01-01

    A conceptual models of the commercial air transportation industry is developed which can be used to predict trends in economics, demand, and consumption. The methodology is based on digraph theory, which considers the interaction of variables and propagation of changes. Air transportation economics are treated by examination of major variables, their relationships, historic trends, and calculation of regression coefficients. A description of the modeling technique and a compilation of historic airline industry statistics used to determine interaction coefficients are included. Results of model validations show negligible difference between actual and projected values over the twenty-eight year period of 1959 to 1976. A limited application of the method presents forecasts of air tranportation industry demand, growth, revenue, costs, and fuel consumption to 2020 for two scenarios of future economic growth and energy consumption.

  5. Molecular Force Spectroscopy on Cells

    NASA Astrophysics Data System (ADS)

    Liu, Baoyu; Chen, Wei; Zhu, Cheng

    2015-04-01

    Molecular force spectroscopy has become a powerful tool to study how mechanics regulates biology, especially the mechanical regulation of molecular interactions and its impact on cellular functions. This force-driven methodology has uncovered a wealth of new information of the physical chemistry of molecular bonds for various biological systems. The new concepts, qualitative and quantitative measures describing bond behavior under force, and structural bases underlying these phenomena have substantially advanced our fundamental understanding of the inner workings of biological systems from the nanoscale (molecule) to the microscale (cell), elucidated basic molecular mechanisms of a wide range of important biological processes, and provided opportunities for engineering applications. Here, we review major force spectroscopic assays, conceptual developments of mechanically regulated kinetics of molecular interactions, and their biological relevance. We also present current challenges and highlight future directions.

  6. Application of the Spanish methodological approach for biosphere assessment to a generic high-level waste disposal site.

    PubMed

    Agüero, A; Pinedo, P; Simón, I; Cancio, D; Moraleda, M; Trueba, C; Pérez-Sánchez, D

    2008-09-15

    A methodological approach which includes conceptual developments, methodological aspects and software tools have been developed in the Spanish context, based on the BIOMASS "Reference Biospheres Methodology". The biosphere assessments have to be undertaken with the aim of demonstrating compliance with principles and regulations established to limit the possible radiological impact of radioactive waste disposals on human health and on the environment, and to ensure that future generations will not be exposed to higher radiation levels than those that would be acceptable today. The biosphere in the context of high-level waste disposal is defined as the collection of various radionuclide transfer pathways that may result in releases into the surface environment, transport within and between the biosphere receptors, exposure of humans and biota, and the doses/risks associated with such exposures. The assessments need to take into account the complexity of the biosphere, the nature of the radionuclides released and the long timescales considered. It is also necessary to make assumptions related to the habits and lifestyle of the exposed population, human activities in the long term and possible modifications of the biosphere. A summary on the Spanish methodological approach for biosphere assessment are presented here as well as its application in a Spanish generic case study. A reference scenario has been developed based on current conditions at a site located in Central-West Spain, to indicate the potential impact to the actual population. In addition, environmental change has been considered qualitatively through the use of interaction matrices and transition diagrams. Unit source terms of (36)Cl, (79)Se, (99)Tc, (129)I, (135)Cs, (226)Ra, (231)Pa, (238)U, (237)Np and (239)Pu have been taken. Two exposure groups of infants and adults have been chosen for dose calculations. Results are presented and their robustness is evaluated through the use of uncertainty and sensitivity analyses.

  7. Application of Executable Architecture in Early Concept Evaluation using the DoD Architecture Framework

    DTIC Science & Technology

    2016-09-15

    7 Methodology Overview ................................................................................................7...32 III. Methodology ...33 Overview of Research Methodology ..........................................................................34 Implementation of Methodology

  8. Development and application of a DNA microarray-based yeast two-hybrid system

    PubMed Central

    Suter, Bernhard; Fontaine, Jean-Fred; Yildirimman, Reha; Raskó, Tamás; Schaefer, Martin H.; Rasche, Axel; Porras, Pablo; Vázquez-Álvarez, Blanca M.; Russ, Jenny; Rau, Kirstin; Foulle, Raphaele; Zenkner, Martina; Saar, Kathrin; Herwig, Ralf; Andrade-Navarro, Miguel A.; Wanker, Erich E.

    2013-01-01

    The yeast two-hybrid (Y2H) system is the most widely applied methodology for systematic protein–protein interaction (PPI) screening and the generation of comprehensive interaction networks. We developed a novel Y2H interaction screening procedure using DNA microarrays for high-throughput quantitative PPI detection. Applying a global pooling and selection scheme to a large collection of human open reading frames, proof-of-principle Y2H interaction screens were performed for the human neurodegenerative disease proteins huntingtin and ataxin-1. Using systematic controls for unspecific Y2H results and quantitative benchmarking, we identified and scored a large number of known and novel partner proteins for both huntingtin and ataxin-1. Moreover, we show that this parallelized screening procedure and the global inspection of Y2H interaction data are uniquely suited to define specific PPI patterns and their alteration by disease-causing mutations in huntingtin and ataxin-1. This approach takes advantage of the specificity and flexibility of DNA microarrays and of the existence of solid-related statistical methods for the analysis of DNA microarray data, and allows a quantitative approach toward interaction screens in human and in model organisms. PMID:23275563

  9. Quantum chemical exploration of the intramolecular hydrogen bond interaction in 2-thiazol-2-yl-phenol and 2-benzothiazol-2-yl-phenol in the context of excited-state intramolecular proton transfer: A focus on the covalency in hydrogen bond

    NASA Astrophysics Data System (ADS)

    Paul, Bijan Kumar; Ganguly, Aniruddha; Guchhait, Nikhil

    2014-10-01

    The present work demonstrates a computational exploration of the intramolecular H-bond (IMHB) interaction in two model heterocyclic compounds - 2-thiazol-2-yl-phenol (2T2YP) and 2-benzothiazol-2-yl-phenol (2B2YP) by meticulous application of various quantum chemical tools. Major emphasis is rendered on the analysis of IMHB interaction by calculation of electron density ρ(r) and Laplacian ∇2ρ(r) at the bond critical point using the Atoms-In-Molecule methodology. Topological features based on ρ(r) suggest that at equilibrium geometry the IMHB interaction develops certain characteristics typical of a covalent interaction. The interplay between aromaticity and Resonance-Assisted H-Bond (RAHB) has also been discussed using both geometrical and magnetic criteria. The occurrence of IMHB interaction in 2T2YP and 2B2YP has also been criticized under the provision of the Natural Bond Orbital (NBO) analysis. The ESIPT phenomenon in the molecular systems is also critically addressed on the lexicon of potential energy surface (PES) analysis.

  10. Object-oriented microcomputer software for earthquake seismology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroeger, G.C.

    1993-02-01

    A suite of graphically interactive applications for the retrieval, editing and modeling of earthquake seismograms have been developed using object-orientation programming methodology and the C++ language. Retriever is an application which allows the user to search for, browse, and extract seismic data from CD-ROMs produced by the National Earthquake Information Center (NEIC). The user can restrict the date, size, location and depth of desired earthquakes and extract selected data into a variety of common seismic file formats. Reformer is an application that allows the user to edit seismic data and data headers, and perform a variety of signal processing operationsmore » on that data. Synthesizer is a program for the generation and analysis of teleseismic P and SH synthetic seismograms. The program provides graphical manipulation of source parameters, crustal structures and seismograms, as well as near real-time response in generating synthetics for arbitrary flat-layered crustal structures. All three applications use class libraries developed for implementing geologic and seismic objects and views. Standard seismogram view objects and objects that encapsulate the reading and writing of different seismic data file formats are shared by all three applications. The focal mechanism views in Synthesizer are based on a generic stereonet view object. Interaction with the native graphical user interface is encapsulated in a class library in order to simplify the porting of the software to different operating systems and application programming interfaces. The software was developed on the Apple Macintosh and is being ported to UNIX/X-Window platforms.« less

  11. Interactive learning and action: realizing the promise of synthetic biology for global health.

    PubMed

    Betten, A Wieke; Roelofsen, Anneloes; Broerse, Jacqueline E W

    2013-09-01

    The emerging field of synthetic biology has the potential to improve global health. For example, synthetic biology could contribute to efforts at vaccine development in a context in which vaccines and immunization have been identified by the international community as being crucial to international development efforts and, in particular, the millennium development goals. However, past experience with innovations shows that realizing a technology's potential can be difficult and complex. To achieve better societal embedding of synthetic biology and to make sure it reaches its potential, science and technology development should be made more inclusive and interactive. Responsible research and innovation is based on the premise that a broad range of stakeholders with different views, needs and ideas should have a voice in the technological development and deployment process. The interactive learning and action (ILA) approach has been developed as a methodology to bring societal stakeholders into a science and technology development process. This paper proposes an ILA in five phases for an international effort, with national case studies, to develop socially robust applications of synthetic biology for global health, based on the example of vaccine development. The design is based on results of a recently initiated ILA project on synthetic biology; results from other interactive initiatives described in the literature; and examples of possible applications of synthetic biology for global health that are currently being developed.

  12. Coupling Protein Side-Chain and Backbone Flexibility Improves the Re-design of Protein-Ligand Specificity.

    PubMed

    Ollikainen, Noah; de Jong, René M; Kortemme, Tanja

    2015-01-01

    Interactions between small molecules and proteins play critical roles in regulating and facilitating diverse biological functions, yet our ability to accurately re-engineer the specificity of these interactions using computational approaches has been limited. One main difficulty, in addition to inaccuracies in energy functions, is the exquisite sensitivity of protein-ligand interactions to subtle conformational changes, coupled with the computational problem of sampling the large conformational search space of degrees of freedom of ligands, amino acid side chains, and the protein backbone. Here, we describe two benchmarks for evaluating the accuracy of computational approaches for re-engineering protein-ligand interactions: (i) prediction of enzyme specificity altering mutations and (ii) prediction of sequence tolerance in ligand binding sites. After finding that current state-of-the-art "fixed backbone" design methods perform poorly on these tests, we develop a new "coupled moves" design method in the program Rosetta that couples changes to protein sequence with alterations in both protein side-chain and protein backbone conformations, and allows for changes in ligand rigid-body and torsion degrees of freedom. We show significantly increased accuracy in both predicting ligand specificity altering mutations and binding site sequences. These methodological improvements should be useful for many applications of protein-ligand design. The approach also provides insights into the role of subtle conformational adjustments that enable functional changes not only in engineering applications but also in natural protein evolution.

  13. Spectrophores as one-dimensional descriptors calculated from three-dimensional atomic properties: applications ranging from scaffold hopping to multi-target virtual screening.

    PubMed

    Gladysz, Rafaela; Dos Santos, Fabio Mendes; Langenaeker, Wilfried; Thijs, Gert; Augustyns, Koen; De Winter, Hans

    2018-03-07

    Spectrophores are novel descriptors that are calculated from the three-dimensional atomic properties of molecules. In our current implementation, the atomic properties that were used to calculate spectrophores include atomic partial charges, atomic lipophilicity indices, atomic shape deviations and atomic softness properties. This approach can easily be widened to also include additional atomic properties. Our novel methodology finds its roots in the experimental affinity fingerprinting technology developed in the 1990's by Terrapin Technologies. Here we have translated it into a purely virtual approach using artificial affinity cages and a simplified metric to calculate the interaction between these cages and the atomic properties. A typical spectrophore consists of a vector of 48 real numbers. This makes it highly suitable for the calculation of a wide range of similarity measures for use in virtual screening and for the investigation of quantitative structure-activity relationships in combination with advanced statistical approaches such as self-organizing maps, support vector machines and neural networks. In our present report we demonstrate the applicability of our novel methodology for scaffold hopping as well as virtual screening.

  14. Measuring child personality when child personality was not measured: Application of a thin-slice approach.

    PubMed

    Tackett, Jennifer L; Smack, Avanté J; Herzhoff, Kathrin; Reardon, Kathleen W; Daoud, Stephanie; Granic, Isabela

    2017-02-01

    Recent efforts have demonstrated that thin-slice (TS) assessment-or assessment of individual characteristics after only brief exposure to that individual's behaviour-can produce reliable and valid measurements of child personality traits. The extent to which this approach can be generalized to archival data not designed to measure personality, and whether it can be used to measure personality pathology traits in youth, is not yet known. Archival video data of a parent-child interaction task was collected as part of a clinical intervention trial for aggressive children (N = 177). Unacquainted observers independently watched the clips and rated children on normal-range (neuroticism, extraversion, agreeableness, conscientiousness and openness to experience) and pathological (callous-unemotional) personality traits. TS ratings of child personality showed strong internal consistency, valid associations with measures of externalizing problems and temperament, and revealed differentiated subgroups of children based on severity. As such, these findings demonstrate an ecologically valid application of TS methodology and illustrate how researchers and clinicians can extend their existing data by measuring child personality using TS methodology, even in cases where child personality was not originally measured. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Quantum-Mechanics Methodologies in Drug Discovery: Applications of Docking and Scoring in Lead Optimization.

    PubMed

    Crespo, Alejandro; Rodriguez-Granillo, Agustina; Lim, Victoria T

    2017-01-01

    The development and application of quantum mechanics (QM) methodologies in computer- aided drug design have flourished in the last 10 years. Despite the natural advantage of QM methods to predict binding affinities with a higher level of theory than those methods based on molecular mechanics (MM), there are only a few examples where diverse sets of protein-ligand targets have been evaluated simultaneously. In this work, we review recent advances in QM docking and scoring for those cases in which a systematic analysis has been performed. In addition, we introduce and validate a simplified QM/MM expression to compute protein-ligand binding energies. Overall, QMbased scoring functions are generally better to predict ligand affinities than those based on classical mechanics. However, the agreement between experimental activities and calculated binding energies is highly dependent on the specific chemical series considered. The advantage of more accurate QM methods is evident in cases where charge transfer and polarization effects are important, for example when metals are involved in the binding process or when dispersion forces play a significant role as in the case of hydrophobic or stacking interactions. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  16. Kinetic Monte Carlo simulations for transient thermal fields: Computational methodology and application to the submicrosecond laser processes in implanted silicon.

    PubMed

    Fisicaro, G; Pelaz, L; Lopez, P; La Magna, A

    2012-09-01

    Pulsed laser irradiation of damaged solids promotes ultrafast nonequilibrium kinetics, on the submicrosecond scale, leading to microscopic modifications of the material state. Reliable theoretical predictions of this evolution can be achieved only by simulating particle interactions in the presence of large and transient gradients of the thermal field. We propose a kinetic Monte Carlo (KMC) method for the simulation of damaged systems in the extremely far-from-equilibrium conditions caused by the laser irradiation. The reference systems are nonideal crystals containing point defect excesses, an order of magnitude larger than the equilibrium density, due to a preirradiation ion implantation process. The thermal and, eventual, melting problem is solved within the phase-field methodology, and the numerical solutions for the space- and time-dependent thermal field were then dynamically coupled to the KMC code. The formalism, implementation, and related tests of our computational code are discussed in detail. As an application example we analyze the evolution of the defect system caused by P ion implantation in Si under nanosecond pulsed irradiation. The simulation results suggest a significant annihilation of the implantation damage which can be well controlled by the laser fluence.

  17. Freedom, Flow and Fairness: Exploring How Children Develop Socially at School through Outdoor Play

    ERIC Educational Resources Information Center

    Waite, Sue; Rogers, Sue; Evans, Julie

    2013-01-01

    In this article, we report on a study that sought to discover micro-level social interactions in fluid outdoor learning spaces. Our methodology was centred around the children; our methods moved with them and captured their social interactions through mobile audio-recording. We argue that our methodological approach supported access to…

  18. Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yau, M.; Motamed, M.; Guarro, S.

    2006-07-01

    Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less

  19. Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel; Malamud, Bruce D.

    2016-04-01

    Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  20. Receptor-based 3D-QSAR in Drug Design: Methods and Applications in Kinase Studies.

    PubMed

    Fang, Cheng; Xiao, Zhiyan

    2016-01-01

    Receptor-based 3D-QSAR strategy represents a superior integration of structure-based drug design (SBDD) and three-dimensional quantitative structure-activity relationship (3D-QSAR) analysis. It combines the accurate prediction of ligand poses by the SBDD approach with the good predictability and interpretability of statistical models derived from the 3D-QSAR approach. Extensive efforts have been devoted to the development of receptor-based 3D-QSAR methods and two alternative approaches have been exploited. One associates with computing the binding interactions between a receptor and a ligand to generate structure-based descriptors for QSAR analyses. The other concerns the application of various docking protocols to generate optimal ligand poses so as to provide reliable molecular alignments for the conventional 3D-QSAR operations. This review highlights new concepts and methodologies recently developed in the field of receptorbased 3D-QSAR, and in particular, covers its application in kinase studies.

  1. A Three-Hybrid System to Probe In Vivo Protein-Protein Interactions: Application to the Essential Proteins of the RD1 Complex of M. tuberculosis

    PubMed Central

    Bhalla, Kuhulika; Ghosh, Anamika; Kumar, Krishan; Kumar, Sushil; Ranganathan, Anand

    2011-01-01

    Background Protein-protein interactions play a crucial role in enabling a pathogen to survive within a host. In many cases the interactions involve a complex of proteins rather than just two given proteins. This is especially true for pathogens like M. tuberculosis that are able to successfully survive the inhospitable environment of the macrophage. Studying such interactions in detail may help in developing small molecules that either disrupt or augment the interactions. Here, we describe the development of an E. coli based bacterial three-hybrid system that can be used effectively to study ternary protein complexes. Methodology/Principal Findings The protein-protein interactions involved in M. tuberculosis pathogenesis have been used as a model for the validation of the three-hybrid system. Using the M. tuberculosis RD1 encoded proteins CFP10, ESAT6 and Rv3871 for our proof-of-concept studies, we show that the interaction between the proteins CFP10 and Rv3871 is strengthened and stabilized in the presence of ESAT6, the known heterodimeric partner of CFP10. Isolating peptide candidates that can disrupt crucial protein-protein interactions is another application that the system offers. We demonstrate this by using CFP10 protein as a disruptor of a previously established interaction between ESAT6 and a small peptide HCL1; at the same time we also show that CFP10 is not able to disrupt the strong interaction between ESAT6 and another peptide SL3. Conclusions/Significance The validation of the three-hybrid system paves the way for finding new peptides that are stronger binders of ESAT6 compared even to its natural partner CFP10. Additionally, we believe that the system offers an opportunity to study tri-protein complexes and also perform a screening of protein/peptide binders to known interacting proteins so as to elucidate novel tri-protein complexes. PMID:22087330

  2. 78 FR 77399 - Basic Health Program: Proposed Federal Funding Methodology for Program Year 2015

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-23

    ... American Indians and Alaska Natives F. Example Application of the BHP Funding Methodology III. Collection... effectively 138 percent due to the application of a required 5 percent income disregard in determining the... correct errors in applying the methodology (such as mathematical errors). Under section 1331(d)(3)(ii) of...

  3. Optical trapping and Raman spectroscopy of solid particles.

    PubMed

    Rkiouak, L; Tang, M J; Camp, J C J; McGregor, J; Watson, I M; Cox, R A; Kalberer, M; Ward, A D; Pope, F D

    2014-06-21

    The heterogeneous interactions of gas molecules on solid particles are crucial in many areas of science, engineering and technology. Such interactions play a critical role in atmospheric chemistry and in heterogeneous catalysis, a key technology in the energy and chemical industries. Investigating heterogeneous interactions upon single levitated particles can provide significant insight into these important processes. Various methodologies exist for levitating micron sized particles including: optical, electrical and acoustic techniques. Prior to this study, the optical levitation of solid micron scale particles has proved difficult to achieve over timescales relevant to the above applications. In this work, a new vertically configured counter propagating dual beam optical trap was optimized to levitate a range of solid particles in air. Silica (SiO2), α-alumina (Al2O3), titania (TiO2) and polystyrene were stably trapped with a high trapping efficiency (Q = 0.42). The longest stable trapping experiment was conducted continuously for 24 hours, and there are no obvious constraints on trapping time beyond this period. Therefore, the methodology described in this paper should be of major benefit to various research communities. The strength of the new technique is demonstrated by the simultaneous levitation and spectroscopic interrogation of silica particles by Raman spectroscopy. In particular, the adsorption of water upon silica was investigated under controlled relative humidity environments. Furthermore, the collision and coagulation behaviour of silica particles with microdroplets of sulphuric acid was followed using both optical imaging and Raman spectroscopy.

  4. Computational and Biochemical Docking of the Irreversible Cocaine Analog RTI 82 Directly Demonstrates Ligand Positioning in the Dopamine Transporter Central Substrate-binding Site*

    PubMed Central

    Dahal, Rejwi Acharya; Pramod, Akula Bala; Sharma, Babita; Krout, Danielle; Foster, James D.; Cha, Joo Hwan; Cao, Jianjing; Newman, Amy Hauck; Lever, John R.; Vaughan, Roxanne A.; Henry, L. Keith

    2014-01-01

    The dopamine transporter (DAT) functions as a key regulator of dopaminergic neurotransmission via re-uptake of synaptic dopamine (DA). Cocaine binding to DAT blocks this activity and elevates extracellular DA, leading to psychomotor stimulation and addiction, but the mechanisms by which cocaine interacts with DAT and inhibits transport remain incompletely understood. Here, we addressed these questions using computational and biochemical methodologies to localize the binding and adduction sites of the photoactivatable irreversible cocaine analog 3β-(p-chlorophenyl)tropane-2β-carboxylic acid, 4′-azido-3′-iodophenylethyl ester ([125I]RTI 82). Comparative modeling and small molecule docking indicated that the tropane pharmacophore of RTI 82 was positioned in the central DA active site with an orientation that juxtaposed the aryliodoazide group for cross-linking to rat DAT Phe-319. This prediction was verified by focused methionine substitution of residues flanking this site followed by cyanogen bromide mapping of the [125I]RTI 82-labeled mutants and by the substituted cysteine accessibility method protection analyses. These findings provide positive functional evidence linking tropane pharmacophore interaction with the core substrate-binding site and support a competitive mechanism for transport inhibition. This synergistic application of computational and biochemical methodologies overcomes many uncertainties inherent in other approaches and furnishes a schematic framework for elucidating the ligand-protein interactions of other classes of DA transport inhibitors. PMID:25179220

  5. Optimization of pyDock for the new CAPRI challenges: Docking of homology-based models, domain-domain assembly and protein-RNA binding.

    PubMed

    Pons, Carles; Solernou, Albert; Perez-Cano, Laura; Grosdidier, Solène; Fernandez-Recio, Juan

    2010-11-15

    We describe here our results in the last CAPRI edition. We have participated in all targets, both as predictors and as scorers, using our pyDock docking methodology. The new challenges (homology-based modeling of the interacting subunits, domain-domain assembling, and protein-RNA interactions) have pushed our computer tools to the limits and have encouraged us to devise new docking approaches. Overall, the results have been quite successful, in line with previous editions, especially considering the high difficulty of some of the targets. Our docking approaches succeeded in five targets as predictors or as scorers (T29, T34, T35, T41, and T42). Moreover, with the inclusion of available information on the residues expected to be involved in the interaction, our protocol would have also succeeded in two additional cases (T32 and T40). In the remaining targets (except T37), results were equally poor for most of the groups. We submitted the best model (in ligand RMSD) among scorers for the unbound-bound target T29, the second best model among scorers for the protein-RNA target T34, and the only correct model among predictors for the domain assembly target T35. In summary, our excellent results for the new proposed challenges in this CAPRI edition showed the limitations and applicability of our approaches and encouraged us to continue developing methodologies for automated biomolecular docking. © 2010 Wiley-Liss, Inc.

  6. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  7. Fragment screening by SPR and advanced application to GPCRs.

    PubMed

    Shepherd, Claire A; Hopkins, Andrew L; Navratilova, Iva

    2014-01-01

    Surface plasmon resonance (SPR) is one of the primary biophysical methods for the screening of low molecular weight 'fragment' libraries, due to its low protein consumption and 'label-free' methodology. SPR biosensor interaction analysis is employed to both screen and confirm the binding of compounds in fragment screening experiments, as it provides accurate information on the affinity and kinetics of molecular interactions. The most advanced application of the use of SPR for fragment screening is against membrane protein drug targets, such G-protein coupled receptors (GPCRs). Biophysical GPCR assays using SPR have been validated with pharmacological measurements approximate to cell-based methods, yet provide the advantage of biophysical methods in their ability to measure the weak affinities of low molecular weight fragments. A number of SPR fragment screens against GPCRs have now been disclosed in the literature. SPR fragment screening is proving versatile to screen both thermostabilised GPCRs and solubilised wild type receptors. In this chapter, we discuss the state-of-the-art in GPCR fragment screening by SPR and the technical considerations in performing such experiments. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. On the Comparison of the Long Penetration Mode (LPM) Supersonic Counterflowing Jet to the Supersonic Screech Jet

    NASA Technical Reports Server (NTRS)

    Farr, Rebecca A.; Chang, Chau-Lyan.; Jones, Jess H.; Dougherty, N. Sam

    2015-01-01

    The authors provide a brief overview of the classic tonal screech noise problem created by underexpanded supersonic jets, briefly describing the fluid dynamic-acoustics feedback mechanism that has been long established as the basis for this well-known aeroacoustics problem. This is followed by a description of the Long Penetration Mode (LPM) supersonic underexpanded counterflowing jet phenomenon which has been demonstrated in several wind tunnel tests and modeled in several computational fluid dynamics (CFD) simulations. The authors provide evidence from test and CFD analysis of LPM that indicates that acoustics feedback and fluid interaction seen in LPM are analogous to the aeroacoustics interactions seen in screech jets. Finally, the authors propose applying certain methodologies to LPM which have been developed and successfully demonstrated in the study of screech jets and mechanically induced excitation in fluid oscillators for decades. The authors conclude that the large body of work done on jet screech, other aeroacoustic phenomena, and fluid oscillators can have direct application to the study and applications of LPM counterflowing supersonic cold flow jets.

  9. Multiple component end-member mixing model of dilution: hydrochemical effects of construction water at Yucca Mountain, Nevada, USA

    NASA Astrophysics Data System (ADS)

    Lu, Guoping; Sonnenthal, Eric L.; Bodvarsson, Gudmundur S.

    2008-12-01

    The standard dual-component and two-member linear mixing model is often used to quantify water mixing of different sources. However, it is no longer applicable whenever actual mixture concentrations are not exactly known because of dilution. For example, low-water-content (low-porosity) rock samples are leached for pore-water chemical compositions, which therefore are diluted in the leachates. A multicomponent, two-member mixing model of dilution has been developed to quantify mixing of water sources and multiple chemical components experiencing dilution in leaching. This extended mixing model was used to quantify fracture-matrix interaction in construction-water migration tests along the Exploratory Studies Facility (ESF) tunnel at Yucca Mountain, Nevada, USA. The model effectively recovers the spatial distribution of water and chemical compositions released from the construction water, and provides invaluable data on the matrix fracture interaction. The methodology and formulations described here are applicable to many sorts of mixing-dilution problems, including dilution in petroleum reservoirs, hydrospheres, chemical constituents in rocks and minerals, monitoring of drilling fluids, and leaching, as well as to environmental science studies.

  10. In-plane optical anisotropy of layered gallium telluride

    DOE PAGES

    Huang, Shengxi; Tatsumi, Yuki; Ling, Xi; ...

    2016-08-16

    Layered gallium telluride (GaTe) has attracted much attention recently, due to its extremely high photoresponsivity, short response time, and promising thermoelectric performance. Different from most commonly studied two-dimensional (2D) materials, GaTe has in-plane anisotropy and a low symmetry with the C 2h 3 space group. Investigating the in-plane optical anisotropy, including the electron–photon and electron–phonon interactions of GaTe is essential in realizing its applications in optoelectronics and thermoelectrics. In this work, the anisotropic light-matter interactions in the low-symmetry material GaTe are studied using anisotropic optical extinction and Raman spectroscopies as probes. Our polarized optical extinction spectroscopy reveals the weak anisotropymore » in optical extinction spectra for visible light of multilayer GaTe. Polarized Raman spectroscopy proves to be sensitive to the crystalline orientation of GaTe, and shows the intricate dependences of Raman anisotropy on flake thickness, photon and phonon energies. Such intricate dependences can be explained by theoretical analyses employing first-principles calculations and group theory. Furthermore, these studies are a crucial step toward the applications of GaTe especially in optoelectronics and thermoelectrics, and provide a general methodology for the study of the anisotropy of light-matter interactions in 2D layered materials with in-plane anisotropy.« less

  11. Fuzzy Logic-Based Audio Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Malcangi, M.

    2008-11-01

    Audio and audio-pattern recognition is becoming one of the most important technologies to automatically control embedded systems. Fuzzy logic may be the most important enabling methodology due to its ability to rapidly and economically model such application. An audio and audio-pattern recognition engine based on fuzzy logic has been developed for use in very low-cost and deeply embedded systems to automate human-to-machine and machine-to-machine interaction. This engine consists of simple digital signal-processing algorithms for feature extraction and normalization, and a set of pattern-recognition rules manually tuned or automatically tuned by a self-learning process.

  12. Gesture Recognition for Educational Games: Magic Touch Math

    NASA Astrophysics Data System (ADS)

    Kye, Neo Wen; Mustapha, Aida; Azah Samsudin, Noor

    2017-08-01

    Children nowadays are having problem learning and understanding basic mathematical operations because they are not interested in studying or learning mathematics. This project proposes an educational game called Magic Touch Math that focuses on basic mathematical operations targeted to children between the age of three to five years old using gesture recognition to interact with the game. Magic Touch Math was developed in accordance to the Game Development Life Cycle (GDLC) methodology. The prototype developed has helped children to learn basic mathematical operations via intuitive gestures. It is hoped that the application is able to get the children motivated and interested in mathematics.

  13. Reinforcement learning improves behaviour from evaluative feedback

    NASA Astrophysics Data System (ADS)

    Littman, Michael L.

    2015-05-01

    Reinforcement learning is a branch of machine learning concerned with using experience gained through interacting with the world and evaluative feedback to improve a system's ability to make behavioural decisions. It has been called the artificial intelligence problem in a microcosm because learning algorithms must act autonomously to perform well and achieve their goals. Partly driven by the increasing availability of rich data, recent years have seen exciting advances in the theory and practice of reinforcement learning, including developments in fundamental technical areas such as generalization, planning, exploration and empirical methodology, leading to increasing applicability to real-life problems.

  14. Reinforcement learning improves behaviour from evaluative feedback.

    PubMed

    Littman, Michael L

    2015-05-28

    Reinforcement learning is a branch of machine learning concerned with using experience gained through interacting with the world and evaluative feedback to improve a system's ability to make behavioural decisions. It has been called the artificial intelligence problem in a microcosm because learning algorithms must act autonomously to perform well and achieve their goals. Partly driven by the increasing availability of rich data, recent years have seen exciting advances in the theory and practice of reinforcement learning, including developments in fundamental technical areas such as generalization, planning, exploration and empirical methodology, leading to increasing applicability to real-life problems.

  15. Incorporating information on predicted solvent accessibility to the co-evolution-based study of protein interactions.

    PubMed

    Ochoa, David; García-Gutiérrez, Ponciano; Juan, David; Valencia, Alfonso; Pazos, Florencio

    2013-01-27

    A widespread family of methods for studying and predicting protein interactions using sequence information is based on co-evolution, quantified as similarity of phylogenetic trees. Part of the co-evolution observed between interacting proteins could be due to co-adaptation caused by inter-protein contacts. In this case, the co-evolution is expected to be more evident when evaluated on the surface of the proteins or the internal layers close to it. In this work we study the effect of incorporating information on predicted solvent accessibility to three methods for predicting protein interactions based on similarity of phylogenetic trees. We evaluate the performance of these methods in predicting different types of protein associations when trees based on positions with different characteristics of predicted accessibility are used as input. We found that predicted accessibility improves the results of two recent versions of the mirrortree methodology in predicting direct binary physical interactions, while it neither improves these methods, nor the original mirrortree method, in predicting other types of interactions. That improvement comes at no cost in terms of applicability since accessibility can be predicted for any sequence. We also found that predictions of protein-protein interactions are improved when multiple sequence alignments with a richer representation of sequences (including paralogs) are incorporated in the accessibility prediction.

  16. TSEMA: interactive prediction of protein pairings between interacting families

    PubMed Central

    Izarzugaza, José M. G.; Juan, David; Pons, Carles; Ranea, Juan A. G.; Valencia, Alfonso; Pazos, Florencio

    2006-01-01

    An entire family of methodologies for predicting protein interactions is based on the observed fact that families of interacting proteins tend to have similar phylogenetic trees due to co-evolution. One application of this concept is the prediction of the mapping between the members of two interacting protein families (which protein within one family interacts with which protein within the other). The idea is that the real mapping would be the one maximizing the similarity between the trees. Since the exhaustive exploration of all possible mappings is not feasible for large families, current approaches use heuristic techniques which do not ensure the best solution to be found. This is why it is important to check the results proposed by heuristic techniques and to manually explore other solutions. Here we present TSEMA, the server for efficient mapping assessment. This system calculates an initial mapping between two families of proteins based on a Monte Carlo approach and allows the user to interactively modify it based on performance figures and/or specific biological knowledge. All the explored mappings are graphically shown over a representation of the phylogenetic trees. The system is freely available at . Standalone versions of the software behind the interface are available upon request from the authors. PMID:16845017

  17. Examining emotional expressions in discourse: methodological considerations

    NASA Astrophysics Data System (ADS)

    Hufnagel, Elizabeth; Kelly, Gregory J.

    2017-10-01

    This methodological paper presents an approach for examining emotional expressions through discourse analysis and ethnographic methods. Drawing on trends in the current literature in science education, we briefly explain the importance of emotions in science education and examine the current research methodologies used in interactional emotion studies. We put forth and substantiate a methodological approach that attends to the interactional, contextual, intertextual, and consequential aspects of emotional expressions. By examining emotional expressions in the discourse in which they are constructed, emotional expressions are identified through semantics, contextualization, and linguistic features. These features make salient four dimensions of emotional expressions: aboutness, frequency, type, and ownership. Drawing on data from a large empirical study of pre-service elementary teachers' emotional expressions about climate change in a science course, we provide illustrative examples to describe what counts as emotional expressions in situ. In doing so we explain how our approach makes salient the nuanced nature of such expressions as well as the broader discourse in which they are constructed and the implications for researching emotional expressions in science education discourse. We suggest reasons why this discourse orientated research methodology can contribute to the interactional study of emotions in science education contexts.

  18. Protein-Carbohydrate Interactions Studied by NMR: From Molecular Recognition to Drug Design

    PubMed Central

    Fernández-Alonso, María del Carmen; Díaz, Dolores; Berbis, Manuel Álvaro; Marcelo, Filipa; Cañada, Javier; Jiménez-Barbero, Jesús

    2012-01-01

    Diseases that result from infection are, in general, a consequence of specific interactions between a pathogenic organism and the cells. The study of host-pathogen interactions has provided insights for the design of drugs with therapeutic properties. One area that has proved to be promising for such studies is the constituted by carbohydrates which participate in biological processes of paramount importance. On the one hand, carbohydrates have shown to be information carriers with similar, if not higher, importance than traditionally considered carriers as amino acids and nucleic acids. On the other hand, the knowledge on molecular recognition of sugars by lectins and other carbohydrate-binding proteins has been employed for the development of new biomedical strategies. Biophysical techniques such as X-Ray crystallography and NMR spectroscopy lead currently the investigation on this field. In this review, a description of traditional and novel NMR methodologies employed in the study of sugar-protein interactions is briefly presented in combination with a palette of NMR-based studies related to biologically and/or pharmaceutically relevant applications. PMID:23305367

  19. Peptide Array X-Linking (PAX): A New Peptide-Protein Identification Approach

    PubMed Central

    Okada, Hirokazu; Uezu, Akiyoshi; Soderblom, Erik J.; Moseley, M. Arthur; Gertler, Frank B.; Soderling, Scott H.

    2012-01-01

    Many protein interaction domains bind short peptides based on canonical sequence consensus motifs. Here we report the development of a peptide array-based proteomics tool to identify proteins directly interacting with ligand peptides from cell lysates. Array-formatted bait peptides containing an amino acid-derived cross-linker are photo-induced to crosslink with interacting proteins from lysates of interest. Indirect associations are removed by high stringency washes under denaturing conditions. Covalently trapped proteins are subsequently identified by LC-MS/MS and screened by cluster analysis and domain scanning. We apply this methodology to peptides with different proline-containing consensus sequences and show successful identifications from brain lysates of known and novel proteins containing polyproline motif-binding domains such as EH, EVH1, SH3, WW domains. These results suggest the capacity of arrayed peptide ligands to capture and subsequently identify proteins by mass spectrometry is relatively broad and robust. Additionally, the approach is rapid and applicable to cell or tissue fractions from any source, making the approach a flexible tool for initial protein-protein interaction discovery. PMID:22606326

  20. Scaling of elongation transition thickness during thin-film growth on weakly interacting substrates

    NASA Astrophysics Data System (ADS)

    Lü, B.; Souqui, L.; Elofsson, V.; Sarakinos, K.

    2017-08-01

    The elongation transition thickness ( θElong) is a central concept in the theoretical description of thin-film growth dynamics on weakly interacting substrates via scaling relations of θElong with respect to rates of key atomistic film-forming processes. To date, these scaling laws have only been confirmed quantitatively by simulations, while experimental proof has been left ambiguous as it has not been possible to measure θElong. Here, we present a method for determining experimentally θElong for Ag films growing on amorphous SiO2: an archetypical weakly interacting film/substrate system. Our results confirm the theoretically predicted θElong scaling behavior, which then allow us to calculate the rates of adatom diffusion and island coalescence completion, in good agreement with the literature. The methodology presented herein casts the foundation for studying growth dynamics and cataloging atomistic-process rates for a wide range of weakly interacting film/substrate systems. This may provide insights into directed growth of metal films with a well-controlled morphology and interfacial structure on 2D crystals—including graphene and MoS2—for catalytic and nanoelectronic applications.

  1. Ensemble docking to difficult targets in early-stage drug discovery: Methodology and application to fibroblast growth factor 23.

    PubMed

    Velazquez, Hector A; Riccardi, Demian; Xiao, Zhousheng; Quarles, Leigh Darryl; Yates, Charless Ryan; Baudry, Jerome; Smith, Jeremy C

    2018-02-01

    Ensemble docking is now commonly used in early-stage in silico drug discovery and can be used to attack difficult problems such as finding lead compounds which can disrupt protein-protein interactions. We give an example of this methodology here, as applied to fibroblast growth factor 23 (FGF23), a protein hormone that is responsible for regulating phosphate homeostasis. The first small-molecule antagonists of FGF23 were recently discovered by combining ensemble docking with extensive experimental target validation data (Science Signaling, 9, 2016, ra113). Here, we provide a detailed account of how ensemble-based high-throughput virtual screening was used to identify the antagonist compounds discovered in reference (Science Signaling, 9, 2016, ra113). Moreover, we perform further calculations, redocking those antagonist compounds identified in reference (Science Signaling, 9, 2016, ra113) that performed well on drug-likeness filters, to predict possible binding regions. These predicted binding modes are rescored with the molecular mechanics Poisson-Boltzmann surface area (MM/PBSA) approach to calculate the most likely binding site. Our findings suggest that the antagonist compounds antagonize FGF23 through the disruption of protein-protein interactions between FGF23 and fibroblast growth factor receptor (FGFR). © 2017 John Wiley & Sons A/S.

  2. Proof of concept of a workflow methodology for the creation of basic canine head anatomy veterinary education tool using augmented reality.

    PubMed

    Christ, Roxie; Guevar, Julien; Poyade, Matthieu; Rea, Paul M

    2018-01-01

    Neuroanatomy can be challenging to both teach and learn within the undergraduate veterinary medicine and surgery curriculum. Traditional techniques have been used for many years, but there has now been a progression to move towards alternative digital models and interactive 3D models to engage the learner. However, digital innovations in the curriculum have typically involved the medical curriculum rather than the veterinary curriculum. Therefore, we aimed to create a simple workflow methodology to highlight the simplicity there is in creating a mobile augmented reality application of basic canine head anatomy. Using canine CT and MRI scans and widely available software programs, we demonstrate how to create an interactive model of head anatomy. This was applied to augmented reality for a popular Android mobile device to demonstrate the user-friendly interface. Here we present the processes, challenges and resolutions for the creation of a highly accurate, data based anatomical model that could potentially be used in the veterinary curriculum. This proof of concept study provides an excellent framework for the creation of augmented reality training products for veterinary education. The lack of similar resources within this field provides the ideal platform to extend this into other areas of veterinary education and beyond.

  3. Proof of concept of a workflow methodology for the creation of basic canine head anatomy veterinary education tool using augmented reality

    PubMed Central

    Christ, Roxie; Guevar, Julien; Poyade, Matthieu

    2018-01-01

    Neuroanatomy can be challenging to both teach and learn within the undergraduate veterinary medicine and surgery curriculum. Traditional techniques have been used for many years, but there has now been a progression to move towards alternative digital models and interactive 3D models to engage the learner. However, digital innovations in the curriculum have typically involved the medical curriculum rather than the veterinary curriculum. Therefore, we aimed to create a simple workflow methodology to highlight the simplicity there is in creating a mobile augmented reality application of basic canine head anatomy. Using canine CT and MRI scans and widely available software programs, we demonstrate how to create an interactive model of head anatomy. This was applied to augmented reality for a popular Android mobile device to demonstrate the user-friendly interface. Here we present the processes, challenges and resolutions for the creation of a highly accurate, data based anatomical model that could potentially be used in the veterinary curriculum. This proof of concept study provides an excellent framework for the creation of augmented reality training products for veterinary education. The lack of similar resources within this field provides the ideal platform to extend this into other areas of veterinary education and beyond. PMID:29698413

  4. The Infant Intestinal Microbiome: Friend or Foe?

    PubMed Central

    Mshvildadze, Maka; Neu, Josef

    2013-01-01

    During the course of mammalian evolution there has been a close relationship between microbes residing in the gastrointestinal (GI) tract and the mammalian host. Interactions of resident intestinal microbes with the luminal contents and the mucosal surface play important roles in normal intestinal development, nutrition and adaptive innate immunity. The GI tract of the premature infant has a large but fragile surface area covered by a thin monolayer of epithelial cells that overlies a highly immunoreactive submucosa. Interactions in the lumen between microbes, nutrients and the intestinal mucosa can range from a healthy homeostasis to an uncontrolled systemic inflammatory response syndrome (SIRS) that leads to multiple organ system failure and death. Recent advances in molecular microbiota analytic methodology that stem from advances in high throughput sequencing technology have provided us with the tools to determine the GI microbiota in great depth, including the nearly 80 % of microbes in the intestine that are very difficult if not impossible to culture by current methodology. Application of these techniques to derive a better understanding of the developing intestinal ecosystem in the premature neonate may hold the key to understand and eventually prevent several important diseases including necrotizing enterocolitis (NEC) and late onset neonatal sepsis that may be acquired via translocation through the GI tract. PMID:20116944

  5. CANDO and the infinite drug discovery frontier

    PubMed Central

    Minie, Mark; Chopra, Gaurav; Sethi, Geetika; Horst, Jeremy; White, George; Roy, Ambrish; Hatti, Kaushik; Samudrala, Ram

    2014-01-01

    The Computational Analysis of Novel Drug Opportunities (CANDO) platform (http://protinfo.org/cando) uses similarity of compound–proteome interaction signatures to infer homology of compound/drug behavior. We constructed interaction signatures for 3733 human ingestible compounds covering 48,278 protein structures mapping to 2030 indications based on basic science methodologies to predict and analyze protein structure, function, and interactions developed by us and others. Our signature comparison and ranking approach yielded benchmarking accuracies of 12–25% for 1439 indications with at least two approved compounds. We prospectively validated 49/82 ‘high value’ predictions from nine studies covering seven indications, with comparable or better activity to existing drugs, which serve as novel repurposed therapeutics. Our approach may be generalized to compounds beyond those approved by the FDA, and can also consider mutations in protein structures to enable personalization. Our platform provides a holistic multiscale modeling framework of complex atomic, molecular, and physiological systems with broader applications in medicine and engineering. PMID:24980786

  6. Using aggregate data to estimate the standard error of a treatment-covariate interaction in an individual patient data meta-analysis.

    PubMed

    Kovalchik, Stephanie A; Cumberland, William G

    2012-05-01

    Subgroup analyses are important to medical research because they shed light on the heterogeneity of treatment effectts. A treatment-covariate interaction in an individual patient data (IPD) meta-analysis is the most reliable means to estimate how a subgroup factor modifies a treatment's effectiveness. However, owing to the challenges in collecting participant data, an approach based on aggregate data might be the only option. In these circumstances, it would be useful to assess the relative efficiency and power loss of a subgroup analysis without patient-level data. We present methods that use aggregate data to estimate the standard error of an IPD meta-analysis' treatment-covariate interaction for regression models of a continuous or dichotomous patient outcome. Numerical studies indicate that the estimators have good accuracy. An application to a previously published meta-regression illustrates the practical utility of the methodology. © 2012 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Plasmonic metasurface cavity for simultaneous enhancement of optical electric and magnetic fields in deep subwavelength volume.

    PubMed

    Hong, Jongwoo; Kim, Sun-Je; Kim, Inki; Yun, Hansik; Mun, Sang-Eun; Rho, Junsuk; Lee, Byoungho

    2018-05-14

    It has been hard to achieve simultaneous plasmonic enhancement of nanoscale light-matter interactions in terms of both electric and magnetic manners with easily reproducible fabrication method and systematic theoretical design rule. In this paper, a novel concept of a flat nanofocusing device is proposed for simultaneously squeezing both electric and magnetic fields in deep-subwavelength volume (~λ 3 /538) in a large area. Based on the funneled unit cell structures and surface plasmon-assisted coherent interactions between them, the array of rectangular nanocavity connected to a tapered nanoantenna, plasmonic metasurface cavity, is constructed by periodic arrangement of the unit cell. The average enhancement factors of electric and magnetic field intensities reach about 60 and 22 in nanocavities, respectively. The proposed outstanding performance of the device is verified numerically and experimentally. We expect that this work would expand methodologies involving optical near-field manipulations in large areas and related potential applications including nanophotonic sensors, nonlinear responses, and quantum interactions.

  8. Logic regression and its extensions.

    PubMed

    Schwender, Holger; Ruczinski, Ingo

    2010-01-01

    Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.

  9. Enzymatic detection of As(III) in aqueous solution using alginate immobilized pumpkin urease: optimization of process variables by response surface methodology.

    PubMed

    Talat, Mahe; Prakash, Om; Hasan, S H

    2009-10-01

    Urease immobilized on alginate was utilized to detect and quantify As(3+) in aqueous solution. Urease from the seeds of pumpkin (vegetable waste) was purified to apparent homogeneity by heat treatment and gel filtration (Sephadex G-200). Further enzyme was entrapped in 3.5% alginate beads. Urea hydrolysis by enzyme revealed a clear dependence on the concentration and interaction time of As(3+). The process variables effecting the quantitation of As(3+) was investigated using central composite design with Minitab 15 software. The predicted results were found in good agreement (R(2)=96.71%) with experimental results indicating the applicability of proposed model. The multiple regression analysis and ANOVA showed that enzyme activity decreased with increase of As(3+) concentration and interaction time. 3D plot and contour plot between As(3+) concentration and interaction time was helpful to predict residual activity of enzyme for a particular As(3+) at a particular time.

  10. High-Throughput, Data-Rich Cellular RNA Device Engineering

    PubMed Central

    Townshend, Brent; Kennedy, Andrew B.; Xiang, Joy S.; Smolke, Christina D.

    2015-01-01

    Methods for rapidly assessing sequence-structure-function landscapes and developing conditional gene-regulatory devices are critical to our ability to manipulate and interface with biology. We describe a framework for engineering RNA devices from preexisting aptamers that exhibit ligand-responsive ribozyme tertiary interactions. Our methodology utilizes cell sorting, high-throughput sequencing, and statistical data analyses to enable parallel measurements of the activities of hundreds of thousands of sequences from RNA device libraries in the absence and presence of ligands. Our tertiary interaction RNA devices exhibit improved performance in terms of gene silencing, activation ratio, and ligand sensitivity as compared to optimized RNA devices that rely on secondary structure changes. We apply our method to building biosensors for diverse ligands and determine consensus sequences that enable ligand-responsive tertiary interactions. These methods advance our ability to develop broadly applicable genetic tools and to elucidate understanding of the underlying sequence-structure-function relationships that empower rational design of complex biomolecules. PMID:26258292

  11. Passing messages between biological networks to refine predicted interactions.

    PubMed

    Glass, Kimberly; Huttenhower, Curtis; Quackenbush, John; Yuan, Guo-Cheng

    2013-01-01

    Regulatory network reconstruction is a fundamental problem in computational biology. There are significant limitations to such reconstruction using individual datasets, and increasingly people attempt to construct networks using multiple, independent datasets obtained from complementary sources, but methods for this integration are lacking. We developed PANDA (Passing Attributes between Networks for Data Assimilation), a message-passing model using multiple sources of information to predict regulatory relationships, and used it to integrate protein-protein interaction, gene expression, and sequence motif data to reconstruct genome-wide, condition-specific regulatory networks in yeast as a model. The resulting networks were not only more accurate than those produced using individual data sets and other existing methods, but they also captured information regarding specific biological mechanisms and pathways that were missed using other methodologies. PANDA is scalable to higher eukaryotes, applicable to specific tissue or cell type data and conceptually generalizable to include a variety of regulatory, interaction, expression, and other genome-scale data. An implementation of the PANDA algorithm is available at www.sourceforge.net/projects/panda-net.

  12. The development of interior noise and vibration criteria

    NASA Technical Reports Server (NTRS)

    Leatherwood, J. D.; Clevenson, S. A.; Stephens, D. G.

    1990-01-01

    A generalized model was developed for estimating passenger discomfort response to combined noise and vibration. This model accounts for broadband noise and vibration spectra and multiple axes of vibration as well as the interactive effects of combined noise and vibration. The model has the unique capability of transforming individual components of noise/vibration environment into subjective comfort units and then combining these comfort units to produce a total index of passenger discomfort and useful sub-indices that typify passenger comfort within the environment. An overview of the model development is presented including the methodology employed, major elements of the model, model applications, and a brief description of a commercially available portable ride comfort meter based directly upon the model algorithms. Also discussed are potential criteria formats that account for the interactive effects of noise and vibration on human discomfort response.

  13. Protein-Protein Interface and Disease: Perspective from Biomolecular Networks.

    PubMed

    Hu, Guang; Xiao, Fei; Li, Yuqian; Li, Yuan; Vongsangnak, Wanwipa

    Protein-protein interactions are involved in many important biological processes and molecular mechanisms of disease association. Structural studies of interfacial residues in protein complexes provide information on protein-protein interactions. Characterizing protein-protein interfaces, including binding sites and allosteric changes, thus pose an imminent challenge. With special focus on protein complexes, approaches based on network theory are proposed to meet this challenge. In this review we pay attention to protein-protein interfaces from the perspective of biomolecular networks and their roles in disease. We first describe the different roles of protein complexes in disease through several structural aspects of interfaces. We then discuss some recent advances in predicting hot spots and communication pathway analysis in terms of amino acid networks. Finally, we highlight possible future aspects of this area with respect to both methodology development and applications for disease treatment.

  14. Advances in Lipidomics for Cancer Biomarkers Discovery

    PubMed Central

    Perrotti, Francesca; Rosa, Consuelo; Cicalini, Ilaria; Sacchetta, Paolo; Del Boccio, Piero; Genovesi, Domenico; Pieragostino, Damiana

    2016-01-01

    Lipids play critical functions in cellular survival, proliferation, interaction and death, since they are involved in chemical-energy storage, cellular signaling, cell membranes, and cell–cell interactions. These cellular processes are strongly related to carcinogenesis pathways, particularly to transformation, progression, and metastasis, suggesting the bioactive lipids are mediators of a number of oncogenic processes. The current review gives a synopsis of a lipidomic approach in tumor characterization; we provide an overview on potential lipid biomarkers in the oncology field and on the principal lipidomic methodologies applied. The novel lipidomic biomarkers are reviewed in an effort to underline their role in diagnosis, in prognostic characterization and in prediction of therapeutic outcomes. A lipidomic investigation through mass spectrometry highlights new insights on molecular mechanisms underlying cancer disease. This new understanding will promote clinical applications in drug discovery and personalized therapy. PMID:27916803

  15. Control system software, simulation, and robotic applications

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    1991-01-01

    All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.

  16. Development of an UPLC-MS/MS micromethod for quantitation of cinitapride in plasma and its application in a pharmacokinetic interaction trial.

    PubMed

    Marcelín-Jiménez, Gabriel; Contreras, Leticia; Esquivel, Javier; Ávila, Óscar; Batista, Dany; Ángeles, Alionka P; García-González, Alberto

    2017-03-01

    Cinitapride (CIN) is a benzamide-derived molecule used for the treatment of gastroesophageal reflux and dyspepsia. Its pharmacokinetics are controversial due to the use of supratherapeutic doses and the lack of sensitive methodology. Therefore, a sensitive and accurate micromethod was developed for its quantitation in human plasma. CIN was extracted from 300 µl of heparinized plasma by liquid-liquid extraction using cisapride as internal standard, and analyzed with an ultra performance liquid chromatograph employing positive multiple-reaction monitoring-MS. The method proved to be rapid, accurate and stable within a range between 50 and 2000 pg/ml and was successfully validated and applied in a pharmacokinetic interaction trial, where it was demonstrated that oral co-administration of simethicone does not modify the bioavailability of CIN.

  17. Probing protein-lipid interactions by FRET between membrane fluorophores

    NASA Astrophysics Data System (ADS)

    Trusova, Valeriya M.; Gorbenko, Galyna P.; Deligeorgiev, Todor; Gadjev, Nikolai

    2016-09-01

    Förster resonance energy transfer (FRET) is a powerful fluorescence technique that has found numerous applications in medicine and biology. One area where FRET proved to be especially informative involves the intermolecular interactions in biological membranes. The present study was focused on developing and verifying a Monte-Carlo approach to analyzing the results of FRET between the membrane-bound fluorophores. This approach was employed to quantify FRET from benzanthrone dye ABM to squaraine dye SQ-1 in the model protein-lipid system containing a polycationic globular protein lysozyme and negatively charged lipid vesicles composed of phosphatidylcholine and phosphatidylglycerol. It was found that acceptor redistribution between the lipid bilayer and protein binding sites resulted in the decrease of FRET efficiency. Quantification of this effect in terms of the proposed methodology yielded both structural and binding parameters of lysozyme-lipid complexes.

  18. Voice interactive electronic warning systems (VIEWS) - An applied approach to voice technology in the helicopter cockpit

    NASA Technical Reports Server (NTRS)

    Voorhees, J. W.; Bucher, N. M.

    1983-01-01

    The cockpit has been one of the most rapidly changing areas of new aircraft design over the past thirty years. In connection with these developments, a pilot can now be considered a decision maker/system manager as well as a vehicle controller. There is, however, a trend towards an information overload in the cockpit, and information processing problems begin to occur for the rotorcraft pilot. One approach to overcome the arising difficulties is based on the utilization of voice technology to improve the information transfer rate in the cockpit with respect to both input and output. Attention is given to the background of speech technology, the application of speech technology within the cockpit, voice interactive electronic warning system (VIEWS) simulation, and methodology. Information subsystems are considered along with a dynamic simulation study, and data collection.

  19. Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)

    2001-01-01

    Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are interfaced. This capability rapidly provides the high-fidelity results needed in the early design phase. Moreover, the capability is applicable to the general field of engineering science and mechanics. Hence, it provides a collaborative capability that accounts for interactions among engineering analysis methods.

  20. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.

    PubMed

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-08-24

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  1. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors

    PubMed Central

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-01-01

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908

  2. Optimization of ultrasonic-assisted preparation of dietary fiber from corn pericarp using response surface methodology.

    PubMed

    Wang, Anna; Wu, Ligen; Li, Xiulin

    2013-09-01

    Corn pericarp, which is an industrial waste of corn starch production, is an important source of dietary fiber in cereals, with claimed health benefits. However, they used to be discarded or utilized as animal feed. The application of pre-ultrasound treatment is critical for achieving rapid preparation of desired components from plant materials and for preserving structural and molecular properties of these compounds. Ultrasonic-assisted preparation was used to produce dietary fiber from corn pericarp using response surface methodology. The optimal particle size of corn pericarp (mesh size 40), the ratio of liquid to solid (25 mL g⁻¹), ultrasonic power (180 W) and ultrasonic time (80 min) were determined based on response surface methodology analysis. The interaction effects of particle size of corn pericarp and ultrasonic time had a highlysignificant effect on the yield of dietary fiber, and a significant effect was shown by ultrasonic power and ultrasonic time. The maximum yield of dietary fiber was 86.84%, which agreed closely with the predicted value. Using ultrasonic-assisted preparation, it may be possible to enhance the yield of dietary fiber from corn pericarp. © 2013 Society of Chemical Industry.

  3. Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures

    NASA Technical Reports Server (NTRS)

    Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1994-01-01

    A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

  4. Chromatin immunoprecipitation (ChIP) method for non-model fruit flies (Diptera: Tephritidae) and evidence of histone modifications.

    PubMed

    Nagalingam, Kumaran; Lorenc, Michał T; Manoli, Sahana; Cameron, Stephen L; Clarke, Anthony R; Dudley, Kevin J

    2018-01-01

    Interactions between DNA and proteins located in the cell nucleus play an important role in controlling physiological processes by specifying, augmenting and regulating context-specific transcription events. Chromatin immunoprecipitation (ChIP) is a widely used methodology to study DNA-protein interactions and has been successfully used in various cell types for over three decades. More recently, by combining ChIP with genomic screening technologies and Next Generation Sequencing (e.g. ChIP-seq), it has become possible to profile DNA-protein interactions (including covalent histone modifications) across entire genomes. However, the applicability of ChIP-chip and ChIP-seq has rarely been extended to non-model species because of a number of technical challenges. Here we report a method that can be used to identify genome wide covalent histone modifications in a group of non-model fruit fly species (Diptera: Tephritidae). The method was developed by testing and refining protocols that have been used in model organisms, including Drosophila melanogaster. We demonstrate that this method is suitable for a group of economically important pest fruit fly species, viz., Bactrocera dorsalis, Ceratitis capitata, Zeugodacus cucurbitae and Bactrocera tryoni. We also report an example ChIP-seq dataset for B. tryoni, providing evidence for histone modifications in the genome of a tephritid fruit fly for the first time. Since tephritids are major agricultural pests globally, this methodology will be a valuable resource to study taxa-specific evolutionary questions and to assist with pest management. It also provides a basis for researchers working with other non-model species to undertake genome wide DNA-protein interaction studies.

  5. Calculation of Derivative Thermodynamic Hydration and Aqueous Partial Molar Properties of Ions Based on Atomistic Simulations.

    PubMed

    Dahlgren, Björn; Reif, Maria M; Hünenberger, Philippe H; Hansen, Niels

    2012-10-09

    The raw ionic solvation free energies calculated on the basis of atomistic (explicit-solvent) simulations are extremely sensitive to the boundary conditions and treatment of electrostatic interactions used during these simulations. However, as shown recently [Kastenholz, M. A.; Hünenberger, P. H. J. Chem. Phys.2006, 124, 224501 and Reif, M. M.; Hünenberger, P. H. J. Chem. Phys.2011, 134, 144104], the application of an appropriate correction scheme allows for a conversion of the methodology-dependent raw data into methodology-independent results. In this work, methodology-independent derivative thermodynamic hydration and aqueous partial molar properties are calculated for the Na(+) and Cl(-) ions at P° = 1 bar and T(-) = 298.15 K, based on the SPC water model and on ion-solvent Lennard-Jones interaction coefficients previously reoptimized against experimental hydration free energies. The hydration parameters considered are the hydration free energy and enthalpy. The aqueous partial molar parameters considered are the partial molar entropy, volume, heat capacity, volume-compressibility, and volume-expansivity. Two alternative calculation methods are employed to access these properties. Method I relies on the difference in average volume and energy between two aqueous systems involving the same number of water molecules, either in the absence or in the presence of the ion, along with variations of these differences corresponding to finite pressure or/and temperature changes. Method II relies on the calculation of the hydration free energy of the ion, along with variations of this free energy corresponding to finite pressure or/and temperature changes. Both methods are used considering two distinct variants in the application of the correction scheme. In variant A, the raw values from the simulations are corrected after the application of finite difference in pressure or/and temperature, based on correction terms specifically designed for derivative parameters at P° and T(-). In variant B, these raw values are corrected prior to differentiation, based on corresponding correction terms appropriate for the different simulation pressures P and temperatures T. The results corresponding to the different calculation schemes show that, except for the hydration free energy itself, accurate methodological independence and quantitative agreement with even the most reliable experimental parameters (ion-pair properties) are not yet reached. Nevertheless, approximate internal consistency and qualitative agreement with experimental results can be achieved, but only when an appropriate correction scheme is applied, along with a careful consideration of standard-state issues. In this sense, the main merit of the present study is to set a clear framework for these types of calculations and to point toward directions for future improvements, with the ultimate goal of reaching a consistent and quantitative description of single-ion hydration thermodynamics in molecular dynamics simulations.

  6. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  7. Proceedings of the Annual Meeting of the Association for Education in Journalism and Mass Communication (86th, Kansas City, Missouri, July 30-August 2, 2003). Communication Theory & Methodology Division.

    ERIC Educational Resources Information Center

    2003

    The Communication Theory & Methodology Division of the proceedings contains the following 14 papers: "Interaction As a Unit of Analysis for Interactive Media Research: A Conceptualization" (Joo-Hyun Lee and Hairong Li); "Towards a Network Approach of Human Action: Theoretical Concepts and Empirical Observations in Media…

  8. On the intersection of phonetic detail and the organization of interaction: clinical connections.

    PubMed

    Walker, Gareth; Local, John

    2013-01-01

    The analysis of language use in real-world contexts poses particular methodological challenges. We codify responses to these challenges as a series of methodological imperatives. To demonstrate the relevance of these imperatives to clinical investigation, we present analyses of single episodes of interaction where one participant has a speech and/or language impairment: atypical prosody, echolalia and dysarthria. We demonstrate there is considerable heuristic and analytic value in taking this approach to analysing the organization of interaction involving individuals with a speech and/or language impairment.

  9. Nucleon-nucleon interactions via Lattice QCD: Methodology. HAL QCD approach to extract hadronic interactions in lattice QCD

    NASA Astrophysics Data System (ADS)

    Aoki, Sinya

    2013-07-01

    We review the potential method in lattice QCD, which has recently been proposed to extract nucleon-nucleon interactions via numerical simulations. We focus on the methodology of this approach by emphasizing the strategy of the potential method, the theoretical foundation behind it, and special numerical techniques. We compare the potential method with the standard finite volume method in lattice QCD, in order to make pros and cons of the approach clear. We also present several numerical results for nucleon-nucleon potentials.

  10. Segmentation of epidermal tissue with histopathological damage in images of haematoxylin and eosin stained human skin

    PubMed Central

    2014-01-01

    Background Digital image analysis has the potential to address issues surrounding traditional histological techniques including a lack of objectivity and high variability, through the application of quantitative analysis. A key initial step in image analysis is the identification of regions of interest. A widely applied methodology is that of segmentation. This paper proposes the application of image analysis techniques to segment skin tissue with varying degrees of histopathological damage. The segmentation of human tissue is challenging as a consequence of the complexity of the tissue structures and inconsistencies in tissue preparation, hence there is a need for a new robust method with the capability to handle the additional challenges materialising from histopathological damage. Methods A new algorithm has been developed which combines enhanced colour information, created following a transformation to the L*a*b* colourspace, with general image intensity information. A colour normalisation step is included to enhance the algorithm’s robustness to variations in the lighting and staining of the input images. The resulting optimised image is subjected to thresholding and the segmentation is fine-tuned using a combination of morphological processing and object classification rules. The segmentation algorithm was tested on 40 digital images of haematoxylin & eosin (H&E) stained skin biopsies. Accuracy, sensitivity and specificity of the algorithmic procedure were assessed through the comparison of the proposed methodology against manual methods. Results Experimental results show the proposed fully automated methodology segments the epidermis with a mean specificity of 97.7%, a mean sensitivity of 89.4% and a mean accuracy of 96.5%. When a simple user interaction step is included, the specificity increases to 98.0%, the sensitivity to 91.0% and the accuracy to 96.8%. The algorithm segments effectively for different severities of tissue damage. Conclusions Epidermal segmentation is a crucial first step in a range of applications including melanoma detection and the assessment of histopathological damage in skin. The proposed methodology is able to segment the epidermis with different levels of histological damage. The basic method framework could be applied to segmentation of other epithelial tissues. PMID:24521154

  11. Statistical media design for efficient polyhydroxyalkanoate production in Pseudomonas sp. MNNG-S.

    PubMed

    Saranya, V; Rajeswari, V; Abirami, P; Poornimakkani, K; Suguna, P; Shenbagarathai, R

    2016-07-03

    Polyhydroxyalkanoate (PHA) is a promising polymer for various biomedical applications. There is a high need to improve the production rate to achieve end use. When a cost-effective production was carried out with cheaper agricultural residues like molasses, traces of toxins were incorporated into the polymer, which makes it unfit for biomedical applications. On the other hand, there is an increase in the popularity of using chemically defined media for the production of compounds with biomedical applications. However, these media do not exhibit favorable characteristics such as efficient utilization at large scale compared to complex media. This article aims to determine the specific nutritional requirement of Pseudomonas sp. MNNG-S for efficient production of polyhydroxyalkanoate. Response surface methodology (RSM) was used in this study to statistically design for PHA production based on the interactive effect of five significant variables (sucrose; potassium dihydrogen phosphate; ammonium sulfate; magnesium sulfate; trace elements). The interactive effects of sucrose with ammonium sulfate, ammonium sulfate with combined potassium phosphate, and trace element with magnesium sulfate were found to be significant (p < .001). The optimization approach adapted in this study increased the PHA production more than fourfold (from 0.85 g L(-1) to 4.56 g L(-1)).

  12. A review of human factors challenges of complex adaptive systems: discovering and understanding chaos in human performance.

    PubMed

    Karwowski, Waldemar

    2012-12-01

    In this paper, the author explores a need for a greater understanding of the true nature of human-system interactions from the perspective of the theory of complex adaptive systems, including the essence of complexity, emergent properties of system behavior, nonlinear systems dynamics, and deterministic chaos. Human performance, more often than not, constitutes complex adaptive phenomena with emergent properties that exhibit nonlinear dynamical (chaotic) behaviors. The complexity challenges in the design and management of contemporary work systems, including service systems, are explored. Examples of selected applications of the concepts of nonlinear dynamics to the study of human physical performance are provided. Understanding and applications of the concepts of theory of complex adaptive and dynamical systems should significantly improve the effectiveness of human-centered design efforts of a large system of systems. Performance of many contemporary work systems and environments may be sensitive to the initial conditions and may exhibit dynamic nonlinear properties and chaotic system behaviors. Human-centered design of emergent human-system interactions requires application of the theories of nonlinear dynamics and complex adaptive system. The success of future human-systems integration efforts requires the fusion of paradigms, knowledge, design principles, and methodologies of human factors and ergonomics with those of the science of complex adaptive systems as well as modern systems engineering.

  13. The Design, Development, and Evaluation of a Qualitative Data Collection Application for Pregnant Women.

    PubMed

    Keedle, Hazel; Schmied, Virginia; Burns, Elaine; Dahlen, Hannah

    2018-01-01

    This article explores the development and evaluation of a smartphone mobile software application (app) to collect qualitative data. The app was specifically designed to capture real-time qualitative data from women planning a vaginal birth after caesarean delivery. This article outlines the design and development of the app to include funding, ethics, and the recruitment of an app developer, as well as the evaluation of using the app by seven participants. Data collection methods used in qualitative research include interviews and focus groups (either online, face-to-face, or by phone), participant diaries, or observations of interactions. This article identifies an alternative data collection methodology using a smartphone app to collect real-time data. The app provides real-time data and instant access to data alongside the ability to access participants from a variety of locations. This allows the researcher to gain insight into the experiences of participants through audio or video recordings in longitudinal studies without the need for constant interactions or interviews with participants. Using smartphone applications can allow researchers to access participants who are traditionally hard to reach and access their data in real time. Evaluating these apps before use in research is invaluable. © 2017 Sigma Theta Tau International.

  14. Design of experiments applications in bioprocessing: concepts and approach.

    PubMed

    Kumar, Vijesh; Bhalla, Akriti; Rathore, Anurag S

    2014-01-01

    Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)-based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I-optimal and D-optimal designs to the commonly used central composite and Box-Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration. © 2013 American Institute of Chemical Engineers.

  15. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  16. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  17. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  18. Regeneration of Bombyx mori silk nanofibers and nanocomposite fibrils by the electrospinning process

    NASA Astrophysics Data System (ADS)

    Ayutsede, Jonathan Eyitouyo

    In recent years, there has been significant interest in the utilization of natural materials for novel nanoproducts such as tissue engineered scaffolds. Silkworm silk fibers represent one of the strongest natural fibers known. Silkworm silk, a protein-based natural biopolymer, has received renewed interest in recent years due to its unique properties (strength, toughness) and potential applications such as smart textiles, protective clothing and tissue engineering. The traditional 10--20 mum diameter, triangular-shaped Bombyx mori fibers have remained unchanged over the years. However, in our study, we examine the scientific implication and potential applications of reducing the diameter to the nanoscale, changing the triangular shape of the fiber and adding nanofillers in the form of single wall carbon nanotubes (SWNT) by the electrospinning process. The electrospinning process preserves the natural conformation of the silk (random and beta-sheet). The feasibility of changing the properties of the electrospun nanofibers by post processing treatments (annealing and chemical treatment) was investigated. B. mori silk fibroin solution (formic acid) was successfully electrospun to produce uniform nanofibers (as small as 12 nm). Response Surface Methodology (RSM) was applied for the first time to experimental results of electrospinning, to develop a processing window that can reproduce regenerated silk nanofibers of a predictable size (d < 100nm). SWNT-silk multifunctional nanocomposite fibers were fabricated for the first time with anticipated properties (mechanical, thermal and electrically conductive) that may have scientific applications (nerve regeneration, stimulation of cell-scaffold interaction). In order to realize these applications, the following areas need to be addressed: a systematic investigation of the dispersion of the nanotubes in the silk matrix, a determination of new methodologies for characterizing the nanofiber properties and establishing the nature of the silk-SWNT interactions. A new visualization system was developed to characterize the transport properties of the nanofibrous assemblies. The morphological, chemical, structural and mechanical properties of the nanofibers were determined by field emission environmental scanning microscopy, Fourier transform infrared and Raman spectroscopy, wide angle x-ray diffraction and microtensile tester respectively.

  19. Modelling Ecological Cognitive Rehabilitation Therapies for Building Virtual Environments in Brain Injury.

    PubMed

    Martínez-Moreno, J M; Sánchez-González, P; Luna, M; Roig, T; Tormos, J M; Gómez, E J

    2016-01-01

    Brain Injury (BI) has become one of the most common causes of neurological disability in developed countries. Cognitive disorders result in a loss of independence and patients' quality of life. Cognitive rehabilitation aims to promote patients' skills to achieve their highest degree of personal autonomy. New technologies such as virtual reality or interactive video allow developing rehabilitation therapies based on reproducible Activities of Daily Living (ADLs), increasing the ecological validity of the therapy. However, the lack of frameworks to formalize and represent the definition of this kind of therapies can be a barrier for widespread use of interactive virtual environments in clinical routine. To provide neuropsychologists with a methodology and an instrument to design and evaluate cognitive rehabilitation therapeutic interventions strategies based on ADLs performed in interactive virtual environments. The proposed methodology is used to model therapeutic interventions during virtual ADLs considering cognitive deficit, expected abnormal interactions and therapeutic hypotheses. It allows identifying abnormal behavioural patterns and designing interventions strategies in order to achieve errorless-based rehabilitation. An ADL case study ('buying bread') is defined according to the guidelines established by the ADL intervention model. This case study is developed, as a proof of principle, using interactive video technology and is used to assess the feasibility of the proposed methodology in the definition of therapeutic intervention procedures. The proposed methodology provides neuropsychologists with an instrument to design and evaluate ADL-based therapeutic intervention strategies, attending to solve actual limitation of virtual scenarios, to be use for ecological rehabilitation of cognitive deficit in daily clinical practice. The developed case study proves the potential of the methodology to design therapeutic interventions strategies; however our current work is devoted to designing more experiments in order to present more evidence about its values.

  20. Rational Design Methodology.

    DTIC Science & Technology

    1978-09-01

    This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a

  1. Electronic structure, dielectric response, and surface charge distribution of RGD (1FUV) peptide.

    PubMed

    Adhikari, Puja; Wen, Amy M; French, Roger H; Parsegian, V Adrian; Steinmetz, Nicole F; Podgornik, Rudolf; Ching, Wai-Yim

    2014-07-08

    Long and short range molecular interactions govern molecular recognition and self-assembly of biological macromolecules. Microscopic parameters in the theories of these molecular interactions are either phenomenological or need to be calculated within a microscopic theory. We report a unified methodology for the ab initio quantum mechanical (QM) calculation that yields all the microscopic parameters, namely the partial charges as well as the frequency-dependent dielectric response function, that can then be taken as input for macroscopic theories of electrostatic, polar, and van der Waals-London dispersion intermolecular forces. We apply this methodology to obtain the electronic structure of the cyclic tripeptide RGD-4C (1FUV). This ab initio unified methodology yields the relevant parameters entering the long range interactions of biological macromolecules, providing accurate data for the partial charge distribution and the frequency-dependent dielectric response function of this peptide. These microscopic parameters determine the range and strength of the intricate intermolecular interactions between potential docking sites of the RGD-4C ligand and its integrin receptor.

  2. End State: The Fallacy of Modern Military Planning

    DTIC Science & Technology

    2017-04-06

    operational planning for non -linear, complex scenarios requires application of non -linear, advanced planning techniques such as design methodology ...cannot be approached in a linear, mechanistic manner by a universal planning methodology . Theater/global campaign plans and theater strategies offer no...strategic environments, and instead prescribes a universal linear methodology that pays no mind to strategic complexity. This universal application

  3. Application of an Integrated Methodology for Propulsion and Airframe Control Design to a STOVL Aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane

    1994-01-01

    An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.

  4. Concurrent measurement of "real-world" stress and arousal in individuals with psychosis: assessing the feasibility and validity of a novel methodology.

    PubMed

    Kimhy, David; Delespaul, Philippe; Ahn, Hongshik; Cai, Shengnan; Shikhman, Marina; Lieberman, Jeffrey A; Malaspina, Dolores; Sloan, Richard P

    2010-11-01

    Psychosis has been repeatedly suggested to be affected by increases in stress and arousal. However, there is a dearth of evidence supporting the temporal link between stress, arousal, and psychosis during "real-world" functioning. This paucity of evidence may stem from limitations of current research methodologies. Our aim is to the test the feasibility and validity of a novel methodology designed to measure concurrent stress and arousal in individuals with psychosis during "real-world" daily functioning. Twenty patients with psychosis completed a 36-hour ambulatory assessment of stress and arousal. We used experience sampling method with palm computers to assess stress (10 times per day, 10 AM → 10 PM) along with concurrent ambulatory measurement of cardiac autonomic regulation using a Holter monitor. The clocks of the palm computer and Holter monitor were synchronized, allowing the temporal linking of the stress and arousal data. We used power spectral analysis to determine the parasympathetic contributions to autonomic regulation and sympathovagal balance during 5 minutes before and after each experience sample. Patients completed 79% of the experience samples (75% with a valid concurrent arousal data). Momentary increases in stress had inverse correlation with concurrent parasympathetic activity (ρ = -.27, P < .0001) and positive correlation with sympathovagal balance (ρ = .19, P = .0008). Stress and heart rate were not significantly related (ρ = -.05, P = .3875). The findings support the feasibility and validity of our methodology in individuals with psychosis. The methodology offers a novel way to study in high time resolution the concurrent, "real-world" interactions between stress, arousal, and psychosis. The authors discuss the methodology's potential applications and future research directions.

  5. Selection of organisms for the co-evolution-based study of protein interactions.

    PubMed

    Herman, Dorota; Ochoa, David; Juan, David; Lopez, Daniel; Valencia, Alfonso; Pazos, Florencio

    2011-09-12

    The prediction and study of protein interactions and functional relationships based on similarity of phylogenetic trees, exemplified by the mirrortree and related methodologies, is being widely used. Although dependence between the performance of these methods and the set of organisms used to build the trees was suspected, so far nobody assessed it in an exhaustive way, and, in general, previous works used as many organisms as possible. In this work we asses the effect of using different sets of organism (chosen according with various phylogenetic criteria) on the performance of this methodology in detecting protein interactions of different nature. We show that the performance of three mirrortree-related methodologies depends on the set of organisms used for building the trees, and it is not always directly related to the number of organisms in a simple way. Certain subsets of organisms seem to be more suitable for the predictions of certain types of interactions. This relationship between type of interaction and optimal set of organism for detecting them makes sense in the light of the phylogenetic distribution of the organisms and the nature of the interactions. In order to obtain an optimal performance when predicting protein interactions, it is recommended to use different sets of organisms depending on the available computational resources and data, as well as the type of interactions of interest.

  6. Selection of organisms for the co-evolution-based study of protein interactions

    PubMed Central

    2011-01-01

    Background The prediction and study of protein interactions and functional relationships based on similarity of phylogenetic trees, exemplified by the mirrortree and related methodologies, is being widely used. Although dependence between the performance of these methods and the set of organisms used to build the trees was suspected, so far nobody assessed it in an exhaustive way, and, in general, previous works used as many organisms as possible. In this work we asses the effect of using different sets of organism (chosen according with various phylogenetic criteria) on the performance of this methodology in detecting protein interactions of different nature. Results We show that the performance of three mirrortree-related methodologies depends on the set of organisms used for building the trees, and it is not always directly related to the number of organisms in a simple way. Certain subsets of organisms seem to be more suitable for the predictions of certain types of interactions. This relationship between type of interaction and optimal set of organism for detecting them makes sense in the light of the phylogenetic distribution of the organisms and the nature of the interactions. Conclusions In order to obtain an optimal performance when predicting protein interactions, it is recommended to use different sets of organisms depending on the available computational resources and data, as well as the type of interactions of interest. PMID:21910884

  7. Hazard interactions and interaction networks (cascades) within multi-hazard methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2016-08-01

    This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  8. When hydroquinone meets methoxy radical: Hydrogen abstraction reaction from the viewpoint of interacting quantum atoms.

    PubMed

    Petković, Milena; Nakarada, Đura; Etinski, Mihajlo

    2018-05-25

    Interacting Quantum Atoms methodology is used for a detailed analysis of hydrogen abstraction reaction from hydroquinone by methoxy radical. Two pathways are analyzed, which differ in the orientation of the reactants at the corresponding transition states. Although the discrepancy between the two barriers amounts to only 2 kJ/mol, which implies that the two pathways are of comparable probability, the extent of intra-atomic and inter-atomic energy changes differs considerably. We thus demonstrated that Interacting Quantum Atoms procedure can be applied to unravel distinct energy transfer routes in seemingly similar mechanisms. Identification of energy components with the greatest contribution to the variation of the overall energy (intra-atomic and inter-atomic terms that involve hydroquinone's oxygen and the carbon atom covalently bound to it, the transferring hydrogen and methoxy radical's oxygen), is performed using the Relative energy gradient method. Additionally, the Interacting Quantum Fragments approach shed light on the nature of dominant interactions among selected fragments: both Coulomb and exchange-correlation contributions are of comparable importance when considering interactions of the transferring hydrogen atom with all other atoms, whereas the exchange-correlation term dominates interaction between methoxy radical's methyl group and hydroquinone's aromatic ring. This study represents one of the first applications of Interacting Quantum Fragments approach on first order saddle points. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  9. TargetNet: a web service for predicting potential drug-target interaction profiling via multi-target SAR models.

    PubMed

    Yao, Zhi-Jiang; Dong, Jie; Che, Yu-Jing; Zhu, Min-Feng; Wen, Ming; Wang, Ning-Ning; Wang, Shan; Lu, Ai-Ping; Cao, Dong-Sheng

    2016-05-01

    Drug-target interactions (DTIs) are central to current drug discovery processes and public health fields. Analyzing the DTI profiling of the drugs helps to infer drug indications, adverse drug reactions, drug-drug interactions, and drug mode of actions. Therefore, it is of high importance to reliably and fast predict DTI profiling of the drugs on a genome-scale level. Here, we develop the TargetNet server, which can make real-time DTI predictions based only on molecular structures, following the spirit of multi-target SAR methodology. Naïve Bayes models together with various molecular fingerprints were employed to construct prediction models. Ensemble learning from these fingerprints was also provided to improve the prediction ability. When the user submits a molecule, the server will predict the activity of the user's molecule across 623 human proteins by the established high quality SAR model, thus generating a DTI profiling that can be used as a feature vector of chemicals for wide applications. The 623 SAR models related to 623 human proteins were strictly evaluated and validated by several model validation strategies, resulting in the AUC scores of 75-100 %. We applied the generated DTI profiling to successfully predict potential targets, toxicity classification, drug-drug interactions, and drug mode of action, which sufficiently demonstrated the wide application value of the potential DTI profiling. The TargetNet webserver is designed based on the Django framework in Python, and is freely accessible at http://targetnet.scbdd.com .

  10. TargetNet: a web service for predicting potential drug-target interaction profiling via multi-target SAR models

    NASA Astrophysics Data System (ADS)

    Yao, Zhi-Jiang; Dong, Jie; Che, Yu-Jing; Zhu, Min-Feng; Wen, Ming; Wang, Ning-Ning; Wang, Shan; Lu, Ai-Ping; Cao, Dong-Sheng

    2016-05-01

    Drug-target interactions (DTIs) are central to current drug discovery processes and public health fields. Analyzing the DTI profiling of the drugs helps to infer drug indications, adverse drug reactions, drug-drug interactions, and drug mode of actions. Therefore, it is of high importance to reliably and fast predict DTI profiling of the drugs on a genome-scale level. Here, we develop the TargetNet server, which can make real-time DTI predictions based only on molecular structures, following the spirit of multi-target SAR methodology. Naïve Bayes models together with various molecular fingerprints were employed to construct prediction models. Ensemble learning from these fingerprints was also provided to improve the prediction ability. When the user submits a molecule, the server will predict the activity of the user's molecule across 623 human proteins by the established high quality SAR model, thus generating a DTI profiling that can be used as a feature vector of chemicals for wide applications. The 623 SAR models related to 623 human proteins were strictly evaluated and validated by several model validation strategies, resulting in the AUC scores of 75-100 %. We applied the generated DTI profiling to successfully predict potential targets, toxicity classification, drug-drug interactions, and drug mode of action, which sufficiently demonstrated the wide application value of the potential DTI profiling. The TargetNet webserver is designed based on the Django framework in Python, and is freely accessible at http://targetnet.scbdd.com.

  11. Application of the Interacting Quantum Atoms Approach to the S66 and Ionic-Hydrogen-Bond Datasets for Noncovalent Interactions.

    PubMed

    Suárez, Dimas; Díaz, Natalia; Francisco, Evelio; Martín Pendás, Angel

    2018-04-17

    The interacting quantum atoms (IQA) method can assess, systematically and in great detail, the strength and physics of both covalent and noncovalent interactions. The lack of a pair density in density functional theory (DFT), which precludes the direct IQA decomposition of the characteristic exchange-correlation energy, has been recently overcome by means of a scaling technique, which can largely expand the applicability of the method. To better assess the utility of the augmented IQA methodology to derive quantum chemical decompositions at the atomic and molecular levels, we report the results of Hartree-Fock (HF) and DFT calculations on the complexes included in the S66 and the ionic H-bond databases of benchmark geometry and binding energies. For all structures, we perform single-point and geometry optimizations using HF and selected DFT methods with triple-ζ basis sets followed by full IQA calculations. Pairwise dispersion energies are accounted for by the D3 method. We analyze the goodness of the HF-D3 and DFT-D3 binding energies, the magnitude of numerical errors, the fragment and atomic distribution of formation energies, etc. It is shown that fragment-based IQA decomposes the formation energies in comparable terms to those of perturbative approaches and that the atomic IQA energies hold the promise of rigorously quantifying atomic and group energy contributions in larger biomolecular systems. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms

    NASA Technical Reports Server (NTRS)

    Kurdila, Andrew J.; Sharpley, Robert C.

    1999-01-01

    This paper presents a final report on Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms. The focus of this research is to derive and implement: 1) Wavelet based methodologies for the compression, transmission, decoding, and visualization of three dimensional finite element geometry and simulation data in a network environment; 2) methodologies for interactive algorithm monitoring and tracking in computational mechanics; and 3) Methodologies for interactive algorithm steering for the acceleration of large scale finite element simulations. Also included in this report are appendices describing the derivation of wavelet based Particle Image Velocity algorithms and reduced order input-output models for nonlinear systems by utilizing wavelet approximations.

  13. Prototype integration of the joint munitions assessment and planning model with the OSD threat methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynn, R.Y.S.; Bolmarcich, J.J.

    The purpose of this Memorandum is to propose a prototype procedure which the Office of Munitions might employ to exercise, in a supportive joint fashion, two of its High Level Conventional Munitions Models, namely, the OSD Threat Methodology and the Joint Munitions Assessment and Planning (JMAP) model. The joint application of JMAP and the OSD Threat Methodology provides a tool to optimize munitions stockpiles. The remainder of this Memorandum comprises five parts. The first is a description of the structure and use of the OSD Threat Methodology. The second is a description of JMAP and its use. The third discussesmore » the concept of the joint application of JMAP and OSD Threat Methodology. The fourth displays sample output of the joint application. The fifth is a summary and epilogue. Finally, three appendices contain details of the formulation, data, and computer code.« less

  14. Basic principles, methodology, and applications of remote sensing in agriculture

    NASA Technical Reports Server (NTRS)

    Moreira, M. A. (Principal Investigator); Deassuncao, G. V.

    1984-01-01

    The basic principles of remote sensing applied to agriculture and the methods used in data analysis are described. Emphasis is placed on the importance of developing a methodology that may help crop forecast, basic concepts of spectral signatures of vegetation, the methodology of the LANDSAT data utilization in agriculture, and the remote sensing program application of INPE (Institute for Space Research) in agriculture.

  15. Additive Manufacturing of Functional Elements on Sheet Metal

    NASA Astrophysics Data System (ADS)

    Schaub, Adam; Ahuja, Bhrigu; Butzhammer, Lorenz; Osterziel, Johannes; Schmidt, Michael; Merklein, Marion

    Laser Beam Melting (LBM) process with its advantages of high design flexibility and free form manufacturing methodology is often applied limitedly due to its low productivity and unsuitability for mass production compared to conventional manufacturing processes. In order to overcome these limitations, a hybrid manufacturing methodology is developed combining the additive manufacturing process of laser beam melting with sheet forming processes. With an interest towards aerospace and medical industry, the material in focus is Ti-6Al-4V. Although Ti-6Al-4V is a commercially established material and its application for LBM process has been extensively investigated, the combination of LBM of Ti-6Al-4V with sheet metal still needs to be researched. Process dynamics such as high temperature gradients and thermally induced stresses lead to complex stress states at the interaction zone between the sheet and LBM structure. Within the presented paper mechanical characterization of hybrid parts will be performed by shear testing. The association of shear strength with process parameters is further investigated by analyzing the internal structure of the hybrid geometry at varying energy inputs during the LBM process. In order to compare the hybrid manufacturing methodology with conventional fabrication, the conventional methodologies subtractive machining and state of the art Laser Beam Melting is evaluated within this work. These processes will be analyzed for their mechanical characteristics and productivity by determining the build time and raw material consumption for each case. The paper is concluded by presenting the characteristics of the hybrid manufacturing methodology compared to alternative manufacturing technologies.

  16. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  17. CFD methodology and validation for turbomachinery flows

    NASA Astrophysics Data System (ADS)

    Hirsch, Ch.

    1994-05-01

    The essential problem today, in the application of 3D Navier-Stokes simulations to the design and analysis of turbomachinery components, is the validation of the numerical approximation and of the physical models, in particular the turbulence modelling. Although most of the complex 3D flow phenomena occurring in turbomachinery bladings can be captured with relatively coarse meshes, many detailed flow features are dependent on mesh size, on the turbulence and transition models. A brief review of the present state of the art of CFD methodology is given with emphasis on quality and accuracy of numerical approximations related to viscous flow computations. Considerations related to the mesh influence on solution accuracy are stressed. The basic problems of turbulence and transition modelling are discussed next, with a short summary of the main turbulence models and their applications to representative turbomachinery flows. Validations of present turbulence models indicate that none of the available turbulence models is able to predict all the detailed flow behavior in complex flow interactions. In order to identify the phenomena that can be captured on coarser meshes a detailed understanding of the complex 3D flow in compressor and turbines is necessary. Examples of global validations for different flow configurations, representative of compressor and turbine aerodynamics are presented, including secondary and tip clearance flows.

  18. TEMPO-Oxidized Nanofibrillated Cellulose as a High Density Carrier for Bioactive Molecules.

    PubMed

    Weishaupt, Ramon; Siqueira, Gilberto; Schubert, Mark; Tingaut, Philippe; Maniura-Weber, Katharina; Zimmermann, Tanja; Thöny-Meyer, Linda; Faccio, Greta; Ihssen, Julian

    2015-11-09

    Controlled and efficient immobilization of specific biomolecules is a key technology to introduce new, favorable functions to materials suitable for biomedical applications. Here, we describe an innovative and efficient, two-step methodology for the stable immobilization of various biomolecules, including small peptides and enzymes onto TEMPO oxidized nanofibrillated cellulose (TO-NFC). The introduction of carboxylate groups to NFC by TEMPO oxidation provided a high surface density of negative charges able to drive the adsorption of biomolecules and take part in covalent cross-linking reactions with 1-ethyl-3-[3-(dimethylamino)propyl]carbodiimide (EDAC) and glutaraldehyde (Ga) chemistry. Up to 0.27 μmol of different biomolecules per mg of TO-NFC could be reversibly immobilized by electrostatic interaction. An additional chemical cross-linking step prevented desorption of more than 80% of these molecules. Using the cysteine-protease papain as model, a highly active papain-TO-NFC conjugate was achieved. Once papain was immobilized, 40% of the initial enzymatic activity was retained, with an increase in kcat from 213 to >700 s(-1) for the covalently immobilized enzymes. The methodology presented in this work expands the range of application for TO-NFC in the biomedical field by enabling well-defined hybrid biomaterials with a high density of functionalization.

  19. Conceptual compression discussion on a multi-linear (FTA) and systematic (FRAM) method in an offshore operation's accident modeling.

    PubMed

    Toroody, Ahmad Bahoo; Abaei, Mohammad Mahdy; Gholamnia, Reza

    2016-12-01

    Risk assessment can be classified into two broad categories: traditional and modern. This paper is aimed at contrasting the functional resonance analysis method (FRAM) as a modern approach with the fault tree analysis (FTA) as a traditional method, regarding assessing the risks of a complex system. Applied methodology by which the risk assessment is carried out, is presented in each approach. Also, FRAM network is executed with regard to nonlinear interaction of human and organizational levels to assess the safety of technological systems. The methodology is implemented for lifting structures deep offshore. The main finding of this paper is that the combined application of FTA and FRAM during risk assessment, could provide complementary perspectives and may contribute to a more comprehensive understanding of an incident. Finally, it is shown that coupling a FRAM network with a suitable quantitative method will result in a plausible outcome for a predefined accident scenario.

  20. Description and first application of a new technique to measure the gravitational mass of antihydrogen

    NASA Astrophysics Data System (ADS)

    Alpha Collaboration; Amole, C.; Ashkezari, M. D.; Baquero-Ruiz, M.; Bertsche, W.; Butler, E.; Capra, A.; Cesar, C. L.; Charlton, M.; Eriksson, S.; Fajans, J.; Friesen, T.; Fujiwara, M. C.; Gill, D. R.; Gutierrez, A.; Hangst, J. S.; Hardy, W. N.; Hayden, M. E.; Isaac, C. A.; Jonsell, S.; Kurchaninov, L.; Little, A.; Madsen, N.; McKenna, J. T. K.; Menary, S.; Napoli, S. C.; Nolan, P.; Olin, A.; Pusa, P.; Rasmussen, C. Ø.; Robicheaux, F.; Sarid, E.; Silveira, D. M.; So, C.; Thompson, R. I.; van der Werf, D. P.; Wurtele, J. S.; Zhmoginov, A. I.; Charman, A. E.

    2013-04-01

    Physicists have long wondered whether the gravitational interactions between matter and antimatter might be different from those between matter and itself. Although there are many indirect indications that no such differences exist and that the weak equivalence principle holds, there have been no direct, free-fall style, experimental tests of gravity on antimatter. Here we describe a novel direct test methodology; we search for a propensity for antihydrogen atoms to fall downward when released from the ALPHA antihydrogen trap. In the absence of systematic errors, we can reject ratios of the gravitational to inertial mass of antihydrogen >75 at a statistical significance level of 5% worst-case systematic errors increase the minimum rejection ratio to 110. A similar search places somewhat tighter bounds on a negative gravitational mass, that is, on antigravity. This methodology, coupled with ongoing experimental improvements, should allow us to bound the ratio within the more interesting near equivalence regime.

  1. Description and first application of a new technique to measure the gravitational mass of antihydrogen

    PubMed Central

    Amole, C.; Ashkezari, M. D.; Baquero-Ruiz, M.; Bertsche, W.; Butler, E.; Capra, A.; Cesar, C. L.; Charlton, M.; Eriksson, S.; Fajans, J.; Friesen, T.; Fujiwara, M. C.; Gill, D. R.; Gutierrez, A.; Hangst, J. S.; Hardy, W. N.; Hayden, M. E.; Isaac, C. A.; Jonsell, S.; Kurchaninov, L.; Little, A.; Madsen, N.; McKenna, J. T. K.; Menary, S.; Napoli, S. C.; Nolan, P.; Olin, A.; Pusa, P.; Rasmussen, C. Ø; Robicheaux, F.; Sarid, E.; Silveira, D. M.; So, C.; Thompson, R. I.; van der Werf, D. P.; Wurtele, J. S.; Zhmoginov, A. I.; Charman, A. E.

    2013-01-01

    Physicists have long wondered whether the gravitational interactions between matter and antimatter might be different from those between matter and itself. Although there are many indirect indications that no such differences exist and that the weak equivalence principle holds, there have been no direct, free-fall style, experimental tests of gravity on antimatter. Here we describe a novel direct test methodology; we search for a propensity for antihydrogen atoms to fall downward when released from the ALPHA antihydrogen trap. In the absence of systematic errors, we can reject ratios of the gravitational to inertial mass of antihydrogen >75 at a statistical significance level of 5%; worst-case systematic errors increase the minimum rejection ratio to 110. A similar search places somewhat tighter bounds on a negative gravitational mass, that is, on antigravity. This methodology, coupled with ongoing experimental improvements, should allow us to bound the ratio within the more interesting near equivalence regime. PMID:23653197

  2. Structural Loads Analysis for Wave Energy Converters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi

    2017-06-03

    This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process.« less

  3. Satellite Vulnerability to Space Debris- An Improved 3D Risk Assessment Methodology

    NASA Astrophysics Data System (ADS)

    Grassi, Lilith; Destefanis, Roberto; Tiboldo, Francesca; Donath, Therese; Winterboer, Arne; Evand, Leanne; Janovsky, Rolf; Kempf, Scott; Rudolph, Martin; Schafer, Frank; Gelhaus, Johannes

    2013-08-01

    The work described in the present paper, performed as a part of the PÇ-ROTECT project, presents an enhanced method to evaluate satellite vulnerability to micrometeoroids and orbital debris (MMOD), using the ESABASE2/Debris tool (developed under ESA contract). Starting from the estimation of induced failures on spacecraft (S/C) components and from the computation of lethal impacts (with an energy leading to the loss of the satellite), and considering the equipment redundancies and interactions between components, the debris-induced S/C functional impairment is assessed. The developed methodology, illustrated through its application to a case study satellite, includes the capability to estimate the number of failures on internal components, overcoming the limitations of current tools which do not allow propagating the debris cloud inside the S/C. The ballistic limit of internal equipment behind a sandwich panel structure is evaluated through the implementation of the Schäfer Ryan Lambert (SRL) Ballistic Limit Equation (BLE).

  4. Computation of three-dimensional nozzle-exhaust flow fields with the GIM code

    NASA Technical Reports Server (NTRS)

    Spradley, L. W.; Anderson, P. G.

    1978-01-01

    A methodology is introduced for constructing numerical analogs of the partial differential equations of continuum mechanics. A general formulation is provided which permits classical finite element and many of the finite difference methods to be derived directly. The approach, termed the General Interpolants Method (GIM), can combined the best features of finite element and finite difference methods. A quasi-variational procedure is used to formulate the element equations, to introduce boundary conditions into the method and to provide a natural assembly sequence. A derivation is given in terms of general interpolation functions from this procedure. Example computations for transonic and supersonic flows in two and three dimensions are given to illustrate the utility of GIM. A three-dimensional nozzle-exhaust flow field is solved including interaction with the freestream and a coupled treatment of the shear layer. Potential applications of the GIM code to a variety of computational fluid dynamics problems is then discussed in terms of existing capability or by extension of the methodology.

  5. Application of response surface methodology for optimization of polygalacturonase production by Aspergillus niger.

    PubMed

    Yadav, Kaushlesh K; Garg, Neelima; Kumar, Devendra; Kumar, Sanjay; Singh, Achal; Muthukumar, M

    2015-01-01

    Polygalacturonase (PG) degrades pectin into D-galacturonic acid monomers and is used widely in food industry especially for juice clarification. In the present study,. fermentation conditions for polygalacturonase production by Asgergillus niger NAIMCCF-02958, using mango peel as substrate, were optimized using the 2(3) factorial design with central composite rotatable experimental design (CCRD) of response surface methodology (RSM). The maximum PG activity 723.66 U g(-1) was achieved under pH 4.0, temperature 30 degrees C and 2% inoculum by response surface curve. The experimental value of PG activity wkas higher 607.65 U g(-1) than the predicted value 511.75 U g(-1). Under the proposed optimized conditions, the determination coefficient (R2) was equal to 0.66 indicating that the model could explain 66% of the total variation as well as establish the relationship between the variables and the responses. ANOVA analysis and the three dimensional plots also confirmed interactions among the parameters.

  6. Description and first application of a new technique to measure the gravitational mass of antihydrogen.

    PubMed

    Charman, A E; Amole, C; Ashkezari, M D; Baquero-Ruiz, M; Bertsche, W; Butler, E; Capra, A; Cesar, C L; Charlton, M; Eriksson, S; Fajans, J; Friesen, T; Fujiwara, M C; Gill, D R; Gutierrez, A; Hangst, J S; Hardy, W N; Hayden, M E; Isaac, C A; Jonsell, S; Kurchaninov, L; Little, A; Madsen, N; McKenna, J T K; Menary, S; Napoli, S C; Nolan, P; Olin, A; Pusa, P; Rasmussen, C Ø; Robicheaux, F; Sarid, E; Silveira, D M; So, C; Thompson, R I; van der Werf, D P; Wurtele, J S; Zhmoginov, A I

    2013-01-01

    Physicists have long wondered whether the gravitational interactions between matter and antimatter might be different from those between matter and itself. Although there are many indirect indications that no such differences exist and that the weak equivalence principle holds, there have been no direct, free-fall style, experimental tests of gravity on antimatter. Here we describe a novel direct test methodology; we search for a propensity for antihydrogen atoms to fall downward when released from the ALPHA antihydrogen trap. In the absence of systematic errors, we can reject ratios of the gravitational to inertial mass of antihydrogen >75 at a statistical significance level of 5%; worst-case systematic errors increase the minimum rejection ratio to 110. A similar search places somewhat tighter bounds on a negative gravitational mass, that is, on antigravity. This methodology, coupled with ongoing experimental improvements, should allow us to bound the ratio within the more interesting near equivalence regime.

  7. Application of anaerobic granular sludge for competitive biosorption of methylene blue and Pb(II): Fluorescence and response surface methodology.

    PubMed

    Shi, Li; Wei, Dong; Ngo, Huu Hao; Guo, Wenshan; Du, Bin; Wei, Qin

    2015-10-01

    This study assessed the biosorption of anaerobic granular sludge (AGS) and its capacity as a biosorbent to remove Pb(II) and methylene blue (MB) from multi-components aqueous solution. It emerged that the biosorption data fitted well to the pseudo-second-order and Langmuir adsorption isotherm models in both single and binary systems. In competitive biosorption systems, Pb(II) and MB will suppress each other's biosorption capacity. Spectroscopic analysis, including Fourier transform infrared spectroscopy (FTIR) and fluorescence spectroscopy were integrated to explain this interaction. Hydroxyl and amine groups in AGS were the key functional groups for sorption. Three-dimensional excitation-emission matrix (3D-EEM) implied that two main protein-like substances were identified and quenched when Pb(II) or MB were present. Response surface methodology (RSM) confirmed that the removal efficiency of Pb(II) and MB reached its peak when the concentration ratios of Pb(II) and MB achieved a constant value of 1. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Multivariate Methods for Meta-Analysis of Genetic Association Studies.

    PubMed

    Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G

    2018-01-01

    Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.

  9. An overview on the use of backscattered sound for measuring suspended particle size and concentration profiles in non-cohesive inorganic sediment transport studies

    NASA Astrophysics Data System (ADS)

    Thorne, Peter D.; Hurther, David

    2014-02-01

    For over two decades, coastal marine scientists studying boundary layer sediment transport processes have been using, and developing, the application of sound for high temporal-spatial resolution measurements of suspended particle size and concentration profiles. To extract the suspended sediment parameters from the acoustic data requires an understanding of the interaction of sound with a suspension of sediments and an inversion methodology. This understanding is distributed around journals in a number of scientific fields and there is no single article that succinctly draws together the different components. In the present work the aim is to provide an overview on the acoustic approach to measuring suspended sediment parameters and assess its application in the study of non-cohesive inorganic suspended sediment transport processes.

  10. Applications of a constrained mechanics methodology in economics

    NASA Astrophysics Data System (ADS)

    Janová, Jitka

    2011-11-01

    This paper presents instructive interdisciplinary applications of constrained mechanics calculus in economics on a level appropriate for undergraduate physics education. The aim of the paper is (i) to meet the demand for illustrative examples suitable for presenting the background of the highly expanding research field of econophysics even at the undergraduate level and (ii) to enable the students to gain a deeper understanding of the principles and methods routinely used in mechanics by looking at the well-known methodology from the different perspective of economics. Two constrained dynamic economic problems are presented using the economic terminology in an intuitive way. First, the Phillips model of the business cycle is presented as a system of forced oscillations and the general problem of two interacting economies is solved by the nonholonomic dynamics approach. Second, the Cass-Koopmans-Ramsey model of economical growth is solved as a variational problem with a velocity-dependent constraint using the vakonomic approach. The specifics of the solution interpretation in economics compared to mechanics is discussed in detail, a discussion of the nonholonomic and vakonomic approaches to constrained problems in mechanics and economics is provided and an economic interpretation of the Lagrange multipliers (possibly surprising for the students of physics) is carefully explained. This paper can be used by the undergraduate students of physics interested in interdisciplinary physics applications to gain an understanding of the current scientific approach to economics based on a physical background, or by university teachers as an attractive supplement to classical mechanics lessons.

  11. Bringing the Unidata IDV to the Cloud

    NASA Astrophysics Data System (ADS)

    Fisher, W. I.; Oxelson Ganter, J.

    2015-12-01

    Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While traditional software engineering provides a suite of tools and methodologies which may mitigate this issue, they are typically ignored by developers lacking a background in software engineering. Causing further problems, these methodologies are best applied at the start of project; trying to apply them to an existing, mature project can require an immense effort. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. As a result of these issues, there exists a large body of software which is simultaneously critical to the scientists who are dependent upon it, and yet increasingly difficult to maintain.The solution to this problem was partially provided with the advent of Cloud Computing; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. When coupled with containerization technology such as Docker, we are able to easily bring the same visualization software to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be.Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  12. Metal Based Synthetic Strategies and the Examination of Structure Determining Factors in Alkaline Earth Metal Compounds

    NASA Astrophysics Data System (ADS)

    Takahashi, Yuriko

    Last decades have witnessed a large expansion of the organometallic heavier alkaline earth metal species. However, continued growth of this promising area of chemistry has been slowed by severe restrictions and limitations in viable synthetic methodologies leading to difficulties in preparing and characterizing the target compounds. There is clearly a need for the further development of synthetic methodologies and detailed structure function analysis that will promote the further advancement of organoalkaline earth metal chemistry in applications as diverse as materials chemistry and catalysis. This thesis work greatly extends the synthetic options currently available towards organoalkaline earth metal species by introducing redox transmetallation protolysis (RTP), a reaction based on the readily available Ph3Bi as a non-toxic transmetallation agent. Based on a straightforward one-pot procedure and work-up, Ph3Bi based RTP presents a powerful synthetic alternative for the facile preparation of a large variety of heavy alkaline earth metal compounds. The second part of the thesis explores the effect of secondary non covalent interactions on the coordination chemistry as well as thermal properties of a series of novel alkali, alkaline earth, rare earth as well as heterobimetallic alkali/alkaline earth fluoroalkoxides. These compounds showcase the significance of non-covalent M···F-C and agostic interactions on metal stabilization and structural features, providing critical input on ligand design for the design of advanced metal organic vapor deposition (MOCVD) precursor materials. This work also showcases the impact of M···F-C interactions over M---co-ligand coordination, a critical precursor design element as well.

  13. Building an adaptive agent to monitor and repair the electrical power system of an orbital satellite

    NASA Technical Reports Server (NTRS)

    Tecuci, Gheorghe; Hieb, Michael R.; Dybala, Tomasz

    1995-01-01

    Over several years we have developed a multistrategy apprenticeship learning methodology for building knowledge-based systems. Recently we have developed and applied our methodology to building intelligent agents. This methodology allows a subject matter expert to build an agent in the same way in which the expert would teach a human apprentice. The expert will give the agent specific examples of problems and solutions, explanations of these solutions, or supervise the agent as it solves new problems. During such interactions, the agent learns general rules and concepts, continuously extending and improving its knowledge base. In this paper we present initial results on applying this methodology to build an intelligent adaptive agent for monitoring and repair of the electrical power system of an orbital satellite, stressing the interaction with the expert during apprenticeship learning.

  14. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  15. Methodology for identification of pore forming antimicrobial peptides from soy protein subunits β-conglycinin and glycinin.

    PubMed

    Xiang, Ning; Lyu, Yuan; Zhu, Xiao; Bhunia, Arun K; Narsimhan, Ganesan

    2016-11-01

    Antimicrobial peptides (AMPs) inactivate microbial cells through pore formation in cell membrane. Because of their different mode of action compared to antibiotics, AMPs can be effectively used to combat drug resistant bacteria in human health. AMPs can also be used to replace antibiotics in animal feed and immobilized on food packaging films. In this research, we developed a methodology based on mechanistic evaluation of peptide-lipid bilayer interaction to identify AMPs from soy protein. Production of AMPs from soy protein is an attractive, cost-saving alternative for commercial consideration, because soy protein is an abundant and common protein resource. This methodology is also applicable for identification of AMPs from any protein. Initial screening of peptide segments from soy glycinin (11S) and soy β-conglycinin (7S) subunits was based on their hydrophobicity, hydrophobic moment and net charge. Delicate balance between hydrophilic and hydrophobic interactions is necessary for pore formation. High hydrophobicity decreases the peptide solubility in aqueous phase whereas high hydrophilicity limits binding of the peptide to the bilayer. Out of several candidates chosen from the initial screening, two peptides satisfied the criteria for antimicrobial activity, viz. (i) lipid-peptide binding in surface state and (ii) pore formation in transmembrane state of the aggregate. This method of identification of antimicrobial activity via molecular dynamics simulation was shown to be robust in that it is insensitive to the number of peptides employed in the simulation, initial peptide structure and force field. Their antimicrobial activity against Listeria monocytogenes and Escherichia coli was further confirmed by spot-on-lawn test. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. SafeNet: a methodology for integrating general-purpose unsafe devices in safe-robot rehabilitation systems.

    PubMed

    Vicentini, Federico; Pedrocchi, Nicola; Malosio, Matteo; Molinari Tosatti, Lorenzo

    2014-09-01

    Robot-assisted neurorehabilitation often involves networked systems of sensors ("sensory rooms") and powerful devices in physical interaction with weak users. Safety is unquestionably a primary concern. Some lightweight robot platforms and devices designed on purpose include safety properties using redundant sensors or intrinsic safety design (e.g. compliance and backdrivability, limited exchange of energy). Nonetheless, the entire "sensory room" shall be required to be fail-safe and safely monitored as a system at large. Yet, sensor capabilities and control algorithms used in functional therapies require, in general, frequent updates or re-configurations, making a safety-grade release of such devices hardly sustainable in cost-effectiveness and development time. As such, promising integrated platforms for human-in-the-loop therapies could not find clinical application and manufacturing support because of lacking in the maintenance of global fail-safe properties. Under the general context of cross-machinery safety standards, the paper presents a methodology called SafeNet for helping in extending the safety rate of Human Robot Interaction (HRI) systems using unsafe components, including sensors and controllers. SafeNet considers, in fact, the robotic system as a device at large and applies the principles of functional safety (as in ISO 13489-1) through a set of architectural procedures and implementation rules. The enabled capability of monitoring a network of unsafe devices through redundant computational nodes, allows the usage of any custom sensors and algorithms, usually planned and assembled at therapy planning-time rather than at platform design-time. A case study is presented with an actual implementation of the proposed methodology. A specific architectural solution is applied to an example of robot-assisted upper-limb rehabilitation with online motion tracking. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Multi-scale characterization of the energy landscape of proteins with application to the C3D/Efb-C complex.

    PubMed

    Haspel, Nurit; Geisbrecht, Brian V; Lambris, John; Kavraki, Lydia

    2010-03-01

    We present a novel multi-level methodology to explore and characterize the low energy landscape and the thermodynamics of proteins. Traditional conformational search methods typically explore only a small portion of the conformational space of proteins and are hard to apply to large proteins due to the large amount of calculations required. In our multi-scale approach, we first provide an initial characterization of the equilibrium state ensemble of a protein using an efficient computational conformational sampling method. We then enrich the obtained ensemble by performing short Molecular Dynamics (MD) simulations on selected conformations from the ensembles as starting points. To facilitate the analysis of the results, we project the resulting conformations on a low-dimensional landscape to efficiently focus on important interactions and examine low energy regions. This methodology provides a more extensive sampling of the low energy landscape than an MD simulation starting from a single crystal structure as it explores multiple trajectories of the protein. This enables us to obtain a broader view of the dynamics of proteins and it can help in understanding complex binding, improving docking results and more. In this work, we apply the methodology to provide an extensive characterization of the bound complexes of the C3d fragment of human Complement component C3 and one of its powerful bacterial inhibitors, the inhibitory domain of Staphylococcus aureus extra-cellular fibrinogen-binding domain (Efb-C) and two of its mutants. We characterize several important interactions along the binding interface and define low free energy regions in the three complexes. Proteins 2010. (c) 2009 Wiley-Liss, Inc.

  18. A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales

    PubMed Central

    Ayton, Gary S.; Voth, Gregory A.

    2009-01-01

    A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167

  19. Synthesis, structure and DFT study of cymantrenyl Fischer carbene complexes of group VI and VII transition metals

    NASA Astrophysics Data System (ADS)

    Fraser, Roan; van Rooyen, Petrus H.; Landman, Marilé

    2016-02-01

    Bi- and trimetallic carbene complexes of group VI and VII transition metals (Cr, Mo, W, Mn and Re), with CpMn(CO)3 as the initial synthon, have been synthesised according to the classical Fischer methodology. Crystal structures of the novel carbene complexes with general formula [Mx(CO)y-1{C(OEt)(MnCp(CO)3)}], where x = 1 then y = 3 or 6; x = 2 then y = 10, of the complexes are reported. A density functional theory (DFT) study was undertaken to determine natural bonding orbitals (NBOs) and conformational as well as isomeric aspects of the polymetallic complexes. Application of the second-order perturbation theory (SOPT) of the natural bond orbital (NBO) method revealed stabilizing interactions between the methylene C-H bonds and the carbonyl ligands of the carbene metal moiety. These stabilization interactions show a linear decrease for the group VI metal carbene complexes down the group.

  20. Methods for simulation-based analysis of fluid-structure interaction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew Franklin; Payne, Jeffrey L.

    2005-10-01

    Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonalmore » decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.« less

  1. Formal verification of a fault tolerant clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  2. A methodology for evaluation of an interactive multispectral image processing system

    NASA Technical Reports Server (NTRS)

    Kovalick, William M.; Newcomer, Jeffrey A.; Wharton, Stephen W.

    1987-01-01

    Because of the considerable cost of an interactive multispectral image processing system, an evaluation of a prospective system should be performed to ascertain if it will be acceptable to the anticipated users. Evaluation of a developmental system indicated that the important system elements include documentation, user friendliness, image processing capabilities, and system services. The criteria and evaluation procedures for these elements are described herein. The following factors contributed to the success of the evaluation of the developmental system: (1) careful review of documentation prior to program development, (2) construction and testing of macromodules representing typical processing scenarios, (3) availability of other image processing systems for referral and verification, and (4) use of testing personnel with an applications perspective and experience with other systems. This evaluation was done in addition to and independently of program testing by the software developers of the system.

  3. The Methodology for Developing Mobile Agent Application for Ubiquitous Environment

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Kazutaka; Yoshioka, Nobukazu; Honiden, Shinichi

    A methodology which enables a flexible and reusable development of mobile agent application to a mobility aware indoor environment is provided in this study. The methodology is named Workflow-awareness model based on a concept of a pair of mobile agents cooperating to perform a given task. A monolithic mobile agent application with numerous concerns in a mobility aware setting is divided into a master agent (MA) and a shadow agent (SA) according to a type of tasks. The MA executes a main application logic which includes monitoring a user's physical movement and coordinating various services. The SA performs additional tasks depending on environments to aid the MA in achieving efficient execution without losing application logic. "Workflow-awareness (WFA)" means that the SA knows the MA's execution state transition so that the SA can provide a proper task at a proper timing. A prototype implementation of the methodology is done with a practical use of AspectJ. AspectJ is used to automate WFA by weaving communication modules to both MA and SA. Usefulness of this methodology concerning its efficiency and software engineering aspects are analyzed. As for the effectiveness, the overhead of WFA is relatively small to the whole expenditure time. And from the view of the software engineering, WFA is possible to provide a mechanism to deploy one application in various situations.

  4. Construction and photophysical properties of organic-inorganic nanonetworks based on oligo(phenylenevinylene) and functionalized gold nanoparticles.

    PubMed

    Yang, Jien; Liu, Xiaofeng; Huang, Changshui; Zhou, Chunjie; Li, Yuliang; Zhu, Daoben

    2010-02-22

    Novel organic-inorganic nanonetworks of oligo(phenylenevinylene) (OPV) and gold nanoparticles (GNPs) have been synthesized by the amine-based epoxide ring-opening reaction. The resulting OPV-GNPs nanocomposites exhibit homogeneous and well-defined interfaces between the organic ligands and the inorganic nanoparticles, thereby promoting efficient electronic interfacial interaction between the two constituents. The functionalized gold nanoparticles serve as chemical reagents for the construction of nanohybrids, while the epoxide-terminated OPV acts as linkage between gold nanoparticles. The new architecture provides a facile methodology for fabrication of novel organic-inorganic nanohybrids under relatively mild conditions, which facilitates further applications of hybrid materials.

  5. Application of ion chromatography in pharmaceutical and drug analysis.

    PubMed

    Jenke, Dennis

    2011-08-01

    Ion chromatography (IC) has developed and matured into an important analytical methodology in a number of diverse applications and industries, including pharmaceuticals. This manuscript provides a review of IC applications for the determinations of active and inactive ingredients, excipients, degradation products, and impurities relevant to pharmaceutical analyses and thus serves as a resource for investigators looking for insights into the use of the IC methodology in this field of application.

  6. Mobile Eye Tracking Methodology in Informal E-Learning in Social Groups in Technology-Enhanced Science Centres

    ERIC Educational Resources Information Center

    Magnussen, Rikke; Zachariassen, Maria; Kharlamov, Nikita; Larsen, Birger

    2017-01-01

    This paper presents a methodological discussion of the potential and challenges of involving mobile eye tracking technology in studies of knowledge generation and learning in a science centre context. The methodological exploration is based on eye-tracking studies of audience interaction and knowledge generation in the technology-enhanced health…

  7. Adapting a Methodology from Mathematics Education Research to Chemistry Education Research: Documenting Collective Activity

    ERIC Educational Resources Information Center

    Cole, Renee; Becker, Nicole; Towns, Marcy; Sweeney, George; Wawro, Megan; Rasmussen, Chris

    2012-01-01

    In this report, we adapt and extend a methodology for documenting the collective production of meaning in a classroom community. A cornerstone of the methodological approach that we develop is a close examination of classroom discourse. Our efforts to analyze the collective production of meaning by examining classroom interaction are compatible…

  8. Research in Modeling and Simulation for Airspace Systems Innovation

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.; Kimmel, William M.; Welch, Sharon S.

    2007-01-01

    This viewgraph presentation provides an overview of some of the applied research and simulation methodologies at the NASA Langley Research Center that support aerospace systems innovation. Risk assessment methodologies, complex systems design and analysis methodologies, and aer ospace operations simulations are described. Potential areas for future research and collaboration using interactive and distributed simula tions are also proposed.

  9. Methodology of management of dredging operations II. Applications.

    PubMed

    Junqua, G; Abriak, N E; Gregoire, P; Dubois, V; Mac Farlane, F; Damidot, D

    2006-04-01

    This paper presents the new methodology of management of dredging operations. Derived partly from existing methodologies (OECD, PNUE, AIPCN), it aims to be more comprehensive, mixing the qualities and the complementarities of previous methodologies. The application of the methodology has been carried out on the site of the Port of Dunkirk (FRANCE). Thus, a characterization of the sediments of this site has allowed a zoning of the Port to be established in to zones of probable homogeneity of sediments. Moreover, sources of pollution have been identified, with an aim of prevention. Ways of waste improvement have also been developed, to answer regional needs, from a point of view of competitive and territorial intelligence. Their development has required a mutualisation of resources between professionals, research centres and local communities, according to principles of industrial ecology. Lastly, a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.) has been used to determine the most relevant scenario (or alternative, or action) for a dredging operation intended by the Port of Dunkirk. These applications have confirmed the relevance of this methodology for the management of dredging operations.

  10. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    NASA Astrophysics Data System (ADS)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  11. Toward Theory-Based Research in Political Communication.

    ERIC Educational Resources Information Center

    Simon, Adam F.; Iyengar, Shanto

    1996-01-01

    Praises the theoretical and methodological potential of the field of political communication. Calls for greater interaction and cross fertilization among the fields of political science, sociology, economics, and psychology. Briefly discusses relevant research methodologies. (MJP)

  12. Mobile Tele-Mental Health: Increasing Applications and a Move to Hybrid Models of Care

    PubMed Central

    Chan, Steven Richard; Torous, John; Hinton, Ladson; Yellowlees, Peter

    2014-01-01

    Mobile telemental health is defined as the use of mobile phones and other wireless devices as applied to psychiatric and mental health practice. Applications of such include treatment monitoring and adherence, health promotion, ecological momentary assessment, and decision support systems. Advantages of mobile telemental health are underscored by its interactivity, just-in-time interventions, and low resource requirements and portability. Challenges in realizing this potential of mobile telemental health include the low penetration rates of health applications on mobile devices in part due to health literacy, the delay in current published research in evaluating newer technologies, and outdated research methodologies. Despite such challenges, one immediate opportunity for mobile telemental health is utilizing mobile devices as videoconferencing mediums for psychotherapy and psychosocial interventions enhanced by novel sensor based monitoring and behavior-prediction algorithms. This paper provides an overview of mobile telemental health and its current trends, as well as future opportunities as applied to patient care in both academic research and commercial ventures. PMID:27429272

  13. Efficient calculation of beyond RPA correlation energies in the dielectric matrix formalism

    NASA Astrophysics Data System (ADS)

    Beuerle, Matthias; Graf, Daniel; Schurkus, Henry F.; Ochsenfeld, Christian

    2018-05-01

    We present efficient methods to calculate beyond random phase approximation (RPA) correlation energies for molecular systems with up to 500 atoms. To reduce the computational cost, we employ the resolution-of-the-identity and a double-Laplace transform of the non-interacting polarization propagator in conjunction with an atomic orbital formalism. Further improvements are achieved using integral screening and the introduction of Cholesky decomposed densities. Our methods are applicable to the dielectric matrix formalism of RPA including second-order screened exchange (RPA-SOSEX), the RPA electron-hole time-dependent Hartree-Fock (RPA-eh-TDHF) approximation, and RPA renormalized perturbation theory using an approximate exchange kernel (RPA-AXK). We give an application of our methodology by presenting RPA-SOSEX benchmark results for the L7 test set of large, dispersion dominated molecules, yielding a mean absolute error below 1 kcal/mol. The present work enables calculating beyond RPA correlation energies for significantly larger molecules than possible to date, thereby extending the applicability of these methods to a wider range of chemical systems.

  14. Development of a Multi-Disciplinary Computing Environment (MDICE)

    NASA Technical Reports Server (NTRS)

    Kingsley, Gerry; Siegel, John M., Jr.; Harrand, Vincent J.; Lawrence, Charles; Luker, Joel J.

    1999-01-01

    The growing need for and importance of multi-component and multi-disciplinary engineering analysis has been understood for many years. For many applications, loose (or semi-implicit) coupling is optimal, and allows the use of various legacy codes without requiring major modifications. For this purpose, CFDRC and NASA LeRC have developed a computational environment to enable coupling between various flow analysis codes at several levels of fidelity. This has been referred to as the Visual Computing Environment (VCE), and is being successfully applied to the analysis of several aircraft engine components. Recently, CFDRC and AFRL/VAAC (WL) have extended the framework and scope of VCE to enable complex multi-disciplinary simulations. The chosen initial focus is on aeroelastic aircraft applications. The developed software is referred to as MDICE-AE, an extensible system suitable for integration of several engineering analysis disciplines. This paper describes the methodology, basic architecture, chosen software technologies, salient library modules, and the current status of and plans for MDICE. A fluid-structure interaction application is described in a separate companion paper.

  15. Mobile Tele-Mental Health: Increasing Applications and a Move to Hybrid Models of Care.

    PubMed

    Chan, Steven Richard; Torous, John; Hinton, Ladson; Yellowlees, Peter

    2014-05-06

    Mobile telemental health is defined as the use of mobile phones and other wireless devices as applied to psychiatric and mental health practice. Applications of such include treatment monitoring and adherence, health promotion, ecological momentary assessment, and decision support systems. Advantages of mobile telemental health are underscored by its interactivity, just-in-time interventions, and low resource requirements and portability. Challenges in realizing this potential of mobile telemental health include the low penetration rates of health applications on mobile devices in part due to health literacy, the delay in current published research in evaluating newer technologies, and outdated research methodologies. Despite such challenges, one immediate opportunity for mobile telemental health is utilizing mobile devices as videoconferencing mediums for psychotherapy and psychosocial interventions enhanced by novel sensor based monitoring and behavior-prediction algorithms. This paper provides an overview of mobile telemental health and its current trends, as well as future opportunities as applied to patient care in both academic research and commercial ventures.

  16. Methodical and technological aspects of creation of interactive computer learning systems

    NASA Astrophysics Data System (ADS)

    Vishtak, N. M.; Frolov, D. A.

    2017-01-01

    The article presents a methodology for the development of an interactive computer training system for training power plant. The methods used in the work are a generalization of the content of scientific and methodological sources on the use of computer-based training systems in vocational education, methods of system analysis, methods of structural and object-oriented modeling of information systems. The relevance of the development of the interactive computer training systems in the preparation of the personnel in the conditions of the educational and training centers is proved. Development stages of the computer training systems are allocated, factors of efficient use of the interactive computer training system are analysed. The algorithm of work performance at each development stage of the interactive computer training system that enables one to optimize time, financial and labor expenditure on the creation of the interactive computer training system is offered.

  17. Reduction of streamflow monitoring networks by a reference point approach

    NASA Astrophysics Data System (ADS)

    Cetinkaya, Cem P.; Harmancioglu, Nilgun B.

    2014-05-01

    Adoption of an integrated approach to water management strongly forces policy and decision-makers to focus on hydrometric monitoring systems as well. Existing hydrometric networks need to be assessed and revised against the requirements on water quantity data to support integrated management. One of the questions that a network assessment study should resolve is whether a current monitoring system can be consolidated in view of the increased expenditures in time, money and effort imposed on the monitoring activity. Within the last decade, governmental monitoring agencies in Turkey have foreseen an audit on all their basin networks in view of prevailing economic pressures. In particular, they question how they can decide whether monitoring should be continued or terminated at a particular site in a network. The presented study is initiated to address this question by examining the applicability of a method called “reference point approach” (RPA) for network assessment and reduction purposes. The main objective of the study is to develop an easily applicable and flexible network reduction methodology, focusing mainly on the assessment of the “performance” of existing streamflow monitoring networks in view of variable operational purposes. The methodology is applied to 13 hydrometric stations in the Gediz Basin, along the Aegean coast of Turkey. The results have shown that the simplicity of the method, in contrast to more complicated computational techniques, is an asset that facilitates the involvement of decision makers in application of the methodology for a more interactive assessment procedure between the monitoring agency and the network designer. The method permits ranking of hydrometric stations with regard to multiple objectives of monitoring and the desired attributes of the basin network. Another distinctive feature of the approach is that it also assists decision making in cases with limited data and metadata. These features of the RPA approach highlight its advantages over the existing network assessment and reduction methods.

  18. Four applications of a software data collection and analysis methodology

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.

  19. Review of health information technology usability study methodologies

    PubMed Central

    Bakken, Suzanne

    2011-01-01

    Usability factors are a major obstacle to health information technology (IT) adoption. The purpose of this paper is to review and categorize health IT usability study methods and to provide practical guidance on health IT usability evaluation. 2025 references were initially retrieved from the Medline database from 2003 to 2009 that evaluated health IT used by clinicians. Titles and abstracts were first reviewed for inclusion. Full-text articles were then examined to identify final eligibility studies. 629 studies were categorized into the five stages of an integrated usability specification and evaluation framework that was based on a usability model and the system development life cycle (SDLC)-associated stages of evaluation. Theoretical and methodological aspects of 319 studies were extracted in greater detail and studies that focused on system validation (SDLC stage 2) were not assessed further. The number of studies by stage was: stage 1, task-based or user–task interaction, n=42; stage 2, system–task interaction, n=310; stage 3, user–task–system interaction, n=69; stage 4, user–task–system–environment interaction, n=54; and stage 5, user–task–system–environment interaction in routine use, n=199. The studies applied a variety of quantitative and qualitative approaches. Methodological issues included lack of theoretical framework/model, lack of details regarding qualitative study approaches, single evaluation focus, environmental factors not evaluated in the early stages, and guideline adherence as the primary outcome for decision support system evaluations. Based on the findings, a three-level stratified view of health IT usability evaluation is proposed and methodological guidance is offered based upon the type of interaction that is of primary interest in the evaluation. PMID:21828224

  20. Image Representation and Interactivity: An Exploration of Utility Values, Information-Needs and Image Interactivity

    ERIC Educational Resources Information Center

    Lewis, Elise C.

    2011-01-01

    This study was designed to explore the relationships between users and interactive images. Three factors were identified and provided different perspectives on how users interact with images: image utility, information-need, and images with varying levels of interactivity. The study used a mixed methodology to gain a more comprehensive…

  1. Reusable Software Usability Specifications for mHealth Applications.

    PubMed

    Cruz Zapata, Belén; Fernández-Alemán, José Luis; Toval, Ambrosio; Idri, Ali

    2018-01-25

    One of the key factors for the adoption of mobile technologies, and in particular of mobile health applications, is usability. A usable application will be easier to use and understand by users, and will improve user's interaction with it. This paper proposes a software requirements catalog for usable mobile health applications, which can be used for the development of new applications, or the evaluation of existing ones. The catalog is based on the main identified sources in literature on usability and mobile health applications. Our catalog was organized according to the ISO/IEC/IEEE 29148:2011 standard and follows the SIREN methodology to create reusable catalogs. The applicability of the catalog was verified by the creation of an audit method, which was used to perform the evaluation of a real app, S Health, application created by Samsung Electronics Co. The usability requirements catalog, along with the audit method, identified several usability flaws on the evaluated app, which scored 83%. Some flaws were detected in the app related to the navigation pattern. Some more issues related to the startup experience, empty screens or writing style were also found. The way a user navigates through an application improves or deteriorates user's experience with the application. We proposed a reusable usability catalog and an audit method. This proposal was used to evaluate a mobile health application. An audit report was created with the usability issues identified on the evaluated application.

  2. Using Modern Methodologies with Maintenance Software

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  3. Electronic Structure, Dielectric Response, and Surface Charge Distribution of RGD (1FUV) Peptide

    PubMed Central

    Adhikari, Puja; Wen, Amy M.; French, Roger H.; Parsegian, V. Adrian; Steinmetz, Nicole F.; Podgornik, Rudolf; Ching, Wai-Yim

    2014-01-01

    Long and short range molecular interactions govern molecular recognition and self-assembly of biological macromolecules. Microscopic parameters in the theories of these molecular interactions are either phenomenological or need to be calculated within a microscopic theory. We report a unified methodology for the ab initio quantum mechanical (QM) calculation that yields all the microscopic parameters, namely the partial charges as well as the frequency-dependent dielectric response function, that can then be taken as input for macroscopic theories of electrostatic, polar, and van der Waals-London dispersion intermolecular forces. We apply this methodology to obtain the electronic structure of the cyclic tripeptide RGD-4C (1FUV). This ab initio unified methodology yields the relevant parameters entering the long range interactions of biological macromolecules, providing accurate data for the partial charge distribution and the frequency-dependent dielectric response function of this peptide. These microscopic parameters determine the range and strength of the intricate intermolecular interactions between potential docking sites of the RGD-4C ligand and its integrin receptor. PMID:25001596

  4. A review of the processes by which ultrasound is generated through the interaction of ionizing radiation and irradiated materials: some possible applications.

    PubMed

    Baily, N A

    1992-01-01

    The production of acoustic waves following the absorption of energy deposited by ionizing radiation, with a consequent production of localized thermal spikes has been confirmed by a number of papers published in the physics literature. This paper reviews the basic theory and presents most of the supporting experimental data. Some of the experimental methods used and the results obtained are summarized. In addition to the rather straightforward and routine use of acoustic phenomena produced by ionizing radiation for the detection and measurements of such radiation, there are some special applications that appear to be especially attractive for medical physics. Some of these are unique to ionizing radiation in that the amplitude of the ultrasound wave is proportional to the energy deposited in small volumes at localized sites of these interactions, while others derive from methodologies already in use with nonionizing radiations. The detection and measurement of this ultrasonic radiation could possibly lead to methods for the study of such fundamental phenomenon as track structure, precision localization of therapeutic treatment beams, and even the possible imaging of internal anatomic structures to provide on-line portal images.

  5. Graph theory in brain-to-brain connectivity: A simulation study and an application to an EEG hyperscanning experiment.

    PubMed

    Toppi, J; Ciaramidaro, A; Vogel, P; Mattia, D; Babiloni, F; Siniatchkin, M; Astolfi, L

    2015-08-01

    Hyperscanning consists in the simultaneous recording of hemodynamic or neuroelectrical signals from two or more subjects acting in a social context. Well-established methodologies for connectivity estimation have already been adapted to hyperscanning purposes. The extension of graph theory approach to multi-subjects case is still a challenging issue. In the present work we aim to test the ability of the currently used graph theory global indices in describing the properties of a network given by two interacting subjects. The testing was conducted first on surrogate brain-to-brain networks reproducing typical social scenarios and then on real EEG hyperscanning data recorded during a Joint Action task. The results of the simulation study highlighted the ability of all the investigated indexes in modulating their values according to the level of interaction between subjects. However, only global efficiency and path length indexes demonstrated to be sensitive to an asymmetry in the communication between the two subjects. Such results were, then, confirmed by the application on real EEG data. Global efficiency modulated, in fact, their values according to the inter-brain density, assuming higher values in the social condition with respect to the non-social condition.

  6. Global-local methodologies and their application to nonlinear analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1989-01-01

    An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.

  7. Application of Fuzzy Logic to Matrix FMECA

    NASA Astrophysics Data System (ADS)

    Shankar, N. Ravi; Prabhu, B. S.

    2001-04-01

    A methodology combining the benefits of Fuzzy Logic and Matrix FMEA is presented in this paper. The presented methodology extends the risk prioritization beyond the conventional Risk Priority Number (RPN) method. Fuzzy logic is used to calculate the criticality rank. Also the matrix approach is improved further to develop a pictorial representation retaining all relevant qualitative and quantitative information of several FMEA elements relationships. The methodology presented is demonstrated by application to an illustrative example.

  8. An automated real-time microscopy system for analysis of fluorescence resonance energy transfer

    NASA Astrophysics Data System (ADS)

    Bernardini, André; Wotzlaw, Christoph; Lipinski, Hans-Gerd; Fandrey, Joachim

    2010-05-01

    Molecular imaging based on Fluorescence Resonance Energy Transfer (FRET) is widely used in cellular physiology both for protein-protein interaction analysis and detecting conformational changes of single proteins, e.g. during activation of signaling cascades. However, getting reliable results from FRET measurements is still hampered by methodological problems such as spectral bleed through, chromatic aberration, focal plane shifts and false positive FRET. Particularly false positive FRET signals caused by random interaction of the fluorescent dyes can easily lead to misinterpretation of the data. This work introduces a Nipkow Disc based FRET microscopy system, that is easy to operate without expert knowledge of FRET. The system automatically accounts for all relevant sources of errors and provides various result presentations of two, three and four dimensional FRET data. Two examples are given to demonstrate the scope of application. An interaction analysis of the two subunits of the hypoxia-inducible transcription factor 1 demonstrates the use of the system as a tool for protein-protein interaction analysis. As an example for time lapse observations, the conformational change of the fluorophore labeled heat shock protein 33 in the presence of oxidant stress is shown.

  9. Integrating 4-d light-sheet imaging with interactive virtual reality to recapitulate developmental cardiac mechanics and physiology

    NASA Astrophysics Data System (ADS)

    Ding, Yichen; Yu, Jing; Abiri, Arash; Abiri, Parinaz; Lee, Juhyun; Chang, Chih-Chiang; Baek, Kyung In; Sevag Packard, René R.; Hsiai, Tzung K.

    2018-02-01

    There currently is a limited ability to interactively study developmental cardiac mechanics and physiology. We therefore combined light-sheet fluorescence microscopy (LSFM) with virtual reality (VR) to provide a hybrid platform for 3- dimensional (3-D) architecture and time-dependent cardiac contractile function characterization. By taking advantage of the rapid acquisition, high axial resolution, low phototoxicity, and high fidelity in 3-D and 4-D (3-D spatial + 1-D time or spectra), this VR-LSFM hybrid methodology enables interactive visualization and quantification otherwise not available by conventional methods such as routine optical microscopes. We hereby demonstrate multi-scale applicability of VR-LSFM to 1) interrogate skin fibroblasts interacting with a hyaluronic acid-based hydrogel, 2) navigate through the endocardial trabecular network during zebrafish development, and 3) localize gene therapy-mediated potassium channel expression in adult murine hearts. We further combined our batch intensity normalized segmentation (BINS) algorithm with deformable image registration (DIR) to interface a VR environment for the analysis of cardiac contraction. Thus, the VR-LSFM hybrid platform demonstrates an efficient and robust framework for creating a user-directed microenvironment in which we uncovered developmental cardiac mechanics and physiology with high spatiotemporal resolution.

  10. Microbial interactions with chromium: basic biological processes and applications in environmental biotechnology.

    PubMed

    Gutiérrez-Corona, J F; Romo-Rodríguez, P; Santos-Escobar, F; Espino-Saldaña, A E; Hernández-Escoto, H

    2016-12-01

    Chromium (Cr) is a highly toxic metal for microorganisms as well as plants and animal cells. Due to its widespread industrial use, Cr has become a serious pollutant in diverse environmental settings. The hexavalent form of the metal, Cr(VI), is considered a more toxic species than the relatively innocuous and less mobile Cr(III) form. The study of the interactions between microorganisms and Cr has been helpful to unravel the mechanisms allowing organisms to survive in the presence of high concentrations of Cr(VI) and to detoxify and remove the oxyanion. Various mechanisms of interactions with Cr have been identified in diverse species of bacteria and fungi, including biosorption, bioaccumulation, reduction of Cr(VI) to Cr(III), and chromate efflux. Some of these systems have been proposed as potential biotechnological tools for the bioremediation of Cr pollution using bioreactors or by in situ treatments. In this review, the interactions of microorganisms with Cr are summarised, emphasising the importance of new research avenues using advanced methodologies, including proteomic, transcriptomic, and metabolomic analyses, as well as the use of techniques based on X-ray absorption spectroscopy and electron paramagnetic resonance spectroscopy.

  11. Atmospheric interaction with nanosatellites from observed orbital decay

    NASA Astrophysics Data System (ADS)

    Macario-Rojas, A.; Smith, K. L.; Crisp, N. H.; Roberts, P. C. E.

    2018-06-01

    Nanosatellites have gained considerable presence in low Earth orbits wherein the atmospheric interaction with exposed surfaces plays a fundamental role in the evolution of motion. These aspects become relevant with the increasing applicability of nanosatellites to a broader range of missions objectives. This investigation sets out to determine distinctive drag coefficient development and attributes of atmospheric gas-surface interactions in nanosatellites in the common form of standard 3U CubeSats from observed orbital decay. As orbital decay can be measured with relative accuracy, and its mechanism broken down into its constituent sources, the value of drag-related coefficients can be inferred by fitting modelled orbit predictions to observed data wherein the coefficient of interest is the adjusted parameter. The analysis uses the data of ten historical missions with documented passive attitude stabilisation strategies to reduce uncertainties. Findings indicate that it is possible to estimate fitted drag coefficients in CubeSats with physical representativeness. Assessment of atomic oxygen surface coverage derived from the fitted drag coefficients is broadly consistent with theoretical trends. The proposed methodology opens the possibility to assess atmospheric interaction characteristics by using the unprecedented opportunity arising from the numerous observed orbital decay of nanosatellites.

  12. The Virtual Physiological Human ToolKit.

    PubMed

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  13. Distance learning in academic health education.

    PubMed

    Mattheos, N; Schittek, M; Attström, R; Lyon, H C

    2001-05-01

    Distance learning is an apparent alternative to traditional methods in education of health care professionals. Non-interactive distance learning, interactive courses and virtual learning environments exist as three different generations in distance learning, each with unique methodologies, strengths and potential. Different methodologies have been recommended for distance learning, varying from a didactic approach to a problem-based learning procedure. Accreditation, teamwork and personal contact between the tutors and the students during a course provided by distance learning are recommended as motivating factors in order to enhance the effectiveness of the learning. Numerous assessment methods for distance learning courses have been proposed. However, few studies report adequate tests for the effectiveness of the distance-learning environment. Available information indicates that distance learning may significantly decrease the cost of academic health education at all levels. Furthermore, such courses can provide education to students and professionals not accessible by traditional methods. Distance learning applications still lack the support of a solid theoretical framework and are only evaluated to a limited extent. Cases reported so far tend to present enthusiastic results, while more carefully-controlled studies suggest a cautious attitude towards distance learning. There is a vital need for research evidence to identify the factors of importance and variables involved in distance learning. The effectiveness of distance learning courses, especially in relation to traditional teaching methods, must therefore be further investigated.

  14. Optimizing adsorption of blue pigment from wastewater by nano-porous modified Na-bentonite using spectrophotometry based on response surface method

    NASA Astrophysics Data System (ADS)

    Moradi, Neshat; Salem, Shiva; Salem, Amin

    2018-03-01

    This work highlighted the effective activation of bentonite paste to produce nano-porous powder for removal of cationic dye from wastewater. The effects of activation parameters such as soda and moisture contents, ageing time and temperature were analyzed using response surface methodology (RSM). The significance of independent variables and their interactions were tested by blending the obtained powders with wastewater and then the adsorption was evaluated, spectrophotometrically. The experiments were carried out by preparation of pastes according to response surface methodology and central composite design, which is the standard method, was used to evaluate the effects and interactions of four factors on the treatment efficiency. RSM was demonstrated as an appropriate approach for optimization of alkali activation. The optimal conditions obtained from the desirable responses were 5.0 wt% soda and 45.0 wt% moisture, respectively in which the powder activation was carried out at 150 °C. In order to well understand the role of nano-structured material on dye removal, the adsorbents were characterized through X-ray diffraction, Fourier transform infrared spectroscopy, scanning electron microscopy and Brunauer-Emmett-Teller surface area measurement. Finally, the analysis clearly demonstrates that the dye removal onto prepared adsorbent is well fitted with Langmuir isotherm compared to the other isotherm models. The low cost of material and facile process support the further development for commercial application purpose.

  15. Chemometric strategy for modeling metabolic biological space along the gastrointestinal tract and assessing microbial influences.

    PubMed

    Martin, François-Pierre J; Montoliu, Ivan; Kochhar, Sunil; Rezzi, Serge

    2010-12-01

    Over the past decade, the analysis of metabolic data with advanced chemometric techniques has offered the potential to explore functional relationships among biological compartments in relation to the structure and function of the intestine. However, the employed methodologies, generally based on regression modeling techniques, have given emphasis to region-specific metabolic patterns, while providing only limited insights into the spatiotemporal metabolic features of the complex gastrointestinal system. Hence, novel approaches are needed to analyze metabolic data to reconstruct the metabolic biological space associated with the evolving structures and functions of an organ such as the gastrointestinal tract. Here, we report the application of multivariate curve resolution (MCR) methodology to model metabolic relationships along the gastrointestinal compartments in relation to its structure and function using data from our previous metabonomic analysis. The method simultaneously summarizes metabolite occurrence and contribution to continuous metabolic signatures of the different biological compartments of the gut tract. This methodology sheds new light onto the complex web of metabolic interactions with gut symbionts that modulate host cell metabolism in surrounding gut tissues. In the future, such an approach will be key to provide new insights into the dynamic onset of metabolic deregulations involved in region-specific gastrointestinal disorders, such as Crohn's disease or ulcerative colitis.

  16. Formal analysis and automatic generation of user interfaces: approach, methodology, and an algorithm.

    PubMed

    Heymann, Michael; Degani, Asaf

    2007-04-01

    We present a formal approach and methodology for the analysis and generation of user interfaces, with special emphasis on human-automation interaction. A conceptual approach for modeling, analyzing, and verifying the information content of user interfaces is discussed. The proposed methodology is based on two criteria: First, the interface must be correct--that is, given the interface indications and all related information (user manuals, training material, etc.), the user must be able to successfully perform the specified tasks. Second, the interface and related information must be succinct--that is, the amount of information (mode indications, mode buttons, parameter settings, etc.) presented to the user must be reduced (abstracted) to the minimum necessary. A step-by-step procedure for generating the information content of the interface that is both correct and succinct is presented and then explained and illustrated via two examples. Every user interface is an abstract description of the underlying system. The correspondence between the abstracted information presented to the user and the underlying behavior of a given machine can be analyzed and addressed formally. The procedure for generating the information content of user interfaces can be automated, and a software tool for its implementation has been developed. Potential application areas include adaptive interface systems and customized/personalized interfaces.

  17. The VeTOOLS Project: an example of how to strengthen collaboration between scientists and Civil Protections in disaster risk reduction

    NASA Astrophysics Data System (ADS)

    Marti, Joan; Bartolini, Stefania; Becerril, Laura

    2016-04-01

    VeTOOLS is a project funded by the European Commission's Humanitarian Aid and Civil Protection department (ECHO), and aims at creating an integrated software platform specially designed to assess and manage volcanic risk. The project facilitates interaction and cooperation between scientists and Civil Protection Agencies in order to share, unify, and exchange procedures, methodologies and technologies to effectively reduce the impacts of volcanic disasters. The project aims at 1) improving and developing volcanic risk assessment and management capacities in active volcanic regions; 2) developing universal methodologies, scenario definitions, response strategies and alert protocols to cope with the full range of volcanic threats; 4) improving quantitative methods and tools for vulnerability and risk assessment; and 5) defining thresholds and protocols for civil protection. With these objectives, the VeTOOLS project points to two of the Sendai Framework resolutions for implementing it: i) Provide guidance on methodologies and standards for risk assessments, disaster risk modelling and the use of data; ii) Promote and support the availability and application of science and technology to decision-making, and offers a good example on how a close collaboration between science and civil protection is an effective way to contribute to DRR. European Commission ECHO Grant SI2.695524

  18. Fracture Analysis of Vessels. Oak Ridge FAVOR, v06.1, Computer Code: Theory and Implementation of Algorithms, Methods, and Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, P. T.; Dickson, T. L.; Yin, S.

    The current regulations to insure that nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to transients such as pressurized thermal shock (PTS) events were derived from computational models developed in the early-to-mid 1980s. Since that time, advancements and refinements in relevant technologies that impact RPV integrity assessment have led to an effort by the NRC to re-evaluate its PTS regulations. Updated computational methodologies have been developed through interactions between experts in the relevant disciplines of thermal hydraulics, probabilistic risk assessment, materials embrittlement, fracture mechanics, and inspection (flaw characterization). Contributors to the development of these methodologies include themore » NRC staff, their contractors, and representatives from the nuclear industry. These updated methodologies have been integrated into the Fracture Analysis of Vessels -- Oak Ridge (FAVOR, v06.1) computer code developed for the NRC by the Heavy Section Steel Technology (HSST) program at Oak Ridge National Laboratory (ORNL). The FAVOR, v04.1, code represents the baseline NRC-selected applications tool for re-assessing the current PTS regulations. This report is intended to document the technical bases for the assumptions, algorithms, methods, and correlations employed in the development of the FAVOR, v06.1, code.« less

  19. Promoting food security and well-being among poor and HIV/AIDS affected households: lessons from an interactive and integrated approach.

    PubMed

    Swaans, Kees; Broerse, Jacqueline; Meincke, Maylin; Mudhara, Maxwell; Bunders, Joske

    2009-02-01

    Participatory and interdisciplinary approaches have been suggested to develop appropriate agricultural innovations as an alternative strategy to improve food security and well-being among HIV/AIDS affected households. However, sustainable implementation of such interactive approaches is far from easy and straight forward. This study reports of the Interactive Learning and Action (ILA) approach, a methodology for agricultural innovation which has been adapted to the context of HIV/AIDS. Role players in agriculture and health were brought together to stimulate and sustain innovation among three support groups for poor and affected households in a rural high HIV/AIDS prevalence area in South Africa. The effectiveness of the approach was evaluated using both outcome and process criteria. The results indicate that an interactive approach in which service providers/researchers engage themselves as actors to explore the livelihood system and develop appropriate solutions in joint collaboration with resource users has potential. However, it also revealed that cooperation among participants and stakeholders at the interface of agriculture and HIV/AIDS is complicated and sensitive to erosion. Of particular concern was the difficulty of mobilizing members from poor and affected households to participate and to overcome stigma and discrimination. Lessons and potential applications for the further development of interactive approaches are discussed.

  20. The interaction of antibodies with lipid membranes unraveled by fluorescence methodologies

    NASA Astrophysics Data System (ADS)

    Figueira, Tiago N.; Veiga, Ana Salomé; Castanho, Miguel A. R. B.

    2014-12-01

    The interest and investment in antibody therapies has reached an overwhelming scale in the last decade. Yet, little concern has been noticed among the scientific community to unravel important interactions of antibodies with biological structures other than their respective epitopes. Lipid membranes are particularly relevant in this regard as they set the stage for protein-protein recognition, a concept potentially inclusive of antibody-antigen recognition. Fluorescence techniques allow experimental monitoring of protein partition between aqueous and lipid phases, deciphering events of adsorption, insertion and diffusion. This review focuses on the available fluorescence spectroscopy methodologies directed to the study of antibody-membrane interactions.

  1. Exploring Intercultural Interactions in Multicultural Contexts: Proposal and Research Suggestions.

    ERIC Educational Resources Information Center

    Yeh, Jung-huel Becky

    A discussion examines the importance of communication between non-native speakers (NNS/NNS), reviews relevant theories and issues in intercultural interactions and NNS/NNS interactions, and explores methodological issues in interpreting linguistic and interactional data. The intent is to explore features of communication between NNSs from…

  2. Assessing Campus Climates for Lesbian, Gay, Bisexual and Transgender (LGBT) Students: Methodological and Political Issues

    ERIC Educational Resources Information Center

    Brown, Robert D.; Gortmaker, Valerie J.

    2009-01-01

    Methodological and political issues arise during the designing, conducting, and reporting of campus-climate studies for LGBT students. These issues interact; making a decision about a methodological issue (e.g., sample size) has an impact on a political issue (e.g., how well the findings will be received). Ten key questions that must be addressed…

  3. Ethical analysis in HTA of complex health interventions.

    PubMed

    Lysdahl, Kristin Bakke; Oortwijn, Wija; van der Wilt, Gert Jan; Refolo, Pietro; Sacchini, Dario; Mozygemba, Kati; Gerhardus, Ansgar; Brereton, Louise; Hofmann, Bjørn

    2016-03-22

    In the field of health technology assessment (HTA), there are several approaches that can be used for ethical analysis. However, there is a scarcity of literature that critically evaluates and compares the strength and weaknesses of these approaches when they are applied in practice. In this paper, we analyse the applicability of some selected approaches for addressing ethical issues in HTA in the field of complex health interventions. Complex health interventions have been the focus of methodological attention in HTA. However, the potential methodological challenges for ethical analysis are as yet unknown. Six of the most frequently described and applied ethical approaches in HTA were critically assessed against a set of five characteristics of complex health interventions: multiple and changing perspectives, indeterminate phenomena, uncertain causality, unpredictable outcomes, and ethical complexity. The assessments are based on literature and the authors' experiences of developing, applying and assessing the approaches. The Interactive, participatory HTA approach is by its nature and flexibility, applicable across most complexity characteristics. Wide Reflective Equilibrium is also flexible and its openness to different perspectives makes it better suited for complex health interventions than more rigid conventional approaches, such as Principlism and Casuistry. Approaches developed for HTA purposes are fairly applicable for complex health interventions, which one could expect because they include various ethical perspectives, such as the HTA Core Model® and the Socratic approach. This study shows how the applicability for addressing ethical issues in HTA of complex health interventions differs between the selected ethical approaches. Knowledge about these differences may be helpful when choosing and applying an approach for ethical analyses in HTA. We believe that the study contributes to increasing awareness and interest of the ethical aspects of complex health interventions in general.

  4. NASA Technology Applications Team: Commercial applications of aerospace technology

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Research Triangle Institute (RTI) is pleased to report the results of NASA contract NASW-4367, 'Operation of a Technology Applications Team'. Through a period of significant change within NASA, the RTI Team has maintained its focus on helping NASA establish partnerships with U.S. industry for dual use development and technology commercialization. Our emphasis has been on outcomes, such as licenses, industry partnerships and commercialization of technologies that are important to NASA in its mission of contributing to the improved competitive position of U.S. industry. RTI's ongoing commitment to quality and customer responsiveness has driven our staff to continuously improve our technology transfer methodologies to meet NASA's requirements. For example, RTI has emphasized the following areas: (1) Methodology For Technology Assessment and Marketing: RTI has developed an implemented effective processes for assessing the commercial potential of NASA technologies. These processes resulted from an RTI study of best practices, hands-on experience, and extensive interaction with the NASA Field Centers to adapt to their specific needs; (2) Effective Marketing Strategies: RTI surveyed industry technology managers to determine effective marketing tools and strategies. The Technology Opportunity Announcement format and content were developed as a result of this industry input. For technologies with a dynamic visual impact, RTI has developed a stand-alone demonstration diskette that was successful in developing industry interest in licensing the technology; and (3) Responsiveness to NASA Requirements: RTI listened to our customer (NASA) and designed our processes to conform with the internal procedures and resources at each NASA Field Center and the direction provided by NASA's Agenda for Change. This report covers the activities of the Research Triangle Institute Technology Applications Team for the period 1 October 1993 through 31 December 1994.

  5. 42 CFR 436.601 - Application of financial eligibility methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... VIRGIN ISLANDS General Financial Eligibility Requirements and Options § 436.601 Application of financial... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...

  6. A Decomposition of Hospital Profitability: An Application of DuPont Analysis to the US Market.

    PubMed

    Turner, Jason; Broom, Kevin; Elliott, Michael; Lee, Jen-Fu

    2015-01-01

    This paper evaluates the drivers of profitability for a large sample of U.S. hospitals. Following a methodology frequently used by financial analysts, we use a DuPont analysis as a framework to evaluate the quality of earnings. By decomposing returns on equity (ROE) into profit margin, total asset turnover, and capital structure, the DuPont analysis reveals what drives overall profitability. Profit margin, the efficiency with which services are rendered (total asset turnover), and capital structure is calculated for 3,255 U.S. hospitals between 2007 and 2012 using data from the Centers for Medicare & Medicaid Services' Healthcare Cost Report Information System (CMS Form 2552). The sample is then stratified by ownership, size, system affiliation, teaching status, critical access designation, and urban or non-urban location. Those hospital characteristics and interaction terms are then regressed (OLS) against the ROE and the respective DuPont components. Sensitivity to regression methodology is also investigated using a seemingly unrelated regression. When the sample is stratified by hospital characteristics, the results indicate investor-owned hospitals have higher profit margins, higher efficiency, and are substantially more leveraged. Hospitals in systems are found to have higher ROE, margins, and efficiency but are associated with less leverage. In addition, a number of important and significant interactions between teaching status, ownership, location, critical access designation, and inclusion in a system are documented. Many of the significant relationships, most notably not-for-profit ownership, lose significance or are predominately associated with one interaction effect when interaction terms are introduced as explanatory variables. Results are not sensitive to the alternative methodology. The results of the DuPont analysis suggest that although there appears to be convergence in the behavior of NFP and IO hospitals, significant financial differences remain depending on their respective hospital characteristics. Those differences are tempered or exacerbated by location, size, teaching status, system affiliation, and critical access designation. With the exception of cost-based reimbursement for critical access hospitals, emerging payment systems are placing additional financial pressures on hospitals. The financial pressures being applied treat hospitals as a monolithic category and, given the delicate and often negative ROE for many hospitals, the long-term stability of the healthcare facility infrastructure may be negatively impacted.

  7. Passing Messages between Biological Networks to Refine Predicted Interactions

    PubMed Central

    Glass, Kimberly; Huttenhower, Curtis; Quackenbush, John; Yuan, Guo-Cheng

    2013-01-01

    Regulatory network reconstruction is a fundamental problem in computational biology. There are significant limitations to such reconstruction using individual datasets, and increasingly people attempt to construct networks using multiple, independent datasets obtained from complementary sources, but methods for this integration are lacking. We developed PANDA (Passing Attributes between Networks for Data Assimilation), a message-passing model using multiple sources of information to predict regulatory relationships, and used it to integrate protein-protein interaction, gene expression, and sequence motif data to reconstruct genome-wide, condition-specific regulatory networks in yeast as a model. The resulting networks were not only more accurate than those produced using individual data sets and other existing methods, but they also captured information regarding specific biological mechanisms and pathways that were missed using other methodologies. PANDA is scalable to higher eukaryotes, applicable to specific tissue or cell type data and conceptually generalizable to include a variety of regulatory, interaction, expression, and other genome-scale data. An implementation of the PANDA algorithm is available at www.sourceforge.net/projects/panda-net. PMID:23741402

  8. Real-space transmission electron microscopy investigations of attachment of functionalized magnetic nanoparticles to DNA-coils acting as a biosensor.

    PubMed

    Akhtar, Sultan; Strömberg, Mattias; Zardán Gómez de la Torre, Teresa; Russell, Camilla; Gunnarsson, Klas; Nilsson, Mats; Svedlindh, Peter; Strømme, Maria; Leifer, Klaus

    2010-10-21

    The present work provides the first real-space analysis of nanobead-DNA coil interactions. Immobilization of oligonucleotide-functionalized magnetic nanobeads in rolling circle amplified DNA-coils was studied by complex magnetization measurements and transmission electron microscopy (TEM), and a statistical analysis of the number of beads hybridized to the DNA-coils was performed. The average number of beads per DNA-coil using the results from both methods was found to be around 6 and slightly above 2 for samples with 40 and 130 nm beads, respectively. The TEM analysis supported an earlier hypothesis that 40 nm beads are preferably immobilized in the interior of DNA-coils whereas 130 nm beads, to a larger extent, are immobilized closer to the exterior of the coils. The methodology demonstrated in the present work should open up new possibilities for characterization of interactions of a large variety of functionalized nanoparticles with macromolecules, useful for gaining more fundamental understanding of such interactions as well as for optimizing a number of biosensor applications.

  9. Interactive bibliographical database on color

    NASA Astrophysics Data System (ADS)

    Caivano, Jose L.

    2002-06-01

    The paper describes the methodology and results of a project under development, aimed at the elaboration of an interactive bibliographical database on color in all fields of application: philosophy, psychology, semiotics, education, anthropology, physical and natural sciences, biology, medicine, technology, industry, architecture and design, arts, linguistics, geography, history. The project is initially based upon an already developed bibliography, published in different journals, updated in various opportunities, and now available at the Internet, with more than 2,000 entries. The interactive database will amplify that bibliography, incorporating hyperlinks and contents (indexes, abstracts, keywords, introductions, or eventually the complete document), and devising mechanisms for information retrieval. The sources to be included are: books, doctoral dissertations, multimedia publications, reference works. The main arrangement will be chronological, but the design of the database will allow rearrangements or selections by different fields: subject, Decimal Classification System, author, language, country, publisher, etc. A further project is to develop another database, including color-specialized journals or newsletters, and articles on color published in international journals, arranged in this case by journal name and date of publication, but allowing also rearrangements or selections by author, subject and keywords.

  10. Sensori-Motor Learning with Movement Sonification: Perspectives from Recent Interdisciplinary Studies.

    PubMed

    Bevilacqua, Frédéric; Boyer, Eric O; Françoise, Jules; Houix, Olivier; Susini, Patrick; Roby-Brami, Agnès; Hanneton, Sylvain

    2016-01-01

    This article reports on an interdisciplinary research project on movement sonification for sensori-motor learning. First, we describe different research fields which have contributed to movement sonification, from music technology including gesture-controlled sound synthesis, sonic interaction design, to research on sensori-motor learning with auditory-feedback. In particular, we propose to distinguish between sound-oriented tasks and movement-oriented tasks in experiments involving interactive sound feedback. We describe several research questions and recently published results on movement control, learning and perception. In particular, we studied the effect of the auditory feedback on movements considering several cases: from experiments on pointing and visuo-motor tracking to more complex tasks where interactive sound feedback can guide movements, or cases of sensory substitution where the auditory feedback can inform on object shapes. We also developed specific methodologies and technologies for designing the sonic feedback and movement sonification. We conclude with a discussion on key future research challenges in sensori-motor learning with movement sonification. We also point out toward promising applications such as rehabilitation, sport training or product design.

  11. An immersed-boundary method for flow–structure interaction in biological systems with application to phonation

    PubMed Central

    Luo, Haoxiang; Mittal, Rajat; Zheng, Xudong; Bielamowicz, Steven A.; Walsh, Raymond J.; Hahn, James K.

    2008-01-01

    A new numerical approach for modeling a class of flow–structure interaction problems typically encountered in biological systems is presented. In this approach, a previously developed, sharp-interface, immersed-boundary method for incompressible flows is used to model the fluid flow and a new, sharp-interface Cartesian grid, immersed boundary method is devised to solve the equations of linear viscoelasticity that governs the solid. The two solvers are coupled to model flow–structure interaction. This coupled solver has the advantage of simple grid generation and efficient computation on simple, single-block structured grids. The accuracy of the solid-mechanics solver is examined by applying it to a canonical problem. The solution methodology is then applied to the problem of laryngeal aerodynamics and vocal fold vibration during human phonation. This includes a three-dimensional eigen analysis for a multi-layered vocal fold prototype as well as two-dimensional, flow-induced vocal fold vibration in a modeled larynx. Several salient features of the aerodynamics as well as vocal-fold dynamics are presented. PMID:19936017

  12. Selective molecular annealing: in situ small angle X-ray scattering study of microwave-assisted annealing of block copolymers.

    PubMed

    Toolan, Daniel T W; Adlington, Kevin; Isakova, Anna; Kalamiotis, Alexis; Mokarian-Tabari, Parvaneh; Dimitrakis, Georgios; Dodds, Christopher; Arnold, Thomas; Terrill, Nick J; Bras, Wim; Hermida Merino, Daniel; Topham, Paul D; Irvine, Derek J; Howse, Jonathan R

    2017-08-09

    Microwave annealing has emerged as an alternative to traditional thermal annealing approaches for optimising block copolymer self-assembly. A novel sample environment enabling small angle X-ray scattering to be performed in situ during microwave annealing is demonstrated, which has enabled, for the first time, the direct study of the effects of microwave annealing upon the self-assembly behavior of a model, commercial triblock copolymer system [polystyrene-block-poly(ethylene-co-butylene)-block-polystyrene]. Results show that the block copolymer is a poor microwave absorber, resulting in no change in the block copolymer morphology upon application of microwave energy. The block copolymer species may only indirectly interact with the microwave energy when a small molecule microwave-interactive species [diethylene glycol dibenzoate (DEGDB)] is incorporated directly into the polymer matrix. Then significant morphological development is observed at DEGDB loadings ≥6 wt%. Through spatial localisation of the microwave-interactive species, we demonstrate targeted annealing of specific regions of a multi-component system, opening routes for the development of "smart" manufacturing methodologies.

  13. Application of network methods for understanding evolutionary dynamics in discrete habitats.

    PubMed

    Greenbaum, Gili; Fefferman, Nina H

    2017-06-01

    In populations occupying discrete habitat patches, gene flow between habitat patches may form an intricate population structure. In such structures, the evolutionary dynamics resulting from interaction of gene-flow patterns with other evolutionary forces may be exceedingly complex. Several models describing gene flow between discrete habitat patches have been presented in the population-genetics literature; however, these models have usually addressed relatively simple settings of habitable patches and have stopped short of providing general methodologies for addressing nontrivial gene-flow patterns. In the last decades, network theory - a branch of discrete mathematics concerned with complex interactions between discrete elements - has been applied to address several problems in population genetics by modelling gene flow between habitat patches using networks. Here, we present the idea and concepts of modelling complex gene flows in discrete habitats using networks. Our goal is to raise awareness to existing network theory applications in molecular ecology studies, as well as to outline the current and potential contribution of network methods to the understanding of evolutionary dynamics in discrete habitats. We review the main branches of network theory that have been, or that we believe potentially could be, applied to population genetics and molecular ecology research. We address applications to theoretical modelling and to empirical population-genetic studies, and we highlight future directions for extending the integration of network science with molecular ecology. © 2017 John Wiley & Sons Ltd.

  14. Global-local methodologies and their application to nonlinear analysis. [for structural postbuckling study

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.

  15. Embodied Conversational Agents in Clinical Psychology: A Scoping Review

    PubMed Central

    Lau, Ho Ming; Ruwaard, Jeroen; Riper, Heleen

    2017-01-01

    Background Embodied conversational agents (ECAs) are computer-generated characters that simulate key properties of human face-to-face conversation, such as verbal and nonverbal behavior. In Internet-based eHealth interventions, ECAs may be used for the delivery of automated human support factors. Objective We aim to provide an overview of the technological and clinical possibilities, as well as the evidence base for ECA applications in clinical psychology, to inform health professionals about the activity in this field of research. Methods Given the large variety of applied methodologies, types of applications, and scientific disciplines involved in ECA research, we conducted a systematic scoping review. Scoping reviews aim to map key concepts and types of evidence underlying an area of research, and answer less-specific questions than traditional systematic reviews. Systematic searches for ECA applications in the treatment of mood, anxiety, psychotic, autism spectrum, and substance use disorders were conducted in databases in the fields of psychology and computer science, as well as in interdisciplinary databases. Studies were included if they conveyed primary research findings on an ECA application that targeted one of the disorders. We mapped each study’s background information, how the different disorders were addressed, how ECAs and users could interact with one another, methodological aspects, and the study’s aims and outcomes. Results This study included N=54 publications (N=49 studies). More than half of the studies (n=26) focused on autism treatment, and ECAs were used most often for social skills training (n=23). Applications ranged from simple reinforcement of social behaviors through emotional expressions to sophisticated multimodal conversational systems. Most applications (n=43) were still in the development and piloting phase, that is, not yet ready for routine practice evaluation or application. Few studies conducted controlled research into clinical effects of ECAs, such as a reduction in symptom severity. Conclusions ECAs for mental disorders are emerging. State-of-the-art techniques, involving, for example, communication through natural language or nonverbal behavior, are increasingly being considered and adopted for psychotherapeutic interventions in ECA research with promising results. However, evidence on their clinical application remains scarce. At present, their value to clinical practice lies mostly in the experimental determination of critical human support factors. In the context of using ECAs as an adjunct to existing interventions with the aim of supporting users, important questions remain with regard to the personalization of ECAs’ interaction with users, and the optimal timing and manner of providing support. To increase the evidence base with regard to Internet interventions, we propose an additional focus on low-tech ECA solutions that can be rapidly developed, tested, and applied in routine practice. PMID:28487267

  16. Embodied Conversational Agents in Clinical Psychology: A Scoping Review.

    PubMed

    Provoost, Simon; Lau, Ho Ming; Ruwaard, Jeroen; Riper, Heleen

    2017-05-09

    Embodied conversational agents (ECAs) are computer-generated characters that simulate key properties of human face-to-face conversation, such as verbal and nonverbal behavior. In Internet-based eHealth interventions, ECAs may be used for the delivery of automated human support factors. We aim to provide an overview of the technological and clinical possibilities, as well as the evidence base for ECA applications in clinical psychology, to inform health professionals about the activity in this field of research. Given the large variety of applied methodologies, types of applications, and scientific disciplines involved in ECA research, we conducted a systematic scoping review. Scoping reviews aim to map key concepts and types of evidence underlying an area of research, and answer less-specific questions than traditional systematic reviews. Systematic searches for ECA applications in the treatment of mood, anxiety, psychotic, autism spectrum, and substance use disorders were conducted in databases in the fields of psychology and computer science, as well as in interdisciplinary databases. Studies were included if they conveyed primary research findings on an ECA application that targeted one of the disorders. We mapped each study's background information, how the different disorders were addressed, how ECAs and users could interact with one another, methodological aspects, and the study's aims and outcomes. This study included N=54 publications (N=49 studies). More than half of the studies (n=26) focused on autism treatment, and ECAs were used most often for social skills training (n=23). Applications ranged from simple reinforcement of social behaviors through emotional expressions to sophisticated multimodal conversational systems. Most applications (n=43) were still in the development and piloting phase, that is, not yet ready for routine practice evaluation or application. Few studies conducted controlled research into clinical effects of ECAs, such as a reduction in symptom severity. ECAs for mental disorders are emerging. State-of-the-art techniques, involving, for example, communication through natural language or nonverbal behavior, are increasingly being considered and adopted for psychotherapeutic interventions in ECA research with promising results. However, evidence on their clinical application remains scarce. At present, their value to clinical practice lies mostly in the experimental determination of critical human support factors. In the context of using ECAs as an adjunct to existing interventions with the aim of supporting users, important questions remain with regard to the personalization of ECAs' interaction with users, and the optimal timing and manner of providing support. To increase the evidence base with regard to Internet interventions, we propose an additional focus on low-tech ECA solutions that can be rapidly developed, tested, and applied in routine practice. ©Simon Provoost, Ho Ming Lau, Jeroen Ruwaard, Heleen Riper. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 09.05.2017.

  17. Virtual standardized patients: an interactive method to examine variation in depression care among primary care physicians

    PubMed Central

    Hooper, Lisa M.; Weinfurt, Kevin P.; Cooper, Lisa A.; Mensh, Julie; Harless, William; Kuhajda, Melissa C.; Epstein, Steven A.

    2009-01-01

    Background Some primary care physicians provide less than optimal care for depression (Kessler et al., Journal of the American Medical Association 291, 2581–90, 2004). However, the literature is not unanimous on the best method to use in order to investigate this variation in care. To capture variations in physician behaviour and decision making in primary care settings, 32 interactive CD-ROM vignettes were constructed and tested. Aim and method The primary aim of this methods-focused paper was to review the extent to which our study method – an interactive CD-ROM patient vignette methodology – was effective in capturing variation in physician behaviour. Specifically, we examined the following questions: (a) Did the interactive CD-ROM technology work? (b) Did we create believable virtual patients? (c) Did the research protocol enable interviews (data collection) to be completed as planned? (d) To what extent was the targeted study sample size achieved? and (e) Did the study interview protocol generate valid and reliable quantitative data and rich, credible qualitative data? Findings Among a sample of 404 randomly selected primary care physicians, our voice-activated interactive methodology appeared to be effective. Specifically, our methodology – combining interactive virtual patient vignette technology, experimental design, and expansive open-ended interview protocol – generated valid explanations for variations in primary care physician practice patterns related to depression care. PMID:20463864

  18. Precise photorealistic visualization for restoration of historic buildings based on tacheometry data

    NASA Astrophysics Data System (ADS)

    Ragia, Lemonia; Sarri, Froso; Mania, Katerina

    2018-03-01

    This paper puts forward a 3D reconstruction methodology applied to the restoration of historic buildings taking advantage of the speed, range and accuracy of a total geodetic station. The measurements representing geo-referenced points produced an interactive and photorealistic geometric mesh of a monument named `Neoria.' `Neoria' is a Venetian building located by the old harbor at Chania, Crete, Greece. The integration of tacheometry acquisition and computer graphics puts forward a novel integrated software framework for the accurate 3D reconstruction of a historical building. The main technical challenge of this work was the production of a precise 3D mesh based on a sufficient number of tacheometry measurements acquired fast and at low cost, employing a combination of surface reconstruction and processing methods. A fully interactive application based on game engine technologies was developed. The user can visualize and walk through the monument and the area around it as well as photorealistically view it at different times of day and night. Advanced interactive functionalities are offered to the user in relation to identifying restoration areas and visualizing the outcome of such works. The user could visualize the coordinates of the points measured, calculate distances and navigate through the complete 3D mesh of the monument. The geographical data are stored in a database connected with the application. Features referencing and associating the database with the monument are developed. The goal was to utilize a small number of acquired data points and present a fully interactive visualization of a geo-referenced 3D model.

  19. Precise photorealistic visualization for restoration of historic buildings based on tacheometry data

    NASA Astrophysics Data System (ADS)

    Ragia, Lemonia; Sarri, Froso; Mania, Katerina

    2018-04-01

    This paper puts forward a 3D reconstruction methodology applied to the restoration of historic buildings taking advantage of the speed, range and accuracy of a total geodetic station. The measurements representing geo-referenced points produced an interactive and photorealistic geometric mesh of a monument named `Neoria.' `Neoria' is a Venetian building located by the old harbor at Chania, Crete, Greece. The integration of tacheometry acquisition and computer graphics puts forward a novel integrated software framework for the accurate 3D reconstruction of a historical building. The main technical challenge of this work was the production of a precise 3D mesh based on a sufficient number of tacheometry measurements acquired fast and at low cost, employing a combination of surface reconstruction and processing methods. A fully interactive application based on game engine technologies was developed. The user can visualize and walk through the monument and the area around it as well as photorealistically view it at different times of day and night. Advanced interactive functionalities are offered to the user in relation to identifying restoration areas and visualizing the outcome of such works. The user could visualize the coordinates of the points measured, calculate distances and navigate through the complete 3D mesh of the monument. The geographical data are stored in a database connected with the application. Features referencing and associating the database with the monument are developed. The goal was to utilize a small number of acquired data points and present a fully interactive visualization of a geo-referenced 3D model.

  20. Biophotonics sensor acclimatization to stem cells environment

    NASA Astrophysics Data System (ADS)

    Mohamad Shahimin, Mukhzeer

    2017-11-01

    The ability to discriminate, characterise and purify biological cells from heterogeneous population of cells is fundamental to numerous prognosis and diagnosis applications; often forming the basis for current and emerging clinical protocols in stem cell therapy. Current sorting approaches exploit differences in cell density, specific immunologic targets, or receptor-ligand interactions to isolate particular cells. Identification of novel properties by which different cell types may be discerned and of new ways for their selective manipulation are clearly fundamental components for improving sorting methodologies. Biophotonics sensor developed by our team are potentially capable of discriminating cells according to their refractive index (which is highly dependable on the organelles inside the cell), size (indicator to cell stage) and shape (in certain cases as an indicator to cell type). The sensor, which already discriminate particles efficiently, is modified to acclimatize into biological environment, especially for stem cell applications.

  1. Using directed information for influence discovery in interconnected dynamical systems

    NASA Astrophysics Data System (ADS)

    Rao, Arvind; Hero, Alfred O.; States, David J.; Engel, James Douglas

    2008-08-01

    Structure discovery in non-linear dynamical systems is an important and challenging problem that arises in various applications such as computational neuroscience, econometrics, and biological network discovery. Each of these systems have multiple interacting variables and the key problem is the inference of the underlying structure of the systems (which variables are connected to which others) based on the output observations (such as multiple time trajectories of the variables). Since such applications demand the inference of directed relationships among variables in these non-linear systems, current methods that have a linear assumption on structure or yield undirected variable dependencies are insufficient. Hence, in this work, we present a methodology for structure discovery using an information-theoretic metric called directed time information (DTI). Using both synthetic dynamical systems as well as true biological datasets (kidney development and T-cell data), we demonstrate the utility of DTI in such problems.

  2. Study of archaeological underwater finds: deterioration and conservation

    NASA Astrophysics Data System (ADS)

    Crisci, G. M.; La Russa, M. F.; Macchione, M.; Malagodi, M.; Palermo, A. M.; Ruffolo, S. A.

    2010-09-01

    This study is aimed at an assessment of the methodologies, instruments and new applications for underwater archaeology. Research focused on study of the various kinds of degradation affecting underwater finds and stone materials aged in underwater environment, efficiency evaluation of various surface cleaning methods and study and mixing of protective products with consolidating resins and antimicrobial biocides to be applied to restored underwater finds. Transmitted light optical microscopy and scanning electron microscopy (SEM) were used to study surface biofilms and the interactions with samples of different stone materials such as brick, marble and granite immersed in the submarine archaeological area of Crotone (South of Italy). Surface cleaning tests were performed with application of ion exchange resins, EDTA, hydrogen peroxide and ultrasound techniques. Capillary water absorption, simulated solar ageing and colourimetric measurements were carried out to evaluate hydrophobic and consolidant properties; to assess biocidal efficacy, heterotrophic micro-organisms ( Aspergillus niger) were inoculated on agar plates and growth inhibition was measured.

  3. Prediction of the mechanical properties of zeolite pellets for aerospace molecular decontamination applications.

    PubMed

    Rioland, Guillaume; Dutournié, Patrick; Faye, Delphine; Daou, T Jean; Patarin, Joël

    2016-01-01

    Zeolite pellets containing 5 wt % of binder (methylcellulose or sodium metasilicate) were formed with a hydraulic press. This paper describes a mathematical model to predict the mechanical properties (uniaxial and diametric compression) of these pellets for arbitrary dimensions (height and diameter) using a design of experiments (DOE) methodology. A second-degree polynomial equation including interactions was used to approximate the experimental results. This leads to an empirical model for the estimation of the mechanical properties of zeolite pellets with 5 wt % of binder. The model was verified by additional experimental tests including pellets of different dimensions created with different applied pressures. The optimum dimensions were found to be a diameter of 10-23 mm, a height of 1-3.5 mm and an applied pressure higher than 200 MPa. These pellets are promising for technological uses in molecular decontamination for aerospace-based applications.

  4. Novel and Recent Synthesis and Applications of β-Lactams

    NASA Astrophysics Data System (ADS)

    Troisi, Luigino; Granito, Catia; Pindinelli, Emanuela

    In this chapter, a comprehensive overview of the most significant and interesting contributions published from 2000 until now, concerning the preparation of novel β-lactam structures is presented. Among the different synthetic strategies available, either novel or already known but efficient and versatile methodologies are covered. The simple modifications of one or more substituents linked to the nitrogen N-1, the C-3, and the C-4 carbon atoms of the β-lactam nucleus were considered as an alternative synthetic protocol of more complex and polyfunctionalized molecules. Indeed, it is well known and extensively reviewed that the biological activity of this strained four-membered heterocycle is strictly dependent on the nature of the substituent groups that affect the reactivity towards the molecular active sites, increasing or lowering the possibility of interaction with the substrates. Finally, a synthetic survey of the most significant biological and pharmacological applications of the 2-azetidinones is reported.

  5. CFD Code Development for Combustor Flows

    NASA Technical Reports Server (NTRS)

    Norris, Andrew

    2003-01-01

    During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.

  6. Bactericidal Activity of Usnic Acid-Loaded Electrospun Fibers.

    PubMed

    Araújo, Evando S; Pereira, Eugênia C; da Costa, Mateus M; da Silva, Nicácio H; de Oliveira, Helinando P

    2016-01-01

    Usnic acid has been progressively reported in the literature as one of the most important lichen metabolites characterized by a rich diversity of applications such as antifungal, antimicrobial, antiprotozoal and antiviral agent. Particularly, antimicrobial activity of usnic acid can be improved by encapsulation of active molecules in enteric electrospun fibers, allowing the controlled release of active molecule at specific pH. Few relevant patents to the topic have been reviewed and cited. Bactericidal activity of usnic acid-loaded electrospun fibers of Eudragit L-100 and polyvinylpyrrolidone was examined against Staphylococcus aureus using inhibition hales methodology. The controlled release of active material at high pH is established after 10 minutes of interaction with media and results in reasonable activity against S. aureus, as detected by inhibition hales. The strong biological activity of usnic acid-loaded electrospun fibers provides a promising application for corresponding material as a bactericidal agent for wound healing treatment.

  7. Probing Silica-Biomolecule Interactions by Solid-State NMR and Molecular Dynamics Simulations.

    PubMed

    Brückner, Stephan Ingmar; Donets, Sergii; Dianat, Arezoo; Bobeth, Manfred; Gutiérrez, Rafael; Cuniberti, Gianaurelio; Brunner, Eike

    2016-11-08

    Understanding the molecular interactions between inorganic phases such as silica and organic material is fundamental for chromatographic applications, for tailoring silica-enzyme interactions, and for elucidating the mechanisms of biomineralization. The formation, structure, and properties of the organic/inorganic interface is crucial in this context. Here, we investigate the interaction of selectively 13 C-labeled choline with 29 Si-labeled monosilicic acid/silica at the molecular level. Silica/choline nanocomposites were analyzed by solid-state NMR spectroscopy in combination with extended molecular dynamics (MD) simulations to understand the silica/organic interface. Cross-polarization magic angle spinning (CP MAS)-based NMR experiments like 1 H- 13 C CP-REDOR (rotational-echo double resonance), 1 H- 13 C HETCOR (heteronuclear correlation), and 1 H- 29 Si- 1 H double CP are employed to determine spatial parameters. The measurement of 29 Si- 13 C internuclear distances for selectively 13 C-labeled choline provides an experimental parameter that allows the direct verification of MD simulations. Atomistic modeling using classical MD methodologies is performed using the INTERFACE force field. The modeling results are in excellent agreement with the experimental data and reveal the relevant molecular conformations as well as the nature and interplay of the interactions between the choline cation and the silica surface. Electrostatic interactions and hydrogen bonding are both important and depend strongly on the hydration level as well as the charge state of the silica surface.

  8. Neutral-neutral and neutral-ion collision integrals for Y2O3-Ar plasma system

    NASA Astrophysics Data System (ADS)

    Dhamale, Gayatri D.; Nath, Swastik; Mathe, Vikas L.; Ghorui, Srikumar

    2017-06-01

    A detailed investigation on the neutral-neutral and neutral-ion collision integrals is reported for Y2O3-Ar plasma, an important system of functional material with unique properties having a wide range of processing applications. The calculated integrals are indispensible pre-requisite for the estimation of transport properties needed in CFD modelling of associated plasma processes. Polarizability plays an important role in determining the integral values. Ambiguity in selecting appropriate polarizability data available in the literature and calculating effective number of electrons in the ionized species contributing to the polarizability are addressed. The integrals are evaluated using Lennard-Jones like phenomenological potential up to (l,s) = (4,4). Used interaction potential is suitable for both neutral-neutral and neutral-ion interactions. For atom-parent ion interactions, contribution coming from the inelastic resonant charge transfer process has been accounted properly together with that coming from the elastic counterpart. A total of 14 interacting species and 60 different interactions are considered. Key contributing factors like basic electronic properties of the interacting species and associated polarizability values are accounted carefully. Adopted methodology is first benchmarked against data reported in the literature and then applied to the Y2O3-Ar plasma system for estimating the collision integrals. Results are presented in the temperature range of 100 K-100 000 K.

  9. A reliability evaluation methodology for memory chips for space applications when sample size is small

    NASA Technical Reports Server (NTRS)

    Chen, Y.; Nguyen, D.; Guertin, S.; Berstein, J.; White, M.; Menke, R.; Kayali, S.

    2003-01-01

    This paper presents a reliability evaluation methodology to obtain the statistical reliability information of memory chips for space applications when the test sample size needs to be kept small because of the high cost of the radiation hardness memories.

  10. SOME POSSIBLE APPLICATIONS OF PROJECT OUTCOMES RESEARCH METHODOLOGY

    DTIC Science & Technology

    Section I, refers to the possibility of using the theory and methodology of Project Outcomes to problems of strategic information. It is felt that...purposes of assessing present and future organizational effectiveness . Section IV, refers to the applications that our study may have for problems of

  11. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    NASA Astrophysics Data System (ADS)

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  12. A new methodology to integrate planetary quarantine requirements into mission planning, with application to a Jupiter orbiter

    NASA Technical Reports Server (NTRS)

    Howard, R. A.; North, D. W.; Pezier, J. P.

    1975-01-01

    A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.

  13. Interactional Psychology and Organizational Behavior.

    DTIC Science & Technology

    1982-02-01

    effect and then argue, per- suasively, that situations control behavior. The fact that actual experimental treatments are typically non -representative of...Interactional psychology organizational design organization theory person x situation interaction work socialization person-environment interaction...and methodological underpinnings of situ- ationism, and (2) the presentation of the interactionist perspective. For purposes of the present paper

  14. 3D Inhabited Virtual Worlds: Interactivity and Interaction between Avatars, Autonomous Agents, and Users.

    ERIC Educational Resources Information Center

    Jensen, Jens F.

    This paper addresses some of the central questions currently related to 3-Dimensional Inhabited Virtual Worlds (3D-IVWs), their virtual interactions, and communication, drawing from the theory and methodology of sociology, interaction analysis, interpersonal communication, semiotics, cultural studies, and media studies. First, 3D-IVWs--seen as a…

  15. Turbofan engine control system design using the LQG/LTR methodology

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    1989-01-01

    Application of the linear-quadratic-Gaussian with loop-transfer-recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired target feedback loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.

  16. Turbofan engine control system design using the LQG/LTR methodology

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    1989-01-01

    Application of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired Target-Feedback-Loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.

  17. Network representation of protein interactions: Theory of graph description and analysis.

    PubMed

    Kurzbach, Dennis

    2016-09-01

    A methodological framework is presented for the graph theoretical interpretation of NMR data of protein interactions. The proposed analysis generalizes the idea of network representations of protein structures by expanding it to protein interactions. This approach is based on regularization of residue-resolved NMR relaxation times and chemical shift data and subsequent construction of an adjacency matrix that represents the underlying protein interaction as a graph or network. The network nodes represent protein residues. Two nodes are connected if two residues are functionally correlated during the protein interaction event. The analysis of the resulting network enables the quantification of the importance of each amino acid of a protein for its interactions. Furthermore, the determination of the pattern of correlations between residues yields insights into the functional architecture of an interaction. This is of special interest for intrinsically disordered proteins, since the structural (three-dimensional) architecture of these proteins and their complexes is difficult to determine. The power of the proposed methodology is demonstrated at the example of the interaction between the intrinsically disordered protein osteopontin and its natural ligand heparin. © 2016 The Protein Society.

  18. Development of Risk Assessment Methodology for Land Application and Distribution and Marketing of Municipal Sludge

    EPA Science Inventory

    This is one of a series of reports that present methodologies for assessing the potential risks to humans or other organisms from the disposal or reuse of municipal sludge. The sludge management practices addressed by this series include land application practices, distribution a...

  19. Multilevel Modeling: A Review of Methodological Issues and Applications

    ERIC Educational Resources Information Center

    Dedrick, Robert F.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.; Kromrey, Jeffrey D.; Lang, Thomas R.; Niles, John D.; Lee, Reginald S.

    2009-01-01

    This study analyzed the reporting of multilevel modeling applications of a sample of 99 articles from 13 peer-reviewed journals in education and the social sciences. A checklist, derived from the methodological literature on multilevel modeling and focusing on the issues of model development and specification, data considerations, estimation, and…

  20. Selection of higher order regression models in the analysis of multi-factorial transcription data.

    PubMed

    Prazeres da Costa, Olivia; Hoffman, Arthur; Rey, Johannes W; Mansmann, Ulrich; Buch, Thorsten; Tresch, Achim

    2014-01-01

    Many studies examine gene expression data that has been obtained under the influence of multiple factors, such as genetic background, environmental conditions, or exposure to diseases. The interplay of multiple factors may lead to effect modification and confounding. Higher order linear regression models can account for these effects. We present a new methodology for linear model selection and apply it to microarray data of bone marrow-derived macrophages. This experiment investigates the influence of three variable factors: the genetic background of the mice from which the macrophages were obtained, Yersinia enterocolitica infection (two strains, and a mock control), and treatment/non-treatment with interferon-γ. We set up four different linear regression models in a hierarchical order. We introduce the eruption plot as a new practical tool for model selection complementary to global testing. It visually compares the size and significance of effect estimates between two nested models. Using this methodology we were able to select the most appropriate model by keeping only relevant factors showing additional explanatory power. Application to experimental data allowed us to qualify the interaction of factors as either neutral (no interaction), alleviating (co-occurring effects are weaker than expected from the single effects), or aggravating (stronger than expected). We find a biologically meaningful gene cluster of putative C2TA target genes that appear to be co-regulated with MHC class II genes. We introduced the eruption plot as a tool for visual model comparison to identify relevant higher order interactions in the analysis of expression data obtained under the influence of multiple factors. We conclude that model selection in higher order linear regression models should generally be performed for the analysis of multi-factorial microarray data.

  1. How can we make progress with decision support systems in landscape and river basin management? Lessons learned from a comparative analysis of four different decision support systems.

    PubMed

    Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T H; Seppelt, Ralf

    2010-12-01

    This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of 'what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.

  2. How Can We Make Progress with Decision Support Systems in Landscape and River Basin Management? Lessons Learned from a Comparative Analysis of Four Different Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T. H.; Seppelt, Ralf

    2010-12-01

    This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of `what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.

  3. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  4. An Application of the Methodology for Assessment of the Sustainability of Air Transport System

    NASA Technical Reports Server (NTRS)

    Janic, Milan

    2003-01-01

    An assessment and operationalization of the concept of sustainable air transport system is recognized as an important but complex research, operational and policy task. In the scope of the academic efforts to properly address the problem, this paper aims to assess the sustainability of air transport system. It particular, the paper describes the methodology for assessment of sustainability and its potential application. The methodology consists of the indicator systems, which relate to the air transport system operational, economic, social and environmental dimension of performance. The particular indicator systems are relevant for the particular actors such users (air travellers), air transport operators, aerospace manufacturers, local communities, governmental authorities at different levels (local, national, international), international air transport associations, pressure groups and public. In the scope of application of the methodology, the specific cases are selected to estimate the particular indicators, and thus to assess the system sustainability under given conditions.

  5. Interaction Mechanisms between Air Bubble and Molybdenite Surface: Impact of Solution Salinity and Polymer Adsorption.

    PubMed

    Xie, Lei; Wang, Jingyi; Yuan, Duowei; Shi, Chen; Cui, Xin; Zhang, Hao; Liu, Qi; Liu, Qingxia; Zeng, Hongbo

    2017-03-07

    The surface characteristics of molybdenite (MoS 2 ) such as wettability and surface interactions have attracted much research interest in a wide range of engineering applications, such as froth flotation. In this work, a bubble probe atomic force microscope (AFM) technique was employed to directly measure the interaction forces between an air bubble and molybdenite mineral surface before/after polymer (i.e., guar gum) adsorption treatment. The AFM imaging showed that the polymer coverage on the surface of molybdenite could achieve ∼5.6, ∼44.5, and ∼100% after conditioning in 1, 5, and 10 ppm polymer solution, respectively, which coincided with the polymer coverage results based on contact angle measurements. The electrolyte concentration and surface treatment by polymer adsorption were found to significantly affect bubble-mineral interaction and attachment. The experimental force results on bubble-molybdenite (without polymer treatment) agreed well with the calculations using a theoretical model based on the Reynolds lubrication theory and augmented Young-Laplace equation including the effect of disjoining pressure. The overall surface repulsion was enhanced when the NaCl concentration decreased from 100 to 1 mM, which inhibited the bubble-molybdenite attachment. After conditioning the molybdenite surface in 1 ppm polymer solution, it was more difficult for air bubbles to attach to the molybdenite surface due to the weakened hydrophobic interaction with a shorter decay length. Increasing the polymer concentration to 5 ppm effectively inhibited bubble attachment on mineral surface, which was mainly due to the much reduced hydrophobic interaction as well as the additional steric repulsion between the extended polymer chains and bubble surface. The results provide quantitative information on the interaction mechanism between air bubbles and molybdenite mineral surfaces on the nanoscale, with useful implications for the development of effective polymer depressants and fundamental understanding of bubble-solid interactions in mineral flotation. The methodologies used in this work can be readily extended to studying similar interfacial interactions in many other engineering applications such as froth flotation deinking and bitumen extraction in oil sands industry.

  6. Modeling Negotiation by a Paticipatory Approach

    NASA Astrophysics Data System (ADS)

    Torii, Daisuke; Ishida, Toru; Bousquet, François

    In a participatory approach by social scientists, role playing games (RPG) are effectively used to understand real thinking and behavior of stakeholders, but RPG is not sufficient to handle a dynamic process like negotiation. In this study, a participatory simulation where user-controlled avatars and autonomous agents coexist is introduced to the participatory approach for modeling negotiation. To establish a modeling methodology of negotiation, we have tackled the following two issues. First, for enabling domain experts to concentrate interaction design for participatory simulation, we have adopted the architecture in which an interaction layer controls agents and have defined three types of interaction descriptions (interaction protocol, interaction scenario and avatar control scenario) to be described. Second, for enabling domain experts and stakeholders to capitalize on participatory simulation, we have established a four-step process for acquiring negotiation model: 1) surveys and interviews to stakeholders, 2) RPG, 3) interaction design, and 4) participatory simulation. Finally, we discussed our methodology through a case study of agricultural economics in the northeast Thailand.

  7. Electrical tweezer for highly parallelized electrorotation measurements over a wide frequency bandwidth.

    PubMed

    Rohani, Ali; Varhue, Walter; Su, Yi-Hsuan; Swami, Nathan S

    2014-07-01

    Electrorotation (ROT) is a powerful tool for characterizing the dielectric properties of cells and bioparticles. However, its application has been somewhat limited by the need to mitigate disruptions to particle rotation by translation under positive DEP and by frictional interactions with the substrate. While these disruptions may be overcome by implementing particle positioning schemes or field cages, these methods restrict the frequency bandwidth to the negative DEP range and permit only single particle measurements within a limited spatial extent of the device geometry away from field nonuniformities. Herein, we present an electrical tweezer methodology based on a sequence of electrical signals, composed of negative DEP using 180-degree phase-shifted fields for trapping and levitation of the particles, followed by 90-degree phase-shifted fields over a wide frequency bandwidth for highly parallelized electrorotation measurements. Through field simulations of the rotating electrical field under this wave-sequence, we illustrate the enhanced spatial extent for electrorotation measurements, with no limitations to frequency bandwidth. We apply this methodology to characterize subtle modifications in morphology and electrophysiology of Cryptosporidium parvum with varying degrees of heat treatment, in terms of shifts in the electrorotation spectra over the 0.05-40 MHz region. Given the single particle sensitivity and the ability for highly parallelized electrorotation measurements, we envision its application toward characterizing heterogeneous subpopulations of microbial and stem cells. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. The astronomy education through interactive materials

    NASA Astrophysics Data System (ADS)

    Voelzke, Marcos Rincon; Macedo, Josue

    This study presents results of a survey conducted at the Federal Institution of Education, Science and Technology in the North of Minas Gerais (IFNMG), and aimed to investigate the potentialities of the use of interactive materials in the teaching of astronomy. An advanced training course with involved learning activities about basic concepts of astronomy was offered to thirty-two Licenciate students in Physics, Mathematics and Biological Sciences, using the mixed methodology, combined with the three pedagogical moments. Among other aspects, the viability of the use of resources was noticed, involving digital technologies and interactive materials on teaching of astronomy, which may contribute to the broadening of methodological options for future teachers and meet their training needs.

  9. Astronomy education through interactive materials

    NASA Astrophysics Data System (ADS)

    Voelzke, Marcos Rincon; Antunes de Macêdo, Josué

    2015-08-01

    This study presents results of a survey conducted at the Federal Institution of Education, Science and Technology in the North of Minas Gerais (IFNMG), and aimed to investigate the potentialities of the use of interactive materials in the teaching of astronomy. An advanced training course with involved learning activities about basic concepts of astronomy was offered to thirty-two Licenciate students in Physics, Mathematics and Biological Sciences, using the mixed methodology, combined with the three pedagogical moments. Among other aspects, the viability of the use of resources was noticed, involving digital technologies and interactive materials on teaching of astronomy, which may contribute to the broadening of methodological options for future teachers and meet their training needs.

  10. Research on the potential use of interactive materials on astronomy education

    NASA Astrophysics Data System (ADS)

    Voelzke, Marcos Rincon; Macedo, Josue

    2016-07-01

    This study presents results of a survey conducted at the Federal Institution of Education, Science and Technology in the North of Minas Gerais (IFNMG), and aimed to investigate the potentialities of the use of interactive materials in the teaching of astronomy. An advanced training course with involved learning activities about basic concepts of astronomy was offered to thirty-two Licenciate students in Physics, Mathematics and Biological Sciences, using the mixed methodology, combined with the three pedagogical moments. Among other aspects, the viability of the use of resources was noticed, involving digital technologies and interactive materials on teaching of astronomy, which may contribute to the broadening of methodological options for future teachers and meet their training needs.

  11. A Methodology and Implementation for Annotating Digital Images for Context-appropriate Use in an Academic Health Care Environment

    PubMed Central

    Goede, Patricia A.; Lauman, Jason R.; Cochella, Christopher; Katzman, Gregory L.; Morton, David A.; Albertine, Kurt H.

    2004-01-01

    Use of digital medical images has become common over the last several years, coincident with the release of inexpensive, mega-pixel quality digital cameras and the transition to digital radiology operation by hospitals. One problem that clinicians, medical educators, and basic scientists encounter when handling images is the difficulty of using business and graphic arts commercial-off-the-shelf (COTS) software in multicontext authoring and interactive teaching environments. The authors investigated and developed software-supported methodologies to help clinicians, medical educators, and basic scientists become more efficient and effective in their digital imaging environments. The software that the authors developed provides the ability to annotate images based on a multispecialty methodology for annotation and visual knowledge representation. This annotation methodology is designed by consensus, with contributions from the authors and physicians, medical educators, and basic scientists in the Departments of Radiology, Neurobiology and Anatomy, Dermatology, and Ophthalmology at the University of Utah. The annotation methodology functions as a foundation for creating, using, reusing, and extending dynamic annotations in a context-appropriate, interactive digital environment. The annotation methodology supports the authoring process as well as output and presentation mechanisms. The annotation methodology is the foundation for a Windows implementation that allows annotated elements to be represented as structured eXtensible Markup Language and stored separate from the image(s). PMID:14527971

  12. Cycle time reduction using lean six sigma in make-to-order (MTO) environment: Conceptual framework

    NASA Astrophysics Data System (ADS)

    Man, Siti Mariam; Zain, Zakiyah; Nawawi, Mohd Kamal Mohd

    2015-12-01

    This paper outlines the framework for application of lean six sigma (LSS) methodology to improve semiconductor assembly cycle time in a make-to-order (MTO) business environment. The cycle time reduction is the prime objective in the context of an overall productivity improvement particularly in the MTO environment. The interaction of the production rate and cycle time is described, while the emphasis is on Define-Measure-Analyze-Improve-Control (DMAIC) and Plan-Do-Check-Act (PDCA) activities. A framework for the conceptual understanding is provided along with practical implementation issues. A relevant measure for the degree of flexibility (DOF) in the context of quick setup is also discussed.

  13. The contribution of statistical physics to evolutionary biology.

    PubMed

    de Vladar, Harold P; Barton, Nicholas H

    2011-08-01

    Evolutionary biology shares many concepts with statistical physics: both deal with populations, whether of molecules or organisms, and both seek to simplify evolution in very many dimensions. Often, methodologies have undergone parallel and independent development, as with stochastic methods in population genetics. Here, we discuss aspects of population genetics that have embraced methods from physics: non-equilibrium statistical mechanics, travelling waves and Monte-Carlo methods, among others, have been used to study polygenic evolution, rates of adaptation and range expansions. These applications indicate that evolutionary biology can further benefit from interactions with other areas of statistical physics; for example, by following the distribution of paths taken by a population through time. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Supramolecular Differentiation for Construction of Anisotropic Fullerene Nanostructures by Time-Programmed Control of Interfacial Growth.

    PubMed

    Bairi, Partha; Minami, Kosuke; Hill, Jonathan P; Nakanishi, Waka; Shrestha, Lok Kumar; Liu, Chao; Harano, Koji; Nakamura, Eiichi; Ariga, Katsuhiko

    2016-09-27

    Supramolecular assembly can be used to construct a wide variety of ordered structures by exploiting the cumulative effects of multiple noncovalent interactions. However, the construction of anisotropic nanostructures remains subject to some limitations. Here, we demonstrate the preparation of anisotropic fullerene-based nanostructures by supramolecular differentiation, which is the programmed control of multiple assembly strategies. We have carefully combined interfacial assembly and local phase separation phenomena. Two fullerene derivatives, PhH and C12H, were together formed into self-assembled anisotropic nanostructures by using this approach. This technique is applicable for the construction of anisotropic nanostructures without requiring complex molecular design or complicated methodology.

  15. Using Networks To Understand Medical Data: The Case of Class III Malocclusions

    PubMed Central

    Scala, Antonio; Auconi, Pietro; Scazzocchio, Marco; Caldarelli, Guido; McNamara, James A.; Franchi, Lorenzo

    2012-01-01

    A system of elements that interact or regulate each other can be represented by a mathematical object called a network. While network analysis has been successfully applied to high-throughput biological systems, less has been done regarding their application in more applied fields of medicine; here we show an application based on standard medical diagnostic data. We apply network analysis to Class III malocclusion, one of the most difficult to understand and treat orofacial anomaly. We hypothesize that different interactions of the skeletal components can contribute to pathological disequilibrium; in order to test this hypothesis, we apply network analysis to 532 Class III young female patients. The topology of the Class III malocclusion obtained by network analysis shows a strong co-occurrence of abnormal skeletal features. The pattern of these occurrences influences the vertical and horizontal balance of disharmony in skeletal form and position. Patients with more unbalanced orthodontic phenotypes show preponderance of the pathological skeletal nodes and minor relevance of adaptive dentoalveolar equilibrating nodes. Furthermore, by applying Power Graphs analysis we identify some functional modules among orthodontic nodes. These modules correspond to groups of tightly inter-related features and presumably constitute the key regulators of plasticity and the sites of unbalance of the growing dentofacial Class III system. The data of the present study show that, in their most basic abstraction level, the orofacial characteristics can be represented as graphs using nodes to represent orthodontic characteristics, and edges to represent their various types of interactions. The applications of this mathematical model could improve the interpretation of the quantitative, patient-specific information, and help to better targeting therapy. Last but not least, the methodology we have applied in analyzing orthodontic features can be applied easily to other fields of the medical science. PMID:23028552

  16. Changes in the reflectance of ex situ leaves: A methodological approach

    NASA Astrophysics Data System (ADS)

    Ponzoni, Flavio Jorge; Inoe, Mario Takao

    1992-04-01

    The main aspects of the interaction between electromagnetic radiation and detached leaves are presented. An experiment with Eucalipto and Araucaria detached leaves is described, including the description of the methodologies utilized in the collection and storage of the reflectance.

  17. Methodologies and Methods for User Behavioral Research.

    ERIC Educational Resources Information Center

    Wang, Peiling

    1999-01-01

    Discusses methodological issues in empirical studies of information-related behavior in six specific research areas: information needs and uses; information seeking; relevance judgment; online searching (including online public access catalog, online database, and the Web); human-system interactions; and reference transactions. (Contains 191…

  18. Study design options in evaluating gene-environment interactions: practical considerations for a planned case-control study of pediatric leukemia.

    PubMed

    Goodman, Michael; Dana Flanders, W

    2007-04-01

    We compare methodological approaches for evaluating gene-environment interaction using a planned study of pediatric leukemia as a practical example. We considered three design options: a full case-control study (Option I), a case-only study (Option II), and a partial case-control study (Option III), in which information on controls is limited to environmental exposure only. For each design option we determined its ability to measure the main effects of environmental factor E and genetic factor G, and the interaction between E and G. Using the leukemia study example, we calculated sample sizes required to detect and odds ratio (OR) of 2.0 for E alone, an OR of 10 for G alone and an interaction G x E of 3. Option I allows measuring both main effects and interaction, but requires a total sample size of 1,500 cases and 1,500 controls. Option II allows measuring only interaction, but requires just 121 cases. Option III allows calculating the main effect of E, and interaction, but not the main effect of G, and requires a total of 156 cases and 133 controls. In this case, the partial case-control study (Option III) appears to be more efficient with respect to its ability to answer the research questions for the amount of resources required. The design options considered in this example are not limited to observational epidemiology and may be applicable in studies of pharmacogenomics, survivorship, and other areas of pediatric ALL research.

  19. Rhythm Patterns Interaction - Synchronization Behavior for Human-Robot Joint Action

    PubMed Central

    Mörtl, Alexander; Lorenz, Tamara; Hirche, Sandra

    2014-01-01

    Interactive behavior among humans is governed by the dynamics of movement synchronization in a variety of repetitive tasks. This requires the interaction partners to perform for example rhythmic limb swinging or even goal-directed arm movements. Inspired by that essential feature of human interaction, we present a novel concept and design methodology to synthesize goal-directed synchronization behavior for robotic agents in repetitive joint action tasks. The agents’ tasks are described by closed movement trajectories and interpreted as limit cycles, for which instantaneous phase variables are derived based on oscillator theory. Events segmenting the trajectories into multiple primitives are introduced as anchoring points for enhanced synchronization modes. Utilizing both continuous phases and discrete events in a unifying view, we design a continuous dynamical process synchronizing the derived modes. Inverse to the derivation of phases, we also address the generation of goal-directed movements from the behavioral dynamics. The developed concept is implemented to an anthropomorphic robot. For evaluation of the concept an experiment is designed and conducted in which the robot performs a prototypical pick-and-place task jointly with human partners. The effectiveness of the designed behavior is successfully evidenced by objective measures of phase and event synchronization. Feedback gathered from the participants of our exploratory study suggests a subjectively pleasant sense of interaction created by the interactive behavior. The results highlight potential applications of the synchronization concept both in motor coordination among robotic agents and in enhanced social interaction between humanoid agents and humans. PMID:24752212

  20. Understanding the Structure-Function Relationships of Dendrimers in Environmental and Biomedical Applications

    NASA Astrophysics Data System (ADS)

    Wang, Bo

    We are living an era wherein nanoparticles (NPs) have been widely applied in our lives. Dendrimers are special polymeric NPs with unique physiochemical properties, which have been intensely explored for a variety of applications. Current studies on dendrimers are bottlenecked by insufficient understandings of their structure and dynamic behaviors from a molecular level. With primarily computational approaches supplemented by many other experimental technics, this dissertation aims to establish structure-function relationships of dendrimers in environmental and biomedical applications. More specifically, it thoroughly investigates the interactions between dendrimers and different biomolecules including carbon-based NPs, metal-based NPs, and proteins/peptides. Those results not only provide profound knowledge for evaluating the impacts of dendrimers on environmental and biological systems but also facilitate designing next-generation functional polymeric nanomaterials. The dissertation is organized as following. Chapter 1 provides an overview of current progresses on dendrimer studies, where methodology of Discrete Molecular Dynamics (DMD), my major research tool, is also introduced. Two directions of utilizing dendrimers will be discussed in following chapters. Chapter 2 will focus on environmental applications of dendrimers, where two back-to-back studies are presented. I will start from describing some interesting observations from experiments i.e. dendrimers dispersed model oil molecules. Then, I will reveal why surface chemistries of dendrimers lead to different remediation efficiencies by computational modelings. Finally, I will demonstrate different scenarios of dendrimer-small molecules association. Chapter 3 is centered on dendrimers in the biomedical applications including two subtopics. In the first topic, we will discuss dendrimers as surfactants that modulating the interactions between proteins and NPs. Some fundamental concepts regarding to NPs-Protein interactions such as NP-protein corona are also explained. In the following topic, I will look into amyloid protein aggregation mediated by dendrimers, which is of high expectations for combating amyloidogenic-related diseases. Chapter 4 concludes the whole dissertation. It also briefly introduces my ongoing projects and future research directions about dendrimers. This dissertation has presented a systematic study of dendrimers in environmental and biomedical applications which might provide valuable information for future dendrimer design thus benefit the nanobiotechnology.

  1. An object-oriented approach for harmonization of multimedia markup languages

    NASA Astrophysics Data System (ADS)

    Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay

    2003-12-01

    An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.

  2. Applications of decision analysis and related techniques to industrial engineering problems at KSC

    NASA Technical Reports Server (NTRS)

    Evans, Gerald W.

    1995-01-01

    This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).

  3. Crossing trend analysis methodology and application for Turkish rainfall records

    NASA Astrophysics Data System (ADS)

    Şen, Zekâi

    2018-01-01

    Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.

  4. Assessment of Methodological Quality of Economic Evaluations in Belgian Drug Reimbursement Applications

    PubMed Central

    Simoens, Steven

    2013-01-01

    Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474

  5. Assessment of methodological quality of economic evaluations in belgian drug reimbursement applications.

    PubMed

    Simoens, Steven

    2013-01-01

    This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation.

  6. Railroad classification yard design methodology study Elkhart Yard Rehabilitation : a case study

    DOT National Transportation Integrated Search

    1980-02-01

    This interim report documents the application of a railroad classification : yard design methodology to CONRAIL's Elkhart Yard Rehabilitation. This : case study effort represents Phase 2 of a larger effort to develop a yard : design methodology, and ...

  7. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    PubMed Central

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  8. Navy Community of Practice for Programmers and Developers

    DTIC Science & Technology

    2016-12-01

    execute cyber missions. The methodology employed in this research is human-centered design via a social interaction prototype, which allows us to learn...for Navy programmers and developers. Chapter V details the methodology used to design the proposed CoP. This chapter summarizes the results from...thirty years the term has evolved to incorporate ideas from numerous design methodologies and movements [57]. In the 1980s, revealed design began to

  9. Security Quality Requirements Engineering (SQUARE) Methodology

    DTIC Science & Technology

    2005-11-01

    such as Joint Application Development and the Accelerated Requirements Method [Wood 89, Hubbard 99] • Soft Systems Methodology [Checkland 89...investigated were misuse cases [Jacobson 92], Soft Systems Methodology (SSM) [Checkland 89], Quality Function Deployment (QFD) [QFD 05], Con- trolled...html (2005). [Checkland 89] Checkland, Peter. Soft Systems Methodology . Rational Analysis for a Problematic World. New York, NY: John Wiley & Sons

  10. Radioactive waste disposal fees-Methodology for calculation

    NASA Astrophysics Data System (ADS)

    Bemš, Július; Králík, Tomáš; Kubančák, Ján; Vašíček, Jiří; Starý, Oldřich

    2014-11-01

    This paper summarizes the methodological approach used for calculation of fee for low- and intermediate-level radioactive waste disposal and for spent fuel disposal. The methodology itself is based on simulation of cash flows related to the operation of system for waste disposal. The paper includes demonstration of methodology application on the conditions of the Czech Republic.

  11. Setting up the Interactive Educational Process in Higher Education

    ERIC Educational Resources Information Center

    Ponomariova, Olga Nikolaevna; Vasin?, Olga Nikolaevna

    2016-01-01

    This article aims to discuss the opportunities in the interactive teaching in higher education. The study presents the methodological approach of understanding the notions of "teaching technology" and "interactive teaching methods". The originality of the study consists in the authors' definition of the situation in "the…

  12. Application of Resource Description Framework to Personalise Learning: Systematic Review and Methodology

    ERIC Educational Resources Information Center

    Jevsikova, Tatjana; Berniukevicius, Andrius; Kurilovas, Eugenijus

    2017-01-01

    The paper is aimed to present a methodology of learning personalisation based on applying Resource Description Framework (RDF) standard model. Research results are two-fold: first, the results of systematic literature review on Linked Data, RDF "subject-predicate-object" triples, and Web Ontology Language (OWL) application in education…

  13. Enhanced α-amylase production by a marine protist, Ulkenia sp. using response surface methodology and genetic algorithm.

    PubMed

    Shirodkar, Priyanka V; Muraleedharan, Usha Devi

    2017-11-26

    Amylases are a group of enzymes with a wide variety of industrial applications. Enhancement of α-amylase production from the marine protists, thraustochytrids has been attempted for the first time by applying statistical-based experimental designs using response surface methodology (RSM) and genetic algorithm (GA) for optimization of the most influencing process variables. A full factorial central composite experimental design was used to study the cumulative interactive effect of nutritional components viz., glucose, corn starch, and yeast extract. RSM was performed on two objectives, that is, growth of Ulkenia sp. AH-2 (ATCC® PRA-296) and α-amylase activity. When GA was conducted for maximization of the enzyme activity, the optimal α-amylase activity was found to be 71.20 U/mL which was close to that obtained by RSM (71.93 U/mL), both of which were in agreement with the predicted value of 72.37 U/mL. Optimal growth at the optimized process variables was found to be 1.89A 660nm . The optimized medium increased α-amylase production by 1.2-fold.

  14. Large Scale Many-Body Perturbation Theory calculations: methodological developments, data collections, validation

    NASA Astrophysics Data System (ADS)

    Govoni, Marco; Galli, Giulia

    Green's function based many-body perturbation theory (MBPT) methods are well established approaches to compute quasiparticle energies and electronic lifetimes. However, their application to large systems - for instance to heterogeneous systems, nanostructured, disordered, and defective materials - has been hindered by high computational costs. We will discuss recent MBPT methodological developments leading to an efficient formulation of electron-electron and electron-phonon interactions, and that can be applied to systems with thousands of electrons. Results using a formulation that does not require the explicit calculation of virtual states, nor the storage and inversion of large dielectric matrices will be presented. We will discuss data collections obtained using the WEST code, the advantages of the algorithms used in WEST over standard techniques, and the parallel performance. Work done in collaboration with I. Hamada, R. McAvoy, P. Scherpelz, and H. Zheng. This work was supported by MICCoM, as part of the Computational Materials Sciences Program funded by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division and by ANL.

  15. A Novel Multi-scale Simulation Strategy for Turbulent Reacting Flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, Sutherland C.

    In this project, a new methodology was proposed to bridge the gap between Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES). This novel methodology, titled Lattice-Based Multiscale Simulation (LBMS), creates a lattice structure of One-Dimensional Turbulence (ODT) models. This model has been shown to capture turbulent combustion with high fidelity by fully resolving interactions between turbulence and diffusion. By creating a lattice of ODT models, which are then coupled, LBMS overcomes the shortcomings of ODT, which are its inability to capture large scale three dimensional flow structures. However, by spacing these lattices significantly apart, LBMS can avoid the cursemore » of dimensionality that creates untenable computational costs associated with DNS. This project has shown that LBMS is capable of reproducing statistics of isotropic turbulent flows while coarsening the spacing between lines significantly. It also investigates and resolves issues that arise when coupling ODT lines, such as flux reconstruction perpendicular to a given ODT line, preservation of conserved quantities when eddies cross a course cell volume and boundary condition application. Robust parallelization is also investigated.« less

  16. Testing Mean Differences among Groups: Multivariate and Repeated Measures Analysis with Minimal Assumptions

    PubMed Central

    Bathke, Arne C.; Friedrich, Sarah; Pauly, Markus; Konietschke, Frank; Staffen, Wolfgang; Strobl, Nicolas; Höller, Yvonne

    2018-01-01

    ABSTRACT To date, there is a lack of satisfactory inferential techniques for the analysis of multivariate data in factorial designs, when only minimal assumptions on the data can be made. Presently available methods are limited to very particular study designs or assume either multivariate normality or equal covariance matrices across groups, or they do not allow for an assessment of the interaction effects across within-subjects and between-subjects variables. We propose and methodologically validate a parametric bootstrap approach that does not suffer from any of the above limitations, and thus provides a rather general and comprehensive methodological route to inference for multivariate and repeated measures data. As an example application, we consider data from two different Alzheimer’s disease (AD) examination modalities that may be used for precise and early diagnosis, namely, single-photon emission computed tomography (SPECT) and electroencephalogram (EEG). These data violate the assumptions of classical multivariate methods, and indeed classical methods would not have yielded the same conclusions with regards to some of the factors involved. PMID:29565679

  17. Co-activation patterns in resting-state fMRI signals.

    PubMed

    Liu, Xiao; Zhang, Nanyin; Chang, Catie; Duyn, Jeff H

    2018-02-08

    The brain is a complex system that integrates and processes information across multiple time scales by dynamically coordinating activities over brain regions and circuits. Correlations in resting-state functional magnetic resonance imaging (rsfMRI) signals have been widely used to infer functional connectivity of the brain, providing a metric of functional associations that reflects a temporal average over an entire scan (typically several minutes or longer). Not until recently was the study of dynamic brain interactions at much shorter time scales (seconds to minutes) considered for inference of functional connectivity. One method proposed for this objective seeks to identify and extract recurring co-activation patterns (CAPs) that represent instantaneous brain configurations at single time points. Here, we review the development and recent advancement of CAP methodology and other closely related approaches, as well as their applications and associated findings. We also discuss the potential neural origins and behavioral relevance of CAPs, along with methodological issues and future research directions in the analysis of fMRI co-activation patterns. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Science as a service: understanding successful knowledge transfer in a New Zealand research institute.

    PubMed

    Moore, D; Bayne, K; Barnard, T

    2012-01-01

    This paper reports on an exercise conducted within a state-owned body (Crown Research Institute) in New Zealand aimed at building greater understanding of the key factors in successful research programmes. Success was defined in this study as a high level of uptake of the emerging science, with commensurate benefits to both industry and the community. The methodology had three parts. A review of the knowledge and technology transfer literature; a series of 15 semi-structured interviews with science leaders; and a facilitated workshop. The purpose of the review was to generate a robust framework upon which to centre the interview dialogues, and two models were selected. The results varied, reflecting the diversity of research services provided by the organization, but the findings were predominantly new and valuable. The importance of the long term relationship with the end users was the strongest recurring theme. The methodology may have wider application in both research and consulting settings; for the benefits derived from the interactive process with staff, as well as for the specific findings.

  19. A local quasicontinuum method for 3D multilattice crystalline materials: Application to shape-memory alloys

    NASA Astrophysics Data System (ADS)

    Sorkin, V.; Elliott, R. S.; Tadmor, E. B.

    2014-07-01

    The quasicontinuum (QC) method, in its local (continuum) limit, is applied to materials with a multilattice crystal structure. Cauchy-Born (CB) kinematics, which accounts for the shifts of the crystal motif, is used to relate atomic motions to continuum deformation gradients. To avoid failures of CB kinematics, QC is augmented with a phonon stability analysis that detects lattice period extensions and identifies the minimum required periodic cell size. This approach is referred to as Cascading Cauchy-Born kinematics (CCB). In this paper, the method is described and developed. It is then used, along with an effective interaction potential (EIP) model for shape-memory alloys, to simulate the shape-memory effect and pseudoelasticity in a finite specimen. The results of these simulations show that (i) the CCB methodology is an essential tool that is required in order for QC-type simulations to correctly capture the first-order phase transitions responsible for these material behaviors, and (ii) that the EIP model adopted in this work coupled with the QC/CCB methodology is capable of predicting the characteristic behavior found in shape-memory alloys.

  20. Not Your Same Old Story: New Rules for Thematic Apperceptive Techniques (TATs).

    PubMed

    Jenkins, Sharon Rae

    2017-01-01

    Stories told about pictures have been used for both research and clinical practice since the beginning of modern personality assessment. However, with the growing science-practice gap, these thematic apperceptive techniques (TATs) have been used differently in those 2 venues. Scientific validation is presumptively general, but clinical application is idiographic and situation-specific. A bridge is needed. The manualized human-scored narrative analysis systems discussed here are valuable scientist-practitioner tools, but they require a validation literature to support further research publication, maintain their role in clinical training, and justify clinicians' reimbursement by third-party payers. To facilitate wider understanding of manualized TAT methodologies, this article addresses long-standing criticisms of TAT reliability and proposes some strategic solutions to the measurement error problem for both researchers and clinicians, including analyzing person-situation interactions, purposeful situation sampling for within-storyteller comparisons, and uses of small samples. The new rules for TATs include conceptual and methodological standards that researchers should aim to meet and report, reviewers should apply to manuscripts, and clinical assessors can use to analyze their own data and justify third-party payment.

  1. Synchronous Videoconferencing in Distance Education for Pre-Licensure Nursing

    ERIC Educational Resources Information Center

    Scarbrough, John E.

    2015-01-01

    Current nursing education practices typically include methodologies for providing access to students located at a distance from the hosting institution. The majority of methodologies make use of asynchronous formatting in which communication occurs without the benefit of simultaneous, synchronous interaction. The increasing worldwide availability…

  2. A Theoretical and Methodological Evaluation of Leadership Research.

    ERIC Educational Resources Information Center

    Lashbrook, Velma J.; Lashbrook, William B.

    This paper isolates some of the strengths and weaknesses of leadership research by evaluating it from both a theoretical and methodological perspective. The seven theories or approaches examined are: great man, trait, situational, style, functional, social influence, and interaction positions. General theoretical, conceptual, and measurement…

  3. Scalar Implicatures in Child Language: Give Children a Chance

    ERIC Educational Resources Information Center

    Foppolo, Francesca; Guasti, Maria Teresa; Chierchia, Gennaro

    2012-01-01

    Children's pragmatic competence in deriving conversational implicatures (and scalar implicatures in particular) offers an intriguing standpoint to explore how developmental, methodological, and purely theoretical perspectives interact and feed each other. In this paper, we focus mainly on developmental and methodological issues, showing that…

  4. Calculation and mitigation of isotopic interferences in liquid chromatography-mass spectrometry/mass spectrometry assays and its application in supporting microdose absolute bioavailability studies.

    PubMed

    Gu, Huidong; Wang, Jian; Aubry, Anne-Françoise; Jiang, Hao; Zeng, Jianing; Easter, John; Wang, Jun-sheng; Dockens, Randy; Bifano, Marc; Burrell, Richard; Arnold, Mark E

    2012-06-05

    A methodology for the accurate calculation and mitigation of isotopic interferences in liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS) assays and its application in supporting microdose absolute bioavailability studies are reported for the first time. For simplicity, this calculation methodology and the strategy to minimize the isotopic interference are demonstrated using a simple molecule entity, then applied to actual development drugs. The exact isotopic interferences calculated with this methodology were often much less than the traditionally used, overestimated isotopic interferences simply based on the molecular isotope abundance. One application of the methodology is the selection of a stable isotopically labeled internal standard (SIL-IS) for an LC-MS/MS bioanalytical assay. The second application is the selection of an SIL analogue for use in intravenous (i.v.) microdosing for the determination of absolute bioavailability. In the case of microdosing, the traditional approach of calculating isotopic interferences can result in selecting a labeling scheme that overlabels the i.v.-dosed drug or leads to incorrect conclusions on the feasibility of using an SIL drug and analysis by LC-MS/MS. The methodology presented here can guide the synthesis by accurately calculating the isotopic interferences when labeling at different positions, using different selective reaction monitoring (SRM) transitions or adding more labeling positions. This methodology has been successfully applied to the selection of the labeled i.v.-dosed drugs for use in two microdose absolute bioavailability studies, before initiating the chemical synthesis. With this methodology, significant time and cost saving can be achieved in supporting microdose absolute bioavailability studies with stable labeled drugs.

  5. Multi-application controls: Robust nonlinear multivariable aerospace controls applications

    NASA Technical Reports Server (NTRS)

    Enns, Dale F.; Bugajski, Daniel J.; Carter, John; Antoniewicz, Bob

    1994-01-01

    This viewgraph presentation describes the general methodology used to apply Honywell's Multi-Application Control (MACH) and the specific application to the F-18 High Angle-of-Attack Research Vehicle (HARV) including piloted simulation handling qualities evaluation. The general steps include insertion of modeling data for geometry and mass properties, aerodynamics, propulsion data and assumptions, requirements and specifications, e.g. definition of control variables, handling qualities, stability margins and statements for bandwidth, control power, priorities, position and rate limits. The specific steps include choice of independent variables for least squares fits to aerodynamic and propulsion data, modifications to the management of the controls with regard to integrator windup and actuation limiting and priorities, e.g. pitch priority over roll, and command limiting to prevent departures and/or undesirable inertial coupling or inability to recover to a stable trim condition. The HARV control problem is characterized by significant nonlinearities and multivariable interactions in the low speed, high angle-of-attack, high angular rate flight regime. Systematic approaches to the control of vehicle motions modeled with coupled nonlinear equations of motion have been developed. This paper will discuss the dynamic inversion approach which explicity accounts for nonlinearities in the control design. Multiple control effectors (including aerodynamic control surfaces and thrust vectoring control) and sensors are used to control the motions of the vehicles in several degrees-of-freedom. Several maneuvers will be used to illustrate performance of MACH in the high angle-of-attack flight regime. Analytical methods for assessing the robust performance of the multivariable control system in the presence of math modeling uncertainty, disturbances, and commands have reached a high level of maturity. The structured singular value (mu) frequency response methodology is presented as a method for analyzing robust performance and the mu-synthesis method will be presented as a method for synthesizing a robust control system. The paper concludes with the author's expectations regarding future applications of robust nonlinear multivariable controls.

  6. Applications of monsoon research: Opportunities to inform decisionmaking and reduce regional vulnerability

    NASA Astrophysics Data System (ADS)

    Ray, A. J.; Garfin, G. M.; Wilder, M.; Lenart, M.; Vásquez-León, M.; Comrie, A. C.

    2007-05-01

    This presentation will describe ongoing efforts to understand interactions between the North American Monsoon and society, in order to develop applications for monsoon research in a highly complex, multicultural and binational region. The North American Monsoon is an annual precipitation regime that begins in early June in Mexico and progresses northward to the southwestern United States. The region includes stakeholders in large urban complexes, productive agricultural areas, and sparsely populated arid and semi-arid ecosystems. The political, cultural, and socioeconomic divisions between the U.S. and Mexico create a broad range of sensitivities to climate variability as well as capacities to use forecasts and other information to cope with climate. We will highlight methodologies to link climate science with society and analyze opportunities for monsoon science to benefit society in four sectors: natural hazards management, agriculture, public health, and water management. We present a synthesized list of stakeholder needs and a calendar of decisions to help scientists link user needs to potential forecasts and products. To ensure usability of forecasts and other research products, we recommend iterative scientist-stakeholder interactions, through integrated assessments. These knowledge- exchange interactions can improve the capacity for stakeholders to use forecasts thoughtfully and inform the development of research, and for the research community to obtain feedback on climate-related products and receive insights to guide research direction. We expect that integrated assessments can capitalize on the opportunities for monsoon science to inform decisionmaking, in the best instances, reduce regional climate vulnerabilities and enhance regional sustainability

  7. Understanding Contamination; Twenty Years of Simulating Radiological Contamination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emily Snyder; John Drake; Ryan James

    A wide variety of simulated contamination methods have been developed by researchers to reproducibly test radiological decontamination methods. Some twenty years ago a method of non-radioactive contamination simulation was proposed at the Idaho National Laboratory (INL) that mimicked the character of radioactive cesium and zirconium contamination on stainless steel. It involved baking the contamination into the surface of the stainless steel in order to 'fix' it into a tenacious, tightly bound oxide layer. This type of contamination was particularly applicable to nuclear processing facilities (and nuclear reactors) where oxide growth and exchange of radioactive materials within the oxide layer becamemore » the predominant model for material/contaminant interaction. Additional simulation methods and their empirically derived basis (from a nuclear fuel reprocessing facility) are discussed. In the last ten years the INL, working with the Defense Advanced Research Projects Agency (DARPA) and the National Homeland Security Research Center (NHSRC), has continued to develop contamination simulation methodologies. The most notable of these newer methodologies was developed to compare the efficacy of different decontamination technologies against radiological dispersal device (RDD, 'dirty bomb') type of contamination. There are many different scenarios for how RDD contamination may be spread, but the most commonly used one at the INL involves the dispersal of an aqueous solution containing radioactive Cs-137. This method was chosen during the DARPA projects and has continued through the NHSRC series of decontamination trials and also gives a tenacious 'fixed' contamination. Much has been learned about the interaction of cesium contamination with building materials, particularly concrete, throughout these tests. The effects of porosity, cation-exchange capacity of the material and the amount of dirt and debris on the surface are very important factors. The interaction of the contaminant/substrate with the particular decontamination technology is also very important. Results of decontamination testing from hundreds of contaminated coupons have lead to certain conclusions about the contamination and the type of decontamination methods being deployed. A recent addition to the DARPA initiated methodology simulates the deposition of nuclear fallout. This contamination differs from previous tests in that it has been developed and validated purely to simulate a 'loose' type of contamination. This may represent the first time that a radiologically contaminated 'fallout' stimulant has been developed to reproducibly test decontamination methods. While no contaminant/methodology may serve as a complete example of all aspects that could be seen in the field, the study of this family of simulation methods provides insight into the nature of radiological contamination.« less

  8. Quality control methodology for high-throughput protein-protein interaction screening.

    PubMed

    Vazquez, Alexei; Rual, Jean-François; Venkatesan, Kavitha

    2011-01-01

    Protein-protein interactions are key to many aspects of the cell, including its cytoskeletal structure, the signaling processes in which it is involved, or its metabolism. Failure to form protein complexes or signaling cascades may sometimes translate into pathologic conditions such as cancer or neurodegenerative diseases. The set of all protein interactions between the proteins encoded by an organism constitutes its protein interaction network, representing a scaffold for biological function. Knowing the protein interaction network of an organism, combined with other sources of biological information, can unravel fundamental biological circuits and may help better understand the molecular basics of human diseases. The protein interaction network of an organism can be mapped by combining data obtained from both low-throughput screens, i.e., "one gene at a time" experiments and high-throughput screens, i.e., screens designed to interrogate large sets of proteins at once. In either case, quality controls are required to deal with the inherent imperfect nature of experimental assays. In this chapter, we discuss experimental and statistical methodologies to quantify error rates in high-throughput protein-protein interactions screens.

  9. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

    NASA Astrophysics Data System (ADS)

    Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

    We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

  10. A New Method, "Reverse Yeast Two-Hybrid Array" (RYTHA), Identifies Mutants that Dissociate the Physical Interaction Between Elg1 and Slx5.

    PubMed

    Lev, Ifat; Shemesh, Keren; Volpe, Marina; Sau, Soumitra; Levinton, Nelly; Molco, Maya; Singh, Shivani; Liefshitz, Batia; Ben Aroya, Shay; Kupiec, Martin

    2017-07-01

    The vast majority of processes within the cell are carried out by proteins working in conjunction. The Yeast Two-Hybrid (Y2H) methodology allows the detection of physical interactions between any two interacting proteins. Here, we describe a novel systematic genetic methodology, "Reverse Yeast Two-Hybrid Array" (RYTHA), that allows the identification of proteins required for modulating the physical interaction between two given proteins. Our assay starts with a yeast strain in which the physical interaction of interest can be detected by growth on media lacking histidine, in the context of the Y2H methodology. By combining the synthetic genetic array technology, we can systematically screen mutant libraries of the yeast Saccharomyces cerevisiae to identify trans -acting mutations that disrupt the physical interaction of interest. We apply this novel method in a screen for mutants that disrupt the interaction between the N-terminus of Elg1 and the Slx5 protein. Elg1 is part of an alternative replication factor C-like complex that unloads PCNA during DNA replication and repair. Slx5 forms, together with Slx8, a SUMO-targeted ubiquitin ligase (STUbL) believed to send proteins to degradation. Our results show that the interaction requires both the STUbL activity and the PCNA unloading by Elg1, and identify topoisomerase I DNA-protein cross-links as a major factor in separating the two activities. Thus, we demonstrate that RYTHA can be applied to gain insights about particular pathways in yeast, by uncovering the connection between the proteasomal ubiquitin-dependent degradation pathway, DNA replication, and repair machinery, which can be separated by the topoisomerase-mediated cross-links to DNA. Copyright © 2017 by the Genetics Society of America.

  11. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  12. Probing Dynamic Cell-Substrate Interactions using Photochemically Generated Surface-Immobilized Gradients: Application to Selectin-Mediated Leukocyte Rolling

    PubMed Central

    Herman, Christine T.; Potts, Gregory K.; Michael, Madeline C.; Tolan, Nicole V.

    2014-01-01

    Model substrates presenting biochemical cues immobilized in a controlled and well-defined manner are of great interest for their applications in biointerface studies that elucidate the molecular basis of cell receptor-ligand interactions. Herein, we describe a direct, photochemical method to generate one-component surface-immobilized biomolecular gradients that are applied to the study of selectin-mediated leukocyte rolling. The technique employs benzophenone-modified glass substrates, which upon controlled exposure to UV light (350 – 365 nm) in the presence of protein-containing solutions facilitate the generation of covalently immobilized protein gradients. Conditions were optimized to generate gradient substrates presenting P-selectin and PSGL-1 (P-selectin Glycoprotein Ligand-1) immobilized at site densities over a 5- to 10-fold range (from as low as ~200 molecules/μm2 to as high as 6000 molecules/μm2). The resulting substrates were quantitatively characterized via fluorescence analysis and radioimmunoassays before their use in the leukocyte rolling assays. HL-60 promyelocytes and Jurkat T lymphocytes were assessed for their ability to tether to and roll on substrates presenting immobilized P-selectin and PSGL-1 under conditions of physiologically relevant shear stress. The results of these flow assays reveal the combined effect of immobilized protein site density and applied wall shear stress on cell rolling behavior. Two-component substrates presenting P-selectin and ICAM-1 (intercellular adhesion molecule-1) were also generated to assess the interplay between these two proteins and their effect on cell rolling and adhesion. These proof-of-principle studies verify that the described gradient generation approach yields well-defined gradient substrates that present immobilized proteins over a large range of site densities that are applicable for investigation of cell-materials interactions, including multi-parameter leukocyte flow studies. Future applications of this enabling methodology may lead to new insights into the biophysical phenomena and molecular mechanism underlying complex biological processes such as leukocyte recruitment and the inflammatory response. PMID:21614364

  13. FY16 Status Report on Development of Integrated EPP and SMT Design Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jetter, R. I.; Sham, T. -L.; Wang, Y.

    2016-08-01

    The goal of the Elastic-Perfectly Plastic (EPP) combined integrated creep-fatigue damage evaluation approach is to incorporate a Simplified Model Test (SMT) data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The EPP methodology is based on the idea that creep damage and strain accumulation can be bounded by a properly chosen “pseudo” yield strength used in an elastic-perfectly plastic analysis, thus avoiding the need for stress classification. The originalmore » SMT approach is based on the use of elastic analysis. The experimental data, cycles to failure, is correlated using the elastically calculated strain range in the test specimen and the corresponding component strain is also calculated elastically. The advantage of this approach is that it is no longer necessary to use the damage interaction, or D-diagram, because the damage due to the combined effects of creep and fatigue are accounted in the test data by means of a specimen that is designed to replicate or bound the stress and strain redistribution that occurs in actual components when loaded in the creep regime. The reference approach to combining the two methodologies and the corresponding uncertainties and validation plans are presented. Results from recent key feature tests are discussed to illustrate the applicability of the EPP methodology and the behavior of materials at elevated temperature when undergoing stress and strain redistribution due to plasticity and creep.« less

  14. Solute transport with equilibrium aqueous complexation and either sorption or ion exchange: Simulation methodology and applications

    USGS Publications Warehouse

    Lewis, F.M.; Voss, C.I.; Rubin, J.

    1987-01-01

    Methodologies that account for specific types of chemical reactions in the simulation of solute transport can be developed so they are compatible with solution algorithms employed in existing transport codes. This enables the simulation of reactive transport in complex multidimensional flow regimes, and provides a means for existing codes to account for some of the fundamental chemical processes that occur among transported solutes. Two equilibrium-controlled reaction systems demonstrate a methodology for accommodating chemical interaction into models of solute transport. One system involves the sorption of a given chemical species, as well as two aqueous complexations in which the sorbing species is a participant. The other reaction set involves binary ion exchange coupled with aqueous complexation involving one of the exchanging species. The methodology accommodates these reaction systems through the addition of nonlinear terms to the transport equations for the sorbing species. Example simulation results show (1) the effect equilibrium chemical parameters have on the spatial distributions of concentration for complexing solutes; (2) that an interrelationship exists between mechanical dispersion and the various reaction processes; (3) that dispersive parameters of the porous media cannot be determined from reactive concentration distributions unless the reaction is accounted for or the influence of the reaction is negligible; (4) how the concentration of a chemical species may be significantly affected by its participation in an aqueous complex with a second species which also sorbs; and (5) that these coupled chemical processes influencing reactive transport can be demonstrated in two-dimensional flow regimes. ?? 1987.

  15. Innovative educational methods and technologies applicable to continuing professional development in periodontology.

    PubMed

    Mattheos, N; Schoonheim-Klein, M; Walmsley, A D; Chapple, I L C

    2010-05-01

    Continuous professional development (CPD) in Periodontology refers to the overall framework of opportunities that facilitate a life-long learning practice, driven by the learner-practitioner and supported by a variety of institutions and individuals. CPD must address different needs for a great diversity of practitioners. It is clear that no particular methodology or technology is able to successfully accommodate the entire spectrum of CPD in Periodontology. Course designers must choose from and combine a wide array of methodologies and technologies, depending upon the needs of the learners and the objectives of the intended education. Research suggests that 'interactivity', 'flexibility', 'continuity' and 'relevance to learners' practice' are major characteristics of successful CPD. Various methods of mentoring, peer-learning environments and work-based learning have been combined with reflective practice and self-study to form the methodological backbone of CPD courses. Blended learning encompasses a wide array of technologies and methodologies and has been successfully used in CPD courses. Internet-based content learning management systems, portable Internet devices, powerful databases and search engines, together with initiatives such as 'open access' and 'open courseware' provide an array of effective instructional and communication tools. Assessment remains a key issue in CPD, providing learners with valuable feedback and it ensures the credibility and effectiveness of the learning process. Assessment is a multi-level process using different methods for different learning outcomes, as directed by current evidence and best practices. Finally, quality assurance of the education provided must follow CPD courses at all times through a structured and credible process.

  16. The use of concept maps during knowledge elicitation in ontology development processes – the nutrigenomics use case

    PubMed Central

    Castro, Alexander Garcia; Rocca-Serra, Philippe; Stevens, Robert; Taylor, Chris; Nashar, Karim; Ragan, Mark A; Sansone, Susanna-Assunta

    2006-01-01

    Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process. PMID:16725019

  17. Quantifying the Molecular Origins of Opposite Solvent Effects on Protein-Protein Interactions

    PubMed Central

    Vagenende, Vincent; Han, Alvin X.; Pek, Han B.; Loo, Bernard L. W.

    2013-01-01

    Although the nature of solvent-protein interactions is generally weak and non-specific, addition of cosolvents such as denaturants and osmolytes strengthens protein-protein interactions for some proteins, whereas it weakens protein-protein interactions for others. This is exemplified by the puzzling observation that addition of glycerol oppositely affects the association constants of two antibodies, D1.3 and D44.1, with lysozyme. To resolve this conundrum, we develop a methodology based on the thermodynamic principles of preferential interaction theory and the quantitative characterization of local protein solvation from molecular dynamics simulations. We find that changes of preferential solvent interactions at the protein-protein interface quantitatively account for the opposite effects of glycerol on the antibody-antigen association constants. Detailed characterization of local protein solvation in the free and associated protein states reveals how opposite solvent effects on protein-protein interactions depend on the extent of dewetting of the protein-protein contact region and on structural changes that alter cooperative solvent-protein interactions at the periphery of the protein-protein interface. These results demonstrate the direct relationship between macroscopic solvent effects on protein-protein interactions and atom-scale solvent-protein interactions, and establish a general methodology for predicting and understanding solvent effects on protein-protein interactions in diverse biological environments. PMID:23696727

  18. Quantifying the molecular origins of opposite solvent effects on protein-protein interactions.

    PubMed

    Vagenende, Vincent; Han, Alvin X; Pek, Han B; Loo, Bernard L W

    2013-01-01

    Although the nature of solvent-protein interactions is generally weak and non-specific, addition of cosolvents such as denaturants and osmolytes strengthens protein-protein interactions for some proteins, whereas it weakens protein-protein interactions for others. This is exemplified by the puzzling observation that addition of glycerol oppositely affects the association constants of two antibodies, D1.3 and D44.1, with lysozyme. To resolve this conundrum, we develop a methodology based on the thermodynamic principles of preferential interaction theory and the quantitative characterization of local protein solvation from molecular dynamics simulations. We find that changes of preferential solvent interactions at the protein-protein interface quantitatively account for the opposite effects of glycerol on the antibody-antigen association constants. Detailed characterization of local protein solvation in the free and associated protein states reveals how opposite solvent effects on protein-protein interactions depend on the extent of dewetting of the protein-protein contact region and on structural changes that alter cooperative solvent-protein interactions at the periphery of the protein-protein interface. These results demonstrate the direct relationship between macroscopic solvent effects on protein-protein interactions and atom-scale solvent-protein interactions, and establish a general methodology for predicting and understanding solvent effects on protein-protein interactions in diverse biological environments.

  19. Application of the Hardman methodology to the Single Channel Ground-Airborne Radio System (SINCGARS)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The HARDMAN methodology was applied to the various configurations of employment for an emerging Army multipurpose communications system. The methodology was used to analyze the manpower, personnel and training (MPT) requirements and associated costs, of the system concepts responsive to the Army's requirement for the Single Channel Ground-Airborne Radio System (SINCGARS). The scope of the application includes the analysis of two conceptual designs Cincinnati Electronics and ITT Aerospace/Optical Division for operating and maintenance support addressed through the general support maintenance echelon.

  20. Effective Factors in Interactions within Japanese EFL Classrooms

    ERIC Educational Resources Information Center

    Maftoon, Parviz; Ziafar, Meisam

    2013-01-01

    Classroom interactional patterns depend on some contextual, cultural and local factors in addition to the methodologies employed in the classroom. In order to delineate such factors, the focus of classroom interaction research needs to shift from the observables to the unobservables like teachers' and learners' psychological states and cultural…

Top