Sample records for objects include data-driven

  1. C-arm technique using distance driven method for nephrolithiasis and kidney stones detection

    NASA Astrophysics Data System (ADS)

    Malalla, Nuhad; Sun, Pengfei; Chen, Ying; Lipkin, Michael E.; Preminger, Glenn M.; Qin, Jun

    2016-04-01

    Distance driven represents a state of art method that used for reconstruction for x-ray techniques. C-arm tomography is an x-ray imaging technique that provides three dimensional information of the object by moving the C-shaped gantry around the patient. With limited view angle, C-arm system was investigated to generate volumetric data of the object with low radiation dosage and examination time. This paper is a new simulation study with two reconstruction methods based on distance driven including: simultaneous algebraic reconstruction technique (SART) and Maximum Likelihood expectation maximization (MLEM). Distance driven is an efficient method that has low computation cost and free artifacts compared with other methods such as ray driven and pixel driven methods. Projection images of spherical objects were simulated with a virtual C-arm system with a total view angle of 40 degrees. Results show the ability of limited angle C-arm technique to generate three dimensional images with distance driven reconstruction.

  2. A metadata-driven approach to data repository design.

    PubMed

    Harvey, Matthew J; McLean, Andrew; Rzepa, Henry S

    2017-01-01

    The design and use of a metadata-driven data repository for research data management is described. Metadata is collected automatically during the submission process whenever possible and is registered with DataCite in accordance with their current metadata schema, in exchange for a persistent digital object identifier. Two examples of data preview are illustrated, including the demonstration of a method for integration with commercial software that confers rich domain-specific data analytics without introducing customisation into the repository itself.

  3. Value Driven Information Processing and Fusion

    DTIC Science & Technology

    2016-03-01

    consensus approach allows a decentralized approach to achieve the optimal error exponent of the centralized counterpart, a conclusion that is signifi...SECURITY CLASSIFICATION OF: The objective of the project is to develop a general framework for value driven decentralized information processing...including: optimal data reduction in a network setting for decentralized inference with quantization constraint; interactive fusion that allows queries and

  4. Data Based Instruction in Reading

    ERIC Educational Resources Information Center

    Ediger, Marlow

    2010-01-01

    Data based instruction has received much attention in educational literature. It relates well to measurement driven teaching and learning. Data may come from several sources including mandated tests, district wide testing, formative and summative evaluations, as well as teacher written tests. Objective information is intended for use in data based…

  5. Pareto fronts for multiobjective optimization design on materials data

    NASA Astrophysics Data System (ADS)

    Gopakumar, Abhijith; Balachandran, Prasanna; Gubernatis, James E.; Lookman, Turab

    Optimizing multiple properties simultaneously is vital in materials design. Here we apply infor- mation driven, statistical optimization strategies blended with machine learning methods, to address multi-objective optimization tasks on materials data. These strategies aim to find the Pareto front consisting of non-dominated data points from a set of candidate compounds with known character- istics. The objective is to find the pareto front in as few additional measurements or calculations as possible. We show how exploration of the data space to find the front is achieved by using uncer- tainties in predictions from regression models. We test our proposed design strategies on multiple, independent data sets including those from computations as well as experiments. These include data sets for Max phases, piezoelectrics and multicomponent alloys.

  6. Transportable Applications Environment (TAE) Tenth Users' Conference

    NASA Technical Reports Server (NTRS)

    Rouff, Chris (Editor); Harris, Elfrieda (Editor); Yeager, Arleen (Editor)

    1993-01-01

    Conference proceedings are represented in graphic visual-aid form. Presentation and panel discussion topics include user experiences with C++ and Ada; the design and interaction of the user interface; the history and goals of TAE; commercialization and testing of TAE Plus; Computer-Human Interaction Models (CHIMES); data driven objects; item-to-item connections and object dependencies; and integration with other software. There follows a list of conference attendees.

  7. Task-Driven Optimization of Fluence Field and Regularization for Model-Based Iterative Reconstruction in Computed Tomography.

    PubMed

    Gang, Grace J; Siewerdsen, Jeffrey H; Stayman, J Webster

    2017-12-01

    This paper presents a joint optimization of dynamic fluence field modulation (FFM) and regularization in quadratic penalized-likelihood reconstruction that maximizes a task-based imaging performance metric. We adopted a task-driven imaging framework for prospective designs of the imaging parameters. A maxi-min objective function was adopted to maximize the minimum detectability index ( ) throughout the image. The optimization algorithm alternates between FFM (represented by low-dimensional basis functions) and local regularization (including the regularization strength and directional penalty weights). The task-driven approach was compared with three FFM strategies commonly proposed for FBP reconstruction (as well as a task-driven TCM strategy) for a discrimination task in an abdomen phantom. The task-driven FFM assigned more fluence to less attenuating anteroposterior views and yielded approximately constant fluence behind the object. The optimal regularization was almost uniform throughout image. Furthermore, the task-driven FFM strategy redistribute fluence across detector elements in order to prescribe more fluence to the more attenuating central region of the phantom. Compared with all strategies, the task-driven FFM strategy not only improved minimum by at least 17.8%, but yielded higher over a large area inside the object. The optimal FFM was highly dependent on the amount of regularization, indicating the importance of a joint optimization. Sample reconstructions of simulated data generally support the performance estimates based on computed . The improvements in detectability show the potential of the task-driven imaging framework to improve imaging performance at a fixed dose, or, equivalently, to provide a similar level of performance at reduced dose.

  8. User-Preference-Driven Model Predictive Control of Residential Building Loads and Battery Storage for Demand Response: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Xin; Baker, Kyri A.; Christensen, Dane T.

    This paper presents a user-preference-driven home energy management system (HEMS) for demand response (DR) with residential building loads and battery storage. The HEMS is based on a multi-objective model predictive control algorithm, where the objectives include energy cost, thermal comfort, and carbon emission. A multi-criterion decision making method originating from social science is used to quickly determine user preferences based on a brief survey and derive the weights of different objectives used in the optimization process. Besides the residential appliances used in the traditional DR programs, a home battery system is integrated into the HEMS to improve the flexibility andmore » reliability of the DR resources. Simulation studies have been performed on field data from a residential building stock data set. Appliance models and usage patterns were learned from the data to predict the DR resource availability. Results indicate the HEMS was able to provide a significant amount of load reduction with less than 20% prediction error in both heating and cooling cases.« less

  9. User-Preference-Driven Model Predictive Control of Residential Building Loads and Battery Storage for Demand Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Xin; Baker, Kyri A; Isley, Steven C

    This paper presents a user-preference-driven home energy management system (HEMS) for demand response (DR) with residential building loads and battery storage. The HEMS is based on a multi-objective model predictive control algorithm, where the objectives include energy cost, thermal comfort, and carbon emission. A multi-criterion decision making method originating from social science is used to quickly determine user preferences based on a brief survey and derive the weights of different objectives used in the optimization process. Besides the residential appliances used in the traditional DR programs, a home battery system is integrated into the HEMS to improve the flexibility andmore » reliability of the DR resources. Simulation studies have been performed on field data from a residential building stock data set. Appliance models and usage patterns were learned from the data to predict the DR resource availability. Results indicate the HEMS was able to provide a significant amount of load reduction with less than 20% prediction error in both heating and cooling cases.« less

  10. Can data-driven benchmarks be used to set the goals of healthy people 2010?

    PubMed Central

    Allison, J; Kiefe, C I; Weissman, N W

    1999-01-01

    OBJECTIVES: Expert panels determined the public health goals of Healthy People 2000 subjectively. The present study examined whether data-driven benchmarks provide a better alternative. METHODS: We developed the "pared-mean" method to define from data the best achievable health care practices. We calculated the pared-mean benchmark for screening mammography from the 1994 National Health Interview Survey, using the metropolitan statistical area as the "provider" unit. Beginning with the best-performing provider and adding providers in descending sequence, we established the minimum provider subset that included at least 10% of all women surveyed on this question. The pared-mean benchmark is then the proportion of women in this subset who received mammography. RESULTS: The pared-mean benchmark for screening mammography was 71%, compared with the Healthy People 2000 goal of 60%. CONCLUSIONS: For Healthy People 2010, benchmarks derived from data reflecting the best available care provide viable alternatives to consensus-derived targets. We are currently pursuing additional refinements to the data-driven pared-mean benchmark approach. PMID:9987466

  11. Transportable Applications Environment (TAE) Plus: A NASA tool used to develop and manage graphical user interfaces

    NASA Technical Reports Server (NTRS)

    Szczur, Martha R.

    1992-01-01

    The Transportable Applications Environment (TAE) Plus was built to support the construction of graphical user interfaces (GUI's) for highly interactive applications, such as real-time processing systems and scientific analysis systems. It is a general purpose portable tool that includes a 'What You See Is What You Get' WorkBench that allows user interface designers to layout and manipulate windows and interaction objects. The WorkBench includes both user entry objects (e.g., radio buttons, menus) and data-driven objects (e.g., dials, gages, stripcharts), which dynamically change based on values of realtime data. Discussed here is what TAE Plus provides, how the implementation has utilized state-of-the-art technologies within graphic workstations, and how it has been used both within and without NASA.

  12. Objective, Quantitative, Data-Driven Assessment of Chemical Probes.

    PubMed

    Antolin, Albert A; Tym, Joseph E; Komianou, Angeliki; Collins, Ian; Workman, Paul; Al-Lazikani, Bissan

    2018-02-15

    Chemical probes are essential tools for understanding biological systems and for target validation, yet selecting probes for biomedical research is rarely based on objective assessment of all potential compounds. Here, we describe the Probe Miner: Chemical Probes Objective Assessment resource, capitalizing on the plethora of public medicinal chemistry data to empower quantitative, objective, data-driven evaluation of chemical probes. We assess >1.8 million compounds for their suitability as chemical tools against 2,220 human targets and dissect the biases and limitations encountered. Probe Miner represents a valuable resource to aid the identification of potential chemical probes, particularly when used alongside expert curation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Variety Preserved Instance Weighting and Prototype Selection for Probabilistic Multiple Scope Simulations

    DTIC Science & Technology

    2017-05-30

    including analysis, control and management of the systems across their multiple scopes . These difficulties will become more significant in near future...behaviors of the systems , it tends to cover their many scopes . Accordingly, we may obtain better models for the simulations in a data-driven manner...to capture variety of the instance distribution in a given data set for covering multiple scopes of our objective system in a seamless manner. (2

  14. Nonstationary EO/IR Clutter Suppression and Dim Object Tracking

    NASA Astrophysics Data System (ADS)

    Tartakovsky, A.; Brown, A.; Brown, J.

    2010-09-01

    We develop and evaluate the performance of advanced algorithms which provide significantly improved capabilities for automated detection and tracking of ballistic and flying dim objects in the presence of highly structured intense clutter. Applications include ballistic missile early warning, midcourse tracking, trajectory prediction, and resident space object detection and tracking. The set of algorithms include, in particular, adaptive spatiotemporal clutter estimation-suppression and nonlinear filtering-based multiple-object track-before-detect. These algorithms are suitable for integration into geostationary, highly elliptical, or low earth orbit scanning or staring sensor suites, and are based on data-driven processing that adapts to real-world clutter backgrounds, including celestial, earth limb, or terrestrial clutter. In many scenarios of interest, e.g., for highly elliptic and, especially, low earth orbits, the resulting clutter is highly nonstationary, providing a significant challenge for clutter suppression to or below sensor noise levels, which is essential for dim object detection and tracking. We demonstrate the success of the developed algorithms using semi-synthetic and real data. In particular, our algorithms are shown to be capable of detecting and tracking point objects with signal-to-clutter levels down to 1/1000 and signal-to-noise levels down to 1/4.

  15. Prototype Development: Context-Driven Dynamic XML Ophthalmologic Data Capture Application

    PubMed Central

    Schwei, Kelsey M; Kadolph, Christopher; Finamore, Joseph; Cancel, Efrain; McCarty, Catherine A; Okorie, Asha; Thomas, Kate L; Allen Pacheco, Jennifer; Pathak, Jyotishman; Ellis, Stephen B; Denny, Joshua C; Rasmussen, Luke V; Tromp, Gerard; Williams, Marc S; Vrabec, Tamara R; Brilliant, Murray H

    2017-01-01

    Background The capture and integration of structured ophthalmologic data into electronic health records (EHRs) has historically been a challenge. However, the importance of this activity for patient care and research is critical. Objective The purpose of this study was to develop a prototype of a context-driven dynamic extensible markup language (XML) ophthalmologic data capture application for research and clinical care that could be easily integrated into an EHR system. Methods Stakeholders in the medical, research, and informatics fields were interviewed and surveyed to determine data and system requirements for ophthalmologic data capture. On the basis of these requirements, an ophthalmology data capture application was developed to collect and store discrete data elements with important graphical information. Results The context-driven data entry application supports several features, including ink-over drawing capability for documenting eye abnormalities, context-based Web controls that guide data entry based on preestablished dependencies, and an adaptable database or XML schema that stores Web form specifications and allows for immediate changes in form layout or content. The application utilizes Web services to enable data integration with a variety of EHRs for retrieval and storage of patient data. Conclusions This paper describes the development process used to create a context-driven dynamic XML data capture application for optometry and ophthalmology. The list of ophthalmologic data elements identified as important for care and research can be used as a baseline list for future ophthalmologic data collection activities. PMID:28903894

  16. Design of a data-driven predictive controller for start-up process of AMT vehicles.

    PubMed

    Lu, Xiaohui; Chen, Hong; Wang, Ping; Gao, Bingzhao

    2011-12-01

    In this paper, a data-driven predictive controller is designed for the start-up process of vehicles with automated manual transmissions (AMTs). It is obtained directly from the input-output data of a driveline simulation model constructed by the commercial software AMESim. In order to obtain offset-free control for the reference input, the predictor equation is gained with incremental inputs and outputs. Because of the physical characteristics, the input and output constraints are considered explicitly in the problem formulation. The contradictory requirements of less friction losses and less driveline shock are included in the objective function. The designed controller is tested under nominal conditions and changed conditions. The simulation results show that, during the start-up process, the AMT clutch with the proposed controller works very well, and the process meets the control objectives: fast clutch lockup time, small friction losses, and the preservation of driver comfort, i.e., smooth acceleration of the vehicle. At the same time, the closed-loop system has the ability to reject uncertainties, such as the vehicle mass and road grade.

  17. Field Model: An Object-Oriented Data Model for Fields

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.

    2001-01-01

    We present an extensible, object-oriented data model designed for field data entitled Field Model (FM). FM objects can represent a wide variety of fields, including fields of arbitrary dimension and node type. FM can also handle time-series data. FM achieves generality through carefully selected topological primitives and through an implementation that leverages the potential of templated C++. FM supports fields where the nodes values are paired with any cell type. Thus FM can represent data where the field nodes are paired with the vertices ("vertex-centered" data), fields where the nodes are paired with the D-dimensional cells in R(sup D) (often called "cell-centered" data), as well as fields where nodes are paired with edges or other cell types. FM is designed to effectively handle very large data sets; in particular FM employs a demand-driven evaluation strategy that works especially well with large field data. Finally, the interfaces developed for FM have the potential to effectively abstract field data based on adaptive meshes. We present initial results with a triangular adaptive grid in R(sup 2) and discuss how the same design abstractions would work equally well with other adaptive-grid variations, including meshes in R(sup 3).

  18. Data Driven Decision Making in the Social Studies

    ERIC Educational Resources Information Center

    Ediger, Marlow

    2010-01-01

    Data driven decision making emphasizes the importance of the teacher using objective sources of information in developing the social studies curriculum. Too frequently, decisions of teachers have been made based on routine and outdated methods of teaching. Valid and reliable tests used to secure results from pupil learning make for better…

  19. Agile data management for curation of genomes to watershed datasets

    NASA Astrophysics Data System (ADS)

    Varadharajan, C.; Agarwal, D.; Faybishenko, B.; Versteeg, R.

    2015-12-01

    A software platform is being developed for data management and assimilation [DMA] as part of the U.S. Department of Energy's Genomes to Watershed Sustainable Systems Science Focus Area 2.0. The DMA components and capabilities are driven by the project science priorities and the development is based on agile development techniques. The goal of the DMA software platform is to enable users to integrate and synthesize diverse and disparate field, laboratory, and simulation datasets, including geological, geochemical, geophysical, microbiological, hydrological, and meteorological data across a range of spatial and temporal scales. The DMA objectives are (a) developing an integrated interface to the datasets, (b) storing field monitoring data, laboratory analytical results of water and sediments samples collected into a database, (c) providing automated QA/QC analysis of data and (d) working with data providers to modify high-priority field and laboratory data collection and reporting procedures as needed. The first three objectives are driven by user needs, while the last objective is driven by data management needs. The project needs and priorities are reassessed regularly with the users. After each user session we identify development priorities to match the identified user priorities. For instance, data QA/QC and collection activities have focused on the data and products needed for on-going scientific analyses (e.g. water level and geochemistry). We have also developed, tested and released a broker and portal that integrates diverse datasets from two different databases used for curation of project data. The development of the user interface was based on a user-centered design process involving several user interviews and constant interaction with data providers. The initial version focuses on the most requested feature - i.e. finding the data needed for analyses through an intuitive interface. Once the data is found, the user can immediately plot and download data through the portal. The resulting product has an interface that is more intuitive and presents the highest priority datasets that are needed by the users. Our agile approach has enabled us to build a system that is keeping pace with the science needs while utilizing limited resources.

  20. Data-driven indexing mechanism for the recognition of polyhedral objects

    NASA Astrophysics Data System (ADS)

    McLean, Stewart; Horan, Peter; Caelli, Terry M.

    1992-02-01

    This paper is concerned with the problem of searching large model databases. To date, most object recognition systems have concentrated on the problem of matching using simple searching algorithms. This is quite acceptable when the number of object models is small. However, in the future, general purpose computer vision systems will be required to recognize hundreds or perhaps thousands of objects and, in such circumstances, efficient searching algorithms will be needed. The problem of searching a large model database is one which must be addressed if future computer vision systems are to be at all effective. In this paper we present a method we call data-driven feature-indexed hypothesis generation as one solution to the problem of searching large model databases.

  1. Application of Template Matching for Improving Classification of Urban Railroad Point Clouds

    PubMed Central

    Arastounia, Mostafa; Oude Elberink, Sander

    2016-01-01

    This study develops an integrated data-driven and model-driven approach (template matching) that clusters the urban railroad point clouds into three classes of rail track, contact cable, and catenary cable. The employed dataset covers 630 m of the Dutch urban railroad corridors in which there are four rail tracks, two contact cables, and two catenary cables. The dataset includes only geometrical information (three dimensional (3D) coordinates of the points) with no intensity data and no RGB data. The obtained results indicate that all objects of interest are successfully classified at the object level with no false positives and no false negatives. The results also show that an average 97.3% precision and an average 97.7% accuracy at the point cloud level are achieved. The high precision and high accuracy of the rail track classification (both greater than 96%) at the point cloud level stems from the great impact of the employed template matching method on excluding the false positives. The cables also achieve quite high average precision (96.8%) and accuracy (98.4%) due to their high sampling and isolated position in the railroad corridor. PMID:27973452

  2. Asynchronous Data Retrieval from an Object-Oriented Database

    NASA Astrophysics Data System (ADS)

    Gilbert, Jonathan P.; Bic, Lubomir

    We present an object-oriented semantic database model which, similar to other object-oriented systems, combines the virtues of four concepts: the functional data model, a property inheritance hierarchy, abstract data types and message-driven computation. The main emphasis is on the last of these four concepts. We describe generic procedures that permit queries to be processed in a purely message-driven manner. A database is represented as a network of nodes and directed arcs, in which each node is a logical processing element, capable of communicating with other nodes by exchanging messages. This eliminates the need for shared memory and for centralized control during query processing. Hence, the model is suitable for implementation on a multiprocessor computer architecture, consisting of large numbers of loosely coupled processing elements.

  3. Cognitive Processing Therapy for Spanish-speaking Latinos: A formative study of a model-driven cultural adaptation of the manual to enhance implementation in a usual care setting

    PubMed Central

    Valentine, Sarah E.; Borba, Christina P. C.; Dixon, Louise; Vaewsorn, Adin S.; Guajardo, Julia Gallegos; Resick, Patricia A.; Wiltsey-Stirman, Shannon; Marques, Luana

    2016-01-01

    Objective As part of a larger implementation trial for Cognitive Processing Therapy (CPT) for posttraumatic stress disorder (PTSD) in a community health center, we used formative evaluation to assess relations between iterative cultural adaption (for Spanish-speaking clients) and implementation outcomes (appropriateness & acceptability) for CPT. Method Qualitative data for the current study were gathered through multiple sources (providers: N=6; clients: N=22), including CPT therapy sessions, provider field notes, weekly consultation team meetings, and researcher field notes. Findings from conventional and directed content analysis of the data informed refinements to the CPT manual. Results Data-driven refinements included adaptations related to cultural context (i.e., language, regional variation in wording), urban context (e.g., crime/violence), and literacy level. Qualitative findings suggest improved appropriateness and acceptability of CPT for Spanish-speaking clients. Conclusion Our study reinforces the need for dual application of cultural adaptation and implementation science to address the PTSD treatment needs of Spanish-speaking clients. PMID:27378013

  4. Sensor modeling and demonstration of a multi-object spectrometer for performance-driven sensing

    NASA Astrophysics Data System (ADS)

    Kerekes, John P.; Presnar, Michael D.; Fourspring, Kenneth D.; Ninkov, Zoran; Pogorzala, David R.; Raisanen, Alan D.; Rice, Andrew C.; Vasquez, Juan R.; Patel, Jeffrey P.; MacIntyre, Robert T.; Brown, Scott D.

    2009-05-01

    A novel multi-object spectrometer (MOS) is being explored for use as an adaptive performance-driven sensor that tracks moving targets. Developed originally for astronomical applications, the instrument utilizes an array of micromirrors to reflect light to a panchromatic imaging array. When an object of interest is detected the individual micromirrors imaging the object are tilted to reflect the light to a spectrometer to collect a full spectrum. This paper will present example sensor performance from empirical data collected in laboratory experiments, as well as our approach in designing optical and radiometric models of the MOS channels and the micromirror array. Simulation of moving vehicles in a highfidelity, hyperspectral scene is used to generate a dynamic video input for the adaptive sensor. Performance-driven algorithms for feature-aided target tracking and modality selection exploit multiple electromagnetic observables to track moving vehicle targets.

  5. Fleet DNA (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walkokwicz, K.; Duran, A.

    2014-06-01

    The Fleet DNA project objectives include capturing and quantifying drive cycle and technology variation for the multitude of medium- and heavy-duty vocations; providing a common data storage warehouse for medium- and heavy-duty vehicle fleet data across DOE activities and laboratories; and integrating existing DOE tools, models, and analyses to provide data-driven decision making capabilities. Fleet DNA advantages include: for Government - providing in-use data for standard drive cycle development, R&D, tech targets, and rule making; for OEMs - real-world usage datasets provide concrete examples of customer use profiles; for fleets - vocational datasets help illustrate how to maximize return onmore » technology investments; for Funding Agencies - ways are revealed to optimize the impact of financial incentive offers; and for researchers -a data source is provided for modeling and simulation.« less

  6. Harnessing health information to foster disadvantaged teens' community engagement, leadership skills, and career plans: a qualitative evaluation of the Teen Health Leadership Program.

    PubMed

    Keselman, Alla; Ahmed, Einas A; Williamson, Deborah C; Kelly, Janice E; Dutcher, Gale A

    2015-04-01

    This paper describes a qualitative evaluation of a small-scale program aiming to improve health information literacy, leadership skills, and interest in health careers among high school students in a low-income, primarily minority community. Graduates participated in semi-structured interviews, transcripts of which were coded with a combination of objectives-driven and data-driven categories. The program had a positive impact on the participants' health information competency, leadership skills, academic orientation, and interest in health careers. Program enablers included a supportive network of adults, novel experiences, and strong mentorship. The study suggests that health information can provide a powerful context for enabling disadvantaged students' community engagement and academic success.

  7. In-situ Condition Monitoring of Components in Small Modular Reactors Using Process and Electrical Signature Analysis. Final report, volume 1. Development of experimental flow control loop, data analysis and plant monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyaya, Belle; Hines, J. Wesley; Damiano, Brian

    The research and development under this project was focused on the following three major objectives: Objective 1: Identification of critical in-vessel SMR components for remote monitoring and development of their low-order dynamic models, along with a simulation model of an integral pressurized water reactor (iPWR). Objective 2: Development of an experimental flow control loop with motor-driven valves and pumps, incorporating data acquisition and on-line monitoring interface. Objective 3: Development of stationary and transient signal processing methods for electrical signatures, machinery vibration, and for characterizing process variables for equipment monitoring. This objective includes the development of a data analysis toolbox. Themore » following is a summary of the technical accomplishments under this project: - A detailed literature review of various SMR types and electrical signature analysis of motor-driven systems was completed. A bibliography of literature is provided at the end of this report. Assistance was provided by ORNL in identifying some key references. - A review of literature on pump-motor modeling and digital signal processing methods was performed. - An existing flow control loop was upgraded with new instrumentation, data acquisition hardware and software. The upgrading of the experimental loop included the installation of a new submersible pump driven by a three-phase induction motor. All the sensors were calibrated before full-scale experimental runs were performed. - MATLAB-Simulink model of a three-phase induction motor and pump system was completed. The model was used to simulate normal operation and fault conditions in the motor-pump system, and to identify changes in the electrical signatures. - A simulation model of an integral PWR (iPWR) was updated and the MATLAB-Simulink model was validated for known transients. The pump-motor model was interfaced with the iPWR model for testing the impact of primary flow perturbations (upsets) on plant parameters and the pump electrical signatures. Additionally, the reactor simulation is being used to generate normal operation data and data with instrumentation faults and process anomalies. A frequency controller was interfaced with the motor power supply in order to vary the electrical supply frequency. The experimental flow control loop was used to generate operational data under varying motor performance characteristics. Coolant leakage events were simulated by varying the bypass loop flow rate. The accuracy of motor power calculation was improved by incorporating the power factor, computed from motor current and voltage in each phase of the induction motor.- A variety of experimental runs were made for steady-state and transient pump operating conditions. Process, vibration, and electrical signatures were measured using a submersible pump with variable supply frequency. High correlation was seen between motor current and pump discharge pressure signal; similar high correlation was exhibited between pump motor power and flow rate. Wide-band analysis indicated high coherence (in the frequency domain) between motor current and vibration signals. - Wide-band operational data from a PWR were acquired from AMS Corporation and used to develop time-series models, and to estimate signal spectrum and sensor time constant. All the data were from different pressure transmitters in the system, including primary and secondary loops. These signals were pre-processed using the wavelet transform for filtering both low-frequency and high-frequency bands. This technique of signal pre-processing provides minimum distortion of the data, and results in a more optimal estimation of time constants of plant sensors using time-series modeling techniques.« less

  8. Alaska | State, Local, and Tribal Governments | NREL

    Science.gov Websites

    Alaska Advancing Energy Solutions in Alaska NREL provides objective, data-driven support to aid decision makers in Alaska as they take actions to deploy sustainable energy technologies, prepare for a clean-energy-driven economic transition, and reduce energy burdens in their jurisdictions. NREL's

  9. Assessing the Potential for Sediment Gravity-Driven Underflows at the Currently Active Mouth of the Huanghe Delta

    NASA Astrophysics Data System (ADS)

    Mullane, M.; Kumpf, L. L.; Kineke, G. C.

    2017-12-01

    The Huanghe (Yellow River), once known for extremely high suspended-sediment concentrations (SSCs) that could produce hyperpycnal plumes (10s of g/l), has experienced a dramatic reduction in sediment load following the construction of several reservoirs, namely the Xiaolangdi reservoir completed in 1999. Except for managed flushing events, SSC in the lower river is now on the order of 1 g/l or less. Adaptations of the Chezy equation for gravity-driven transport show that dominant parameters driving hyperpycnal underflows include concentration (and therefore density), thickness of a sediment-laden layer and bed slope. The objectives of this research were to assess the potential for gravity-driven underflows given modern conditions at the active river mouth. Multiple shore-normal transects were conducted during research cruises in mid-July of 2016 and 2017 using a Knudsen dual-frequency echosounder to collect bathymetric data and to document the potential presence of fluid mud layers. An instrumented profiling tripod equipped with a CTD, optical backscatterance sensor and in-situ pump system were used to sample water column parameters. SSCs were determined from near-bottom and surface water samples. Echosounder data were analyzed for bed slopes at the delta-front and differences in depth of return for the two frequencies (50 and 200 kHz), which could indicate fluid muds. Bathymetric data analysis yielded bed slope measurements near or above threshold values to produce gravity-driven underflows (0.46°). The maximum observed thickness of a potential fluid mud layer was 0.7 m, and the highest sampled near-bed SSCs were nearly 14 g/l for both field campaigns. These results indicate that the modern delta maintains potential for sediment gravity-driven underflows, even during ambient conditions prior to maximum summer discharge. These results will inform future work quantitatively comparing the contributions of all sediment dispersal mechanisms near the active Huanghe delta environment, including advection of the buoyant river plume and wave resuspension and transport by tidal currents.

  10. [Rationalities of knowledge production: on transformations of objects, technologies and information in biomedicine and the life sciences].

    PubMed

    Paul, Norbert W

    2009-09-01

    Since decades, scientific change has been interpreted in the light of paradigm shifts and scientific revolutions. The Kuhnian interpretation of scientific change however is now more and more confronted with non-disciplinary thinking in both, science and studies on science. This paper explores how research in biomedicine and the life sciences can be characterized by different rationalities, sometimes converging, sometimes contradictory, all present at the same time with varying ways of influence, impact, and visibility. In general, the rationality of objects is generated by fitting new objects and findings into a new experimental context. The rationality of hypotheses is a move towards the construction of novel explanatory tools and models. This is often inseparable meshing with the third, the technological rationality, in which a technology-driven, self-supporting and sometimes self-referential refinement of methods and technologies comes along with an extension into other fields. During the second and the third phase, the new and emerging fields tend to expand their explanatory reach not only across disciplinary boundaries but also into the social sphere, creating what has been characterized as "exceptionalism" (e.g. genetic exceptionalism or neuro-exceptionalism). Finally, recent biomedicine and life-sciences reach a level in which experimental work becomes more and more data-driven because the technologically constructed experimental systems generate a plethora of findings (data) which at some point start to blur the original hypotheses. For the rationality of information the materiality of research practices becomes secondary and research objects are more and more getting out of sight. Finally, the credibility of science as a practice becomes more and more dependent on consensus about the applicability and relevance of its results. The rationality of interest (and accountability) has become more and more characteristic for a research process which is no longer primarily determined by the desire for knowledge but by the desire for relevance. This paper explores in which ways object-driven and hypotheses-driven experimental life-sciences transformed into domains of experimental research evolving in a technologically constructed, data-driven environment in which they are subjected to constant morphing due to the forces of different rationalities.

  11. Effectiveness of User- and Expert-Driven Web-based Hypertension Programs: an RCT.

    PubMed

    Liu, Sam; Brooks, Dina; Thomas, Scott G; Eysenbach, Gunther; Nolan, Robert P

    2018-04-01

    The effectiveness of self-guided Internet-based lifestyle counseling (e-counseling) varies, depending on treatment protocol. Two dominant procedures in e-counseling are expert- and user-driven. The influence of these procedures on hypertension management remains unclear. The objective was to assess whether blood pressure improved with expert-driven or user-driven e-counseling over control intervention in patients with hypertension over a 4-month period. This study used a three-parallel group, double-blind randomized controlled design. In Toronto, Canada, 128 participants (aged 35-74 years) with hypertension were recruited. Participants were recruited using online and poster advertisements. Data collection took place between June 2012 and June 2014. Data were analyzed from October 2014 to December 2016. Controls received a weekly e-mail newsletter regarding hypertension management. The expert-driven group was prescribed a weekly exercise and diet plan (e.g., increase 1,000 steps/day this week). The user-driven group received weekly e-mail, which allowed participants to choose their intervention goals (e.g., [1] feel more confident to change my lifestyle, or [2] self-help tips for exercise or a heart healthy diet). Primary outcome was systolic blood pressure measured at baseline and 4-month follow-up. Secondary outcomes included cholesterol, 10-year Framingham cardiovascular risk, daily steps, and dietary habits. Expert-driven groups showed a greater systolic blood pressure decrease than controls at follow-up (expert-driven versus control: -7.5 mmHg, 95% CI= -12.5, -2.6, p=0.01). Systolic blood pressure reduction did not significantly differ between user- and expert-driven. Expert-driven compared with controls also showed a significant improvement in pulse pressure, cholesterol, and Framingham risk score. The expert-driven intervention was significantly more effective than both user-driven and control groups in increasing daily steps and fruit intake. It may be advisable to incorporate an expert-driven e-counseling protocol in order to accommodate participants with greater motivation to change their lifestyle behaviors, but more studies are needed. This study is registered at www.clinicaltrials.gov NCT03111836. Copyright © 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  12. Mechanisms of object recognition: what we have learned from pigeons

    PubMed Central

    Soto, Fabian A.; Wasserman, Edward A.

    2014-01-01

    Behavioral studies of object recognition in pigeons have been conducted for 50 years, yielding a large body of data. Recent work has been directed toward synthesizing this evidence and understanding the visual, associative, and cognitive mechanisms that are involved. The outcome is that pigeons are likely to be the non-primate species for which the computational mechanisms of object recognition are best understood. Here, we review this research and suggest that a core set of mechanisms for object recognition might be present in all vertebrates, including pigeons and people, making pigeons an excellent candidate model to study the neural mechanisms of object recognition. Behavioral and computational evidence suggests that error-driven learning participates in object category learning by pigeons and people, and recent neuroscientific research suggests that the basal ganglia, which are homologous in these species, may implement error-driven learning of stimulus-response associations. Furthermore, learning of abstract category representations can be observed in pigeons and other vertebrates. Finally, there is evidence that feedforward visual processing, a central mechanism in models of object recognition in the primate ventral stream, plays a role in object recognition by pigeons. We also highlight differences between pigeons and people in object recognition abilities, and propose candidate adaptive specializations which may explain them, such as holistic face processing and rule-based category learning in primates. From a modern comparative perspective, such specializations are to be expected regardless of the model species under study. The fact that we have a good idea of which aspects of object recognition differ in people and pigeons should be seen as an advantage over other animal models. From this perspective, we suggest that there is much to learn about human object recognition from studying the “simple” brains of pigeons. PMID:25352784

  13. Microwave Driven Actuators Power Allocation and Distribution

    NASA Technical Reports Server (NTRS)

    Forbes, Timothy; Song, Kyo D.

    2000-01-01

    Design, fabrication and test of a power allocation and distribution (PAD) network for microwave driven actuators is presented in this paper. Development of a circuit that would collect power from a rectenna array amplify and distribute the power to actuators was designed and fabricated for space application in an actuator array driven by a microwave. A P-SPICE model was constructed initially for data reduction purposes, and was followed by a working real-world model. A voltage up - converter (VUC) is used to amplify the voltage from the individual rectenna. The testing yielded a 26:1 voltage amplification ratio with input voltage at 9 volts and a measured output voltage 230VDC. Future work includes the miniaturization of the circuitry, the use of microwave remote control, and voltage amplification technology for each voltage source. The objective of this work is to develop a model system that will collect DC voltage from an array of rectenna and propagate the voltage to an array of actuators.

  14. Validation of Pre-operative Patient Self-Assessment of Cardiac Risk for Non-Cardiac Surgery: Foundations for Decision Support

    PubMed Central

    Manaktala, Sharad; Rockwood, Todd; Adam, Terrence J.

    2013-01-01

    Objectives: To better characterize patient understanding of their risk of cardiac complications from non-cardiac surgery and to develop a patient driven clinical decision support system for preoperative patient risk management. Methods: A patient-driven preoperative self-assessment decision support tool for perioperative assessment was created. Patient’ self-perception of cardiac risk and self-report data for risk factors were compared with gold standard preoperative physician assessment to evaluate agreement. Results: The patient generated cardiac risk profile was used for risk score generation and had excellent agreement with the expert physician assessment. However, patient subjective self-perception risk of cardiovascular complications had poor agreement with expert assessment. Conclusion: A patient driven cardiac risk assessment tool provides a high degree of agreement with expert provider assessment demonstrating clinical feasibility. The limited agreement between provider risk assessment and patient self-perception underscores a need for further work including focused preoperative patient education on cardiac risk. PMID:24551384

  15. Interactive Classification of Construction Materials: Feedback Driven Framework for Annotation and Analysis of 3d Point Clouds

    NASA Astrophysics Data System (ADS)

    Hess, M. R.; Petrovic, V.; Kuester, F.

    2017-08-01

    Digital documentation of cultural heritage structures is increasingly more common through the application of different imaging techniques. Many works have focused on the application of laser scanning and photogrammetry techniques for the acquisition of threedimensional (3D) geometry detailing cultural heritage sites and structures. With an abundance of these 3D data assets, there must be a digital environment where these data can be visualized and analyzed. Presented here is a feedback driven visualization framework that seamlessly enables interactive exploration and manipulation of massive point cloud data. The focus of this work is on the classification of different building materials with the goal of building more accurate as-built information models of historical structures. User defined functions have been tested within the interactive point cloud visualization framework to evaluate automated and semi-automated classification of 3D point data. These functions include decisions based on observed color, laser intensity, normal vector or local surface geometry. Multiple case studies are presented here to demonstrate the flexibility and utility of the presented point cloud visualization framework to achieve classification objectives.

  16. A method for deriving leading causes of death.

    PubMed Central

    Becker, Roberto; Silvi, John; Ma Fat, Doris; L'Hours, André; Laurenti, Ruy

    2006-01-01

    OBJECTIVE: A standard list for ranking leading causes of death worldwide does not exist. WHO headquarters, regional offices and Member States all use different lists that have varying levels of detail. We sought to derive a standard list to enable countries to identify their leading causes of death and to permit comparison between countries. Our aim is to share the criteria and methodology we used to bring some order to the construction of such a list, to provide a consistent procedure that can be used by others, and to give researchers and data owners an opportunity to utilize the list at national and subnational levels. METHODS: Results were primarily data-driven. Data from individual countries representing different regions of the world were extracted from the WHO Mortality Database. Supplementary information from WHO estimates on mortality was used for regions where data were scarce. In addition, a set of criteria was used to group the candidate causes and to determine other causes that should be included on the list. FINDINGS: A ranking list of the leading causes of death that contains broad cause groupings (such as "all cancers", "all heart diseases" or "all accidents") is not effective and does not identify the leading individual causes within these broad groupings; thus it does not allow policy-makers to generate appropriate health advocacy and cost-effective interventions. Similarly, defining candidate causal groups too narrowly or including diseases that have a low frequency does not meet these objectives. CONCLUSION: For international comparisons, we recommend that countries use this list; it is based on extensive evidence and the application of public health disease-prevention criteria. It is not driven by political or financial motives. This list may be adapted for national statistical purposes. PMID:16628303

  17. The effect of occlusion on the semantics of projective spatial terms: a case study in grounding language in perception.

    PubMed

    Kelleher, John D; Ross, Robert J; Sloan, Colm; Mac Namee, Brian

    2011-02-01

    Although data-driven spatial template models provide a practical and cognitively motivated mechanism for characterizing spatial term meaning, the influence of perceptual rather than solely geometric and functional properties has yet to be systematically investigated. In the light of this, in this paper, we investigate the effects of the perceptual phenomenon of object occlusion on the semantics of projective terms. We did this by conducting a study to test whether object occlusion had a noticeable effect on the acceptance values assigned to projective terms with respect to a 2.5-dimensional visual stimulus. Based on the data collected, a regression model was constructed and presented. Subsequent analysis showed that the regression model that included the occlusion factor outperformed an adaptation of Regier & Carlson's well-regarded AVS model for that same spatial configuration.

  18. Data-driven grasp synthesis using shape matching and task-based pruning.

    PubMed

    Li, Ying; Fu, Jiaxin L; Pollard, Nancy S

    2007-01-01

    Human grasps, especially whole-hand grasps, are difficult to animate because of the high number of degrees of freedom of the hand and the need for the hand to conform naturally to the object surface. Captured human motion data provides us with a rich source of examples of natural grasps. However, for each new object, we are faced with the problem of selecting the best grasp from the database and adapting it to that object. This paper presents a data-driven approach to grasp synthesis. We begin with a database of captured human grasps. To identify candidate grasps for a new object, we introduce a novel shape matching algorithm that matches hand shape to object shape by identifying collections of features having similar relative placements and surface normals. This step returns many grasp candidates, which are clustered and pruned by choosing the grasp best suited for the intended task. For pruning undesirable grasps, we develop an anatomically-based grasp quality measure specific to the human hand. Examples of grasp synthesis are shown for a variety of objects not present in the original database. This algorithm should be useful both as an animator tool for posing the hand and for automatic grasp synthesis in virtual environments.

  19. Neural dynamics of object-based multifocal visual spatial attention and priming: Object cueing, useful-field-of-view, and crowding

    PubMed Central

    Foley, Nicholas C.; Grossberg, Stephen; Mingolla, Ennio

    2015-01-01

    How are spatial and object attention coordinated to achieve rapid object learning and recognition during eye movement search? How do prefrontal priming and parietal spatial mechanisms interact to determine the reaction time costs of intra-object attention shifts, inter-object attention shifts, and shifts between visible objects and covertly cued locations? What factors underlie individual differences in the timing and frequency of such attentional shifts? How do transient and sustained spatial attentional mechanisms work and interact? How can volition, mediated via the basal ganglia, influence the span of spatial attention? A neural model is developed of how spatial attention in the where cortical stream coordinates view-invariant object category learning in the what cortical stream under free viewing conditions. The model simulates psychological data about the dynamics of covert attention priming and switching requiring multifocal attention without eye movements. The model predicts how “attentional shrouds” are formed when surface representations in cortical area V4 resonate with spatial attention in posterior parietal cortex (PPC) and prefrontal cortex (PFC), while shrouds compete among themselves for dominance. Winning shrouds support invariant object category learning, and active surface-shroud resonances support conscious surface perception and recognition. Attentive competition between multiple objects and cues simulates reaction-time data from the two-object cueing paradigm. The relative strength of sustained surface-driven and fast-transient motion-driven spatial attention controls individual differences in reaction time for invalid cues. Competition between surface-driven attentional shrouds controls individual differences in detection rate of peripheral targets in useful-field-of-view tasks. The model proposes how the strength of competition can be mediated, though learning or momentary changes in volition, by the basal ganglia. A new explanation of crowding shows how the cortical magnification factor, among other variables, can cause multiple object surfaces to share a single surface-shroud resonance, thereby preventing recognition of the individual objects. PMID:22425615

  20. Neural dynamics of object-based multifocal visual spatial attention and priming: object cueing, useful-field-of-view, and crowding.

    PubMed

    Foley, Nicholas C; Grossberg, Stephen; Mingolla, Ennio

    2012-08-01

    How are spatial and object attention coordinated to achieve rapid object learning and recognition during eye movement search? How do prefrontal priming and parietal spatial mechanisms interact to determine the reaction time costs of intra-object attention shifts, inter-object attention shifts, and shifts between visible objects and covertly cued locations? What factors underlie individual differences in the timing and frequency of such attentional shifts? How do transient and sustained spatial attentional mechanisms work and interact? How can volition, mediated via the basal ganglia, influence the span of spatial attention? A neural model is developed of how spatial attention in the where cortical stream coordinates view-invariant object category learning in the what cortical stream under free viewing conditions. The model simulates psychological data about the dynamics of covert attention priming and switching requiring multifocal attention without eye movements. The model predicts how "attentional shrouds" are formed when surface representations in cortical area V4 resonate with spatial attention in posterior parietal cortex (PPC) and prefrontal cortex (PFC), while shrouds compete among themselves for dominance. Winning shrouds support invariant object category learning, and active surface-shroud resonances support conscious surface perception and recognition. Attentive competition between multiple objects and cues simulates reaction-time data from the two-object cueing paradigm. The relative strength of sustained surface-driven and fast-transient motion-driven spatial attention controls individual differences in reaction time for invalid cues. Competition between surface-driven attentional shrouds controls individual differences in detection rate of peripheral targets in useful-field-of-view tasks. The model proposes how the strength of competition can be mediated, though learning or momentary changes in volition, by the basal ganglia. A new explanation of crowding shows how the cortical magnification factor, among other variables, can cause multiple object surfaces to share a single surface-shroud resonance, thereby preventing recognition of the individual objects. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Electrically Driven Liquid Film Boiling Experiment

    NASA Technical Reports Server (NTRS)

    Didion, Jeffrey R.

    2016-01-01

    This presentation presents the science background and ground based results that form the basis of the Electrically Driven Liquid Film Boiling Experiment. This is an ISS experiment that is manifested for 2021. Objective: Characterize the effects of gravity on the interaction of electric and flow fields in the presence of phase change specifically pertaining to: a) The effects of microgravity on the electrically generated two-phase flow. b) The effects of microgravity on electrically driven liquid film boiling (includes extreme heat fluxes). Electro-wetting of the boiling section will repel the bubbles away from the heated surface in microgravity environment. Relevance/Impact: Provides phenomenological foundation for the development of electric field based two-phase thermal management systems leveraging EHD, permitting optimization of heat transfer surface area to volume ratios as well as achievement of high heat transfer coefficients thus resulting in system mass and volume savings. EHD replaces buoyancy or flow driven bubble removal from heated surface. Development Approach: Conduct preliminary experiments in low gravity and ground-based facilities to refine technique and obtain preliminary data for model development. ISS environment required to characterize electro-wetting effect on nucleate boiling and CHF in the absence of gravity. Will operate in the FIR - designed for autonomous operation.

  2. Creating a driving profile for older adults using GPS devices and naturalistic driving methodology.

    PubMed

    Babulal, Ganesh M; Traub, Cindy M; Webb, Mollie; Stout, Sarah H; Addison, Aaron; Carr, David B; Ott, Brian R; Morris, John C; Roe, Catherine M

    2016-01-01

    Background/Objectives : Road tests and driving simulators are most commonly used in research studies and clinical evaluations of older drivers. Our objective was to describe the process and associated challenges in adapting an existing, commercial, off-the-shelf (COTS), in-vehicle device for naturalistic, longitudinal research to better understand daily driving behavior in older drivers. Design : The Azuga G2 Tracking Device TM was installed in each participant's vehicle, and we collected data over 5 months (speed, latitude/longitude) every 30-seconds when the vehicle was driven.  Setting : The Knight Alzheimer's Disease Research Center at Washington University School of Medicine. Participants : Five individuals enrolled in a larger, longitudinal study assessing preclinical Alzheimer disease and driving performance.  Participants were aged 65+ years and had normal cognition. Measurements :  Spatial components included Primary Location(s), Driving Areas, Mean Centers and Unique Destinations.  Temporal components included number of trips taken during different times of the day.  Behavioral components included number of hard braking, speeding and sudden acceleration events. Methods :  Individual 30-second observations, each comprising one breadcrumb, and trip-level data were collected and analyzed in R and ArcGIS.  Results : Primary locations were confirmed to be 100% accurate when compared to known addresses.  Based on the locations of the breadcrumbs, we were able to successfully identify frequently visited locations and general travel patterns.  Based on the reported time from the breadcrumbs, we could assess number of trips driven in daylight vs. night.  Data on additional events while driving allowed us to compute the number of adverse driving alerts over the course of the 5-month period. Conclusions : Compared to cameras and highly instrumented vehicle in other naturalistic studies, the compact COTS device was quickly installed and transmitted high volumes of data. Driving Profiles for older adults can be created and compared month-to-month or year-to-year, allowing researchers to identify changes in driving patterns that are unavailable in controlled conditions.

  3. Ontology-Based Retrieval of Spatially Related Objects for Location Based Services

    NASA Astrophysics Data System (ADS)

    Haav, Hele-Mai; Kaljuvee, Aivi; Luts, Martin; Vajakas, Toivo

    Advanced Location Based Service (LBS) applications have to integrate information stored in GIS, information about users' preferences (profile) as well as contextual information and information about application itself. Ontology engineering provides methods to semantically integrate several data sources. We propose an ontology-driven LBS development framework: the paper describes the architecture of ontologies and their usage for retrieval of spatially related objects relevant to the user. Our main contribution is to enable personalised ontology driven LBS by providing a novel approach for defining personalised semantic spatial relationships by means of ontologies. The approach is illustrated by an industrial case study.

  4. Model‐Based Approach to Predict Adherence to Protocol During Antiobesity Trials

    PubMed Central

    Sharma, Vishnu D.; Combes, François P.; Vakilynejad, Majid; Lahu, Gezim; Lesko, Lawrence J.

    2017-01-01

    Abstract Development of antiobesity drugs is continuously challenged by high dropout rates during clinical trials. The objective was to develop a population pharmacodynamic model that describes the temporal changes in body weight, considering disease progression, lifestyle intervention, and drug effects. Markov modeling (MM) was applied for quantification and characterization of responder and nonresponder as key drivers of dropout rates, to ultimately support the clinical trial simulations and the outcome in terms of trial adherence. Subjects (n = 4591) from 6 Contrave® trials were included in this analysis. An indirect‐response model developed by van Wart et al was used as a starting point. Inclusion of drug effect was dose driven using a population dose‐ and time‐dependent pharmacodynamic (DTPD) model. Additionally, a population‐pharmacokinetic parameter‐ and data (PPPD)‐driven model was developed using the final DTPD model structure and final parameter estimates from a previously developed population pharmacokinetic model based on available Contrave® pharmacokinetic concentrations. Last, MM was developed to predict transition rate probabilities among responder, nonresponder, and dropout states driven by the pharmacodynamic effect resulting from the DTPD or PPPD model. Covariates included in the models and parameters were diabetes mellitus and race. The linked DTPD‐MM and PPPD‐MM was able to predict transition rates among responder, nonresponder, and dropout states well. The analysis concluded that body‐weight change is an important factor influencing dropout rates, and the MM depicted that overall a DTPD model‐driven approach provides a reasonable prediction of clinical trial outcome probabilities similar to a pharmacokinetic‐driven approach. PMID:28858397

  5. 78 FR 68073 - Announcement of Solicitation of Written Comments on Modifications of Healthy People 2020 Objectives

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ..., health-service, or policy interventions. 3. Objectives should drive actions that will work toward the... populations categorized by race/ethnicity, socioeconomic status, gender, disability status, sexual orientation... care. 9. Healthy People 2020, like past versions, is heavily data driven. Valid, reliable, nationally...

  6. 77 FR 62514 - Announcement of Solicitation of Written Comments on Modifications of Healthy People 2020 Objectives

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-15

    ... interventions. 3. Objectives should drive actions that will work toward the achievement of the proposed targets... populations categorized by race/ethnicity, socioeconomic status, gender, disability status, sexual orientation... care. 9. Healthy People 2020, like past versions, will be heavily data driven. Valid, reliable...

  7. What are the most effective strategies for improving quality and safety of health care?

    PubMed

    Scott, I

    2009-06-01

    There is now a plethora of different quality improvement strategies (QIS) for optimizing health care, some clinician/patient driven, others manager/policy-maker driven. Which of these are most effective remains unclear despite expressed concerns about potential for QIS-related patient harm and wasting of resources. The objective of this study was to review published literature assessing the relative effectiveness of different QIS. Data sources comprising PubMed Clinical Queries, Cochrane Library and its Effective Practice and Organization of Care database, and HealthStar were searched for studies of QIS between January 1985 and February 2008 using search terms based on an a priori QIS classification suggested by experts. Systematic reviews of controlled trials were selected in determining effect sizes for specific QIS, which were compared as a narrative meta-review. Clinician/patient driven QIS were associated with stronger evidence of efficacy and larger effect sizes than manager/policy-maker driven QIS. The most effective strategies (>10% absolute increase in appropriate care or equivalent measure) included clinician-directed audit and feedback cycles, clinical decision support systems, specialty outreach programmes, chronic disease management programmes, continuing professional education based on interactive small-group case discussions, and patient-mediated clinician reminders. Pay-for-performance schemes directed to clinician groups and organizational process redesign were modestly effective. Other manager/policy-maker driven QIS including continuous quality improvement programmes, risk and safety management systems, public scorecards and performance reports, external accreditation, and clinical governance arrangements have not been adequately evaluated with regard to effectiveness. QIS are heterogeneous and methodological flaws in much of the evaluative literature limit validity and generalizability of results. Based on current best available evidence, clinician/patient driven QIS appear to be more effective than manager/policy-maker driven QIS although the latter have, in many instances, attracted insufficient robust evaluations to accurately determine their comparative effectiveness.

  8. DDDAS for space applications

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Pham, Khanh D.; Shen, Dan; Chen, Genshe

    2018-05-01

    The dynamic data-driven applications systems (DDDAS) paradigm is meant to inject measurements into the execution model for enhanced systems performance. One area off interest in DDDAS is for space situation awareness (SSA). For SSA, data is collected about the space environment to determine object motions, environments, and model updates. Dynamically coupling between the data and models enhances the capabilities of each system by complementing models with data for system control, execution, and sensor management. The paper overviews some of the recent developments in SSA made possible from DDDAS techniques which are for object detection, resident space object tracking, atmospheric models for enhanced sensing, cyber protection, and information management.

  9. A data-driven, knowledge-based approach to biomarker discovery: application to circulating microRNA markers of colorectal cancer prognosis.

    PubMed

    Vafaee, Fatemeh; Diakos, Connie; Kirschner, Michaela B; Reid, Glen; Michael, Michael Z; Horvath, Lisa G; Alinejad-Rokny, Hamid; Cheng, Zhangkai Jason; Kuncic, Zdenka; Clarke, Stephen

    2018-01-01

    Recent advances in high-throughput technologies have provided an unprecedented opportunity to identify molecular markers of disease processes. This plethora of complex-omics data has simultaneously complicated the problem of extracting meaningful molecular signatures and opened up new opportunities for more sophisticated integrative and holistic approaches. In this era, effective integration of data-driven and knowledge-based approaches for biomarker identification has been recognised as key to improving the identification of high-performance biomarkers, and necessary for translational applications. Here, we have evaluated the role of circulating microRNA as a means of predicting the prognosis of patients with colorectal cancer, which is the second leading cause of cancer-related death worldwide. We have developed a multi-objective optimisation method that effectively integrates a data-driven approach with the knowledge obtained from the microRNA-mediated regulatory network to identify robust plasma microRNA signatures which are reliable in terms of predictive power as well as functional relevance. The proposed multi-objective framework has the capacity to adjust for conflicting biomarker objectives and to incorporate heterogeneous information facilitating systems approaches to biomarker discovery. We have found a prognostic signature of colorectal cancer comprising 11 circulating microRNAs. The identified signature predicts the patients' survival outcome and targets pathways underlying colorectal cancer progression. The altered expression of the identified microRNAs was confirmed in an independent public data set of plasma samples of patients in early stage vs advanced colorectal cancer. Furthermore, the generality of the proposed method was demonstrated across three publicly available miRNA data sets associated with biomarker studies in other diseases.

  10. SIOExplorer: Modern IT Methods and Tools for Digital Library Management

    NASA Astrophysics Data System (ADS)

    Sutton, D. W.; Helly, J.; Miller, S.; Chase, A.; Clarck, D.

    2003-12-01

    With more geoscience disciplines becoming data-driven it is increasingly important to utilize modern techniques for data, information and knowledge management. SIOExplorer is a new digital library project with 2 terabytes of oceanographic data collected over the last 50 years on 700 cruises by the Scripps Institution of Oceanography. It is built using a suite of information technology tools and methods that allow for an efficient and effective digital library management system. The library consists of a number of independent collections, each with corresponding metadata formats. The system architecture allows each collection to be built and uploaded based on a collection dependent metadata template file (MTF). This file is used to create the hierarchical structure of the collection, create metadata tables in a relational database, and to populate object metadata files and the collection as a whole. Collections are comprised of arbitrary digital objects stored at the San Diego Supercomputer Center (SDSC) High Performance Storage System (HPSS) and managed using the Storage Resource Broker (SRB), data handling middle ware developed at SDSC. SIOExplorer interoperates with other collections as a data provider through the Open Archives Initiative (OAI) protocol. The user services for SIOExplorer are accessed from CruiseViewer, a Java application served using Java Web Start from the SIOExplorer home page. CruiseViewer is an advanced tool for data discovery and access. It implements general keyword and interactive geospatial search methods for the collections. It uses a basemap to georeference search results on user selected basemaps such as global topography or crustal age. User services include metadata viewing, opening of selective mime type digital objects (such as images, documents and grid files), and downloading of objects (including the brokering of proprietary hold restrictions).

  11. Data-driven classification of bipolar I disorder from longitudinal course of mood.

    PubMed

    Cochran, A L; McInnis, M G; Forger, D B

    2016-10-11

    The Diagnostic and Statistical Manual of Mental Disorder (DSM) classification of bipolar disorder defines categories to reflect common understanding of mood symptoms rather than scientific evidence. This work aimed to determine whether bipolar I can be objectively classified from longitudinal mood data and whether resulting classes have clinical associations. Bayesian nonparametric hierarchical models with latent classes and patient-specific models of mood are fit to data from Longitudinal Interval Follow-up Evaluations (LIFE) of bipolar I patients (N=209). Classes are tested for clinical associations. No classes are justified using the time course of DSM-IV mood states. Three classes are justified using the course of subsyndromal mood symptoms. Classes differed in attempted suicides (P=0.017), disability status (P=0.012) and chronicity of affective symptoms (P=0.009). Thus, bipolar I disorder can be objectively classified from mood course, and individuals in the resulting classes share clinical features. Data-driven classification from mood course could be used to enrich sample populations for pharmacological and etiological studies.

  12. Model-Based Approach to Predict Adherence to Protocol During Antiobesity Trials.

    PubMed

    Sharma, Vishnu D; Combes, François P; Vakilynejad, Majid; Lahu, Gezim; Lesko, Lawrence J; Trame, Mirjam N

    2018-02-01

    Development of antiobesity drugs is continuously challenged by high dropout rates during clinical trials. The objective was to develop a population pharmacodynamic model that describes the temporal changes in body weight, considering disease progression, lifestyle intervention, and drug effects. Markov modeling (MM) was applied for quantification and characterization of responder and nonresponder as key drivers of dropout rates, to ultimately support the clinical trial simulations and the outcome in terms of trial adherence. Subjects (n = 4591) from 6 Contrave ® trials were included in this analysis. An indirect-response model developed by van Wart et al was used as a starting point. Inclusion of drug effect was dose driven using a population dose- and time-dependent pharmacodynamic (DTPD) model. Additionally, a population-pharmacokinetic parameter- and data (PPPD)-driven model was developed using the final DTPD model structure and final parameter estimates from a previously developed population pharmacokinetic model based on available Contrave ® pharmacokinetic concentrations. Last, MM was developed to predict transition rate probabilities among responder, nonresponder, and dropout states driven by the pharmacodynamic effect resulting from the DTPD or PPPD model. Covariates included in the models and parameters were diabetes mellitus and race. The linked DTPD-MM and PPPD-MM was able to predict transition rates among responder, nonresponder, and dropout states well. The analysis concluded that body-weight change is an important factor influencing dropout rates, and the MM depicted that overall a DTPD model-driven approach provides a reasonable prediction of clinical trial outcome probabilities similar to a pharmacokinetic-driven approach. © 2017, The Authors. The Journal of Clinical Pharmacology published by Wiley Periodicals, Inc. on behalf of American College of Clinical Pharmacology.

  13. The Impact of Data-Based Science Instruction on Standardized Test Performance

    NASA Astrophysics Data System (ADS)

    Herrington, Tia W.

    Increased teacher accountability efforts have resulted in the use of data to improve student achievement. This study addressed teachers' inconsistent use of data-driven instruction in middle school science. Evidence of the impact of data-based instruction on student achievement and school and district practices has been well documented by researchers. In science, less information has been available on teachers' use of data for classroom instruction. Drawing on data-driven decision making theory, the purpose of this study was to examine whether data-based instruction impacted performance on the science Criterion Referenced Competency Test (CRCT) and to explore the factors that impeded its use by a purposeful sample of 12 science teachers at a data-driven school. The research questions addressed in this study included understanding: (a) the association between student performance on the science portion of the CRCT and data-driven instruction professional development, (b) middle school science teachers' perception of the usefulness of data, and (c) the factors that hindered the use of data for science instruction. This study employed a mixed methods sequential explanatory design. Data collected included 8th grade CRCT data, survey responses, and individual teacher interviews. A chi-square test revealed no improvement in the CRCT scores following the implementation of professional development on data-driven instruction (chi 2 (1) = .183, p = .67). Results from surveys and interviews revealed that teachers used data to inform their instruction, indicating time as the major hindrance to their use. Implications for social change include the development of lesson plans that will empower science teachers to deliver data-based instruction and students to achieve identified academic goals.

  14. Object-Driven and Temporal Action Rules Mining

    ERIC Educational Resources Information Center

    Hajja, Ayman

    2013-01-01

    In this thesis, I present my complete research work in the field of action rules, more precisely object-driven and temporal action rules. The drive behind the introduction of object-driven and temporally based action rules is to bring forth an adapted approach to extract action rules from a subclass of systems that have a specific nature, in which…

  15. The ACGT Master Ontology and its applications – Towards an ontology-driven cancer research and management system

    PubMed Central

    Brochhausen, Mathias; Spear, Andrew D.; Cocos, Cristian; Weiler, Gabriele; Martín, Luis; Anguita, Alberto; Stenzhorn, Holger; Daskalaki, Evangelia; Schera, Fatima; Schwarz, Ulf; Sfakianakis, Stelios; Kiefer, Stephan; Dörr, Martin; Graf, Norbert; Tsiknakis, Manolis

    2017-01-01

    Objective This paper introduces the objectives, methods and results of ontology development in the EU co-funded project Advancing Clinico-genomic Trials on Cancer – Open Grid Services for Improving Medical Knowledge Discovery (ACGT). While the available data in the life sciences has recently grown both in amount and quality, the full exploitation of it is being hindered by the use of different underlying technologies, coding systems, category schemes and reporting methods on the part of different research groups. The goal of the ACGT project is to contribute to the resolution of these problems by developing an ontology-driven, semantic grid services infrastructure that will enable efficient execution of discovery-driven scientific workflows in the context of multi-centric, post-genomic clinical trials. The focus of the present paper is the ACGT Master Ontology (MO). Methods ACGT project researchers undertook a systematic review of existing domain and upper-level ontologies, as well as of existing ontology design software, implementation methods, and end-user interfaces. This included the careful study of best practices, design principles and evaluation methods for ontology design, maintenance, implementation, and versioning, as well as for use on the part of domain experts and clinicians. Results To date, the results of the ACGT project include (i) the development of a master ontology (the ACGT-MO) based on clearly defined principles of ontology development and evaluation; (ii) the development of a technical infra-structure (the ACGT Platform) that implements the ACGT-MO utilizing independent tools, components and resources that have been developed based on open architectural standards, and which includes an application updating and evolving the ontology efficiently in response to end-user needs; and (iii) the development of an Ontology-based Trial Management Application (ObTiMA) that integrates the ACGT-MO into the design process of clinical trials in order to guarantee automatic semantic integration without the need to perform a separate mapping process. PMID:20438862

  16. The evolution of meaning: spatio-temporal dynamics of visual object recognition.

    PubMed

    Clarke, Alex; Taylor, Kirsten I; Tyler, Lorraine K

    2011-08-01

    Research on the spatio-temporal dynamics of visual object recognition suggests a recurrent, interactive model whereby an initial feedforward sweep through the ventral stream to prefrontal cortex is followed by recurrent interactions. However, critical questions remain regarding the factors that mediate the degree of recurrent interactions necessary for meaningful object recognition. The novel prediction we test here is that recurrent interactivity is driven by increasing semantic integration demands as defined by the complexity of semantic information required by the task and driven by the stimuli. To test this prediction, we recorded magnetoencephalography data while participants named living and nonliving objects during two naming tasks. We found that the spatio-temporal dynamics of neural activity were modulated by the level of semantic integration required. Specifically, source reconstructed time courses and phase synchronization measures showed increased recurrent interactions as a function of semantic integration demands. These findings demonstrate that the cortical dynamics of object processing are modulated by the complexity of semantic information required from the visual input.

  17. Synchronization of autonomous objects in discrete event simulation

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  18. Experimental studies of characteristic combustion-driven flows for CFD validation

    NASA Technical Reports Server (NTRS)

    Santoro, R. J.; Moser, M.; Anderson, W.; Pal, S.; Ryan, H.; Merkle, C. L.

    1992-01-01

    A series of rocket-related studies intended to develop a suitable data base for validation of Computational Fluid Dynamics (CFD) models of characteristic combustion-driven flows was undertaken at the Propulsion Engineering Research Center at Penn State. Included are studies of coaxial and impinging jet injectors as well as chamber wall heat transfer effects. The objective of these studies is to provide fundamental understanding and benchmark quality data for phenomena important to rocket combustion under well-characterized conditions. Diagnostic techniques utilized in these studies emphasize determinations of velocity, temperature, spray and droplet characteristics, and combustion zone distribution. Since laser diagnostic approaches are favored, the development of an optically accessible rocket chamber has been a high priority in the initial phase of the project. During the design phase for this chamber, the advice and input of the CFD modeling community were actively sought through presentations and written surveys. Based on this procedure, a suitable uni-element rocket chamber was fabricated and is presently under preliminary testing. Results of these tests, as well as the survey findings leading to the chamber design, were presented.

  19. Keys to success for data-driven decision making: Lessons from participatory monitoring and collaborative adaptive management

    USDA-ARS?s Scientific Manuscript database

    Recent years have witnessed a call for evidence-based decisions in conservation and natural resource management, including data-driven decision-making. Adaptive management (AM) is one prevalent model for integrating scientific data into decision-making, yet AM has faced numerous challenges and limit...

  20. Challenges in Biomarker Discovery: Combining Expert Insights with Statistical Analysis of Complex Omics Data

    PubMed Central

    McDermott, Jason E.; Wang, Jing; Mitchell, Hugh; Webb-Robertson, Bobbie-Jo; Hafen, Ryan; Ramey, John; Rodland, Karin D.

    2012-01-01

    Introduction The advent of high throughput technologies capable of comprehensive analysis of genes, transcripts, proteins and other significant biological molecules has provided an unprecedented opportunity for the identification of molecular markers of disease processes. However, it has simultaneously complicated the problem of extracting meaningful molecular signatures of biological processes from these complex datasets. The process of biomarker discovery and characterization provides opportunities for more sophisticated approaches to integrating purely statistical and expert knowledge-based approaches. Areas covered In this review we will present examples of current practices for biomarker discovery from complex omic datasets and the challenges that have been encountered in deriving valid and useful signatures of disease. We will then present a high-level review of data-driven (statistical) and knowledge-based methods applied to biomarker discovery, highlighting some current efforts to combine the two distinct approaches. Expert opinion Effective, reproducible and objective tools for combining data-driven and knowledge-based approaches to identify predictive signatures of disease are key to future success in the biomarker field. We will describe our recommendations for possible approaches to this problem including metrics for the evaluation of biomarkers. PMID:23335946

  1. Challenges in Biomarker Discovery: Combining Expert Insights with Statistical Analysis of Complex Omics Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Jason E.; Wang, Jing; Mitchell, Hugh D.

    2013-01-01

    The advent of high throughput technologies capable of comprehensive analysis of genes, transcripts, proteins and other significant biological molecules has provided an unprecedented opportunity for the identification of molecular markers of disease processes. However, it has simultaneously complicated the problem of extracting meaningful signatures of biological processes from these complex datasets. The process of biomarker discovery and characterization provides opportunities both for purely statistical and expert knowledge-based approaches and would benefit from improved integration of the two. Areas covered In this review we will present examples of current practices for biomarker discovery from complex omic datasets and the challenges thatmore » have been encountered. We will then present a high-level review of data-driven (statistical) and knowledge-based methods applied to biomarker discovery, highlighting some current efforts to combine the two distinct approaches. Expert opinion Effective, reproducible and objective tools for combining data-driven and knowledge-based approaches to biomarker discovery and characterization are key to future success in the biomarker field. We will describe our recommendations of possible approaches to this problem including metrics for the evaluation of biomarkers.« less

  2. Revisiting Statistical Aspects of Nuclear Material Accounting

    DOE PAGES

    Burr, T.; Hamada, M. S.

    2013-01-01

    Nuclear material accounting (NMA) is the only safeguards system whose benefits are routinely quantified. Process monitoring (PM) is another safeguards system that is increasingly used, and one challenge is how to quantify its benefit. This paper considers PM in the role of enabling frequent NMA, which is referred to as near-real-time accounting (NRTA). We quantify NRTA benefits using period-driven and data-driven testing. Period-driven testing makes a decision to alarm or not at fixed periods. Data-driven testing decides as the data arrives whether to alarm or continue testing. The difference between period-driven and datad-riven viewpoints is illustrated by using one-year andmore » two-year periods. For both one-year and two-year periods, period-driven NMA using once-per-year cumulative material unaccounted for (CUMUF) testing is compared to more frequent Shewhart and joint sequential cusum testing using either MUF or standardized, independently transformed MUF (SITMUF) data. We show that the data-driven viewpoint is appropriate for NRTA and that it can be used to compare safeguards effectiveness. In addition to providing period-driven and data-driven viewpoints, new features include assessing the impact of uncertainty in the estimated covariance matrix of the MUF sequence and the impact of both random and systematic measurement errors.« less

  3. Covariate selection with iterative principal component analysis for predicting physical

    USDA-ARS?s Scientific Manuscript database

    Local and regional soil data can be improved by coupling new digital soil mapping techniques with high resolution remote sensing products to quantify both spatial and absolute variation of soil properties. The objective of this research was to advance data-driven digital soil mapping techniques for ...

  4. Protein-Protein Interface Predictions by Data-Driven Methods: A Review

    PubMed Central

    Xue, Li C; Dobbs, Drena; Bonvin, Alexandre M.J.J.; Honavar, Vasant

    2015-01-01

    Reliably pinpointing which specific amino acid residues form the interface(s) between a protein and its binding partner(s) is critical for understanding the structural and physicochemical determinants of protein recognition and binding affinity, and has wide applications in modeling and validating protein interactions predicted by high-throughput methods, in engineering proteins, and in prioritizing drug targets. Here, we review the basic concepts, principles and recent advances in computational approaches to the analysis and prediction of protein-protein interfaces. We point out caveats for objectively evaluating interface predictors, and discuss various applications of data-driven interface predictors for improving energy model-driven protein-protein docking. Finally, we stress the importance of exploiting binding partner information in reliably predicting interfaces and highlight recent advances in this emerging direction. PMID:26460190

  5. Longitudinal aerodynamic characteristics of light, twin-engine, propeller-driven airplanes

    NASA Technical Reports Server (NTRS)

    Wolowicz, C. H.; Yancey, R. B.

    1972-01-01

    Representative state-of-the-art analytical procedures and design data for predicting the longitudinal static and dynamic stability and control characteristics of light, propeller-driven airplanes are presented. Procedures for predicting drag characteristics are also included. The procedures are applied to a twin-engine, propeller-driven airplane in the clean configuration from zero lift to stall conditions. The calculated characteristics are compared with wind-tunnel and flight data. Included in the comparisons are level-flight trim characteristics, period and damping of the short-period oscillatory mode, and windup-turn characteristics. All calculations are documented.

  6. Creating a driving profile for older adults using GPS devices and naturalistic driving methodology

    PubMed Central

    Babulal, Ganesh M.; Traub, Cindy M.; Webb, Mollie; Stout, Sarah H.; Addison, Aaron; Carr, David B.; Ott, Brian R.; Morris, John C.; Roe, Catherine M.

    2016-01-01

    Background/Objectives: Road tests and driving simulators are most commonly used in research studies and clinical evaluations of older drivers. Our objective was to describe the process and associated challenges in adapting an existing, commercial, off-the-shelf (COTS), in-vehicle device for naturalistic, longitudinal research to better understand daily driving behavior in older drivers. Design: The Azuga G2 Tracking Device TM was installed in each participant’s vehicle, and we collected data over 5 months (speed, latitude/longitude) every 30-seconds when the vehicle was driven.  Setting: The Knight Alzheimer’s Disease Research Center at Washington University School of Medicine. Participants: Five individuals enrolled in a larger, longitudinal study assessing preclinical Alzheimer disease and driving performance.  Participants were aged 65+ years and had normal cognition. Measurements:  Spatial components included Primary Location(s), Driving Areas, Mean Centers and Unique Destinations.  Temporal components included number of trips taken during different times of the day.  Behavioral components included number of hard braking, speeding and sudden acceleration events. Methods:  Individual 30-second observations, each comprising one breadcrumb, and trip-level data were collected and analyzed in R and ArcGIS.  Results: Primary locations were confirmed to be 100% accurate when compared to known addresses.  Based on the locations of the breadcrumbs, we were able to successfully identify frequently visited locations and general travel patterns.  Based on the reported time from the breadcrumbs, we could assess number of trips driven in daylight vs. night.  Data on additional events while driving allowed us to compute the number of adverse driving alerts over the course of the 5-month period. Conclusions: Compared to cameras and highly instrumented vehicle in other naturalistic studies, the compact COTS device was quickly installed and transmitted high volumes of data. Driving Profiles for older adults can be created and compared month-to-month or year-to-year, allowing researchers to identify changes in driving patterns that are unavailable in controlled conditions. PMID:27990264

  7. Control system for several rotating mirror camera synchronization operation

    NASA Astrophysics Data System (ADS)

    Liu, Ningwen; Wu, Yunfeng; Tan, Xianxiang; Lai, Guoji

    1997-05-01

    This paper introduces a single chip microcomputer control system for synchronization operation of several rotating mirror high-speed cameras. The system consists of four parts: the microcomputer control unit (including the synchronization part and precise measurement part and the time delay part), the shutter control unit, the motor driving unit and the high voltage pulse generator unit. The control system has been used to control the synchronization working process of the GSI cameras (driven by a motor) and FJZ-250 rotating mirror cameras (driven by a gas driven turbine). We have obtained the films of the same objective from different directions in different speed or in same speed.

  8. Building Data-Driven Pathways From Routinely Collected Hospital Data: A Case Study on Prostate Cancer

    PubMed Central

    Clark, Jeremy; Cooper, Colin S; Mills, Robert; Rayward-Smith, Victor J; de la Iglesia, Beatriz

    2015-01-01

    Background Routinely collected data in hospitals is complex, typically heterogeneous, and scattered across multiple Hospital Information Systems (HIS). This big data, created as a byproduct of health care activities, has the potential to provide a better understanding of diseases, unearth hidden patterns, and improve services and cost. The extent and uses of such data rely on its quality, which is not consistently checked, nor fully understood. Nevertheless, using routine data for the construction of data-driven clinical pathways, describing processes and trends, is a key topic receiving increasing attention in the literature. Traditional algorithms do not cope well with unstructured processes or data, and do not produce clinically meaningful visualizations. Supporting systems that provide additional information, context, and quality assurance inspection are needed. Objective The objective of the study is to explore how routine hospital data can be used to develop data-driven pathways that describe the journeys that patients take through care, and their potential uses in biomedical research; it proposes a framework for the construction, quality assessment, and visualization of patient pathways for clinical studies and decision support using a case study on prostate cancer. Methods Data pertaining to prostate cancer patients were extracted from a large UK hospital from eight different HIS, validated, and complemented with information from the local cancer registry. Data-driven pathways were built for each of the 1904 patients and an expert knowledge base, containing rules on the prostate cancer biomarker, was used to assess the completeness and utility of the pathways for a specific clinical study. Software components were built to provide meaningful visualizations for the constructed pathways. Results The proposed framework and pathway formalism enable the summarization, visualization, and querying of complex patient-centric clinical information, as well as the computation of quality indicators and dimensions. A novel graphical representation of the pathways allows the synthesis of such information. Conclusions Clinical pathways built from routinely collected hospital data can unearth information about patients and diseases that may otherwise be unavailable or overlooked in hospitals. Data-driven clinical pathways allow for heterogeneous data (ie, semistructured and unstructured data) to be collated over a unified data model and for data quality dimensions to be assessed. This work has enabled further research on prostate cancer and its biomarkers, and on the development and application of methods to mine, compare, analyze, and visualize pathways constructed from routine data. This is an important development for the reuse of big data in hospitals. PMID:26162314

  9. High-performance parallel analysis of coupled problems for aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Gumaste, U.; Ronaghi, M.

    1994-01-01

    Applications are described of high-performance parallel, computation for the analysis of complete jet engines, considering its multi-discipline coupled problem. The coupled problem involves interaction of structures with gas dynamics, heat conduction and heat transfer in aircraft engines. The methodology issues addressed include: consistent discrete formulation of coupled problems with emphasis on coupling phenomena; effect of partitioning strategies, augmentation and temporal solution procedures; sensitivity of response to problem parameters; and methods for interfacing multiscale discretizations in different single fields. The computer implementation issues addressed include: parallel treatment of coupled systems; domain decomposition and mesh partitioning strategies; data representation in object-oriented form and mapping to hardware driven representation, and tradeoff studies between partitioning schemes and fully coupled treatment.

  10. Data-Driven Districts.

    ERIC Educational Resources Information Center

    LaFee, Scott

    2002-01-01

    Describes the use of data-driven decision-making in four school districts: Plainfield Public Schools, Plainfield, New Jersey; Palo Alto Unified School District, Palo Alto, California; Francis Howell School District in eastern Missouri, northwest of St. Louis; and Rio Rancho Public Schools, near Albuquerque, New Mexico. Includes interviews with the…

  11. Biased Competition in Visual Processing Hierarchies: A Learning Approach Using Multiple Cues.

    PubMed

    Gepperth, Alexander R T; Rebhan, Sven; Hasler, Stephan; Fritsch, Jannik

    2011-03-01

    In this contribution, we present a large-scale hierarchical system for object detection fusing bottom-up (signal-driven) processing results with top-down (model or task-driven) attentional modulation. Specifically, we focus on the question of how the autonomous learning of invariant models can be embedded into a performing system and how such models can be used to define object-specific attentional modulation signals. Our system implements bi-directional data flow in a processing hierarchy. The bottom-up data flow proceeds from a preprocessing level to the hypothesis level where object hypotheses created by exhaustive object detection algorithms are represented in a roughly retinotopic way. A competitive selection mechanism is used to determine the most confident hypotheses, which are used on the system level to train multimodal models that link object identity to invariant hypothesis properties. The top-down data flow originates at the system level, where the trained multimodal models are used to obtain space- and feature-based attentional modulation signals, providing biases for the competitive selection process at the hypothesis level. This results in object-specific hypothesis facilitation/suppression in certain image regions which we show to be applicable to different object detection mechanisms. In order to demonstrate the benefits of this approach, we apply the system to the detection of cars in a variety of challenging traffic videos. Evaluating our approach on a publicly available dataset containing approximately 3,500 annotated video images from more than 1 h of driving, we can show strong increases in performance and generalization when compared to object detection in isolation. Furthermore, we compare our results to a late hypothesis rejection approach, showing that early coupling of top-down and bottom-up information is a favorable approach especially when processing resources are constrained.

  12. Developing an OMERACT Core Outcome Set for Assessing Safety Components in Rheumatology Trials: The OMERACT Safety Working Group.

    PubMed

    Klokker, Louise; Tugwell, Peter; Furst, Daniel E; Devoe, Dan; Williamson, Paula; Terwee, Caroline B; Suarez-Almazor, Maria E; Strand, Vibeke; Woodworth, Thasia; Leong, Amye L; Goel, Niti; Boers, Maarten; Brooks, Peter M; Simon, Lee S; Christensen, Robin

    2017-12-01

    Failure to report harmful outcomes in clinical research can introduce bias favoring a potentially harmful intervention. While core outcome sets (COS) are available for benefits in randomized controlled trials in many rheumatic conditions, less attention has been paid to safety in such COS. The Outcome Measures in Rheumatology (OMERACT) Filter 2.0 emphasizes the importance of measuring harms. The Safety Working Group was reestablished at the OMERACT 2016 with the objective to develop a COS for assessing safety components in trials across rheumatologic conditions. The safety issue has previously been discussed at OMERACT, but without a consistent approach to ensure harms were included in COS. Our methods include (1) identifying harmful outcomes in trials of interventions studied in patients with rheumatic diseases by a systematic literature review, (2) identifying components of safety that should be measured in such trials by use of a patient-driven approach including qualitative data collection and statistical organization of data, and (3) developing a COS through consensus processes including everyone involved. Members of OMERACT including patients, clinicians, researchers, methodologists, and industry representatives reached consensus on the need to continue the efforts on developing a COS for safety in rheumatology trials. There was a general agreement about the need to identify safety-related outcomes that are meaningful to patients, framed in terms that patients consider relevant so that they will be able to make informed decisions. The OMERACT Safety Working Group will advance the work previously done within OMERACT using a new patient-driven approach.

  13. A Novel Data-Driven Approach to Preoperative Mapping of Functional Cortex Using Resting-State Functional Magnetic Resonance Imaging

    PubMed Central

    Mitchell, Timothy J.; Hacker, Carl D.; Breshears, Jonathan D.; Szrama, Nick P.; Sharma, Mohit; Bundy, David T.; Pahwa, Mrinal; Corbetta, Maurizio; Snyder, Abraham Z.; Shimony, Joshua S.

    2013-01-01

    BACKGROUND: Recent findings associated with resting-state cortical networks have provided insight into the brain's organizational structure. In addition to their neuroscientific implications, the networks identified by resting-state functional magnetic resonance imaging (rs-fMRI) may prove useful for clinical brain mapping. OBJECTIVE: To demonstrate that a data-driven approach to analyze resting-state networks (RSNs) is useful in identifying regions classically understood to be eloquent cortex as well as other functional networks. METHODS: This study included 6 patients undergoing surgical treatment for intractable epilepsy and 7 patients undergoing tumor resection. rs-fMRI data were obtained before surgery and 7 canonical RSNs were identified by an artificial neural network algorithm. Of these 7, the motor and language networks were then compared with electrocortical stimulation (ECS) as the gold standard in the epilepsy patients. The sensitivity and specificity for identifying these eloquent sites were calculated at varying thresholds, which yielded receiver-operating characteristic (ROC) curves and their associated area under the curve (AUC). RSNs were plotted in the tumor patients to observe RSN distortions in altered anatomy. RESULTS: The algorithm robustly identified all networks in all patients, including those with distorted anatomy. When all ECS-positive sites were considered for motor and language, rs-fMRI had AUCs of 0.80 and 0.64, respectively. When the ECS-positive sites were analyzed pairwise, rs-fMRI had AUCs of 0.89 and 0.76 for motor and language, respectively. CONCLUSION: A data-driven approach to rs-fMRI may be a new and efficient method for preoperative localization of numerous functional brain regions. ABBREVIATIONS: AUC, area under the curve BA, Brodmann area BOLD, blood oxygen level dependent ECS, electrocortical stimulation fMRI, functional magnetic resonance imaging ICA, independent component analysis MLP, multilayer perceptron MP-RAGE, magnetization-prepared rapid gradient echo ROC, receiver-operating characteristic rs-fMRI, resting-state functional magnetic resonance imaging RSN, resting-state network PMID:24264234

  14. The Relationships Between Lost Work Time and Duration of Absence Spells: Proposal for a Payroll Driven Measure of Absenteeism

    PubMed Central

    Hill, James J.; Slade, Martin D.; Cantley, Linda; Vegso, Sally; Fiellin, Martha; Cullen, Mark R.

    2011-01-01

    Objective To propose a standard measure of absenteeism (the work lost rate [WLR]) be included in future research to facilitate understanding and allow for translation of findings between scientific disciplines. Methods Hourly payroll data derived from “punch clock” reports was used to compare various measures of absenteeism used in the literature and the application of the proposed metric (N = 4000 workers). Results Unpaid hours and full absent days were highly correlated with the WLR (r = 0.896 to 0.898). The highest percentage of unpaid hours (lost work time) is captured by absence spells of 1 and 2 days duration. Conclusion The proposed WLR metric captures: 1) The range and distribution of the individual WLRs, 2) the percentage of subjects with no unpaid hours, and 3) the population WLR and should be included whenever payroll data is used to measure absenteeism. PMID:18617841

  15. Mujeres en accion: design and baseline data.

    PubMed

    Keller, Colleen; Fleury, Julie; Perez, Adriana; Belyea, Michael; Castro, Felipe G

    2011-10-01

    The majority of programs designed to promote physical activity in older Hispanic women includes few innovative theory-based interventions that address cultural relevant strategies. The purpose of this report is to describe the design and baseline data for Mujeres en Accion, a physical activity intervention to increase regular physical activity, and cardiovascular health outcomes among older Hispanic women. Mujeres en Accion [Women in Action for Health], a 12 month randomized controlled trial to evaluate the effectiveness of a social support physical activity intervention in midlife and older Hispanic women. This study tests an innovative intervention, Mujeres en Accion, and includes the use of a theory-driven approach to intervention, explores social support as a theoretical mediating variable, use of a Promotora model and a Community Advisory group to incorporate cultural and social approaches and resources, and use of objective measures of physical activity in Hispanic women.

  16. Mujeres en Accion: Design and Baseline Data

    PubMed Central

    Fleury, Julie; Perez, Adriana; Belyea, Michael; Castro, Felipe G.

    2015-01-01

    The majority of programs designed to promote physical activity in older Hispanic women includes few innovative theory-based interventions that address cultural relevant strategies. The purpose of this report is to describe the design and baseline data for Mujeres en Accion, a physical activity intervention to increase regular physical activity, and cardiovascular health outcomes among older Hispanic women. Mujeres en Accion [Women in Action for Health], a 12 month randomized controlled trial to evaluate the effectiveness of a social support physical activity intervention in midlife and older Hispanic women. This study tests an innovative intervention, Mujeres en Accion, and includes the use of a theory-driven approach to intervention, explores social support as a theoretical mediating variable, use of a Promotora model and a Community Advisory group to incorporate cultural and social approaches and resources, and use of objective measures of physical activity in Hispanic women. PMID:21298400

  17. DIGGING DEEPER INTO DEEP DATA: MOLECULAR DOCKING AS A HYPOTHESIS-DRIVEN BIOPHYSICAL INTERROGATION SYSTEM IN COMPUTATIONAL TOXICOLOGY.

    EPA Science Inventory

    Developing and evaluating prediactive strategies to elucidate the mode of biological activity of environmental chemicals is a major objective of the concerted efforts of the US-EPA's computational toxicology program.

  18. A Perfect Time for Data Use: Using Data-Driven Decision Making to Inform Practice

    ERIC Educational Resources Information Center

    Mandinach, Ellen B.

    2012-01-01

    Data-driven decision making has become an essential component of educational practice across all levels, from chief state school officers to classroom teachers, and has received unprecedented attention in terms of policy and financial support. It was included as one of the four pillars in the American Recovery and Reinvestment Act (2009),…

  19. Advanced software development workstation. Comparison of two object-oriented development methodologies

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    This report is an attempt to clarify some of the concerns raised about the OMT method, specifically that OMT is weaker than the Booch method in a few key areas. This interim report specifically addresses the following issues: (1) is OMT object-oriented or only data-driven?; (2) can OMT be used as a front-end to implementation in C++?; (3) the inheritance concept in OMT is in contradiction with the 'pure and real' inheritance concept found in object-oriented (OO) design; (4) low support for software life-cycle issues, for project and risk management; (5) uselessness of functional modeling for the ROSE project; and (6) problems with event-driven and simulation systems. The conclusion of this report is that both Booch's method and Rumbaugh's method are good OO methods, each with strengths and weaknesses in different areas of the development process.

  20. Nanoporous carbon actuator and methods of use thereof

    DOEpatents

    Biener, Juergen [San Leandro, CA; Baumann, Theodore F [Discovery Bay, CA; Shao, Lihua [Karlsruhe, DE; Weissmueller, Joerg [Stutensee, DE

    2012-07-31

    An electrochemically driveable actuator according to one embodiment includes a nanoporous carbon aerogel composition capable of exhibiting charge-induced reversible strain when wetted by an electrolyte and a voltage is applied thereto. An electrochemically driven actuator according to another embodiment includes a nanoporous carbon aerogel composition wetted by an electrolyte; and a mechanism for causing charge-induced reversible strain of the composition. A method for electrochemically actuating an object according to one embodiment includes causing charge-induced reversible strain of a nanoporous carbon aerogel composition wetted with an electrolyte to actuate the object by the strain.

  1. The Business Education Curriculum

    ERIC Educational Resources Information Center

    Rader, Martha; Meggison, Peter

    2007-01-01

    The business education curriculum encompasses the educational experiences of business students at all levels. Business education curricula include a variety of programs, courses, units, course objectives, student competencies, assessments, and extracurricular activities that have evolved over the years. Curricula are driven by numerous internal…

  2. Simulation of Physical Experiments in Immersive Virtual Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Wasfy, Tamer M.

    2001-01-01

    An object-oriented event-driven immersive Virtual environment is described for the creation of virtual labs (VLs) for simulating physical experiments. Discussion focuses on a number of aspects of the VLs, including interface devices, software objects, and various applications. The VLs interface with output devices, including immersive stereoscopic screed(s) and stereo speakers; and a variety of input devices, including body tracking (head and hands), haptic gloves, wand, joystick, mouse, microphone, and keyboard. The VL incorporates the following types of primitive software objects: interface objects, support objects, geometric entities, and finite elements. Each object encapsulates a set of properties, methods, and events that define its behavior, appearance, and functions. A container object allows grouping of several objects. Applications of the VLs include viewing the results of the physical experiment, viewing a computer simulation of the physical experiment, simulation of the experiments procedure, computational steering, and remote control of the physical experiment. In addition, the VL can be used as a risk-free (safe) environment for training. The implementation of virtual structures testing machines, virtual wind tunnels, and a virtual acoustic testing facility is described.

  3. Application of Data-Driven Evidential Belief Functions to Prospectivity Mapping for Aquamarine-Bearing Pegmatites, Lundazi District, Zambia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carranza, E. J. M., E-mail: carranza@itc.nl; Woldai, T.; Chikambwe, E. M.

    A case application of data-driven estimation of evidential belief functions (EBFs) is demonstrated to prospectivity mapping in Lundazi district (eastern Zambia). Spatial data used to represent recognition criteria of prospectivity for aquamarine-bearing pegmatites include mapped granites, mapped faults/fractures, mapped shear zones, and radioelement concentration ratios derived from gridded airborne radiometric data. Data-driven estimates EBFs take into account not only (a) spatial association between an evidential map layer and target deposits but also (b) spatial relationships between classes of evidences in an evidential map layer. Data-driven estimates of EBFs can indicate which spatial data provide positive or negative evidence of prospectivity.more » Data-driven estimates of EBFs of only spatial data providing positive evidence of prospectivity were integrated according to Dempster's rule of combination. Map of integrated degrees of belief was used to delineate zones of relative degress of prospectivity for aquamarine-bearing pegmatites. The predictive map has at least 85% prediction rate and at least 79% success rate of delineating training and validation deposits, respectively. The results illustrate usefulness of data-driven estimation of EBFs in GIS-based predictive mapping of mineral prospectivity. The results also show usefulness of EBFs in managing uncertainties associated with evidential maps.« less

  4. Double-driven shield capacitive type proximity sensor

    NASA Technical Reports Server (NTRS)

    Vranish, John M. (Inventor)

    1993-01-01

    A capacity type proximity sensor comprised of a capacitance type sensor, a capacitance type reference, and two independent and mutually opposing driven shields respectively adjacent to the sensor and reference and which are coupled in an electrical bridge circuit configuration and driven by a single frequency crystal controlled oscillator is presented. The bridge circuit additionally includes a pair of fixed electrical impedance elements which form adjacent arms of the bridge and which comprise either a pair of precision resistances or capacitors. Detection of bridge unbalance provides an indication of the mutual proximity between an object and the sensor. Drift compensation is also utilized to improve performance and thus increase sensor range and sensitivity.

  5. How much are Chevrolet Volts in The EV Project driven in EV Mode?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smart, John

    2013-08-01

    This report summarizes key conclusions from analysis of data collected from Chevrolet Volts participating in The EV Project. Topics include how many miles are driven in EV mode, how far vehicles are driven between charging events, and how much energy is charged from the electric grid per charging event.

  6. Teaching Assistant Professional Development in Biology: Designed for and Driven by Multidimensional Data

    PubMed Central

    Long, Tammy M.; Ebert-May, Diane

    2014-01-01

    Graduate teaching assistants (TAs) are increasingly responsible for instruction in undergraduate science, technology, engineering, and mathematics (STEM) courses. Various professional development (PD) programs have been developed and implemented to prepare TAs for this role, but data about effectiveness are lacking and are derived almost exclusively from self-reported surveys. In this study, we describe the design of a reformed PD (RPD) model and apply Kirkpatrick's Evaluation Framework to evaluate multiple outcomes of TA PD before, during, and after implementing RPD. This framework allows evaluation that includes both direct measures and self-reported data. In RPD, TAs created and aligned learning objectives and assessments and incorporated more learner-centered instructional practices in their teaching. However, these data are inconsistent with TAs’ self-reported perceptions about RPD and suggest that single measures are insufficient to evaluate TA PD programs. PMID:26086654

  7. Size matters: large objects capture attention in visual search.

    PubMed

    Proulx, Michael J

    2010-12-23

    Can objects or events ever capture one's attention in a purely stimulus-driven manner? A recent review of the literature set out the criteria required to find stimulus-driven attentional capture independent of goal-directed influences, and concluded that no published study has satisfied that criteria. Here visual search experiments assessed whether an irrelevantly large object can capture attention. Capture of attention by this static visual feature was found. The results suggest that a large object can indeed capture attention in a stimulus-driven manner and independent of displaywide features of the task that might encourage a goal-directed bias for large items. It is concluded that these results are either consistent with the stimulus-driven criteria published previously or alternatively consistent with a flexible, goal-directed mechanism of saliency detection.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Overview of NREL's work in Alaska. NREL provides objective, data-driven support to aid decision-makers in Alaska as they deploy advanced energy technologies and reduce energy burdens across the nation's largest state. NREL's technical assistance, research, and outreach activities are providing the catalyst for transforming the way Alaska uses energy.

  9. Making Data-Driven Decisions: Silent Reading

    ERIC Educational Resources Information Center

    Trudel, Heidi

    2007-01-01

    Due in part to conflicting opinions and research results, the practice of sustained silent reading (SSR) in schools has been questioned. After a frustrating experience with SSR, the author of this article began a data-driven decision-making process to gain new insights on how to structure silent reading in a classroom, including a comparison…

  10. Automatic labeling and characterization of objects using artificial neural networks

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms, i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  11. The utilization of neural nets in populating an object-oriented database

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms (i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  12. Moving Towards a Science-Driven Workbench for Earth Science Solutions

    NASA Astrophysics Data System (ADS)

    Graves, S. J.; Djorgovski, S. G.; Law, E.; Yang, C. P.; Keiser, K.

    2017-12-01

    The NSF-funded EarthCube Integration and Test Environment (ECITE) prototype was proposed as a 2015 Integrated Activities project and resulted in the prototyping of an EarthCube federated cloud environment and the Integration and Testing Framework. The ECITE team has worked with EarthCube science and technology governance committees to define the types of integration, testing and evaluation necessary to achieve and demonstrate interoperability and functionality that benefit and support the objectives of the EarthCube cyber-infrastructure. The scope of ECITE also includes reaching beyond NSF and EarthCube to work with the broader Earth science community, such as the Earth Science Information Partners (ESIP) to incorporate lessons learned from other testbed activities, and ultimately provide broader community benefits. This presentation will discuss evolving ECITE ideas for a science-driven workbench that will start with documented science use cases, map the use cases to solution scenarios that identify the available technology and data resources that match the use case, the generation of solution workflows and test plans, the testing and evaluation of the solutions in a cloud environment, and finally the documentation of identified technology and data gaps that will assist with driving the development of additional EarthCube resources.

  13. The Application of Jason-1 Measurements to Estimate the Global Near Surface Ocean Circulation for Climate Research

    NASA Technical Reports Server (NTRS)

    Niiler, Peran P.

    2004-01-01

    The scientific objective of this research program was to utilize drifter, Jason-1 altimeter data and a variety of wind data for the determination of time mean and time variable wind driven surface currents of the global ocean. To accomplish this task has required the interpolation of 6-hourly winds on drifter tracks and the computation of the wind coherent motions of the drifters. These calculations showed that the Ekman current model proposed by Ralph and Niiler for the tropical Pacific was valid for all the oceans south of 40N latitude. Improvements to RN99 model were computed and poster presentations of the results were given in several ocean science venues, including the November 2004 GODAY meeting in St. Petersburg, FL.

  14. Perceptual processing during trauma, priming and the development of intrusive memories

    PubMed Central

    Sündermann, Oliver; Hauschildt, Marit; Ehlers, Anke

    2013-01-01

    Background Intrusive reexperiencing in posttraumatic stress disorder (PTSD) is commonly triggered by stimuli with perceptual similarity to those present during the trauma. Information processing theories suggest that perceptual processing during the trauma and enhanced perceptual priming contribute to the easy triggering of intrusive memories by these cues. Methods Healthy volunteers (N = 51) watched neutral and trauma picture stories on a computer screen. Neutral objects that were unrelated to the content of the stories briefly appeared in the interval between the pictures. Dissociation and data-driven processing (as indicators of perceptual processing) and state anxiety during the stories were assessed with self-report questionnaires. After filler tasks, participants completed a blurred object identification task to assess priming and a recognition memory task. Intrusive memories were assessed with telephone interviews 2 weeks and 3 months later. Results Neutral objects were more strongly primed if they occurred in the context of trauma stories than if they occurred during neutral stories, although the effect size was only moderate (ηp2=.08) and only significant when trauma stories were presented first. Regardless of story order, enhanced perceptual priming predicted intrusive memories at 2-week follow-up (N = 51), but not at 3 months (n = 40). Data-driven processing, dissociation and anxiety increases during the trauma stories also predicted intrusive memories. Enhanced perceptual priming and data-driven processing were associated with lower verbal intelligence. Limitations It is unclear to what extent these findings generalize to real-life traumatic events and whether they are specific to negative emotional events. Conclusions The results provide some support for the role of perceptual processing and perceptual priming in reexperiencing symptoms. PMID:23207970

  15. A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty

    USGS Publications Warehouse

    Friedel, Michael J.

    2011-01-01

    This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.

  16. Rethinking Canada's Higher Education Policy.

    ERIC Educational Resources Information Center

    Polster, Claire

    2002-01-01

    Rather than a public good that is freely shared, Canadian university research is increasingly privatized and commercialized and thus rendered accessible only to those who can pay for it. The effects include erosion of collegiality, institutional democracy, curiosity-driven basic research, objectivity, and consideration of disadvantaged groups. All…

  17. Aerodynamic Optimization of Rocket Control Surface Geometry Using Cartesian Methods and CAD Geometry

    NASA Technical Reports Server (NTRS)

    Nelson, Andrea; Aftosmis, Michael J.; Nemec, Marian; Pulliam, Thomas H.

    2004-01-01

    Aerodynamic design is an iterative process involving geometry manipulation and complex computational analysis subject to physical constraints and aerodynamic objectives. A design cycle consists of first establishing the performance of a baseline design, which is usually created with low-fidelity engineering tools, and then progressively optimizing the design to maximize its performance. Optimization techniques have evolved from relying exclusively on designer intuition and insight in traditional trial and error methods, to sophisticated local and global search methods. Recent attempts at automating the search through a large design space with formal optimization methods include both database driven and direct evaluation schemes. Databases are being used in conjunction with surrogate and neural network models as a basis on which to run optimization algorithms. Optimization algorithms are also being driven by the direct evaluation of objectives and constraints using high-fidelity simulations. Surrogate methods use data points obtained from simulations, and possibly gradients evaluated at the data points, to create mathematical approximations of a database. Neural network models work in a similar fashion, using a number of high-fidelity database calculations as training iterations to create a database model. Optimal designs are obtained by coupling an optimization algorithm to the database model. Evaluation of the current best design then gives either a new local optima and/or increases the fidelity of the approximation model for the next iteration. Surrogate methods have also been developed that iterate on the selection of data points to decrease the uncertainty of the approximation model prior to searching for an optimal design. The database approximation models for each of these cases, however, become computationally expensive with increase in dimensionality. Thus the method of using optimization algorithms to search a database model becomes problematic as the number of design variables is increased.

  18. Audiologist-driven versus patient-driven fine tuning of hearing instruments.

    PubMed

    Boymans, Monique; Dreschler, Wouter A

    2012-03-01

    Two methods of fine tuning the initial settings of hearing aids were compared: An audiologist-driven approach--using real ear measurements and a patient-driven fine-tuning approach--using feedback from real-life situations. The patient-driven fine tuning was conducted by employing the Amplifit(®) II system using audiovideo clips. The audiologist-driven fine tuning was based on the NAL-NL1 prescription rule. Both settings were compared using the same hearing aids in two 6-week trial periods following a randomized blinded cross-over design. After each trial period, the settings were evaluated by insertion-gain measurements. Performance was evaluated by speech tests in quiet, in noise, and in time-reversed speech, presented at 0° and with spatially separated sound sources. Subjective results were evaluated using extensive questionnaires and audiovisual video clips. A total of 73 participants were included. On average, higher gain values were found for the audiologist-driven settings than for the patient-driven settings, especially at 1000 and 2000 Hz. Better objective performance was obtained for the audiologist-driven settings for speech perception in quiet and in time-reversed speech. This was supported by better scores on a number of subjective judgments and in the subjective ratings of video clips. The perception of loud sounds scored higher than when patient-driven, but the overall preference was in favor of the audiologist-driven settings for 67% of the participants.

  19. View subspaces for indexing and retrieval of 3D models

    NASA Astrophysics Data System (ADS)

    Dutagaci, Helin; Godil, Afzal; Sankur, Bülent; Yemez, Yücel

    2010-02-01

    View-based indexing schemes for 3D object retrieval are gaining popularity since they provide good retrieval results. These schemes are coherent with the theory that humans recognize objects based on their 2D appearances. The viewbased techniques also allow users to search with various queries such as binary images, range images and even 2D sketches. The previous view-based techniques use classical 2D shape descriptors such as Fourier invariants, Zernike moments, Scale Invariant Feature Transform-based local features and 2D Digital Fourier Transform coefficients. These methods describe each object independent of others. In this work, we explore data driven subspace models, such as Principal Component Analysis, Independent Component Analysis and Nonnegative Matrix Factorization to describe the shape information of the views. We treat the depth images obtained from various points of the view sphere as 2D intensity images and train a subspace to extract the inherent structure of the views within a database. We also show the benefit of categorizing shapes according to their eigenvalue spread. Both the shape categorization and data-driven feature set conjectures are tested on the PSB database and compared with the competitor view-based 3D shape retrieval algorithms.

  20. Multidimensional Data Modeling for Business Process Analysis

    NASA Astrophysics Data System (ADS)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  1. Investigating the Determinants of Adults' Participation in Higher Education

    ERIC Educational Resources Information Center

    Owusu-Agyeman, Yaw

    2016-01-01

    This study investigates the determinants of adult learners' participation in higher education in a lifelong learning environment. The author argues that the determinants of adult learners' participation in higher education include individual demands, state and institutional policy objectives and industry-driven demands rather than demographic…

  2. Space Objects Maneuvering Detection and Prediction via Inverse Reinforcement Learning

    NASA Astrophysics Data System (ADS)

    Linares, R.; Furfaro, R.

    This paper determines the behavior of Space Objects (SOs) using inverse Reinforcement Learning (RL) to estimate the reward function that each SO is using for control. The approach discussed in this work can be used to analyze maneuvering of SOs from observational data. The inverse RL problem is solved using the Feature Matching approach. This approach determines the optimal reward function that a SO is using while maneuvering by assuming that the observed trajectories are optimal with respect to the SO's own reward function. This paper uses estimated orbital elements data to determine the behavior of SOs in a data-driven fashion.

  3. Inferring Binary and Trinary Stellar Populations in Photometric and Astrometric Surveys

    NASA Astrophysics Data System (ADS)

    Widmark, Axel; Leistedt, Boris; Hogg, David W.

    2018-04-01

    Multiple stellar systems are ubiquitous in the Milky Way but are often unresolved and seen as single objects in spectroscopic, photometric, and astrometric surveys. However, modeling them is essential for developing a full understanding of large surveys such as Gaia and connecting them to stellar and Galactic models. In this paper, we address this problem by jointly fitting the Gaia and Two Micron All Sky Survey photometric and astrometric data using a data-driven Bayesian hierarchical model that includes populations of binary and trinary systems. This allows us to classify observations into singles, binaries, and trinaries, in a robust and efficient manner, without resorting to external models. We are able to identify multiple systems and, in some cases, make strong predictions for the properties of their unresolved stars. We will be able to compare such predictions with Gaia Data Release 4, which will contain astrometric identification and analysis of binary systems.

  4. Development of a High Accuracy Angular Measurement System for Langley Research Center Hypersonic Wind Tunnel Facilities

    NASA Technical Reports Server (NTRS)

    Newman, Brett; Yu, Si-bok; Rhew, Ray D. (Technical Monitor)

    2003-01-01

    Modern experimental and test activities demand innovative and adaptable procedures to maximize data content and quality while working within severely constrained budgetary and facility resource environments. This report describes development of a high accuracy angular measurement capability for NASA Langley Research Center hypersonic wind tunnel facilities to overcome these deficiencies. Specifically, utilization of micro-electro-mechanical sensors including accelerometers and gyros, coupled with software driven data acquisition hardware, integrated within a prototype measurement system, is considered. Development methodology addresses basic design requirements formulated from wind tunnel facility constraints and current operating procedures, as well as engineering and scientific test objectives. Description of the analytical framework governing relationships between time dependent multi-axis acceleration and angular rate sensor data and the desired three dimensional Eulerian angular state of the test model is given. Calibration procedures for identifying and estimating critical parameters in the sensor hardware is also addressed.

  5. KNMI DataLab experiences in serving data-driven innovations

    NASA Astrophysics Data System (ADS)

    Noteboom, Jan Willem; Sluiter, Raymond

    2016-04-01

    Climate change research and innovations in weather forecasting rely more and more on (Big) data. Besides increasing data from traditional sources (such as observation networks, radars and satellites), the use of open data, crowd sourced data and the Internet of Things (IoT) is emerging. To deploy these sources of data optimally in our services and products, KNMI has established a DataLab to serve data-driven innovations in collaboration with public and private sector partners. Big data management, data integration, data analytics including machine learning and data visualization techniques are playing an important role in the DataLab. Cross-domain data-driven innovations that arise from public-private collaborative projects and research programmes can be explored, experimented and/or piloted by the KNMI DataLab. Furthermore, advice can be requested on (Big) data techniques and data sources. In support of collaborative (Big) data science activities, scalable environments are offered with facilities for data integration, data analysis and visualization. In addition, Data Science expertise is provided directly or from a pool of internal and external experts. At the EGU conference, gained experiences and best practices are presented in operating the KNMI DataLab to serve data-driven innovations for weather and climate applications optimally.

  6. The fiber walk: a model of tip-driven growth with lateral expansion.

    PubMed

    Bucksch, Alexander; Turk, Greg; Weitz, Joshua S

    2014-01-01

    Tip-driven growth processes underlie the development of many plants. To date, tip-driven growth processes have been modeled as an elongating path or series of segments, without taking into account lateral expansion during elongation. Instead, models of growth often introduce an explicit thickness by expanding the area around the completed elongated path. Modeling expansion in this way can lead to contradictions in the physical plausibility of the resulting surface and to uncertainty about how the object reached certain regions of space. Here, we introduce fiber walks as a self-avoiding random walk model for tip-driven growth processes that includes lateral expansion. In 2D, the fiber walk takes place on a square lattice and the space occupied by the fiber is modeled as a lateral contraction of the lattice. This contraction influences the possible subsequent steps of the fiber walk. The boundary of the area consumed by the contraction is derived as the dual of the lattice faces adjacent to the fiber. We show that fiber walks generate fibers that have well-defined curvatures, and thus enable the identification of the process underlying the occupancy of physical space. Hence, fiber walks provide a base from which to model both the extension and expansion of physical biological objects with finite thickness.

  7. The Fiber Walk: A Model of Tip-Driven Growth with Lateral Expansion

    PubMed Central

    Bucksch, Alexander; Turk, Greg; Weitz, Joshua S.

    2014-01-01

    Tip-driven growth processes underlie the development of many plants. To date, tip-driven growth processes have been modeled as an elongating path or series of segments, without taking into account lateral expansion during elongation. Instead, models of growth often introduce an explicit thickness by expanding the area around the completed elongated path. Modeling expansion in this way can lead to contradictions in the physical plausibility of the resulting surface and to uncertainty about how the object reached certain regions of space. Here, we introduce fiber walks as a self-avoiding random walk model for tip-driven growth processes that includes lateral expansion. In 2D, the fiber walk takes place on a square lattice and the space occupied by the fiber is modeled as a lateral contraction of the lattice. This contraction influences the possible subsequent steps of the fiber walk. The boundary of the area consumed by the contraction is derived as the dual of the lattice faces adjacent to the fiber. We show that fiber walks generate fibers that have well-defined curvatures, and thus enable the identification of the process underlying the occupancy of physical space. Hence, fiber walks provide a base from which to model both the extension and expansion of physical biological objects with finite thickness. PMID:24465607

  8. A Federated Network for Translational Cancer Research Using Clinical Data and Biospecimens

    PubMed Central

    Becich, Michael J.; Bollag, Roni J.; Chavan, Girish; Corrigan, Julia; Dhir, Rajiv; Feldman, Michael D.; Gaudioso, Carmelo; Legowski, Elizabeth; Maihle, Nita J.; Mitchell, Kevin; Murphy, Monica; Sakthivel, Mayur; Tseytlin, Eugene; Weaver, JoEllen

    2015-01-01

    Advances in cancer research and personalized medicine will require significant new bridging infrastructures, including more robust biorepositories that link human tissue to clinical phenotypes and outcomes. In order to meet that challenge, four cancer centers formed the TIES Cancer Research Network, a federated network that facilitates data and biospecimen sharing among member institutions. Member sites can access pathology data that is de-identified and processed with the TIES natural language processing system, which creates a repository of rich phenotype data linked to clinical biospecimens. TIES incorporates multiple security and privacy best practices that, combined with legal agreements, network policies and procedures, enable regulatory compliance. The TIES Cancer Research Network now provides integrated access to investigators at all member institutions, where multiple investigator-driven pilot projects are underway. Examples of federated search across the network illustrate the potential impact on translational research, particularly for studies involving rare cancers, rare phenotypes, and specific biologic behaviors. The network satisfies several key desiderata including local control of data and credentialing, inclusion of rich phenotype information, and applicability to diverse research objectives. The TIES Cancer Research Network presents a model for a national data and biospecimen network. PMID:26670560

  9. Statistical Estimation of Orbital Debris Populations with a Spectrum of Object Size

    NASA Technical Reports Server (NTRS)

    Xu, Y. -l; Horstman, M.; Krisko, P. H.; Liou, J. -C; Matney, M.; Stansbery, E. G.; Stokely, C. L.; Whitlock, D.

    2008-01-01

    Orbital debris is a real concern for the safe operations of satellites. In general, the hazard of debris impact is a function of the size and spatial distributions of the debris populations. To describe and characterize the debris environment as reliably as possible, the current NASA Orbital Debris Engineering Model (ORDEM2000) is being upgraded to a new version based on new and better quality data. The data-driven ORDEM model covers a wide range of object sizes from 10 microns to greater than 1 meter. This paper reviews the statistical process for the estimation of the debris populations in the new ORDEM upgrade, and discusses the representation of large-size (greater than or equal to 1 m and greater than or equal to 10 cm) populations by SSN catalog objects and the validation of the statistical approach. Also, it presents results for the populations with sizes of greater than or equal to 3.3 cm, greater than or equal to 1 cm, greater than or equal to 100 micrometers, and greater than or equal to 10 micrometers. The orbital debris populations used in the new version of ORDEM are inferred from data based upon appropriate reference (or benchmark) populations instead of the binning of the multi-dimensional orbital-element space. This paper describes all of the major steps used in the population-inference procedure for each size-range. Detailed discussions on data analysis, parameter definition, the correlation between parameters and data, and uncertainty assessment are included.

  10. Incorporating Target Priorities in the Sensor Tasking Reward Function

    NASA Astrophysics Data System (ADS)

    Gehly, S.; Bennett, J.

    2016-09-01

    Orbital debris tracking poses many challenges, most fundamentally the need to track a large number of objects from a limited number of sensors. The use of information theoretic sensor allocation provides a means to efficiently collect data on the multitarget system. An additional need of the community is the ability to specify target priorities, driven both by user needs and environmental factors such as collision warnings. This research develops a method to incorporate target priorities in the sensor tasking reward function, allowing for several applications in different tasking modes such as catalog maintenance, calibration, and collision monitoring. A set of numerical studies is included to demonstrate the functionality of the method.

  11. Design and Deployment of a Pediatric Cardiac Arrest Surveillance System

    PubMed Central

    Newton, Heather Marie; McNamara, Leann; Engorn, Branden Michael; Jones, Kareen; Bernier, Meghan; Dodge, Pamela; Salamone, Cheryl; Bhalala, Utpal; Jeffers, Justin M.; Engineer, Lilly; Diener-West, Marie; Hunt, Elizabeth Anne

    2018-01-01

    Objective We aimed to increase detection of pediatric cardiopulmonary resuscitation (CPR) events and collection of physiologic and performance data for use in quality improvement (QI) efforts. Materials and Methods We developed a workflow-driven surveillance system that leveraged organizational information technology systems to trigger CPR detection and analysis processes. We characterized detection by notification source, type, location, and year, and compared it to previous methods of detection. Results From 1/1/2013 through 12/31/2015, there were 2,986 unique notifications associated with 2,145 events, 317 requiring CPR. PICU and PEDS-ED accounted for 65% of CPR events, whereas floor care areas were responsible for only 3% of events. 100% of PEDS-OR and >70% of PICU CPR events would not have been included in QI efforts. Performance data from both defibrillator and bedside monitor increased annually. (2013: 1%; 2014: 18%; 2015: 27%). Discussion After deployment of this system, detection has increased ∼9-fold and performance data collection increased annually. Had the system not been deployed, 100% of PEDS-OR and 50–70% of PICU, NICU, and PEDS-ED events would have been missed. Conclusion By leveraging hospital information technology and medical device data, identification of pediatric cardiac arrest with an associated increased capture in the proportion of objective performance data is possible. PMID:29854451

  12. Ionized and Neutral Outflows in the QUEST QSOs

    NASA Astrophysics Data System (ADS)

    Veilleux, Sylvain

    2011-10-01

    The role of galactic winds in gas-rich mergers is of crucial importance to understand galaxy and SMBH evolution. In recent months, our group has had three major scientific breakthroughs in this area: {1} The discovery with Herschel of massive molecular {OH-absorbing} outflows in several ULIRGs, including the nearest quasar, Mrk 231. {2} The independent discovery from mm-wave interferometric observations in the same object of a spatially resolved molecular {CO-emitting} wind with estimated mass outflow rate 3x larger than the star formation rate and spatially coincident with blueshifted neutral {Na ID-absorbing} gas in optical long-slit spectra. {3} The unambiguous determination from recent Gemini/IFU observations that the Na ID outflow in this object is wide-angle, thus driven by a QSO wind rather than a jet. This powerful outflow may be the long-sought "smoking gun" of quasar mechanical feedback purported to transform gas-rich mergers. However, our Herschel survey excludes all FIR-faint {UV-bright} "classic" QSOs by necessity. So here we propose a complementary FUV absorption-line survey of all FIR-bright -and- FIR-faint QSOs from the same parent sample. New {19 targets} and archival {11} spectra will be used to study, for the first time, the gaseous environments of QSOs as a function of host properties and age across the merger sequence ULIRG -> QSO. These data will allow us to distinguish between ionized & neutral quasar-driven outflows, starburst-driven winds, and tidal debris around the mergers. They will also be uniquely suited for a shallow but broad study of the warm & warm-hot intergalactic media, complementary to on-going surveys that are deeper but narrower.

  13. Digital Suicide Prevention: Can Technology Become a Game-changer?

    PubMed

    Vahabzadeh, Arshya; Sahin, Ned; Kalali, Amir

    2016-01-01

    Suicide continues to be a leading cause of death and has been recognized as a significant public health issue. Rapid advances in data science can provide us with useful tools for suicide prevention, and help to dynamically assess suicide risk in quantitative data-driven ways. In this article, the authors highlight the most current international research in digital suicide prevention, including the use of machine learning, smartphone applications, and wearable sensor driven systems. The authors also discuss future opportunities for digital suicide prevention, and propose a novel Sensor-driven Mental State Assessment System.

  14. Wilmington Area Planning Council, New Castle County, Delaware and Cecil County, Maryland : a performance-based approach to integrating congestion management into the metropolitan planning process

    DOT National Transportation Integrated Search

    2009-04-01

    The Wilmington Area Planning Council takes an objectives-driven, performance-based approach to its metropolitan transportation planning, including paying special attention to integrating its Congestion Management Process into its planning efforts. Th...

  15. EOS Operations Systems: EDOS Implemented Changes to Reduce Operations Costs

    NASA Technical Reports Server (NTRS)

    Cordier, Guy R.; Gomez-Rosa, Carlos; McLemore, Bruce D.

    2007-01-01

    The authors describe in this paper the progress achieved to-date with the reengineering of the Earth Observing System (EOS) Data and Operations System (EDOS), the experience gained in the process and the ensuing reduction of ground systems operations costs. The reengineering effort included a major methodology change, applying to an existing schedule driven system, a data-driven system approach.

  16. Boosting Bayesian parameter inference of nonlinear stochastic differential equation models by Hamiltonian scale separation.

    PubMed

    Albert, Carlo; Ulzega, Simone; Stoop, Ruedi

    2016-04-01

    Parameter inference is a fundamental problem in data-driven modeling. Given observed data that is believed to be a realization of some parameterized model, the aim is to find parameter values that are able to explain the observed data. In many situations, the dominant sources of uncertainty must be included into the model for making reliable predictions. This naturally leads to stochastic models. Stochastic models render parameter inference much harder, as the aim then is to find a distribution of likely parameter values. In Bayesian statistics, which is a consistent framework for data-driven learning, this so-called posterior distribution can be used to make probabilistic predictions. We propose a novel, exact, and very efficient approach for generating posterior parameter distributions for stochastic differential equation models calibrated to measured time series. The algorithm is inspired by reinterpreting the posterior distribution as a statistical mechanics partition function of an object akin to a polymer, where the measurements are mapped on heavier beads compared to those of the simulated data. To arrive at distribution samples, we employ a Hamiltonian Monte Carlo approach combined with a multiple time-scale integration. A separation of time scales naturally arises if either the number of measurement points or the number of simulation points becomes large. Furthermore, at least for one-dimensional problems, we can decouple the harmonic modes between measurement points and solve the fastest part of their dynamics analytically. Our approach is applicable to a wide range of inference problems and is highly parallelizable.

  17. Developmental changes in face visual scanning in autism spectrum disorder as assessed by data-based analysis

    PubMed Central

    Amestoy, Anouck; Guillaud, Etienne; Bouvard, Manuel P.; Cazalets, Jean-René

    2015-01-01

    Individuals with autism spectrum disorder (ASD) present reduced visual attention to faces. However, contradictory conclusions have been drawn about the strategies involved in visual face scanning due to the various methodologies implemented in the study of facial screening. Here, we used a data-driven approach to compare children and adults with ASD subjected to the same free viewing task and to address developmental aspects of face scanning, including its temporal patterning, in healthy children, and adults. Four groups (54 subjects) were included in the study: typical adults, typically developing children, and adults and children with ASD. Eye tracking was performed on subjects viewing unfamiliar faces. Fixations were analyzed using a data-driven approach that employed spatial statistics to provide an objective, unbiased definition of the areas of interest. Typical adults expressed a spatial and temporal strategy for visual scanning that differed from the three other groups, involving a sequential fixation of the right eye (RE), left eye (LE), and mouth. Typically developing children, adults and children with autism exhibited similar fixation patterns and they always started by looking at the RE. Children (typical or with ASD) subsequently looked at the LE or the mouth. Based on the present results, the patterns of fixation for static faces that mature from childhood to adulthood in typical subjects are not found in adults with ASD. The atypical patterns found after developmental progression and experience in ASD groups appear to remain blocked in an immature state that cannot be differentiated from typical developmental child patterns of fixation. PMID:26236264

  18. Model-Driven Development for PDS4 Software and Services

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Algermissen, S. S.; Cayanan, M. D.; Joyner, R. S.; Hardman, S. H.; Padams, J. H.

    2018-04-01

    PDS4 data product labels provide the information necessary for processing the referenced digital object. However, significantly more information is available in the PDS4 Information Model. This additional information is made available for use, by both software and services, to configure, promote resiliency, and improve interoperability.

  19. Integrating watershed- and farm-scale modeling framework for targeting critical source areas while maintaining farm economic viability

    USDA-ARS?s Scientific Manuscript database

    Quantitative risk assessments of pollution and data related to the effectiveness of mitigating best management practices (BMPs) are important aspects of nonpoint source (NPS) pollution control efforts, particularly those driven by specific water quality objectives and by measurable improvement goals...

  20. Testing protocol for predicting driven pile behavior within pre-bored soil : research project capsule.

    DOT National Transportation Integrated Search

    2014-02-01

    The objective of this project is to compile the state-of-the-art and best practice : results available on the subject of pre-bored piles and develop a research and : instrumentation testing plan for fi eld data collection and select multiple pile : d...

  1. Implementing a Successful Faculty, Data Driven Model for Program Review.

    ERIC Educational Resources Information Center

    Beal, Suzanne; Davis, Shirley

    Frederick Community College (Maryland) utilizes both the Instructional Accountability Program Review (IAPR) and the Career Program Review (CPR) to assess program outcomes and determine progress in meeting goals and objectives. The IAPR is a comprehensive review procedure conducted by faculty and associate deans to evaluate all transfer, career,…

  2. Assessment of Soil Moisture Data Requirements by the Potential SMAP Data User Community: Review of SMAP Mission User Community

    NASA Technical Reports Server (NTRS)

    Brown, Molly E.; Escobar, Vanessa M.

    2013-01-01

    NASA's Soil Moisture Active and Passive (SMAP) mission is planned for launch in October 2014 and will provide global measurements of soil moisture and freeze thaw state. The project is driven by both basic research and applied science goals. Understanding how application driven end-users will apply SMAP data, prior to the satellite's launch, is an important goal of NASA's applied science program and SMAP mission success. Because SMAP data are unique, there are no direct proxy data sets that can be used in research and operational studies to determine how the data will interact with existing processes. The objective of this study is to solicit data requirements, accuracy needs, and current understanding of the SMAP mission from the potential user community. This study showed that the data to be provided by the SMAP mission did substantially meet the user community needs. Although there was a broad distribution of requirements stated, the SMAP mission fit within these requirements.

  3. Data quality and processing for decision making: divergence between corporate strategy and manufacturing processes

    NASA Astrophysics Data System (ADS)

    McNeil, Ronald D.; Miele, Renato; Shaul, Dennis

    2000-10-01

    Information technology is driving improvements in manufacturing systems. Results are higher productivity and quality. However, corporate strategy is driven by a number of factors and includes data and pressure from multiple stakeholders, which includes employees, managers, executives, stockholders, boards, suppliers and customers. It is also driven by information about competitors and emerging technology. Much information is based on processing of data and the resulting biases of the processors. Thus, stakeholders can base inputs on faulty perceptions, which are not reality based. Prior to processing, data used may be inaccurate. Sources of data and information may include demographic reports, statistical analyses, intelligence reports (e.g., marketing data), technology and primary data collection. The reliability and validity of data as well as the management of sources and information is critical element to strategy formulation. The paper explores data collection, processing and analyses from secondary and primary sources, information generation and report presentation for strategy formulation and contrast this with data and information utilized to drive internal process such as manufacturing. The hypothesis is that internal process, such as manufacturing, are subordinate to corporate strategies. The impact of possible divergence in quality of decisions at the corporate level on IT driven, quality-manufacturing processes based on measurable outcomes is significant. Recommendations for IT improvements at the corporate strategy level are given.

  4. Reading visually embodied meaning from the brain: Visually grounded computational models decode visual-object mental imagery induced by written text.

    PubMed

    Anderson, Andrew James; Bruni, Elia; Lopopolo, Alessandro; Poesio, Massimo; Baroni, Marco

    2015-10-15

    Embodiment theory predicts that mental imagery of object words recruits neural circuits involved in object perception. The degree of visual imagery present in routine thought and how it is encoded in the brain is largely unknown. We test whether fMRI activity patterns elicited by participants reading objects' names include embodied visual-object representations, and whether we can decode the representations using novel computational image-based semantic models. We first apply the image models in conjunction with text-based semantic models to test predictions of visual-specificity of semantic representations in different brain regions. Representational similarity analysis confirms that fMRI structure within ventral-temporal and lateral-occipital regions correlates most strongly with the image models and conversely text models correlate better with posterior-parietal/lateral-temporal/inferior-frontal regions. We use an unsupervised decoding algorithm that exploits commonalities in representational similarity structure found within both image model and brain data sets to classify embodied visual representations with high accuracy (8/10) and then extend it to exploit model combinations to robustly decode different brain regions in parallel. By capturing latent visual-semantic structure our models provide a route into analyzing neural representations derived from past perceptual experience rather than stimulus-driven brain activity. Our results also verify the benefit of combining multimodal data to model human-like semantic representations. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. The chemical information ontology: provenance and disambiguation for chemical data on the biological semantic web.

    PubMed

    Hastings, Janna; Chepelev, Leonid; Willighagen, Egon; Adams, Nico; Steinbeck, Christoph; Dumontier, Michel

    2011-01-01

    Cheminformatics is the application of informatics techniques to solve chemical problems in silico. There are many areas in biology where cheminformatics plays an important role in computational research, including metabolism, proteomics, and systems biology. One critical aspect in the application of cheminformatics in these fields is the accurate exchange of data, which is increasingly accomplished through the use of ontologies. Ontologies are formal representations of objects and their properties using a logic-based ontology language. Many such ontologies are currently being developed to represent objects across all the domains of science. Ontologies enable the definition, classification, and support for querying objects in a particular domain, enabling intelligent computer applications to be built which support the work of scientists both within the domain of interest and across interrelated neighbouring domains. Modern chemical research relies on computational techniques to filter and organise data to maximise research productivity. The objects which are manipulated in these algorithms and procedures, as well as the algorithms and procedures themselves, enjoy a kind of virtual life within computers. We will call these information entities. Here, we describe our work in developing an ontology of chemical information entities, with a primary focus on data-driven research and the integration of calculated properties (descriptors) of chemical entities within a semantic web context. Our ontology distinguishes algorithmic, or procedural information from declarative, or factual information, and renders of particular importance the annotation of provenance to calculated data. The Chemical Information Ontology is being developed as an open collaborative project. More details, together with a downloadable OWL file, are available at http://code.google.com/p/semanticchemistry/ (license: CC-BY-SA).

  6. The Chemical Information Ontology: Provenance and Disambiguation for Chemical Data on the Biological Semantic Web

    PubMed Central

    Hastings, Janna; Chepelev, Leonid; Willighagen, Egon; Adams, Nico; Steinbeck, Christoph; Dumontier, Michel

    2011-01-01

    Cheminformatics is the application of informatics techniques to solve chemical problems in silico. There are many areas in biology where cheminformatics plays an important role in computational research, including metabolism, proteomics, and systems biology. One critical aspect in the application of cheminformatics in these fields is the accurate exchange of data, which is increasingly accomplished through the use of ontologies. Ontologies are formal representations of objects and their properties using a logic-based ontology language. Many such ontologies are currently being developed to represent objects across all the domains of science. Ontologies enable the definition, classification, and support for querying objects in a particular domain, enabling intelligent computer applications to be built which support the work of scientists both within the domain of interest and across interrelated neighbouring domains. Modern chemical research relies on computational techniques to filter and organise data to maximise research productivity. The objects which are manipulated in these algorithms and procedures, as well as the algorithms and procedures themselves, enjoy a kind of virtual life within computers. We will call these information entities. Here, we describe our work in developing an ontology of chemical information entities, with a primary focus on data-driven research and the integration of calculated properties (descriptors) of chemical entities within a semantic web context. Our ontology distinguishes algorithmic, or procedural information from declarative, or factual information, and renders of particular importance the annotation of provenance to calculated data. The Chemical Information Ontology is being developed as an open collaborative project. More details, together with a downloadable OWL file, are available at http://code.google.com/p/semanticchemistry/ (license: CC-BY-SA). PMID:21991315

  7. A Cyber Enabled Collaborative Environment for Creating, Sharing and Using Data and Modeling Driven Curriculum Modules for Hydrology Education

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Ruddell, B. L.; Fox, S.; Iverson, E. A. R.

    2014-12-01

    With the access to emerging datasets and computational tools, there is a need to bring these capabilities into hydrology classrooms. However, developing curriculum modules using data and models to augment classroom teaching is hindered by a steep technology learning curve, rapid technology turnover, and lack of an organized community cyberinfrastructure (CI) for the dissemination, publication, and sharing of the latest tools and curriculum material for hydrology and geoscience education. The objective of this project is to overcome some of these limitations by developing a cyber enabled collaborative environment for publishing, sharing and adoption of data and modeling driven curriculum modules in hydrology and geosciences classroom. The CI is based on Carleton College's Science Education Resource Center (SERC) Content Management System. Building on its existing community authoring capabilities the system is being extended to allow assembly of new teaching activities by drawing on a collection of interchangeable building blocks; each of which represents a step in the modeling process. Currently the system hosts more than 30 modules or steps, which can be combined to create multiple learning units. Two specific units: Unit Hydrograph and Rational Method, have been used in undergraduate hydrology class-rooms at Purdue University and Arizona State University. The structure of the CI and the lessons learned from its implementation, including preliminary results from student assessments of learning will be presented.

  8. The Use of Linking Adverbials in Academic Essays by Non-Native Writers: How Data-Driven Learning Can Help

    ERIC Educational Resources Information Center

    Garner, James Robert

    2013-01-01

    Over the past several decades, the TESOL community has seen an increased interest in the use of data-driven learning (DDL) approaches. Most studies of DDL have focused on the acquisition of vocabulary items, including a wide range of information necessary for their correct usage. One type of vocabulary that has yet to be properly investigated has…

  9. Limited angle CT reconstruction by simultaneous spatial and Radon domain regularization based on TV and data-driven tight frame

    NASA Astrophysics Data System (ADS)

    Zhang, Wenkun; Zhang, Hanming; Wang, Linyuan; Cai, Ailong; Li, Lei; Yan, Bin

    2018-02-01

    Limited angle computed tomography (CT) reconstruction is widely performed in medical diagnosis and industrial testing because of the size of objects, engine/armor inspection requirements, and limited scan flexibility. Limited angle reconstruction necessitates usage of optimization-based methods that utilize additional sparse priors. However, most of conventional methods solely exploit sparsity priors of spatial domains. When CT projection suffers from serious data deficiency or various noises, obtaining reconstruction images that meet the requirement of quality becomes difficult and challenging. To solve this problem, this paper developed an adaptive reconstruction method for limited angle CT problem. The proposed method simultaneously uses spatial and Radon domain regularization model based on total variation (TV) and data-driven tight frame. Data-driven tight frame being derived from wavelet transformation aims at exploiting sparsity priors of sinogram in Radon domain. Unlike existing works that utilize pre-constructed sparse transformation, the framelets of the data-driven regularization model can be adaptively learned from the latest projection data in the process of iterative reconstruction to provide optimal sparse approximations for given sinogram. At the same time, an effective alternating direction method is designed to solve the simultaneous spatial and Radon domain regularization model. The experiments for both simulation and real data demonstrate that the proposed algorithm shows better performance in artifacts depression and details preservation than the algorithms solely using regularization model of spatial domain. Quantitative evaluations for the results also indicate that the proposed algorithm applying learning strategy performs better than the dual domains algorithms without learning regularization model

  10. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education

    PubMed Central

    Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-01-01

    Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840

  11. Digital Suicide Prevention: Can Technology Become a Game-changer?

    PubMed Central

    Sahin, Ned; Kalali, Amir

    2016-01-01

    Suicide continues to be a leading cause of death and has been recognized as a significant public health issue. Rapid advances in data science can provide us with useful tools for suicide prevention, and help to dynamically assess suicide risk in quantitative data-driven ways. In this article, the authors highlight the most current international research in digital suicide prevention, including the use of machine learning, smartphone applications, and wearable sensor driven systems. The authors also discuss future opportunities for digital suicide prevention, and propose a novel Sensor-driven Mental State Assessment System. PMID:27800282

  12. Large Field Visualization with Demand-Driven Calculation

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Henze, Chris

    1999-01-01

    We present a system designed for the interactive definition and visualization of fields derived from large data sets: the Demand-Driven Visualizer (DDV). The system allows the user to write arbitrary expressions to define new fields, and then apply a variety of visualization techniques to the result. Expressions can include differential operators and numerous other built-in functions, ail of which are evaluated at specific field locations completely on demand. The payoff of following a demand-driven design philosophy throughout becomes particularly evident when working with large time-series data, where the costs of eager evaluation alternatives can be prohibitive.

  13. Palm Beach Community College Strategic Plan, 1999-2004.

    ERIC Educational Resources Information Center

    Samuels, Seymour

    This report addresses strategies and action plans for Palm Beach Community College (PBCC) (Florida) between 1999-2004. As part of a commitment to achieve specific, measurable end results, the college has set various objectives, including: (1) develop, implement and institutionalize a mission driven strategic budget for the 1999-2000 fiscal year;…

  14. The UARS (Upper Atmosphere Research Satellite): A program to study global ozone change

    NASA Technical Reports Server (NTRS)

    1989-01-01

    NASA's Upper Atmosphere Research Satellite (UARS) program, its goals and objectives are described. Also included are its significance to upper atmosphere science, the experimental and theoretical investigations that comprise it, and the compelling issues of global change, driven by human activities, that led NASA to plan and implement it.

  15. Dynamic Data Driven Applications Systems (DDDAS)

    DTIC Science & Technology

    2012-05-03

    response) – Earthquakes, hurricanes, tornados, wildfires, floods, landslides, tsunamis, … • Critical Infrastructure systems – Electric-powergrid...Multiphase Flow Weather and Climate Structural Mechanics Seismic Processing Aerodynamics Geophysical Fluids Quantum Chemistry Actinide Chemistry...Alloys • Approach and Objectives:  Consider porous SMAs:  similar macroscopic behavior but mass /weight is less, and thus attractive for

  16. How Is European Governance Configuring the EHEA?

    ERIC Educational Resources Information Center

    Magalhães, António; Veiga, Amélia; Sousa, Sofia; Ribeiro, Filipa

    2012-01-01

    This article focuses on the interaction between the European dimension driven by the creation of the European Higher Education Area (EHEA) and the development of national reforms to fulfil that objective. On the basis of data gathered in eight countries involved in EuroHESC project TRUE (Transforming European Universities), the curricular and the…

  17. Identifying Measures Used for Assessing Quality of YouTube Videos with Patient Health Information: A Review of Current Literature

    PubMed Central

    Fernandez-Luque, Luis; Armayones, Manuel; Lau, Annie YS

    2013-01-01

    Background Recent publications on YouTube have advocated its potential for patient education. However, a reliable description of what could be considered quality information for patient education on YouTube is missing. Objective To identify topics associated with the concept of quality information for patient education on YouTube in the scientific literature. Methods A literature review was performed in MEDLINE, ISI Web of Knowledge, Scopus, and PsychINFO. Abstract selection was first conducted by two independent reviewers; discrepancies were discussed in a second abstract review with two additional independent reviewers. Full text of selected papers were analyzed looking for concepts, definitions, and topics used by its authors that focused on the quality of information on YouTube for patient education. Results In total, 456 abstracts were extracted and 13 papers meeting eligibility criteria were analyzed. Concepts identified related to quality of information for patient education are categorized as expert-driven, popularity-driven, or heuristic-driven measures. These include (in descending order): (1) quality of content in 10/13 (77%), (2) view count in 9/13 (69%), (3) health professional opinion in 8/13 (62%), (4) adequate length or duration in 6/13 (46%), (5) public ratings in 5/13 (39%), (6) adequate title, tags, and description in 5/13 (39%), (7) good description or a comprehensive narrative in 4/13 (31%), (8) evidence-based practices included in video in 4/13 (31%), (9) suitability as a teaching tool in 4/13 (31%), (10) technical quality in 4/13 (31%), (11) credentials provided in video in 4/13 (31%), (12) enough amount of content to identify its objective in 3/13 (23%), and (13) viewership share in 2/13 (15%). Conclusions Our review confirms that the current topics linked to quality of information for patient education on YouTube are unclear and not standardized. Although expert-driven, popularity-driven, or heuristic-driven measures are used as proxies to estimate the quality of video information, caution should be applied when using YouTube for health promotion and patient educational material. PMID:23612432

  18. Data-Driven Diffusion Of Innovations: Successes And Challenges In 3 Large-Scale Innovative Delivery Models

    PubMed Central

    Dorr, David A.; Cohen, Deborah J.; Adler-Milstein, Julia

    2018-01-01

    Failed diffusion of innovations may be linked to an inability to use and apply data, information, and knowledge to change perceptions of current practice and motivate change. Using qualitative and quantitative data from three large-scale health care delivery innovations—accountable care organizations, advanced primary care practice, and EvidenceNOW—we assessed where data-driven innovation is occurring and where challenges lie. We found that implementation of some technological components of innovation (for example, electronic health records) has occurred among health care organizations, but core functions needed to use data to drive innovation are lacking. Deficits include the inability to extract and aggregate data from the records; gaps in sharing data; and challenges in adopting advanced data functions, particularly those related to timely reporting of performance data. The unexpectedly high costs and burden incurred during implementation of the innovations have limited organizations’ ability to address these and other deficits. Solutions that could help speed progress in data-driven innovation include facilitating peer-to-peer technical assistance, providing tailored feedback reports to providers from data aggregators, and using practice facilitators skilled in using data technology for quality improvement to help practices transform. Policy efforts that promote these solutions may enable more rapid uptake of and successful participation in innovative delivery system reforms. PMID:29401031

  19. Data-Driven Diffusion Of Innovations: Successes And Challenges In 3 Large-Scale Innovative Delivery Models.

    PubMed

    Dorr, David A; Cohen, Deborah J; Adler-Milstein, Julia

    2018-02-01

    Failed diffusion of innovations may be linked to an inability to use and apply data, information, and knowledge to change perceptions of current practice and motivate change. Using qualitative and quantitative data from three large-scale health care delivery innovations-accountable care organizations, advanced primary care practice, and EvidenceNOW-we assessed where data-driven innovation is occurring and where challenges lie. We found that implementation of some technological components of innovation (for example, electronic health records) has occurred among health care organizations, but core functions needed to use data to drive innovation are lacking. Deficits include the inability to extract and aggregate data from the records; gaps in sharing data; and challenges in adopting advanced data functions, particularly those related to timely reporting of performance data. The unexpectedly high costs and burden incurred during implementation of the innovations have limited organizations' ability to address these and other deficits. Solutions that could help speed progress in data-driven innovation include facilitating peer-to-peer technical assistance, providing tailored feedback reports to providers from data aggregators, and using practice facilitators skilled in using data technology for quality improvement to help practices transform. Policy efforts that promote these solutions may enable more rapid uptake of and successful participation in innovative delivery system reforms.

  20. Development of anomaly detection models for deep subsurface monitoring

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.

    2017-12-01

    Deep subsurface repositories are used for waste disposal and carbon sequestration. Monitoring deep subsurface repositories for potential anomalies is challenging, not only because the number of sensor networks and the quality of data are often limited, but also because of the lack of labeled data needed to train and validate machine learning (ML) algorithms. Although physical simulation models may be applied to predict anomalies (or the system's nominal state for that sake), the accuracy of such predictions may be limited by inherent conceptual and parameter uncertainties. The main objective of this study was to demonstrate the potential of data-driven models for leakage detection in carbon sequestration repositories. Monitoring data collected during an artificial CO2 release test at a carbon sequestration repository were used, which include both scalar time series (pressure) and vector time series (distributed temperature sensing). For each type of data, separate online anomaly detection algorithms were developed using the baseline experiment data (no leak) and then tested on the leak experiment data. Performance of a number of different online algorithms was compared. Results show the importance of including contextual information in the dataset to mitigate the impact of reservoir noise and reduce false positive rate. The developed algorithms were integrated into a generic Web-based platform for real-time anomaly detection.

  1. Reproducibility of data-driven dietary patterns in two groups of adult Spanish women from different studies.

    PubMed

    Castelló, Adela; Lope, Virginia; Vioque, Jesús; Santamariña, Carmen; Pedraz-Pingarrón, Carmen; Abad, Soledad; Ederra, Maria; Salas-Trejo, Dolores; Vidal, Carmen; Sánchez-Contador, Carmen; Aragonés, Nuria; Pérez-Gómez, Beatriz; Pollán, Marina

    2016-08-01

    The objective of the present study was to assess the reproducibility of data-driven dietary patterns in different samples extracted from similar populations. Dietary patterns were extracted by applying principal component analyses to the dietary information collected from a sample of 3550 women recruited from seven screening centres belonging to the Spanish breast cancer (BC) screening network (Determinants of Mammographic Density in Spain (DDM-Spain) study). The resulting patterns were compared with three dietary patterns obtained from a previous Spanish case-control study on female BC (Epidemiological study of the Spanish group for breast cancer research (GEICAM: grupo Español de investigación en cáncer de mama)) using the dietary intake data of 973 healthy participants. The level of agreement between patterns was determined using both the congruence coefficient (CC) between the pattern loadings (considering patterns with a CC≥0·85 as fairly similar) and the linear correlation between patterns scores (considering as fairly similar those patterns with a statistically significant correlation). The conclusions reached with both methods were compared. This is the first study exploring the reproducibility of data-driven patterns from two studies and the first using the CC to determine pattern similarity. We were able to reproduce the EpiGEICAM Western pattern in the DDM-Spain sample (CC=0·90). However, the reproducibility of the Prudent (CC=0·76) and Mediterranean (CC=0·77) patterns was not as good. The linear correlation between pattern scores was statistically significant in all cases, highlighting its arbitrariness for determining pattern similarity. We conclude that the reproducibility of widely prevalent dietary patterns is better than the reproducibility of more population-specific patterns. More methodological studies are needed to establish an objective measurement and threshold to determine pattern similarity.

  2. Superwind Outflows in Seyfert Galaxies? : Large-Scale Radio Maps of an Edge-On Sample

    NASA Astrophysics Data System (ADS)

    Colbert, E.; Gallimore, J.; Baum, S.; O'Dea, C.

    1995-03-01

    Large-scale galactic winds (superwinds) are commonly found flowing out of the nuclear region of ultraluminous infrared and powerful starburst galaxies. Stellar winds and supernovae from the nuclear starburst provide the energy to drive these superwinds. The outflowing gas escapes along the rotation axis, sweeping up and shock-heating clouds in the halo, which produces optical line emission, radio synchrotron emission, and X-rays. These features can most easily be studied in edge-on systems, so that the wind emission is not confused by that from the disk. We have begun a systematic search for superwind outflows in Seyfert galaxies. In an earlier optical emission-line survey, we found extended minor axis emission and/or double-peaked emission line profiles in >~30% of the sample objects. We present here large-scale (6cm VLA C-config) radio maps of 11 edge-on Seyfert galaxies, selected (without bias) from a distance-limited sample of 23 edge-on Seyferts. These data have been used to estimate the frequency of occurrence of superwinds. Preliminary results indicate that four (36%) of the 11 objects observed and six (26%) of the 23 objects in the distance-limited sample have extended radio emission oriented perpendicular to the galaxy disk. This emission may be produced by a galactic wind blowing out of the disk. Two (NGC 2992 and NGC 5506) of the nine objects for which we have both radio and optical data show good evidence for a galactic wind in both datasets. We suggest that galactic winds occur in >~30% of all Seyferts. A goal of this work is to find a diagnostic that can be used to distinguish between large-scale outflows that are driven by starbursts and those that are driven by an AGN. The presence of starburst-driven superwinds in Seyferts, if established, would have important implications for the connection between starburst galaxies and AGN.

  3. Dynamic Data Driven Methods for Self-aware Aerospace Vehicles

    DTIC Science & Technology

    2015-04-08

    structural response model that incorporates multiple degradation or failure modes including damaged panel strength (BVID, thru- hole ), damaged panel...stiffness (BVID, thru- hole ), loose fastener, fretted fastener hole , and disbonded surface. • A new data-driven approach for the online updating of the flight...between the first and second plies. The panels were reinforced around the boarders of the panel with through holes to simulate mounting the wing skins to

  4. Item and source memory for emotional associates is mediated by different retrieval processes.

    PubMed

    Ventura-Bort, Carlos; Dolcos, Florin; Wendt, Julia; Wirkner, Janine; Hamm, Alfons O; Weymar, Mathias

    2017-12-12

    Recent event-related potential (ERP) data showed that neutral objects encoded in emotional background pictures were better remembered than objects encoded in neutral contexts, when recognition memory was tested one week later. In the present study, we investigated whether this long-term memory advantage for items is also associated with correct memory for contextual source details. Furthermore, we were interested in the possibly dissociable contribution of familiarity and recollection processes (using a Remember/Know procedure). The results revealed that item memory performance was mainly driven by the subjective experience of familiarity, irrespective of whether the objects were previously encoded in emotional or neutral contexts. Correct source memory for the associated background picture, however, was driven by recollection and enhanced when the content was emotional. In ERPs, correctly recognized old objects evoked frontal ERP Old/New effects (300-500ms), irrespective of context category. As in our previous study (Ventura-Bort et al., 2016b), retrieval for objects from emotional contexts was associated with larger parietal Old/New differences (600-800ms), indicating stronger involvement of recollection. Thus, the results suggest a stronger contribution of recollection-based retrieval to item and contextual background source memory for neutral information associated with an emotional event. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Molecular Gas in Obscured and Extremely Red Quasars at z ˜ 2.5

    NASA Astrophysics Data System (ADS)

    Alexandroff, Rachael; Zakamska, Nadia; Hamann, Fred; Greene, Jenny; Rahman, Mubdi

    2018-01-01

    Quasar feedback is a key element of modern galaxy evolution theory. During powerful episodes of feedback, quasar-driven winds are suspected of removing large amounts of molecular gas from the host galaxy, thus limiting supplies for star formation and ultimately curtailing the maximum mass of galaxies. Here we present Karl A. Jansky Very Large Array (VLA) observations of the CO(1-0) transition in 11 powerful obscured and extremely red quasars (ERQs) at z~2.5. Previous observations have shown that several of these targets display signatures of powerful quasar-driven winds in their ionized gas. Molecular emission is not detected in a single object, whether kinematically disturbed due to a quasar wind or in equilibrium with the host galaxy and neither is molecular gas detected in a combined stack of all objects (equivalent to an exposure time of over 10 hours with the VLA). This observation is in contrast with the previous suggestions that such objects should occupy gas-rich, extremely star-forming galaxies. Possible explanations include a paucity of molecular gas or an excess of high- excitation molecular gas, both of which could be the results of quasar feedback. In the radio continuum, we detect an average point-like (< 5 kpc) emission with luminosity νLν[33 GHz]=2.2 x 1042 erg s-1, consistent with optically-thin (α ≈ -1.0) synchrotron with some possible contribution from thermal free-free emission. The continuum radio emission of these radio-intermediate objects may be a bi-product of radiatively driven winds or may be due to weak jets confined to the host galaxy.

  6. Toward a Literature-Driven Definition of Big Data in Healthcare

    PubMed Central

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data. PMID:26137488

  7. Overcoming Barriers to Integrating Behavioral Health and Primary Care Services

    PubMed Central

    Grazier, Kyle L.; Smiley, Mary L.; Bondalapati, Kirsten S.

    2016-01-01

    Objective: Despite barriers, organizations with varying characteristics have achieved full integration of primary care services with providers and services that identify, treat, and manage those with mental health and substance use disorders. What are the key factors and common themes in stories of this success? Methods: A systematic literature review and snowball sampling technique was used to identify organizations. Site visits and key informant interviews were conducted with 6 organizations that had over time integrated behavioral health and primary care services. Case studies of each organization were independently coded to identify traits common to multiple organizations. Results: Common characteristics include prioritized vulnerable populations, extensive community collaboration, team approaches that included the patient and family, diversified funding streams, and data-driven approaches and practices. Conclusions: While significant barriers to integrating behavioral health and primary care services exist, case studies of organizations that have successfully overcome these barriers share certain common factors. PMID:27380923

  8. Kangaroo mother care: a systematic review of barriers and enablers

    PubMed Central

    Labar, Amy S; Wall, Stephen; Atun, Rifat

    2016-01-01

    Abstract Objective To investigate factors influencing the adoption of kangaroo mother care in different contexts. Methods We searched PubMed, Embase, Scopus, Web of Science and the World Health Organization’s regional databases, for studies on “kangaroo mother care” or “kangaroo care” or “skin-to-skin care” from 1 January 1960 to 19 August 2015, without language restrictions. We included programmatic reports and hand-searched references of published reviews and articles. Two independent reviewers screened articles and extracted data on carers, health system characteristics and contextual factors. We developed a conceptual model to analyse the integration of kangaroo mother care in health systems. Findings We screened 2875 studies and included 112 studies that contained qualitative data on implementation. Kangaroo mother care was applied in different ways in different contexts. The studies show that there are several barriers to implementing kangaroo mother care, including the need for time, social support, medical care and family acceptance. Barriers within health systems included organization, financing and service delivery. In the broad context, cultural norms influenced perceptions and the success of adoption. Conclusion Kangaroo mother care is a complex intervention that is behaviour driven and includes multiple elements. Success of implementation requires high user engagement and stakeholder involvement. Future research includes designing and testing models of specific interventions to improve uptake. PMID:26908962

  9. Hand Rehabilitation Learning System With an Exoskeleton Robotic Glove.

    PubMed

    Ma, Zhou; Ben-Tzvi, Pinhas; Danoff, Jerome

    2016-12-01

    This paper presents a hand rehabilitation learning system, the SAFE Glove, a device that can be utilized to enhance the rehabilitation of subjects with disabilities. This system is able to learn fingertip motion and force for grasping different objects and then record and analyze the common movements of hand function including grip and release patterns. The glove is then able to reproduce these movement patterns in playback fashion to assist a weakened hand to accomplish these movements, or to modulate the assistive level based on the user's or therapist's intent for the purpose of hand rehabilitation therapy. Preliminary data have been collected from healthy hands. To demonstrate the glove's ability to manipulate the hand, the glove has been fitted on a wooden hand and the grasping of various objects was performed. To further prove that hands can be safely driven by this haptic mechanism, force sensor readings placed between each finger and the mechanism are plotted. These experimental results demonstrate the potential of the proposed system in rehabilitation therapy.

  10. Role of fusiform and anterior temporal cortical areas in facial recognition.

    PubMed

    Nasr, Shahin; Tootell, Roger B H

    2012-11-15

    Recent fMRI studies suggest that cortical face processing extends well beyond the fusiform face area (FFA), including unspecified portions of the anterior temporal lobe. However, the exact location of such anterior temporal region(s), and their role during active face recognition, remain unclear. Here we demonstrate that (in addition to FFA) a small bilateral site in the anterior tip of the collateral sulcus ('AT'; the anterior temporal face patch) is selectively activated during recognition of faces but not houses (a non-face object). In contrast to the psychophysical prediction that inverted and contrast reversed faces are processed like other non-face objects, both FFA and AT (but not other visual areas) were also activated during recognition of inverted and contrast reversed faces. However, response accuracy was better correlated to recognition-driven activity in AT, compared to FFA. These data support a segregated, hierarchical model of face recognition processing, extending to the anterior temporal cortex. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Role of Fusiform and Anterior Temporal Cortical Areas in Facial Recognition

    PubMed Central

    Nasr, Shahin; Tootell, Roger BH

    2012-01-01

    Recent FMRI studies suggest that cortical face processing extends well beyond the fusiform face area (FFA), including unspecified portions of the anterior temporal lobe. However, the exact location of such anterior temporal region(s), and their role during active face recognition, remain unclear. Here we demonstrate that (in addition to FFA) a small bilateral site in the anterior tip of the collateral sulcus (‘AT’; the anterior temporal face patch) is selectively activated during recognition of faces but not houses (a non-face object). In contrast to the psychophysical prediction that inverted and contrast reversed faces are processed like other non-face objects, both FFA and AT (but not other visual areas) were also activated during recognition of inverted and contrast reversed faces. However, response accuracy was better correlated to recognition-driven activity in AT, compared to FFA. These data support a segregated, hierarchical model of face recognition processing, extending to the anterior temporal cortex. PMID:23034518

  12. Text mining for adverse drug events: the promise, challenges, and state of the art.

    PubMed

    Harpaz, Rave; Callahan, Alison; Tamang, Suzanne; Low, Yen; Odgers, David; Finlayson, Sam; Jung, Kenneth; LePendu, Paea; Shah, Nigam H

    2014-10-01

    Text mining is the computational process of extracting meaningful information from large amounts of unstructured text. It is emerging as a tool to leverage underutilized data sources that can improve pharmacovigilance, including the objective of adverse drug event (ADE) detection and assessment. This article provides an overview of recent advances in pharmacovigilance driven by the application of text mining, and discusses several data sources-such as biomedical literature, clinical narratives, product labeling, social media, and Web search logs-that are amenable to text mining for pharmacovigilance. Given the state of the art, it appears text mining can be applied to extract useful ADE-related information from multiple textual sources. Nonetheless, further research is required to address remaining technical challenges associated with the text mining methodologies, and to conclusively determine the relative contribution of each textual source to improving pharmacovigilance.

  13. Text Mining for Adverse Drug Events: the Promise, Challenges, and State of the Art

    PubMed Central

    Harpaz, Rave; Callahan, Alison; Tamang, Suzanne; Low, Yen; Odgers, David; Finlayson, Sam; Jung, Kenneth; LePendu, Paea; Shah, Nigam H.

    2014-01-01

    Text mining is the computational process of extracting meaningful information from large amounts of unstructured text. Text mining is emerging as a tool to leverage underutilized data sources that can improve pharmacovigilance, including the objective of adverse drug event detection and assessment. This article provides an overview of recent advances in pharmacovigilance driven by the application of text mining, and discusses several data sources—such as biomedical literature, clinical narratives, product labeling, social media, and Web search logs—that are amenable to text-mining for pharmacovigilance. Given the state of the art, it appears text mining can be applied to extract useful ADE-related information from multiple textual sources. Nonetheless, further research is required to address remaining technical challenges associated with the text mining methodologies, and to conclusively determine the relative contribution of each textual source to improving pharmacovigilance. PMID:25151493

  14. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  15. Toward Computational Cumulative Biology by Combining Models of Biological Datasets

    PubMed Central

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations—for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database. PMID:25427176

  16. Toward computational cumulative biology by combining models of biological datasets.

    PubMed

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations-for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database.

  17. Applications of artificial intelligence to space station and automated software techniques: High level robot command language

    NASA Technical Reports Server (NTRS)

    Mckee, James W.

    1989-01-01

    The objective is to develop a system that will allow a person not necessarily skilled in the art of programming robots to quickly and naturally create the necessary data and commands to enable a robot to perform a desired task. The system will use a menu driven graphical user interface. This interface will allow the user to input data to select objects to be moved. There will be an imbedded expert system to process the knowledge about objects and the robot to determine how they are to be moved. There will be automatic path planning to avoid obstacles in the work space and to create a near optimum path. The system will contain the software to generate the required robot instructions.

  18. Metadata-Driven SOA-Based Application for Facilitation of Real-Time Data Warehousing

    NASA Astrophysics Data System (ADS)

    Pintar, Damir; Vranić, Mihaela; Skočir, Zoran

    Service-oriented architecture (SOA) has already been widely recognized as an effective paradigm for achieving integration of diverse information systems. SOA-based applications can cross boundaries of platforms, operation systems and proprietary data standards, commonly through the usage of Web Services technology. On the other side, metadata is also commonly referred to as a potential integration tool given the fact that standardized metadata objects can provide useful information about specifics of unknown information systems with which one has interest in communicating with, using an approach commonly called "model-based integration". This paper presents the result of research regarding possible synergy between those two integration facilitators. This is accomplished with a vertical example of a metadata-driven SOA-based business process that provides ETL (Extraction, Transformation and Loading) and metadata services to a data warehousing system in need of a real-time ETL support.

  19. The ``Missing Compounds'' affair in functionality-driven material discovery

    NASA Astrophysics Data System (ADS)

    Zunger, Alex

    2014-03-01

    In the paradigm of ``data-driven discovery,'' underlying one of the leading streams of the Material Genome Initiative (MGI), one attempts to compute high-throughput style as many of the properties of as many of the N (about 10**5- 10**6) compounds listed in databases of previously known compounds. One then inspects the ensuing Big Data, searching for useful trends. The alternative and complimentary paradigm of ``functionality-directed search and optimization'' used here, searches instead for the n much smaller than N configurations and compositions that have the desired value of the target functionality. Examples include the use of genetic and other search methods that optimize the structure or identity of atoms on lattice sites, using atomistic electronic structure (such as first-principles) approaches in search of a given electronic property. This addresses a few of the bottlenecks that have faced the alternative, data-driven/high throughput/Big Data philosophy: (i) When the configuration space is theoretically of infinite size, building a complete data base as in data-driven discovery is impossible, yet searching for the optimum functionality, is still a well-posed problem. (ii) The configuration space that we explore might include artificially grown, kinetically stabilized systems (such as 2D layer stacks; superlattices; colloidal nanostructures; Fullerenes) that are not listed in compound databases (used by data-driven approaches), (iii) a large fraction of chemically plausible compounds have not been experimentally synthesized, so in the data-driven approach these are often skipped. In our approach we search explicitly for such ``Missing Compounds''. It is likely that many interesting material properties will be found in cases (i)-(iii) that elude high throughput searches based on databases encapsulating existing knowledge. I will illustrate (a) Functionality-driven discovery of topological insulators and valley-split quantum-computer semiconductors, as well as (b) Use of ``first principles thermodynamics'' to discern which of the previously ``missing compounds'' should, in fact exist and in which structure. Synthesis efforts by Poeppelmeier group at NU realized 20 never-before-made half-Heusler compounds out of the 20 predicted ones, in our predicted space groups. This type of theory-led experimental search of designed materials with target functionalities may shorten the current process of discovery of interesting functional materials. Supported by DOE ,Office of Science, Energy Frontier Research Center for Inverse Design

  20. Older driver estimates of driving exposure compared to in-vehicle data in the Candrive II study.

    PubMed

    Porter, Michelle M; Smith, Glenys A; Cull, Andrew W; Myers, Anita M; Bédard, Michel; Gélinas, Isabelle; Mazer, Barbara L; Marshall, Shawn C; Naglie, Gary; Rapoport, Mark J; Tuokko, Holly A; Vrkljan, Brenda H

    2015-01-01

    Most studies on older adults' driving practices have relied on self-reported information. With technological advances it is now possible to objectively measure the everyday driving of older adults in their own vehicles over time. The purpose of this study was to examine the ability of older drivers to accurately estimate their kilometers driven over one year relative to objectively measured driving exposure. A subsample (n = 159 of 928; 50.9% male) of Candrive II participants (age ≥ 70 years of age) was used in these analyses based on strict criteria for data collected from questionnaires as well as an OttoView-CD Autonomous Data Logging Device installed in their vehicle, over the first year of the prospective cohort study. Although there was no significant difference overall between the self-reported and objectively measured distance categories, only moderate agreement was found (weighted kappa = 0.57; 95% confidence interval, 0.47-0.67). Almost half (45.3%) chose the wrong distance category, and some people misestimated their distance driven by up to 20,000 km. Those who misjudged in the low mileage group (≤5000 km) consistently underestimated, whereas the reverse was found for those in the high distance categories (≥ 20,000); that is, they always overestimated their driving distance. Although self-reported driving distance categories may be adequate for studies entailing broad group comparisons, caution should be used in interpreting results. Use of self-reported estimates for individual assessments should be discouraged.

  1. A Pan-Carina Young Stellar Object Catalog: Intermediate-mass Young Stellar Objects in the Carina Nebula Identified Via Mid-infrared Excess Emission

    NASA Astrophysics Data System (ADS)

    Povich, Matthew S.; Smith, Nathan; Majewski, Steven R.; Getman, Konstantin V.; Townsley, Leisa K.; Babler, Brian L.; Broos, Patrick S.; Indebetouw, Rémy; Meade, Marilyn R.; Robitaille, Thomas P.; Stassun, Keivan G.; Whitney, Barbara A.; Yonekura, Yoshinori; Fukui, Yasuo

    2011-05-01

    We present a catalog of 1439 young stellar objects (YSOs) spanning the 1.42 deg2 field surveyed by the Chandra Carina Complex Project (CCCP), which includes the major ionizing clusters and the most active sites of ongoing star formation within the Great Nebula in Carina. Candidate YSOs were identified via infrared (IR) excess emission from dusty circumstellar disks and envelopes, using data from the Spitzer Space Telescope (the Vela-Carina survey) and the Two-Micron All Sky Survey. We model the 1-24 μm IR spectral energy distributions of the YSOs to constrain physical properties. Our Pan-Carina YSO Catalog (PCYC) is dominated by intermediate-mass (2 M sun < m <~ 10 M sun) objects with disks, including Herbig Ae/Be stars and their less evolved progenitors. The PCYC provides a valuable complementary data set to the CCCP X-ray source catalogs, identifying 1029 YSOs in Carina with no X-ray detection. We also catalog 410 YSOs with X-ray counterparts, including 62 candidate protostars. Candidate protostars with X-ray detections tend to be more evolved than those without. In most cases, X-ray emission apparently originating from intermediate-mass, disk-dominated YSOs is consistent with the presence of low-mass companions, but we also find that X-ray emission correlates with cooler stellar photospheres and higher disk masses. We suggest that intermediate-mass YSOs produce X-rays during their early pre-main-sequence evolution, perhaps driven by magnetic dynamo activity during the convective atmosphere phase, but this emission dies off as the stars approach the main sequence. Extrapolating over the stellar initial mass function scaled to the PCYC population, we predict a total population of >2 × 104 YSOs and a present-day star formation rate (SFR) of >0.008 M sun yr-1. The global SFR in the Carina Nebula, averaged over the past ~5 Myr, has been approximately constant.

  2. Social influence on selection behaviour: Distinguishing local- and global-driven preferential attachment

    PubMed Central

    Pan, Xue; Liu, Kecheng

    2017-01-01

    Social influence drives human selection behaviours when numerous objects competing for limited attentions, which leads to the ‘rich get richer’ dynamics where popular objects tend to get more attentions. However, evidences have been found that, both the global information of the whole system and the local information among one’s friends have significant influence over the one’s selection. Consequently, a key question raises that, it is the local information or the global information more determinative for one’s selection? Here we compare the local-based influence and global-based influence. We show that, the selection behaviour is mainly driven by the local popularity of the objects while the global popularity plays a supplementary role driving the behaviour only when there is little local information for the user to refer to. Thereby, we propose a network model to describe the mechanism of user-object interaction evolution with social influence, where the users perform either local-driven or global-driven preferential attachments to the objects, i.e., the probability of an objects to be selected by a target user is proportional to either its local popularity or global popularity. The simulation suggests that, about 75% of the attachments should be driven by the local popularity to reproduce the empirical observations. It means that, at least in the studied context where users chose businesses on Yelp, there is a probability of 75% for a user to make a selection according to the local popularity. The proposed model and the numerical findings may shed some light on the study of social influence and evolving social systems. PMID:28406984

  3. Social influence on selection behaviour: Distinguishing local- and global-driven preferential attachment.

    PubMed

    Pan, Xue; Hou, Lei; Liu, Kecheng

    2017-01-01

    Social influence drives human selection behaviours when numerous objects competing for limited attentions, which leads to the 'rich get richer' dynamics where popular objects tend to get more attentions. However, evidences have been found that, both the global information of the whole system and the local information among one's friends have significant influence over the one's selection. Consequently, a key question raises that, it is the local information or the global information more determinative for one's selection? Here we compare the local-based influence and global-based influence. We show that, the selection behaviour is mainly driven by the local popularity of the objects while the global popularity plays a supplementary role driving the behaviour only when there is little local information for the user to refer to. Thereby, we propose a network model to describe the mechanism of user-object interaction evolution with social influence, where the users perform either local-driven or global-driven preferential attachments to the objects, i.e., the probability of an objects to be selected by a target user is proportional to either its local popularity or global popularity. The simulation suggests that, about 75% of the attachments should be driven by the local popularity to reproduce the empirical observations. It means that, at least in the studied context where users chose businesses on Yelp, there is a probability of 75% for a user to make a selection according to the local popularity. The proposed model and the numerical findings may shed some light on the study of social influence and evolving social systems.

  4. Evaluation of respondent-driven sampling.

    PubMed

    McCreesh, Nicky; Frost, Simon D W; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda N; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total population data. Total population data on age, tribe, religion, socioeconomic status, sexual activity, and HIV status were available on a population of 2402 male household heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, using current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). We recruited 927 household heads. Full and small RDS samples were largely representative of the total population, but both samples underrepresented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven sampling statistical inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven sampling bootstrap 95% confidence intervals included the population proportion. Respondent-driven sampling produced a generally representative sample of this well-connected nonhidden population. However, current respondent-driven sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience sampling method, and caution is required when interpreting findings based on the sampling method.

  5. A Wide-field Survey for Transiting Hot Jupiters and Eclipsing Pre-main-sequence Binaries in Young Stellar Associations

    NASA Astrophysics Data System (ADS)

    Oelkers, Ryan J.; Macri, Lucas M.; Marshall, Jennifer L.; DePoy, Darren L.; Lambas, Diego G.; Colazo, Carlos; Stringer, Katelyn

    2016-09-01

    The past two decades have seen a significant advancement in the detection, classification, and understanding of exoplanets and binaries. This is due, in large part, to the increase in use of small-aperture telescopes (<20 cm) to survey large areas of the sky to milli-mag precision with rapid cadence. The vast majority of the planetary and binary systems studied to date consists of main-sequence or evolved objects, leading to a dearth of knowledge of properties at early times (<50 Myr). Only a dozen binaries and one candidate transiting Hot Jupiter are known among pre-main-sequence objects, yet these are the systems that can provide the best constraints on stellar formation and planetary migration models. The deficiency in the number of well characterized systems is driven by the inherent and aperiodic variability found in pre-main-sequence objects, which can mask and mimic eclipse signals. Hence, a dramatic increase in the number of young systems with high-quality observations is highly desirable to guide further theoretical developments. We have recently completed a photometric survey of three nearby (<150 pc) and young (<50 Myr) moving groups with a small-aperture telescope. While our survey reached the requisite photometric precision, the temporal coverage was insufficient to detect Hot Jupiters. Nevertheless, we discovered 346 pre-main-sequence binary candidates, including 74 high-priority objects for further study. This paper includes data taken at The McDonald Observatory of The University of Texas at Austin.

  6. Shaping innovations in long-term care for stroke survivors with multimorbidity through stakeholder engagement.

    PubMed

    Sadler, Euan; Porat, Talya; Marshall, Iain; Hoang, Uy; Curcin, Vasa; Wolfe, Charles D A; McKevitt, Christopher

    2017-01-01

    Stroke, like many long-term conditions, tends to be managed in isolation of its associated risk factors and multimorbidity. With increasing access to clinical and research data there is the potential to combine data from a variety of sources to inform interventions to improve healthcare. A 'Learning Health System' (LHS) is an innovative model of care which transforms integrated data into knowledge to improve healthcare. The objective of this study is to develop a process of engaging stakeholders in the use of clinical and research data to co-produce potential solutions, informed by a LHS, to improve long-term care for stroke survivors with multimorbidity. We used a stakeholder engagement study design informed by co-production principles to engage stakeholders, including service users, carers, general practitioners and other health and social care professionals, service managers, commissioners of services, policy makers, third sector representatives and researchers. Over a 10 month period we used a range of methods including stakeholder group meetings, focus groups, nominal group techniques (priority setting and consensus building) and interviews. Qualitative data were recorded, transcribed and analysed thematically. 37 participants took part in the study. The concept of how data might drive intervention development was difficult to convey and understand. The engagement process led to four priority areas for needs for data and information being identified by stakeholders: 1) improving continuity of care; 2) improving management of mental health consequences; 3) better access to health and social care; and 4) targeting multiple risk factors. These priorities informed preliminary design interventions. The final choice of intervention was agreed by consensus, informed by consideration of the gap in evidence and local service provision, and availability of robust data. This shaped a co-produced decision support tool to improve secondary prevention after stroke for further development. Stakeholder engagement to identify data-driven solutions is feasible but requires resources. While a number of potential interventions were identified, the final choice rested not just on stakeholder priorities but also on data availability. Further work is required to evaluate the impact and implementation of data-driven interventions for long-term stroke survivors.

  7. Shaping innovations in long-term care for stroke survivors with multimorbidity through stakeholder engagement

    PubMed Central

    Porat, Talya; Marshall, Iain; Hoang, Uy; Curcin, Vasa; Wolfe, Charles D. A.; McKevitt, Christopher

    2017-01-01

    Background Stroke, like many long-term conditions, tends to be managed in isolation of its associated risk factors and multimorbidity. With increasing access to clinical and research data there is the potential to combine data from a variety of sources to inform interventions to improve healthcare. A ‘Learning Health System’ (LHS) is an innovative model of care which transforms integrated data into knowledge to improve healthcare. The objective of this study is to develop a process of engaging stakeholders in the use of clinical and research data to co-produce potential solutions, informed by a LHS, to improve long-term care for stroke survivors with multimorbidity. Methods We used a stakeholder engagement study design informed by co-production principles to engage stakeholders, including service users, carers, general practitioners and other health and social care professionals, service managers, commissioners of services, policy makers, third sector representatives and researchers. Over a 10 month period we used a range of methods including stakeholder group meetings, focus groups, nominal group techniques (priority setting and consensus building) and interviews. Qualitative data were recorded, transcribed and analysed thematically. Results 37 participants took part in the study. The concept of how data might drive intervention development was difficult to convey and understand. The engagement process led to four priority areas for needs for data and information being identified by stakeholders: 1) improving continuity of care; 2) improving management of mental health consequences; 3) better access to health and social care; and 4) targeting multiple risk factors. These priorities informed preliminary design interventions. The final choice of intervention was agreed by consensus, informed by consideration of the gap in evidence and local service provision, and availability of robust data. This shaped a co-produced decision support tool to improve secondary prevention after stroke for further development. Conclusions Stakeholder engagement to identify data-driven solutions is feasible but requires resources. While a number of potential interventions were identified, the final choice rested not just on stakeholder priorities but also on data availability. Further work is required to evaluate the impact and implementation of data-driven interventions for long-term stroke survivors. PMID:28475606

  8. Human Papillomavirus Drives Tumor Development Throughout the Head and Neck: Improved Prognosis Is Associated With an Immune Response Largely Restricted to the Oropharynx

    PubMed Central

    Chakravarthy, Ankur; Henderson, Stephen; Thirdborough, Stephen M.; Ottensmeier, Christian H.; Su, Xiaoping; Lechner, Matt; Feber, Andrew; Thomas, Gareth J.

    2016-01-01

    Purpose In squamous cell carcinomas of the head and neck (HNSCC), the increasing incidence of oropharyngeal squamous cell carcinomas (OPSCCs) is attributable to human papillomavirus (HPV) infection. Despite commonly presenting at late stage, HPV-driven OPSCCs are associated with improved prognosis compared with HPV-negative disease. HPV DNA is also detectable in nonoropharyngeal (non-OPSCC), but its pathogenic role and clinical significance are unclear. The objectives of this study were to determine whether HPV plays a causal role in non-OPSCC and to investigate whether HPV confers a survival benefit in these tumors. Methods Meta-analysis was used to build a cross-tissue gene-expression signature for HPV-driven cancer. Classifiers trained by machine-learning approaches were used to predict the HPV status of 520 HNSCCs profiled by The Cancer Genome Atlas project. DNA methylation data were similarly used to classify 464 HNSCCs and these analyses were integrated with genomic, histopathology, and survival data to permit a comprehensive comparison of HPV transcript-positive OPSCC and non-OPSCC. Results HPV-driven tumors accounted for 4.1% of non-OPSCCs. Regardless of anatomic site, HPV+ HNSCCs shared highly similar gene expression and DNA methylation profiles; nonkeratinizing, basaloid histopathological features; and lack of TP53 or CDKN2A alterations. Improved overall survival, however, was largely restricted to HPV-driven OPSCCs, which were associated with increased levels of tumor-infiltrating lymphocytes compared with HPV-driven non-OPSCCs. Conclusion Our analysis identified a causal role for HPV in transcript-positive non-OPSCCs throughout the head and neck. Notably, however, HPV-driven non-OPSCCs display a distinct immune microenvironment and clinical behavior compared with HPV-driven OPSCCs. PMID:27863190

  9. Data-driven discovery of new Dirac semimetal materials

    NASA Astrophysics Data System (ADS)

    Yan, Qimin; Chen, Ru; Neaton, Jeffrey

    In recent years, a significant amount of materials property data from high-throughput computations based on density functional theory (DFT) and the application of database technologies have enabled the rise of data-driven materials discovery. In this work, we initiate the extension of the data-driven materials discovery framework to the realm of topological semimetal materials and to accelerate the discovery of novel Dirac semimetals. We implement current available and develop new workflows to data-mine the Materials Project database for novel Dirac semimetals with desirable band structures and symmetry protected topological properties. This data-driven effort relies on the successful development of several automatic data generation and analysis tools, including a workflow for the automatic identification of topological invariants and pattern recognition techniques to find specific features in a massive number of computed band structures. Utilizing this approach, we successfully identified more than 15 novel Dirac point and Dirac nodal line systems that have not been theoretically predicted or experimentally identified. This work is supported by the Materials Project Predictive Modeling Center through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231.

  10. Video Analysis of Granular Gases in a Low-Gravity Environment

    NASA Astrophysics Data System (ADS)

    Lewallen, Erin

    2004-10-01

    Granular Agglomeration in Non-Gravitating Systems is a research project undertaken by the University of Tulsa Granular Dynamics Group. The project investigates the effects of weightlessness on granular systems by studying the dynamics of a "gas" of 1-mm diameter brass ball bearings driven at various amplitudes and frequencies in low-gravity. Models predict that particles in systems subjected to these conditions should exhibit clustering behavior due to energy loss through multiple inelastic collisions. Observation and study of clustering in our experiment could shed light on this phenomenon as a possible mechanism by which particles in space coalesce to form stable objects such as planetesimals and planetary ring systems. Our experiment has flown on NASA's KC-135 low gravity aircraft. Data analysis techniques for video data collected during these flights include modification of images using Adobe Photoshop and development of ball identification and tracking programs written in Interactive Data Language. By tracking individual balls, we aim to establish speed distributions for granular gases and thereby obtain values for granular temperature.

  11. A Hypothesis-Driven Approach to Site Investigation

    NASA Astrophysics Data System (ADS)

    Nowak, W.

    2008-12-01

    Variability of subsurface formations and the scarcity of data lead to the notion of aquifer parameters as geostatistical random variables. Given an information need and limited resources for field campaigns, site investigation is often put into the context of optimal design. In optimal design, the types, numbers and positions of samples are optimized under case-specific objectives to meet the information needs. Past studies feature optimal data worth (balancing maximum financial profit in an engineering task versus the cost of additional sampling), or aim at a minimum prediction uncertainty of stochastic models for a prescribed investigation budget. Recent studies also account for other sources of uncertainty outside the hydrogeological range, such as uncertain toxicity, ingestion and behavioral parameters of the affected population when predicting the human health risk from groundwater contaminations. The current study looks at optimal site investigation from a new angle. Answering a yes/no question under uncertainty directly requires recasting the original question as a hypothesis test. Otherwise, false confidence in the resulting answer would be pretended. A straightforward example is whether a recent contaminant spill will cause contaminant concentrations in excess of a legal limit at a nearby drinking water well. This question can only be answered down to a specified chance of error, i.e., based on the significance level used in hypothesis tests. Optimal design is placed into the hypothesis-driven context by using the chance of providing a false yes/no answer as new criterion to be minimized. Different configurations apply for one-sided and two-sided hypothesis tests. If a false answer entails financial liability, the hypothesis-driven context can be re-cast in the context of data worth. The remaining difference is that failure is a hard constraint in the data worth context versus a monetary punishment term in the hypothesis-driven context. The basic principle is discussed and illustrated on the case of a hypothetical contaminant spill and the exceedance of critical contaminant levels at a downstream location. An tempting and important side question is whether site investigation could be tweaked towards a yes or no answer in maliciously biased campaigns by unfair formulation of the optimization objective.

  12. IRDS prototyping with applications to the representation of EA/RA models

    NASA Technical Reports Server (NTRS)

    Lekkos, Anthony A.; Greenwood, Bruce

    1988-01-01

    The requirements and system overview for the Information Resources Dictionary System (IRDS) are described. A formal design specification for a scaled down IRDS implementation compatible with the proposed FIPS IRDS standard is contained. The major design objectives for this IRDS will include a menu driven user interface, implementation of basic IRDS operations, and PC compatibility. The IRDS was implemented using Smalltalk/5 object oriented programming system and an ATT 6300 personal computer running under MS-DOS 3.1. The difficulties encountered in using Smalltalk are discussed.

  13. Data-driven reverse engineering of signaling pathways using ensembles of dynamic models.

    PubMed

    Henriques, David; Villaverde, Alejandro F; Rocha, Miguel; Saez-Rodriguez, Julio; Banga, Julio R

    2017-02-01

    Despite significant efforts and remarkable progress, the inference of signaling networks from experimental data remains very challenging. The problem is particularly difficult when the objective is to obtain a dynamic model capable of predicting the effect of novel perturbations not considered during model training. The problem is ill-posed due to the nonlinear nature of these systems, the fact that only a fraction of the involved proteins and their post-translational modifications can be measured, and limitations on the technologies used for growing cells in vitro, perturbing them, and measuring their variations. As a consequence, there is a pervasive lack of identifiability. To overcome these issues, we present a methodology called SELDOM (enSEmbLe of Dynamic lOgic-based Models), which builds an ensemble of logic-based dynamic models, trains them to experimental data, and combines their individual simulations into an ensemble prediction. It also includes a model reduction step to prune spurious interactions and mitigate overfitting. SELDOM is a data-driven method, in the sense that it does not require any prior knowledge of the system: the interaction networks that act as scaffolds for the dynamic models are inferred from data using mutual information. We have tested SELDOM on a number of experimental and in silico signal transduction case-studies, including the recent HPN-DREAM breast cancer challenge. We found that its performance is highly competitive compared to state-of-the-art methods for the purpose of recovering network topology. More importantly, the utility of SELDOM goes beyond basic network inference (i.e. uncovering static interaction networks): it builds dynamic (based on ordinary differential equation) models, which can be used for mechanistic interpretations and reliable dynamic predictions in new experimental conditions (i.e. not used in the training). For this task, SELDOM's ensemble prediction is not only consistently better than predictions from individual models, but also often outperforms the state of the art represented by the methods used in the HPN-DREAM challenge.

  14. Data-driven reverse engineering of signaling pathways using ensembles of dynamic models

    PubMed Central

    Henriques, David; Villaverde, Alejandro F.; Banga, Julio R.

    2017-01-01

    Despite significant efforts and remarkable progress, the inference of signaling networks from experimental data remains very challenging. The problem is particularly difficult when the objective is to obtain a dynamic model capable of predicting the effect of novel perturbations not considered during model training. The problem is ill-posed due to the nonlinear nature of these systems, the fact that only a fraction of the involved proteins and their post-translational modifications can be measured, and limitations on the technologies used for growing cells in vitro, perturbing them, and measuring their variations. As a consequence, there is a pervasive lack of identifiability. To overcome these issues, we present a methodology called SELDOM (enSEmbLe of Dynamic lOgic-based Models), which builds an ensemble of logic-based dynamic models, trains them to experimental data, and combines their individual simulations into an ensemble prediction. It also includes a model reduction step to prune spurious interactions and mitigate overfitting. SELDOM is a data-driven method, in the sense that it does not require any prior knowledge of the system: the interaction networks that act as scaffolds for the dynamic models are inferred from data using mutual information. We have tested SELDOM on a number of experimental and in silico signal transduction case-studies, including the recent HPN-DREAM breast cancer challenge. We found that its performance is highly competitive compared to state-of-the-art methods for the purpose of recovering network topology. More importantly, the utility of SELDOM goes beyond basic network inference (i.e. uncovering static interaction networks): it builds dynamic (based on ordinary differential equation) models, which can be used for mechanistic interpretations and reliable dynamic predictions in new experimental conditions (i.e. not used in the training). For this task, SELDOM’s ensemble prediction is not only consistently better than predictions from individual models, but also often outperforms the state of the art represented by the methods used in the HPN-DREAM challenge. PMID:28166222

  15. Heterogeneous postsurgical data analytics for predictive modeling of mortality risks in intensive care units.

    PubMed

    Yun Chen; Hui Yang

    2014-01-01

    The rapid advancements of biomedical instrumentation and healthcare technology have resulted in data-rich environments in hospitals. However, the meaningful information extracted from rich datasets is limited. There is a dire need to go beyond current medical practices, and develop data-driven methods and tools that will enable and help (i) the handling of big data, (ii) the extraction of data-driven knowledge, (iii) the exploitation of acquired knowledge for optimizing clinical decisions. This present study focuses on the prediction of mortality rates in Intensive Care Units (ICU) using patient-specific healthcare recordings. It is worth mentioning that postsurgical monitoring in ICU leads to massive datasets with unique properties, e.g., variable heterogeneity, patient heterogeneity, and time asyncronization. To cope with the challenges in ICU datasets, we developed the postsurgical decision support system with a series of analytical tools, including data categorization, data pre-processing, feature extraction, feature selection, and predictive modeling. Experimental results show that the proposed data-driven methodology outperforms traditional approaches and yields better results based on the evaluation of real-world ICU data from 4000 subjects in the database. This research shows great potentials for the use of data-driven analytics to improve the quality of healthcare services.

  16. Computational neuroscience approach to biomarkers and treatments for mental disorders.

    PubMed

    Yahata, Noriaki; Kasai, Kiyoto; Kawato, Mitsuo

    2017-04-01

    Psychiatry research has long experienced a stagnation stemming from a lack of understanding of the neurobiological underpinnings of phenomenologically defined mental disorders. Recently, the application of computational neuroscience to psychiatry research has shown great promise in establishing a link between phenomenological and pathophysiological aspects of mental disorders, thereby recasting current nosology in more biologically meaningful dimensions. In this review, we highlight recent investigations into computational neuroscience that have undertaken either theory- or data-driven approaches to quantitatively delineate the mechanisms of mental disorders. The theory-driven approach, including reinforcement learning models, plays an integrative role in this process by enabling correspondence between behavior and disorder-specific alterations at multiple levels of brain organization, ranging from molecules to cells to circuits. Previous studies have explicated a plethora of defining symptoms of mental disorders, including anhedonia, inattention, and poor executive function. The data-driven approach, on the other hand, is an emerging field in computational neuroscience seeking to identify disorder-specific features among high-dimensional big data. Remarkably, various machine-learning techniques have been applied to neuroimaging data, and the extracted disorder-specific features have been used for automatic case-control classification. For many disorders, the reported accuracies have reached 90% or more. However, we note that rigorous tests on independent cohorts are critically required to translate this research into clinical applications. Finally, we discuss the utility of the disorder-specific features found by the data-driven approach to psychiatric therapies, including neurofeedback. Such developments will allow simultaneous diagnosis and treatment of mental disorders using neuroimaging, thereby establishing 'theranostics' for the first time in clinical psychiatry. © 2016 The Authors. Psychiatry and Clinical Neurosciences © 2016 Japanese Society of Psychiatry and Neurology.

  17. An ontology-driven semantic mash-up of gene and biological pathway information: Application to the domain of nicotine dependence

    PubMed Central

    Sahoo, Satya S.; Bodenreider, Olivier; Rutter, Joni L.; Skinner, Karen J.; Sheth, Amit P.

    2008-01-01

    Objectives This paper illustrates how Semantic Web technologies (especially RDF, OWL, and SPARQL) can support information integration and make it easy to create semantic mashups (semantically integrated resources). In the context of understanding the genetic basis of nicotine dependence, we integrate gene and pathway information and show how three complex biological queries can be answered by the integrated knowledge base. Methods We use an ontology-driven approach to integrate two gene resources (Entrez Gene and HomoloGene) and three pathway resources (KEGG, Reactome and BioCyc), for five organisms, including humans. We created the Entrez Knowledge Model (EKoM), an information model in OWL for the gene resources, and integrated it with the extant BioPAX ontology designed for pathway resources. The integrated schema is populated with data from the pathway resources, publicly available in BioPAX-compatible format, and gene resources for which a population procedure was created. The SPARQL query language is used to formulate queries over the integrated knowledge base to answer the three biological queries. Results Simple SPARQL queries could easily identify hub genes, i.e., those genes whose gene products participate in many pathways or interact with many other gene products. The identification of the genes expressed in the brain turned out to be more difficult, due to the lack of a common identification scheme for proteins. Conclusion Semantic Web technologies provide a valid framework for information integration in the life sciences. Ontology-driven integration represents a flexible, sustainable and extensible solution to the integration of large volumes of information. Additional resources, which enable the creation of mappings between information sources, are required to compensate for heterogeneity across namespaces. Resource page http://knoesis.wright.edu/research/lifesci/integration/structured_data/JBI-2008/ PMID:18395495

  18. Full body musculoskeletal model for muscle-driven simulation of human gait

    PubMed Central

    Rajagopal, Apoorva; Dembia, Christopher L.; DeMers, Matthew S.; Delp, Denny D.; Hicks, Jennifer L.; Delp, Scott L.

    2017-01-01

    Objective Musculoskeletal models provide a non-invasive means to study human movement and predict the effects of interventions on gait. Our goal was to create an open-source, three-dimensional musculoskeletal model with high-fidelity representations of the lower limb musculature of healthy young individuals that can be used to generate accurate simulations of gait. Methods Our model includes bony geometry for the full body, 37 degrees of freedom to define joint kinematics, Hill-type models of 80 muscle-tendon units actuating the lower limbs, and 17 ideal torque actuators driving the upper body. The model’s musculotendon parameters are derived from previous anatomical measurements of 21 cadaver specimens and magnetic resonance images of 24 young healthy subjects. We tested the model by evaluating its computational time and accuracy of simulations of healthy walking and running. Results Generating muscle-driven simulations of normal walking and running took approximately 10 minutes on a typical desktop computer. The differences between our muscle-generated and inverse dynamics joint moments were within 3% (RMSE) of the peak inverse dynamics joint moments in both walking and running, and our simulated muscle activity showed qualitative agreement with salient features from experimental electromyography data. Conclusion These results suggest that our model is suitable for generating muscle-driven simulations of healthy gait. We encourage other researchers to further validate and apply the model to study other motions of the lower-extremity. Significance The model is implemented in the open source software platform OpenSim. The model and data used to create and test the simulations are freely available at https://simtk.org/home/full_body/, allowing others to reproduce these results and create their own simulations. PMID:27392337

  19. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  20. Pre-Exposure Prophylaxis (PrEP) as an Additional Tool for HIV Prevention Among Men Who Have Sex With Men in Belgium: The Be-PrEP-ared Study Protocol

    PubMed Central

    Nöstlinger, Christiana; Wouters, Kristien; Fransen, Katrien; Crucitti, Tania; Kenyon, Chris; Buyze, Jozefien; Schurmans, Céline; Laga, Marie; Vuylsteke, Bea

    2017-01-01

    Background Pre-exposure prophylaxis (PrEP) is a promising and effective tool to prevent HIV. With the approval of Truvada as daily PrEP by the European Commission in August 2016, individual European Member states prepare themselves for PrEP implementation following the examples of France and Norway. However, context-specific data to guide optimal implementation is currently lacking. Objective With this demonstration project we evaluate whether daily and event-driven PrEP, provided within a comprehensive prevention package, is a feasible and acceptable additional prevention tool for men who have sex with men (MSM) at high risk of acquiring HIV in Belgium. The study’s primary objective is to document the uptake, acceptability, and adherence to both daily and event-driven PrEP, while several secondary objectives have been formulated including impact of PrEP use on sexual behavior. Methods The Be-PrEP-ared study is a phase 3, single-site, open-label prospective cohort study with a large social science component embedded in the trial. A total of 200 participants choose between daily or event-driven PrEP use and may switch, discontinue, or restart their regimen at the 3-monthly visits for a duration of 18 months. Data are collected on several platforms: an electronic case report form, a Web-based tool where participants register their sexual behavior and pill use, a more detailed electronic self-administered questionnaire completed during study visits on a tablet computer, and in-depth interviews among a selected sample of participants. To answer the primary objective, the recruitment rate, (un)safe sex behavior during the last 6 months, percentage of reported intention to use PrEP in the future, retention rates in different regimens, and attitudes towards PrEP use will be analyzed. Adherence will be monitored using self-reported adherence, pill count, tenofovir drug levels in blood samples, and the perceived skills to adhere. Results All participants are currently enrolled, and the last study visit is planned to take place around Q3 2018. Conclusions As PrEP is not yet available in Belgium for use, this study will provide insights into how to optimally implement PrEP within the current health care provision and will shape national and European guidelines with regard to the place of PrEP in HIV prevention strategies. ClinicalTrial EU Clinical Trial 2015-000054-37; https://www.clinicaltrialsregister.eu/ctr-search/trial/2015-000054-37/BE (Archived by WebCite at http://www.webcitation.org/6nacjSdmM). PMID:28135199

  1. A new practice-driven approach to develop software in a cyber-physical system environment

    NASA Astrophysics Data System (ADS)

    Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei

    2016-02-01

    Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.

  2. Genetic Programming for Automatic Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Chadalawada, Jayashree; Babovic, Vladan

    2017-04-01

    One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach for conceptual hydrological modeling: 1. Motivation and theoretical development, Water Resources Research, 47(11).

  3. Combining Knowledge and Data Driven Insights for Identifying Risk Factors using Electronic Health Records

    PubMed Central

    Sun, Jimeng; Hu, Jianying; Luo, Dijun; Markatou, Marianthi; Wang, Fei; Edabollahi, Shahram; Steinhubl, Steven E.; Daar, Zahra; Stewart, Walter F.

    2012-01-01

    Background: The ability to identify the risk factors related to an adverse condition, e.g., heart failures (HF) diagnosis, is very important for improving care quality and reducing cost. Existing approaches for risk factor identification are either knowledge driven (from guidelines or literatures) or data driven (from observational data). No existing method provides a model to effectively combine expert knowledge with data driven insight for risk factor identification. Methods: We present a systematic approach to enhance known knowledge-based risk factors with additional potential risk factors derived from data. The core of our approach is a sparse regression model with regularization terms that correspond to both knowledge and data driven risk factors. Results: The approach is validated using a large dataset containing 4,644 heart failure cases and 45,981 controls. The outpatient electronic health records (EHRs) for these patients include diagnosis, medication, lab results from 2003–2010. We demonstrate that the proposed method can identify complementary risk factors that are not in the existing known factors and can better predict the onset of HF. We quantitatively compare different sets of risk factors in the context of predicting onset of HF using the performance metric, the Area Under the ROC Curve (AUC). The combined risk factors between knowledge and data significantly outperform knowledge-based risk factors alone. Furthermore, those additional risk factors are confirmed to be clinically meaningful by a cardiologist. Conclusion: We present a systematic framework for combining knowledge and data driven insights for risk factor identification. We demonstrate the power of this framework in the context of predicting onset of HF, where our approach can successfully identify intuitive and predictive risk factors beyond a set of known HF risk factors. PMID:23304365

  4. Summary of Research 2002

    DTIC Science & Technology

    2005-01-01

    dissipation, nonuniformity , and nonlinearity are included. A possible future objective is to theoretically investigate nonradiating sources in two and...dissipation, nonuniformity , and nonlinearity. The presence of any of these effects causes radiation to “leak” from the driven region. This radiation was...The utility of LWIR spectral imagery for plume detection was studied. PRESENTATION: Olsen, R.C., Ganer, J. and Van Dyke, E., “Terrain

  5. Hierarchical, parallel computing strategies using component object model for process modelling responses of forest plantations to interacting multiple stresses

    Treesearch

    J. G. Isebrands; G. E. Host; K. Lenz; G. Wu; H. W. Stech

    2000-01-01

    Process models are powerful research tools for assessing the effects of multiple environmental stresses on forest plantations. These models are driven by interacting environmental variables and often include genetic factors necessary for assessing forest plantation growth over a range of different site, climate, and silvicultural conditions. However, process models are...

  6. The three-dimensional Event-Driven Graphics Environment (3D-EDGE)

    NASA Technical Reports Server (NTRS)

    Freedman, Jeffrey; Hahn, Roger; Schwartz, David M.

    1993-01-01

    Stanford Telecom developed the Three-Dimensional Event-Driven Graphics Environment (3D-EDGE) for NASA GSFC's (GSFC) Communications Link Analysis and Simulation System (CLASS). 3D-EDGE consists of a library of object-oriented subroutines which allow engineers with little or no computer graphics experience to programmatically manipulate, render, animate, and access complex three-dimensional objects.

  7. Hybrid-coded 3D structured illumination imaging with Bayesian estimation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Chen, Hsi-Hsun; Luo, Yuan; Singh, Vijay R.

    2016-03-01

    Light induced fluorescent microscopy has long been developed to observe and understand the object at microscale, such as cellular sample. However, the transfer function of lense-based imaging system limits the resolution so that the fine and detailed structure of sample cannot be identified clearly. The techniques of resolution enhancement are fascinated to break the limit of resolution for objective given. In the past decades, the resolution enhancement imaging has been investigated through variety of strategies, including photoactivated localization microscopy (PALM), stochastic optical reconstruction microscopy (STORM), stimulated emission depletion (STED), and structure illuminated microscopy (SIM). In those methods, only SIM can intrinsically improve the resolution limit for a system without taking the structure properties of object into account. In this paper, we develop a SIM associated with Bayesian estimation, furthermore, with optical sectioning capability rendered from HiLo processing, resulting the high resolution through 3D volume. This 3D SIM can provide the optical sectioning and resolution enhancement performance, and be robust to noise owing to the Data driven Bayesian estimation reconstruction proposed. For validating the 3D SIM, we show our simulation result of algorithm, and the experimental result demonstrating the 3D resolution enhancement.

  8. Lateral-directional aerodynamic characteristics of light, twin-engine, propeller driven airplanes

    NASA Technical Reports Server (NTRS)

    Wolowicz, C. H.; Yancey, R. B.

    1972-01-01

    Analytical procedures and design data for predicting the lateral-directional static and dynamic stability and control characteristics of light, twin engine, propeller driven airplanes for propeller-off and power-on conditions are reported. Although the consideration of power effects is limited to twin engine airplanes, the propeller-off considerations are applicable to single engine airplanes as well. The procedures are applied to a twin engine, propeller driven, semi-low-wing airplane in the clean configuration through the linear lift range. The calculated derivative characteristics are compared with wind tunnel and flight data. Included in the calculated characteristics are the spiral mode, roll mode, and Dutch roll mode over the speed range of the airplane.

  9. Two Facets of Stress And Indirect Effects on Child Diet via Emotion-Driven Eating

    PubMed Central

    Tate, Eleanor B.; Spruijt-Metz, Donna; Pickering, Trevor A.; Pentz, Mary Ann

    2015-01-01

    Objective Stress has been associated with high-calorie, low-nutrient food intake (HCLN) and emotion-driven eating (EDE). However, effects on healthy food intake remain unknown. This study examined two facets of stress (self-efficacy, perceived helplessness) and food consumption, mediated by EDE. Methods Cross-sectional data from fourth-graders (n = 978; 52% female, 28% Hispanic) in an obesity intervention used self-report to assess self-efficacy, helplessness, EDE, fruit/vegetable (FV) intake, and high-calorie/low-nutrient (HCLN) food. Results Higher stress self-efficacy was associated with higher FV intake, β = .354, p < 0.001, and stress perceived helplessness had an indirect effect on HCLN intake through emotion-driven eating, indirect effect = .094, p < 0.001; χ2(347) = 659.930, p < 0.001, CFI = 0.940, TLI = 0.930, RMSEA = 0.030, p = 1.00, adjusting for gender, ethnicity, BMI z-score, and program group. Conclusions and Implications Stress self-efficacy may be more important for healthy food intake and perceived helplessness may indicate emotion-driven eating and unhealthy snack food intake. Obesity prevention programs may consider teaching stress management techniques to avoid emotion-driven eating. PMID:26004248

  10. The New NASA Orbital Debris Engineering Model ORDEM 3.0

    NASA Technical Reports Server (NTRS)

    Krisko, P. H.

    2014-01-01

    The NASA Orbital Debris Program Office (ODPO) has released its latest Orbital Debris Engineering Model, ORDEM 3.0. It supersedes ORDEM 2000, now referred to as ORDEM 2.0. This newer model encompasses the Earth satellite and debris flux environment from altitudes of low Earth orbit (LEO) through geosynchronous orbit (GEO). Debris sizes of 10 micron through larger than 1 m in non-GEO and 10 cm through larger than 1 m in GEO are available. The inclusive years are 2010 through 2035. The ORDEM model series has always been data driven. ORDEM 3.0 has the benefit of many more hours of data from existing sources and from new sources than past ORDEM versions. The object data range in size from 10 µm to larger than 1 m, and include in situ and remote measurements. The in situ data reveals material characteristics of small particles. Mass densities are grouped in ORDEM 3.0 in terms of 'high-density', represented by 7.9 g/cc, 'medium-density' represented by 2.8 g/cc and 'low-density' represented by 1.4 g/cc. Supporting models have also advanced significantly. The LEO-to-GEO ENvironment Debris model (LEGEND) includes an historical and a future projection component with yearly populations that include launched and maneuvered intact spacecraft and rocket bodies, mission related debris, and explosion and collision event fragments. LEGEND propagates objects with ephemerides and physical characteristics down to 1 mm in size. The full LEGEND yearly population acts as an a priori condition for a Bayesian statistical model. Specific populations are added from sodium potassium droplet releases, recent major accidental and deliberate collisions, and known anomalous debris events. This paper elaborates on the upgrades of this model over previous versions. Sample validation results with remote and in situ measurements are shown, and the consequences of including material density are discussed as it relates to heightened risks to crewed and robotic spacecraft

  11. A VO-Driven Astronomical Data Grid in China

    NASA Astrophysics Data System (ADS)

    Cui, C.; He, B.; Yang, Y.; Zhao, Y.

    2010-12-01

    With the implementation of many ambitious observation projects, including LAMOST, FAST, and Antarctic observatory at Doom A, observational astronomy in China is stepping into a brand new era with emerging data avalanche. In the era of e-Science, both these cutting-edge projects and traditional astronomy research need much more powerful data management, sharing and interoperability. Based on data-grid concept, taking advantages of the IVOA interoperability technologies, China-VO is developing a VO-driven astronomical data grid environment to enable multi-wavelength science and large database science. In the paper, latest progress and data flow of the LAMOST, architecture of the data grid, and its supports to the VO are discussed.

  12. An Evaluation of Relative Damage to the Powertrain System in Tracked Vehicles

    PubMed Central

    Lee, Sang-Ho; Lee, Jeong-Hwan; Goo, Sang-Hwa; Cho, Yong-Cheol; Cho, Ho-Young

    2009-01-01

    The objective of this study was to improve the reliability of the endurance test for the powertrain system of military tracked vehicles. The measurement system that measures the driving duty applied to the powertrain system caused by mobility on roads consists of eight analog channels and two pulse channels, including the propeller shaft output torques for the left and right sides. The data obtained from this measurement system can be used to introduce a new technology that produces the output torque of a torque converter and that can be applied to analyze the revolution counting for the endurance and road mobility in the front unit and represent the relative fatigue damages analysis technique and its results according to the driven roads through a cumulative fatigue method. PMID:22573990

  13. Gauging Skills of Hospital Security Personnel: a Statistically-driven, Questionnaire-based Approach

    PubMed Central

    Rinkoo, Arvind Vashishta; Mishra, Shubhra; Rahesuddin; Nabi, Tauqeer; Chandra, Vidha; Chandra, Hem

    2013-01-01

    Objectives This study aims to gauge the technical and soft skills of the hospital security personnel so as to enable prioritization of their training needs. Methodology A cross sectional questionnaire based study was conducted in December 2011. Two separate predesigned and pretested questionnaires were used for gauging soft skills and technical skills of the security personnel. Extensive statistical analysis, including Multivariate Analysis (Pillai-Bartlett trace along with Multi-factorial ANOVA) and Post-hoc Tests (Bonferroni Test) was applied. Results The 143 participants performed better on the soft skills front with an average score of 6.43 and standard deviation of 1.40. The average technical skills score was 5.09 with a standard deviation of 1.44. The study avowed a need for formal hands on training with greater emphasis on technical skills. Multivariate analysis of the available data further helped in identifying 20 security personnel who should be prioritized for soft skills training and a group of 36 security personnel who should receive maximum attention during technical skills training. Conclusion This statistically driven approach can be used as a prototype by healthcare delivery institutions worldwide, after situation specific customizations, to identify the training needs of any category of healthcare staff. PMID:23559904

  14. Climate-change-driven deterioration of water quality in a mineralized watershed.

    PubMed

    Todd, Andrew S; Manning, Andrew H; Verplanck, Philip L; Crouch, Caitlin; McKnight, Diane M; Dunham, Ryan

    2012-09-04

    A unique 30-year streamwater chemistry data set from a mineralized alpine watershed with naturally acidic, metal-rich water displays dissolved concentrations of Zn and other metals of ecological concern increasing by 100-400% (400-2000 μg/L) during low-flow months, when metal concentrations are highest. SO(4) and other major ions show similar increases. A lack of natural or anthropogenic land disturbances in the watershed during the study period suggests that climate change is the underlying cause. Local mean annual and mean summer air temperatures have increased at a rate of 0.2-1.2 °C/decade since the 1980s. Other climatic and hydrologic indices, including stream discharge during low-flow months, do not display statistically significant trends. Consideration of potential specific causal mechanisms driven by rising temperatures suggests that melting of permafrost and falling water tables (from decreased recharge) are probable explanations for the increasing concentrations. The prospect of future widespread increases in dissolved solutes from mineralized watersheds is concerning given likely negative impacts on downstream ecosystems and water resources, and complications created for the establishment of attainable remediation objectives at mine sites.

  15. Climate-change-driven deterioration of water quality in a mineralized watershed

    USGS Publications Warehouse

    Todd, Andrew; Manning, Andrew H.; Verplanck, Philip L.; Crouch, Caitlin; McKnight, Diane M.; Dunham, Ryan

    2012-01-01

    A unique 30-year streamwater chemistry data set from a mineralized alpine watershed with naturally acidic, metal-rich water displays dissolved concentrations of Zn and other metals of ecological concern increasing by 100–400% (400–2000 μg/L) during low-flow months, when metal concentrations are highest. SO4 and other major ions show similar increases. A lack of natural or anthropogenic land disturbances in the watershed during the study period suggests that climate change is the underlying cause. Local mean annual and mean summer air temperatures have increased at a rate of 0.2–1.2 °C/decade since the 1980s. Other climatic and hydrologic indices, including stream discharge during low-flow months, do not display statistically significant trends. Consideration of potential specific causal mechanisms driven by rising temperatures suggests that melting of permafrost and falling water tables (from decreased recharge) are probable explanations for the increasing concentrations. The prospect of future widespread increases in dissolved solutes from mineralized watersheds is concerning given likely negative impacts on downstream ecosystems and water resources, and complications created for the establishment of attainable remediation objectives at mine sites.

  16. Computational Pathology

    PubMed Central

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  17. Electron Driven Processes in Atmospheric Behaviour

    NASA Astrophysics Data System (ADS)

    Campbell, L.; Brunger, M. J.; Teubner, P. J. O.

    2006-11-01

    Electron impact plays an important role in many atmospheric processes. Calculation of these is important for basic understanding, atmospheric modeling and remote sensing. Accurate atomic and molecular data, including electron impact cross sections, are required for such calculations. Five electron-driven processes are considered: auroral and dayglow emissions, the reduction of atmospheric electron density by vibrationally excited N2, NO production and infrared emission from NO. In most cases the predictions are compared with measurements. The dependence on experimental atomic and molecular data is also investigated.

  18. A Data-Driven Approach to Interactive Visualization of Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Jun

    Driven by emerging industry standards, electric utilities and grid coordination organizations are eager to seek advanced tools to assist grid operators to perform mission-critical tasks and enable them to make quick and accurate decisions. The emerging field of visual analytics holds tremendous promise for improving the business practices in today’s electric power industry. The conducted investigation, however, has revealed that the existing commercial power grid visualization tools heavily rely on human designers, hindering user’s ability to discover. Additionally, for a large grid, it is very labor-intensive and costly to build and maintain the pre-designed visual displays. This project proposes amore » data-driven approach to overcome the common challenges. The proposed approach relies on developing powerful data manipulation algorithms to create visualizations based on the characteristics of empirically or mathematically derived data. The resulting visual presentations emphasize what the data is rather than how the data should be presented, thus fostering comprehension and discovery. Furthermore, the data-driven approach formulates visualizations on-the-fly. It does not require a visualization design stage, completely eliminating or significantly reducing the cost for building and maintaining visual displays. The research and development (R&D) conducted in this project is mainly divided into two phases. The first phase (Phase I & II) focuses on developing data driven techniques for visualization of power grid and its operation. Various data-driven visualization techniques were investigated, including pattern recognition for auto-generation of one-line diagrams, fuzzy model based rich data visualization for situational awareness, etc. The R&D conducted during the second phase (Phase IIB) focuses on enhancing the prototyped data driven visualization tool based on the gathered requirements and use cases. The goal is to evolve the prototyped tool developed during the first phase into a commercial grade product. We will use one of the identified application areas as an example to demonstrate how research results achieved in this project are successfully utilized to address an emerging industry need. In summary, the data-driven visualization approach developed in this project has proven to be promising for building the next-generation power grid visualization tools. Application of this approach has resulted in a state-of-the-art commercial tool currently being leveraged by more than 60 utility organizations in North America and Europe .« less

  19. Zero curvature-surface driven small objects

    NASA Astrophysics Data System (ADS)

    Dou, Xiaoxiao; Li, Shanpeng; Liu, Jianlin

    2017-08-01

    In this study, we investigate the spontaneous migration of small objects driven by surface tension on a catenoid, formed by a layer of soap constrained by two rings. Although the average curvature of the catenoid is zero at each point, the small objects always migrate to the position near the ring. The force and energy analyses have been performed to uncover the mechanism, and it is found that the small objects distort the local shape of the liquid film, thus making the whole system energetically favorable. These findings provide some inspiration to design microfluidics, aquatic robotics, and miniature boats.

  20. Use of the dispersion ratio in estimating the nonlinear properties of an object of diagnosis

    NASA Technical Reports Server (NTRS)

    Balitskiy, F. Y.; Genkin, M. D.; Ivanova, M. A.; Kobrinskiy, A. A.; Sokolova, A. G.

    1973-01-01

    An experimental investigation for estimating the nonlinearity of a diagnostic object was carried out on a single-stage, spur gear reducer. The linearity of the properties of spur gearing (including the linearity of its mode of operation) was tested. Torsional vibrations of the driven wheel and transverse (to the meshing plane) vibrations of the drive wheel on its support were taken as the two outputs of the object to be analyzed. The results of the investigation showed that the degree of nonlinearity of a reducing gear is essentially connected with its operating mode, so that different mathematical models of it can correspond to different values of the system parameters.

  1. Paving the COWpath: data-driven design of pediatric order sets

    PubMed Central

    Zhang, Yiye; Padman, Rema; Levin, James E

    2014-01-01

    Objective Evidence indicates that users incur significant physical and cognitive costs in the use of order sets, a core feature of computerized provider order entry systems. This paper develops data-driven approaches for automating the construction of order sets that match closely with user preferences and workflow while minimizing physical and cognitive workload. Materials and methods We developed and tested optimization-based models embedded with clustering techniques using physical and cognitive click cost criteria. By judiciously learning from users’ actual actions, our methods identify items for constituting order sets that are relevant according to historical ordering data and grouped on the basis of order similarity and ordering time. We evaluated performance of the methods using 47 099 orders from the year 2011 for asthma, appendectomy and pneumonia management in a pediatric inpatient setting. Results In comparison with existing order sets, those developed using the new approach significantly reduce the physical and cognitive workload associated with usage by 14–52%. This approach is also capable of accommodating variations in clinical conditions that affect order set usage and development. Discussion There is a critical need to investigate the cognitive complexity imposed on users by complex clinical information systems, and to design their features according to ‘human factors’ best practices. Optimizing order set generation using cognitive cost criteria introduces a new approach that can potentially improve ordering efficiency, reduce unintended variations in order placement, and enhance patient safety. Conclusions We demonstrate that data-driven methods offer a promising approach for designing order sets that are generalizable, data-driven, condition-based, and up to date with current best practices. PMID:24674844

  2. Input variable selection for data-driven models of Coriolis flowmeters for two-phase flow measurement

    NASA Astrophysics Data System (ADS)

    Wang, Lijuan; Yan, Yong; Wang, Xue; Wang, Tao

    2017-03-01

    Input variable selection is an essential step in the development of data-driven models for environmental, biological and industrial applications. Through input variable selection to eliminate the irrelevant or redundant variables, a suitable subset of variables is identified as the input of a model. Meanwhile, through input variable selection the complexity of the model structure is simplified and the computational efficiency is improved. This paper describes the procedures of the input variable selection for the data-driven models for the measurement of liquid mass flowrate and gas volume fraction under two-phase flow conditions using Coriolis flowmeters. Three advanced input variable selection methods, including partial mutual information (PMI), genetic algorithm-artificial neural network (GA-ANN) and tree-based iterative input selection (IIS) are applied in this study. Typical data-driven models incorporating support vector machine (SVM) are established individually based on the input candidates resulting from the selection methods. The validity of the selection outcomes is assessed through an output performance comparison of the SVM based data-driven models and sensitivity analysis. The validation and analysis results suggest that the input variables selected from the PMI algorithm provide more effective information for the models to measure liquid mass flowrate while the IIS algorithm provides a fewer but more effective variables for the models to predict gas volume fraction.

  3. The Cannon: A data-driven approach to Stellar Label Determination

    NASA Astrophysics Data System (ADS)

    Ness, M.; Hogg, David W.; Rix, H.-W.; Ho, Anna. Y. Q.; Zasowski, G.

    2015-07-01

    New spectroscopic surveys offer the promise of stellar parameters and abundances (“stellar labels”) for hundreds of thousands of stars; this poses a formidable spectral modeling challenge. In many cases, there is a subset of reference objects for which the stellar labels are known with high(er) fidelity. We take advantage of this with The Cannon, a new data-driven approach for determining stellar labels from spectroscopic data. The Cannon learns from the “known” labels of reference stars how the continuum-normalized spectra depend on these labels by fitting a flexible model at each wavelength; then, The Cannon uses this model to derive labels for the remaining survey stars. We illustrate The Cannon by training the model on only 542 stars in 19 clusters as reference objects, with {T}{eff}, {log} g, and [{Fe}/{{H}}] as the labels, and then applying it to the spectra of 55,000 stars from APOGEE DR10. The Cannon is very accurate. Its stellar labels compare well to the stars for which APOGEE pipeline (ASPCAP) labels are provided in DR10, with rms differences that are basically identical to the stated ASPCAP uncertainties. Beyond the reference labels, The Cannon makes no use of stellar models nor any line-list, but needs a set of reference objects that span label-space. The Cannon performs well at lower signal-to-noise, as it delivers comparably good labels even at one-ninth the APOGEE observing time. We discuss the limitations of The Cannon and its future potential, particularly, to bring different spectroscopic surveys onto a consistent scale of stellar labels.

  4. A software toolbox for robotics

    NASA Technical Reports Server (NTRS)

    Sanwal, J. C.

    1985-01-01

    A method for programming cooperating manipulators, which is guided by a geometric description of the task to be performed, is given. For this a suitable language must be used and a method for describing the workplace and the objects in it in geometric terms. A task level command language and its implementation for concurrently driven multiple robot arm is described. The language is suitable for driving a cell in which manipulators, end effectors, and sensors are controlled by their own dedicated processors. These processors can communicate with each other through a communication network. A mechanism for keeping track of the history of the commands already executed allows the command language for the manipulators to be event driven. A frame based world modeling system is utilized to describe the objects in the work environment and any relationships that hold between these objects. This system provides a versatile tool for managing information about the world model. Default actions normally needed are invoked when the data base is updated or accessed. Most of the first level error recovery is also invoked by the database by utilizing the concepts of demons. The package can be utilized to generate task level commands in a problem solver or a planner.

  5. Combined effects of inversion and feature removal on N170 responses elicited by faces and car fronts

    PubMed Central

    Kloth, Nadine; Itier, Roxane J.; Schweinberger, Stefan R.

    2014-01-01

    The face-sensitive N170 is typically enhanced for inverted compared to upright faces. Itier, Alain, Sedore, and McIntosh (2007) recently suggested that this N170 inversion effect is mainly driven by the eye region which becomes salient when the face configuration is disrupted. Here we tested whether similar effects could be observed with non-face objects that are structurally similar to faces in terms of possessing a homogeneous within-class first-order feature configuration. We presented upright and inverted pictures of intact car fronts, car fronts without lights, and isolated lights, in addition to analogous face conditions. Upright cars elicited substantial N170 responses of similar amplitude to those evoked by upright faces. In strong contrast to face conditions however, the car-elicited N170 was mainly driven by the global shape rather than the presence or absence of lights, and was dramatically reduced for isolated lights. Overall, our data confirm a differential influence of the eye region in upright and inverted faces. Results for car fronts do not suggest similar interactive encoding of eye-like features and configuration for non-face objects, even when these objects possess a similar feature configuration as faces. PMID:23485023

  6. Classical Statistics and Statistical Learning in Imaging Neuroscience

    PubMed Central

    Bzdok, Danilo

    2017-01-01

    Brain-imaging research has predominantly generated insight by means of classical statistics, including regression-type analyses and null-hypothesis testing using t-test and ANOVA. Throughout recent years, statistical learning methods enjoy increasing popularity especially for applications in rich and complex data, including cross-validated out-of-sample prediction using pattern classification and sparsity-inducing regression. This concept paper discusses the implications of inferential justifications and algorithmic methodologies in common data analysis scenarios in neuroimaging. It is retraced how classical statistics and statistical learning originated from different historical contexts, build on different theoretical foundations, make different assumptions, and evaluate different outcome metrics to permit differently nuanced conclusions. The present considerations should help reduce current confusion between model-driven classical hypothesis testing and data-driven learning algorithms for investigating the brain with imaging techniques. PMID:29056896

  7. Data driven propulsion system weight prediction model

    NASA Astrophysics Data System (ADS)

    Gerth, Richard J.

    1994-10-01

    The objective of the research was to develop a method to predict the weight of paper engines, i.e., engines that are in the early stages of development. The impetus for the project was the Single Stage To Orbit (SSTO) project, where engineers need to evaluate alternative engine designs. Since the SSTO is a performance driven project the performance models for alternative designs were well understood. The next tradeoff is weight. Since it is known that engine weight varies with thrust levels, a model is required that would allow discrimination between engines that produce the same thrust. Above all, the model had to be rooted in data with assumptions that could be justified based on the data. The general approach was to collect data on as many existing engines as possible and build a statistical model of the engines weight as a function of various component performance parameters. This was considered a reasonable level to begin the project because the data would be readily available, and it would be at the level of most paper engines, prior to detailed component design.

  8. An evaluation of the effectiveness of a risk-based monitoring approach implemented with clinical trials involving implantable cardiac medical devices.

    PubMed

    Diani, Christopher A; Rock, Angie; Moll, Phil

    2017-12-01

    Background Risk-based monitoring is a concept endorsed by the Food and Drug Administration to improve clinical trial data quality by focusing monitoring efforts on critical data elements and higher risk investigator sites. BIOTRONIK approached this by implementing a comprehensive strategy that assesses risk and data quality through a combination of operational controls and data surveillance. This publication demonstrates the effectiveness of a data-driven risk assessment methodology when used in conjunction with a tailored monitoring plan. Methods We developed a data-driven risk assessment system to rank 133 investigator sites comprising 3442 subjects and identify those sites that pose a potential risk to the integrity of data collected in implantable cardiac device clinical trials. This included identification of specific risk factors and a weighted scoring mechanism. We conducted trend analyses for risk assessment data collected over 1 year to assess the overall impact of our data surveillance process combined with other operational monitoring efforts. Results Trending analyses of key risk factors revealed an improvement in the quality of data collected during the observation period. The three risk factors follow-up compliance rate, unavailability of critical data, and noncompliance rate correspond closely with Food and Drug Administration's risk-based monitoring guidance document. Among these three risk factors, 100% (12/12) of quantiles analyzed showed an increase in data quality. Of these, 67% (8/12) of the improving trends in worst performing quantiles had p-values less than 0.05, and 17% (2/12) had p-values between 0.05 and 0.06. Among the poorest performing site quantiles, there was a statistically significant decrease in subject follow-up noncompliance rates, protocol noncompliance rates, and incidence of missing critical data. Conclusion One year after implementation of a comprehensive strategy for risk-based monitoring, including a data-driven risk assessment methodology to target on-site monitoring visits, statistically significant improvement was seen in a majority of measurable risk factors at the worst performing site quantiles. For the three risk factors which are most critical to the overall compliance of cardiac rhythm management medical device studies: follow-up compliance rate, unavailability of critical data, and noncompliance rate, we measured significant improvement in data quality. Although the worst performing site quantiles improved but not significantly in some risk factors such as subject attrition, the data-driven risk assessment highlighted key areas on which to continue focusing both on-site and centralized monitoring efforts. Data-driven surveillance of clinical trial performance provides actionable observations that can improve site performance. Clinical trials utilizing risk-based monitoring by leveraging a data-driven quality assessment combined with specific operational procedures may lead to an improvement in data quality and resource efficiencies.

  9. A standard-driven approach for electronic submission to pharmaceutical regulatory authorities.

    PubMed

    Lin, Ching-Heng; Chou, Hsin-I; Yang, Ueng-Cheng

    2018-03-01

    Using standards is not only useful for data interchange during the process of a clinical trial, but also useful for analyzing data in a review process. Any step, which speeds up approval of new drugs, may benefit patients. As a result, adopting standards for regulatory submission becomes mandatory in some countries. However, preparing standard-compliant documents, such as annotated case report form (aCRF), needs a great deal of knowledge and experience. The process is complex and labor-intensive. Therefore, there is a need to use information technology to facilitate this process. Instead of standardizing data after the completion of a clinical trial, this study proposed a standard-driven approach. This approach was achieved by implementing a computer-assisted "standard-driven pipeline (SDP)" in an existing clinical data management system. SDP used CDISC standards to drive all processes of a clinical trial, such as the design, data acquisition, tabulation, etc. RESULTS: A completed phase I/II trial was used to prove the concept and to evaluate the effects of this approach. By using the CDISC-compliant question library, aCRFs were generated automatically when the eCRFs were completed. For comparison purpose, the data collection process was simulated and the collected data was transformed by the SDP. This new approach reduced the missing data fields from sixty-two to eight and the controlled term mismatch field reduced from eight to zero during data tabulation. This standard-driven approach accelerated CRF annotation and assured data tabulation integrity. The benefits of this approach include an improvement in the use of standards during the clinical trial and a reduction in missing and unexpected data during tabulation. The standard-driven approach is an advanced design idea that can be used for future clinical information system development. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Evaluation of Respondent-Driven Sampling

    PubMed Central

    McCreesh, Nicky; Frost, Simon; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda Ndagire; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Background Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex-workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total-population data. Methods Total-population data on age, tribe, religion, socioeconomic status, sexual activity and HIV status were available on a population of 2402 male household-heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, employing current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). Results We recruited 927 household-heads. Full and small RDS samples were largely representative of the total population, but both samples under-represented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven-sampling statistical-inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven-sampling bootstrap 95% confidence intervals included the population proportion. Conclusions Respondent-driven sampling produced a generally representative sample of this well-connected non-hidden population. However, current respondent-driven-sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience-sampling method, and caution is required when interpreting findings based on the sampling method. PMID:22157309

  11. A magneto-sensitive skin for robots in space

    NASA Technical Reports Server (NTRS)

    Chauhan, D. S.; Dehoff, P. H.

    1991-01-01

    The development of a robot arm proximity sensing skin that can sense intruding objects is described. The purpose of the sensor would be to prevent the robot from colliding with objects in space including human beings. Eventually a tri-mode system in envisioned including proximity, tactile, and thermal. To date the primary emphasis was on the proximity sensor which evolved from one based on magneto-inductive principles to the current design which is based on a capacitive-reflector system. The capacitive sensing element, backed by a reflector driven at the same voltage and in phase with the sensor, is used to reflect field lines away from the grounded robot toward the intruding object. This results in an increased sensing range of up to 12 in. with the reflector on compared with only 1 in. with it off. It is believed that this design advances the state-of-the-art in capacitive sensor performance.

  12. Policy enabled information sharing system

    DOEpatents

    Jorgensen, Craig R.; Nelson, Brian D.; Ratheal, Steve W.

    2014-09-02

    A technique for dynamically sharing information includes executing a sharing policy indicating when to share a data object responsive to the occurrence of an event. The data object is created by formatting a data file to be shared with a receiving entity. The data object includes a file data portion and a sharing metadata portion. The data object is encrypted and then automatically transmitted to the receiving entity upon occurrence of the event. The sharing metadata portion includes metadata characterizing the data file and referenced in connection with the sharing policy to determine when to automatically transmit the data object to the receiving entity.

  13. The influence of data-driven versus conceptually-driven processing on the development of PTSD-like symptoms.

    PubMed

    Kindt, Merel; van den Hout, Marcel; Arntz, Arnoud; Drost, Jolijn

    2008-12-01

    Ehlers and Clark [(2000). A cognitive model of posttraumatic stress disorder. Behaviour Research and Therapy, 38, 319-345] propose that a predominance of data-driven processing during the trauma predicts subsequent PTSD. We wondered whether, apart from data-driven encoding, sustained data-driven processing after the trauma is also crucial for the development of PTSD. Both hypotheses were tested in two analogue experiments. Experiment 1 demonstrated that relative to conceptually-driven processing (n=20), data-driven processing after the film (n=14), resulted in more intrusions. Experiment 2 demonstrated that relative to the neutral condition (n=24) and the data-driven encoding condition (n=24), conceptual encoding (n=25) reduced suppression of intrusions and a trend emerged for memory fragmentation. The difference between the two encoding styles was due to the beneficial effect of induced conceptual encoding and not to the detrimental effect of data-driven encoding. The data support the viability of the distinction between data-driven/conceptually-driven processing for the understanding of the development of PTSD.

  14. Machine learning and data science in soft materials engineering

    NASA Astrophysics Data System (ADS)

    Ferguson, Andrew L.

    2018-01-01

    In many branches of materials science it is now routine to generate data sets of such large size and dimensionality that conventional methods of analysis fail. Paradigms and tools from data science and machine learning can provide scalable approaches to identify and extract trends and patterns within voluminous data sets, perform guided traversals of high-dimensional phase spaces, and furnish data-driven strategies for inverse materials design. This topical review provides an accessible introduction to machine learning tools in the context of soft and biological materials by ‘de-jargonizing’ data science terminology, presenting a taxonomy of machine learning techniques, and surveying the mathematical underpinnings and software implementations of popular tools, including principal component analysis, independent component analysis, diffusion maps, support vector machines, and relative entropy. We present illustrative examples of machine learning applications in soft matter, including inverse design of self-assembling materials, nonlinear learning of protein folding landscapes, high-throughput antimicrobial peptide design, and data-driven materials design engines. We close with an outlook on the challenges and opportunities for the field.

  15. Machine learning and data science in soft materials engineering.

    PubMed

    Ferguson, Andrew L

    2018-01-31

    In many branches of materials science it is now routine to generate data sets of such large size and dimensionality that conventional methods of analysis fail. Paradigms and tools from data science and machine learning can provide scalable approaches to identify and extract trends and patterns within voluminous data sets, perform guided traversals of high-dimensional phase spaces, and furnish data-driven strategies for inverse materials design. This topical review provides an accessible introduction to machine learning tools in the context of soft and biological materials by 'de-jargonizing' data science terminology, presenting a taxonomy of machine learning techniques, and surveying the mathematical underpinnings and software implementations of popular tools, including principal component analysis, independent component analysis, diffusion maps, support vector machines, and relative entropy. We present illustrative examples of machine learning applications in soft matter, including inverse design of self-assembling materials, nonlinear learning of protein folding landscapes, high-throughput antimicrobial peptide design, and data-driven materials design engines. We close with an outlook on the challenges and opportunities for the field.

  16. Writing Development in Secondary/Post Secondary Language Learning: Integrating Multiple Motivating Factors, Explanatory Feedback, and Explanatory Writing Tools to Increase Competence and Confidence in Writing

    ERIC Educational Resources Information Center

    Jefferson, Trevina

    2013-01-01

    Background: This study discusses data-driven results of newly-developed writing tools that are objective, easy, and less time-consuming than standard classroom writing strategies; additionally, multiple motivation triggers and peer evaluation are evaluated together with these new, modernized writing tools. The results are explained separately and…

  17. Mars as a Destination in a Capability-Driven Framework

    NASA Technical Reports Server (NTRS)

    Hoffman, S. J.; Drake, B. G.; Baker, J. D.; Voels, S. A.

    2011-01-01

    This paper describes NASA s current plans for the exploration of Mars by human crews within NASA s Capability-Driven Framework (CDF). The CDF describes an approach for progressively extending human explorers farther into the Solar System for longer periods of time as allowed by developments in technology and spacecraft systems. Within this framework, Mars defines the most challenging objective currently envisioned for human spaceflight. The paper first describes the CDF and potential destinations being considered within this framework. For destinations relevant to the exploration of Mars, this includes both the Martian surface and the two moons of Mars. This is followed by a brief review of our evolving understanding of Mars to provide the context for the specific objectives set for human exploration crews. This includes results from robotic missions and goals set for future Martian exploration by NASA's community-based forum, the Mars Exploration Program Analysis Group (MEPAG) and the MEPAG-sponsored Human Exploration of Mars - Science Analysis Group (HEM-SAG). The paper then reviews options available for human crews to reach Mars and return to Earth. This includes a discussion of the rationale used to select from among these options for envisioned Mars exploration missions. The paper then concludes with a description of technological and operational challenges that still face NASA in order to be able to achieve the exploration goals for Mars within the CDF.

  18. Model-Based Analysis for Qualitative Data: An Application in Drosophila Germline Stem Cell Regulation

    PubMed Central

    Pargett, Michael; Rundell, Ann E.; Buzzard, Gregery T.; Umulis, David M.

    2014-01-01

    Discovery in developmental biology is often driven by intuition that relies on the integration of multiple types of data such as fluorescent images, phenotypes, and the outcomes of biochemical assays. Mathematical modeling helps elucidate the biological mechanisms at play as the networks become increasingly large and complex. However, the available data is frequently under-utilized due to incompatibility with quantitative model tuning techniques. This is the case for stem cell regulation mechanisms explored in the Drosophila germarium through fluorescent immunohistochemistry. To enable better integration of biological data with modeling in this and similar situations, we have developed a general parameter estimation process to quantitatively optimize models with qualitative data. The process employs a modified version of the Optimal Scaling method from social and behavioral sciences, and multi-objective optimization to evaluate the trade-off between fitting different datasets (e.g. wild type vs. mutant). Using only published imaging data in the germarium, we first evaluated support for a published intracellular regulatory network by considering alternative connections of the same regulatory players. Simply screening networks against wild type data identified hundreds of feasible alternatives. Of these, five parsimonious variants were found and compared by multi-objective analysis including mutant data and dynamic constraints. With these data, the current model is supported over the alternatives, but support for a biochemically observed feedback element is weak (i.e. these data do not measure the feedback effect well). When also comparing new hypothetical models, the available data do not discriminate. To begin addressing the limitations in data, we performed a model-based experiment design and provide recommendations for experiments to refine model parameters and discriminate increasingly complex hypotheses. PMID:24626201

  19. Skin cancer screening: recommendations for data-driven screening guidelines and a review of the US Preventive Services Task Force controversy

    PubMed Central

    Johnson, Mariah M; Leachman, Sancy A; Aspinwall, Lisa G; Cranmer, Lee D; Curiel-Lewandrowski, Clara; Sondak, Vernon K; Stemwedel, Clara E; Swetter, Susan M; Vetto, John; Bowles, Tawnya; Dellavalle, Robert P; Geskin, Larisa J; Grossman, Douglas; Grossmann, Kenneth F; Hawkes, Jason E; Jeter, Joanne M; Kim, Caroline C; Kirkwood, John M; Mangold, Aaron R; Meyskens, Frank; Ming, Michael E; Nelson, Kelly C; Piepkorn, Michael; Pollack, Brian P; Robinson, June K; Sober, Arthur J; Trotter, Shannon; Venna, Suraj S; Agarwala, Sanjiv; Alani, Rhoda; Averbook, Bruce; Bar, Anna; Becevic, Mirna; Box, Neil; E Carson, William; Cassidy, Pamela B; Chen, Suephy C; Chu, Emily Y; Ellis, Darrel L; Ferris, Laura K; Fisher, David E; Kendra, Kari; Lawson, David H; Leming, Philip D; Margolin, Kim A; Markovic, Svetomir; Martini, Mary C; Miller, Debbie; Sahni, Debjani; Sharfman, William H; Stein, Jennifer; Stratigos, Alexander J; Tarhini, Ahmad; Taylor, Matthew H; Wisco, Oliver J; Wong, Michael K

    2017-01-01

    Melanoma is usually apparent on the skin and readily detected by trained medical providers using a routine total body skin examination, yet this malignancy is responsible for the majority of skin cancer-related deaths. Currently, there is no national consensus on skin cancer screening in the USA, but dermatologists and primary care providers are routinely confronted with making the decision about when to recommend total body skin examinations and at what interval. The objectives of this paper are: to propose rational, risk-based, data-driven guidelines commensurate with the US Preventive Services Task Force screening guidelines for other disorders; to compare our proposed guidelines to recommendations made by other national and international organizations; and to review the US Preventive Services Task Force's 2016 Draft Recommendation Statement on skin cancer screening. PMID:28758010

  20. Big data analytics in hyperspectral imaging for detection of microbial colonies on agar plates (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Yoon, Seung-Chul; Park, Bosoon; Lawrence, Kurt C.

    2017-05-01

    Various types of optical imaging techniques measuring light reflectivity and scattering can detect microbial colonies of foodborne pathogens on agar plates. Until recently, these techniques were developed to provide solutions for hypothesis-driven studies, which focused on developing tools and batch/offline machine learning methods with well defined sets of data. These have relatively high accuracy and rapid response time because the tools and methods are often optimized for the collected data. However, they often need to be retrained or recalibrated when new untrained data and/or features are added. A big-data driven technique is more suitable for online learning of new/ambiguous samples and for mining unknown or hidden features. Although big data research in hyperspectral imaging is emerging in remote sensing and many tools and methods have been developed so far in many other applications such as bioinformatics, the tools and methods still need to be evaluated and adjusted in applications where the conventional batch machine learning algorithms were dominant. The primary objective of this study is to evaluate appropriate big data analytic tools and methods for online learning and mining of foodborne pathogens on agar plates. After the tools and methods are successfully identified, they will be applied to rapidly search big color and hyperspectral image data of microbial colonies collected over the past 5 years in house and find the most probable colony or a group of colonies in the collected big data. The meta-data, such as collection time and any unstructured data (e.g. comments), will also be analyzed and presented with output results. The expected results will be novel, big data-driven technology to correctly detect and recognize microbial colonies of various foodborne pathogens on agar plates.

  1. Nursing staff connect libraries with improving patient care but not with achieving organisational objectives: a grounded theory approach.

    PubMed

    Chamberlain, David; Brook, Richard

    2014-03-01

    Health organisations are often driven by specific targets defined by mission statements, aims and objectives to improve patient care. Health libraries need to demonstrate that they contribute to organisational objectives, but it is not clear how nurses view that contribution. To investigate ward nursing staff motivations, their awareness of ward and organisational objectives; and their attitudes towards the contribution of health library services to improving patient care. Qualitative research using focus group data was combined with content analysis of literature evidence and library statistics (quantitative data). Data were analysed using thematic coding, divided into five group themes: understanding of Trust, Ward and Personal objectives, use of Library, use of other information sources, quality and Issues. Four basic social-psychological processes were then developed. Behaviour indicates low awareness of organisational objectives despite patient-centric motivation. High awareness of library services is shown with some connection made by ward staff between improved knowledge and improved patient care. There was a two-tiered understanding of ward objectives and library services, based on level of seniority. However, evidence-based culture needs to be intrinsic in the organisation before all staff benefit. Libraries can actively engage in this at ward and board level and improve patient care by supporting organisational objectives. © 2014 The author. Health Information and Libraries Journal © 2014 Health Libraries Group.

  2. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  3. Exploring the components of physician volunteer engagement: a qualitative investigation of a national Canadian simulation-based training programme

    PubMed Central

    Sarti, Aimee J; Sutherland, Stephanie; Landriault, Angele; DesRosier, Kirk; Brien, Susan; Cardinal, Pierre

    2017-01-01

    Objectives Conceptual clarity on physician volunteer engagement is lacking in the medical literature. The aim of this study was to present a conceptual framework to describe the elements which influence physician volunteer engagement and to explore volunteer engagement within a national educational programme. Setting The context for this study was the Acute Critical Events Simulation (ACES) programme in Canada, which has successfully evolved into a national educational programme, driven by physician volunteers. From 2010 to 2014, the programme recruited 73 volunteer healthcare professionals who contributed to the creation of educational materials and/or served as instructors. Method A conceptual framework was constructed based on an extensive literature review and expert consultation. Secondary qualitative analysis was undertaken on 15 semistructured interviews conducted from 2012 to 2013 with programme directors and healthcare professionals across Canada. An additional 15 interviews were conducted in 2015 with physician volunteers to achieve thematic saturation. Data were analysed iteratively and inductive coding techniques applied. Results From the physician volunteer data, 11 themes emerged. The most prominent themes included volunteer recruitment, retention, exchange, recognition, educator network and quasi-volunteerism. Captured within these interrelated themes were the framework elements, including the synergistic effects of emotional, cognitive and reciprocal engagement. Behavioural engagement was driven by these factors along with a cue to action, which led to contributions to the ACES programme. Conclusion This investigation provides a preliminary framework and supportive evidence towards understanding the complex construct of physician volunteer engagement. The need for this research is particularly important in present day, where growing fiscal constraints create challenges for medical education to do more with less. PMID:28645956

  4. Indicators of ecosystem function identify alternate states in the sagebrush steppe.

    PubMed

    Kachergis, Emily; Rocca, Monique E; Fernandez-Gimenez, Maria E

    2011-10-01

    Models of ecosystem change that incorporate nonlinear dynamics and thresholds, such as state-and-transition models (STMs), are increasingly popular tools for land management decision-making. However, few models are based on systematic collection and documentation of ecological data, and of these, most rely solely on structural indicators (species composition) to identify states and transitions. As STMs are adopted as an assessment framework throughout the United States, finding effective and efficient ways to create data-driven models that integrate ecosystem function and structure is vital. This study aims to (1) evaluate the utility of functional indicators (indicators of rangeland health, IRH) as proxies for more difficult ecosystem function measurements and (2) create a data-driven STM for the sagebrush steppe of Colorado, USA, that incorporates both ecosystem structure and function. We sampled soils, plant communities, and IRH at 41 plots with similar clayey soils but different site histories to identify potential states and infer the effects of management practices and disturbances on transitions. We found that many IRH were correlated with quantitative measures of functional indicators, suggesting that the IRH can be used to approximate ecosystem function. In addition to a reference state that functions as expected for this soil type, we identified four biotically and functionally distinct potential states, consistent with the theoretical concept of alternate states. Three potential states were related to management practices (chemical and mechanical shrub treatments and seeding history) while one was related only to ecosystem processes (erosion). IRH and potential states were also related to environmental variation (slope, soil texture), suggesting that there are environmental factors within areas with similar soils that affect ecosystem dynamics and should be noted within STMs. Our approach generated an objective, data-driven model of ecosystem dynamics for rangeland management. Our findings suggest that the IRH approximate ecosystem processes and can distinguish between alternate states and communities and identify transitions when building data-driven STMs. Functional indicators are a simple, efficient way to create data-driven models that are consistent with alternate state theory. Managers can use them to improve current model-building methods and thus apply state-and-transition models more broadly for land management decision-making.

  5. Framework for a clinical information system.

    PubMed

    Van de Velde, R

    2000-01-01

    The current status of our work towards the design and implementation of a reference architecture for a Clinical Information System is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the 'middle' tier apply the clinical (business) model and application rules to communicate with so-called 'thin client' workstations. The main characteristics are the focus on modelling and reuse of both data and business logic as there is a shift away from data and functional modelling towards object modelling. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.

  6. Functional Fixedness in Creative Thinking Tasks Depends on Stimulus Modality.

    PubMed

    Chrysikou, Evangelia G; Motyka, Katharine; Nigro, Cristina; Yang, Song-I; Thompson-Schill, Sharon L

    2016-11-01

    Pictorial examples during creative thinking tasks can lead participants to fixate on these examples and reproduce their elements even when yielding suboptimal creative products. Semantic memory research may illuminate the cognitive processes underlying this effect. Here, we examined whether pictures and words differentially influence access to semantic knowledge for object concepts depending on whether the task is close- or open-ended. Participants viewed either names or pictures of everyday objects, or a combination of the two, and generated common, secondary, or ad hoc uses for them. Stimulus modality effects were assessed quantitatively through reaction times and qualitatively through a novel coding system, which classifies creative output on a continuum from top-down-driven to bottom-up-driven responses. Both analyses revealed differences across tasks. Importantly, for ad hoc uses, participants exposed to pictures generated more top-down-driven responses than those exposed to object names. These findings have implications for accounts of functional fixedness in creative thinking, as well as theories of semantic memory for object concepts.

  7. Functional Fixedness in Creative Thinking Tasks Depends on Stimulus Modality

    PubMed Central

    Chrysikou, Evangelia G.; Motyka, Katharine; Nigro, Cristina; Yang, Song-I; Thompson-Schill, Sharon L.

    2015-01-01

    Pictorial examples during creative thinking tasks can lead participants to fixate on these examples and reproduce their elements even when yielding suboptimal creative products. Semantic memory research may illuminate the cognitive processes underlying this effect. Here, we examined whether pictures and words differentially influence access to semantic knowledge for object concepts depending on whether the task is close- or open-ended. Participants viewed either names or pictures of everyday objects, or a combination of the two, and generated common, secondary, or ad hoc uses for them. Stimulus modality effects were assessed quantitatively through reaction times and qualitatively through a novel coding system, which classifies creative output on a continuum from top-down-driven to bottom-up-driven responses. Both analyses revealed differences across tasks. Importantly, for ad hoc uses, participants exposed to pictures generated more top-down-driven responses than those exposed to object names. These findings have implications for accounts of functional fixedness in creative thinking, as well as theories of semantic memory for object concepts. PMID:28344724

  8. Life sciences domain analysis model

    PubMed Central

    Freimuth, Robert R; Freund, Elaine T; Schick, Lisa; Sharma, Mukesh K; Stafford, Grace A; Suzek, Baris E; Hernandez, Joyce; Hipp, Jason; Kelley, Jenny M; Rokicki, Konrad; Pan, Sue; Buckler, Andrew; Stokes, Todd H; Fernandez, Anna; Fore, Ian; Buetow, Kenneth H

    2012-01-01

    Objective Meaningful exchange of information is a fundamental challenge in collaborative biomedical research. To help address this, the authors developed the Life Sciences Domain Analysis Model (LS DAM), an information model that provides a framework for communication among domain experts and technical teams developing information systems to support biomedical research. The LS DAM is harmonized with the Biomedical Research Integrated Domain Group (BRIDG) model of protocol-driven clinical research. Together, these models can facilitate data exchange for translational research. Materials and methods The content of the LS DAM was driven by analysis of life sciences and translational research scenarios and the concepts in the model are derived from existing information models, reference models and data exchange formats. The model is represented in the Unified Modeling Language and uses ISO 21090 data types. Results The LS DAM v2.2.1 is comprised of 130 classes and covers several core areas including Experiment, Molecular Biology, Molecular Databases and Specimen. Nearly half of these classes originate from the BRIDG model, emphasizing the semantic harmonization between these models. Validation of the LS DAM against independently derived information models, research scenarios and reference databases supports its general applicability to represent life sciences research. Discussion The LS DAM provides unambiguous definitions for concepts required to describe life sciences research. The processes established to achieve consensus among domain experts will be applied in future iterations and may be broadly applicable to other standardization efforts. Conclusions The LS DAM provides common semantics for life sciences research. Through harmonization with BRIDG, it promotes interoperability in translational science. PMID:22744959

  9. Activity-based costing of health-care delivery, Haiti

    PubMed Central

    Jerome, Gregory; Leandre, Fernet; Browning, Micaela; Warsh, Jonathan; Shah, Mahek; Mistry, Bipin; Faure, Peterson Abnis I; Pierre, Claire; Fang, Anna P; Mugunga, Jean Claude; Gottlieb, Gary; Rhatigan, Joseph; Kaplan, Robert

    2018-01-01

    Abstract Objective To evaluate the implementation of a time-driven activity-based costing analysis at five community health facilities in Haiti. Methods Together with stakeholders, the project team decided that health-care providers should enter start and end times of the patient encounter in every fifth patient’s medical dossier. We trained one data collector per facility, who manually entered the time recordings and patient characteristics in a database and submitted the data to a cloud-based data warehouse each week. We calculated the capacity cost per minute for each resource used. An automated web-based platform multiplied reported time with capacity cost rate and provided the information to health-facilities administrators. Findings Between March 2014 and June 2015, the project tracked the clinical services for 7162 outpatients. The cost of care for specific conditions varied widely across the five facilities, due to heterogeneity in staffing and resources. For example, the average cost of a first antenatal-care visit ranged from 6.87 United States dollars (US$) at a low-level facility to US$ 25.06 at a high-level facility. Within facilities, we observed similarly variation in costs, due to factors such as patient comorbidities, patient arrival time, stocking of supplies at facilities and type of visit. Conclusion Time-driven activity-based costing can be implemented in low-resource settings to guide resource allocation decisions. However, the extent to which this information will drive observable changes at patient, provider and institutional levels depends on several contextual factors, including budget constraints, management, policies and the political economy in which the health system is situated. PMID:29403096

  10. Pre-Big Bang Bubbles from the Gravitational Instability of Generic String Vacua

    NASA Astrophysics Data System (ADS)

    Buonanno, A.; Damour, T.; Veneziano, G.

    1998-06-01

    We formulate the basic postulate of pre-big bang cosmology as one of 'asymptotic past triviality', by which we mean that the initial state is a generic perturbative solution of the tree-level low-energy effective action. Each such singular space-like hypersurface of gravitational collapse becomes, in the string-frame metric, the usual big-bang t = 0 hypersurface, i.e. the place of birth of a baby Friedmann universe after a period of dilaton-driven inflation. Specializing to the spherically-symmetric case, we review and reinterpret previous work on the subject, and propose a simple, scale-invariant criterion for collapse/inflation in terms of asymptotic data at past null infinity. Those data should determine whether, when, and where collapse/inflation occurs, and, when it does, fix its characteristics, including anisotropies on the big bang hypersurface whose imprint could have survived till now. Using Bayesian probability concepts, we finally attempt to answer some fine-tuning objections recently moved to the pre-gib bang scenario.

  11. Mechanisms Linking the Gut Microbiome and Glucose Metabolism

    PubMed Central

    Kratz, Mario; Damman, Chris J.; Hullarg, Meredith

    2016-01-01

    Context: Type 2 diabetes mellitus is associated with gastrointestinal dysbiosis involving both compositional and functional changes in the gut microbiome. Changes in diet and supplementation with probiotics and prebiotics (ie, fermentable fibers) can induce favorable changes in gut bacterial species and improve glucose homeostasis. Objective: This paper will review the data supporting several potential mechanisms whereby gut dysbiosis contributes to metabolic dysfunction, including microbiota driven increases in systemic lipopolysaccharide concentrations, changes in bile acid metabolism, alterations in short chain fatty acid production, alterations in gut hormone secretion, and changes in circulating branched-chain amino acids. Methods: Data for this review were identified by searching English language references from PubMed and relevant articles. Conclusions: Understanding the mechanisms linking the gut microbiome to glucose metabolism, and the relevant compositional and functional characteristics of the gut microbiome, will help direct future research to develop more targeted approaches or novel compounds aimed at restoring a more healthy gut microbiome as a new approach to prevent and treat type 2 diabetes mellitus and related metabolic conditions. PMID:26938201

  12. Network reconstruction of platelet metabolism identifies metabolic signature for aspirin resistance

    NASA Astrophysics Data System (ADS)

    Thomas, Alex; Rahmanian, Sorena; Bordbar, Aarash; Palsson, Bernhard Ø.; Jamshidi, Neema

    2014-01-01

    Recently there has not been a systematic, objective assessment of the metabolic capabilities of the human platelet. A manually curated, functionally tested, and validated biochemical reaction network of platelet metabolism, iAT-PLT-636, was reconstructed using 33 proteomic datasets and 354 literature references. The network contains enzymes mapping to 403 diseases and 231 FDA approved drugs, alluding to an expansive scope of biochemical transformations that may affect or be affected by disease processes in multiple organ systems. The effect of aspirin (ASA) resistance on platelet metabolism was evaluated using constraint-based modeling, which revealed a redirection of glycolytic, fatty acid, and nucleotide metabolism reaction fluxes in order to accommodate eicosanoid synthesis and reactive oxygen species stress. These results were confirmed with independent proteomic data. The construction and availability of iAT-PLT-636 should stimulate further data-driven, systems analysis of platelet metabolism towards the understanding of pathophysiological conditions including, but not strictly limited to, coagulopathies.

  13. Event-driven processing for hardware-efficient neural spike sorting

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Pereira, João L.; Constandinou, Timothy G.

    2018-02-01

    Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.

  14. Watershed Watch: The Importance of Mentors in Student-driven Full Inquiry Undergraduate Research Projects as the Foundation for an Introductory Course in Biogeoscience

    NASA Astrophysics Data System (ADS)

    Rock, B. N.; Hale, S. R.; Graham, K. J.; Hayden, L.; Barber, L.; Perry, C.; Schloss, J.; Sullivan, E.; Yuan, J.; Abebe, E.; Mitchell, L.; Abrams, E.; Gagnon, M.

    2008-12-01

    Watershed Watch (NSF 0525433) engages early undergraduate students from two-year and four-year colleges in student-driven full inquiry-based instruction in the biogeosciences. Program goals for Watershed Watch are to test if inquiry-rich student-driven projects sufficiently engage undeclared students (or noncommittal STEM majors) to declare a STEM major (or remain with their STEM major). A significant component of this program is an intensive two-week Summer course, in which undeclared freshmen research various aspects of a local watershed. Students develop their own research questions and study design, collect and analyze data, and produce a scientific or an oral poster presentation. The course objectives, curriculum and schedule are presented as a model for dissemination for other institutions and programs seeking to develop inquiry-rich courses designed to attract students into biogeoscience disciplines. Data from self-reported student feedback indicated the most important factors explaining high-levels of student motivation and research excellence in the course are 1) working with committed, energetic, and enthusiastic faculty mentors; and 2) faculty mentors demonstrating high degrees of teamwork and coordination.

  15. The `Henry Problem' of `density-driven' groundwater flow versus Tothian `groundwater flow systems' with variable density: A review of the influential Biscayne aquifer data.

    NASA Astrophysics Data System (ADS)

    Weyer, K. U.

    2017-12-01

    Coastal groundwater flow investigations at the Biscayne Bay, south of Miami, Florida, gave rise to the concept of density-driven flow of seawater into coastal aquifers creating a saltwater wedge. Within that wedge, convection-driven return flow of seawater and a dispersion zone were assumed by Cooper et al. (1964) to be the cause of the Biscayne aquifer `sea water wedge'. This conclusion was based on the chloride distribution within the aquifer and on an analytical model concept assuming convection flow within a confined aquifer without taking non-chemical field data into consideration. This concept was later labelled the `Henry Problem', which any numerical variable density flow program must be able to simulate to be considered acceptable. Both, `density-driven flow' and Tothian `groundwater flow systems' (with or without variable density conditions) are driven by gravitation. The difference between the two are the boundary conditions. 'Density-driven flow' occurs under hydrostatic boundary conditions while Tothian `groundwater flow systems' occur under hydrodynamic boundary conditions. Revisiting the Cooper et al. (1964) publication with its record of piezometric field data (heads) showed that the so-called sea water wedge has been caused by discharging deep saline groundwater driven by gravitational flow and not by denser sea water. Density driven flow of seawater into the aquifer was not found reflected in the head measurements for low and high tide conditions which had been taken contemporaneously with the chloride measurements. These head measurements had not been included in the flow interpretation. The very same head measurements indicated a clear dividing line between shallow local fresh groundwater flow and saline deep groundwater flow without the existence of a dispersion zone or a convection cell. The Biscayne situation emphasizes the need for any chemical interpretation of flow pattern to be supported by head data as energy indicators of flow fields. At the Biscayne site density-driven flow of seawater did and does not exist. Instead this site and the Florida coast line in general are the end points of local fresh and regional saline groundwater flow systems driven by gravity forces and not by density differences.

  16. Agriculture-driven deforestation in the tropics from 1990-2015: emissions, trends and uncertainties

    NASA Astrophysics Data System (ADS)

    Carter, Sarah; Herold, Martin; Avitabile, Valerio; de Bruin, Sytze; De Sy, Veronique; Kooistra, Lammert; Rufino, Mariana C.

    2018-01-01

    Limited data exists on emissions from agriculture-driven deforestation, and available data are typically uncertain. In this paper, we provide comparable estimates of emissions from both all deforestation and agriculture-driven deforestation, with uncertainties for 91 countries across the tropics between 1990 and 2015. Uncertainties associated with input datasets (activity data and emissions factors) were used to combine the datasets, where most certain datasets contribute the most. This method utilizes all the input data, while minimizing the uncertainty of the emissions estimate. The uncertainty of input datasets was influenced by the quality of the data, the sample size (for sample-based datasets), and the extent to which the timeframe of the data matches the period of interest. Area of deforestation, and the agriculture-driver factor (extent to which agriculture drives deforestation), were the most uncertain components of the emissions estimates, thus improvement in the uncertainties related to these estimates will provide the greatest reductions in uncertainties of emissions estimates. Over the period of the study, Latin America had the highest proportion of deforestation driven by agriculture (78%), and Africa had the lowest (62%). Latin America had the highest emissions from agriculture-driven deforestation, and these peaked at 974 ± 148 Mt CO2 yr-1 in 2000-2005. Africa saw a continuous increase in emissions between 1990 and 2015 (from 154 ± 21-412 ± 75 Mt CO2 yr-1), so mitigation initiatives could be prioritized there. Uncertainties for emissions from agriculture-driven deforestation are ± 62.4% (average over 1990-2015), and uncertainties were highest in Asia and lowest in Latin America. Uncertainty information is crucial for transparency when reporting, and gives credibility to related mitigation initiatives. We demonstrate that uncertainty data can also be useful when combining multiple open datasets, so we recommend new data providers to include this information.

  17. WebGL Visualisation of 3D Environmental Models Based on Finnish Open Geospatial Data Sets

    NASA Astrophysics Data System (ADS)

    Krooks, A.; Kahkonen, J.; Lehto, L.; Latvala, P.; Karjalainen, M.; Honkavaara, E.

    2014-08-01

    Recent developments in spatial data infrastructures have enabled real time GIS analysis and visualization using open input data sources and service interfaces. In this study we present a new concept where metric point clouds derived from national open airborne laser scanning (ALS) and photogrammetric image data are processed, analyzed, finally visualised a through open service interfaces to produce user-driven analysis products from targeted areas. The concept is demonstrated in three environmental applications: assessment of forest storm damages, assessment of volumetric changes in open pit mine and 3D city model visualization. One of the main objectives was to study the usability and requirements of national level photogrammetric imagery in these applications. The results demonstrated that user driven 3D geospatial analyses were possible with the proposed approach and current technology, for instance, the landowner could assess the amount of fallen trees within his property borders after a storm easily using any web browser. On the other hand, our study indicated that there are still many uncertainties especially due to the insufficient standardization of photogrammetric products and processes and their quality indicators.

  18. Receptive fields selection for binary feature description.

    PubMed

    Fan, Bin; Kong, Qingqun; Trzcinski, Tomasz; Wang, Zhiheng; Pan, Chunhong; Fua, Pascal

    2014-06-01

    Feature description for local image patch is widely used in computer vision. While the conventional way to design local descriptor is based on expert experience and knowledge, learning-based methods for designing local descriptor become more and more popular because of their good performance and data-driven property. This paper proposes a novel data-driven method for designing binary feature descriptor, which we call receptive fields descriptor (RFD). Technically, RFD is constructed by thresholding responses of a set of receptive fields, which are selected from a large number of candidates according to their distinctiveness and correlations in a greedy way. Using two different kinds of receptive fields (namely rectangular pooling area and Gaussian pooling area) for selection, we obtain two binary descriptors RFDR and RFDG .accordingly. Image matching experiments on the well-known patch data set and Oxford data set demonstrate that RFD significantly outperforms the state-of-the-art binary descriptors, and is comparable with the best float-valued descriptors at a fraction of processing time. Finally, experiments on object recognition tasks confirm that both RFDR and RFDG successfully bridge the performance gap between binary descriptors and their floating-point competitors.

  19. Intelligent fuzzy controller for event-driven real time systems

    NASA Technical Reports Server (NTRS)

    Grantner, Janos; Patyra, Marek; Stachowicz, Marian S.

    1992-01-01

    Most of the known linguistic models are essentially static, that is, time is not a parameter in describing the behavior of the object's model. In this paper we show a model for synchronous finite state machines based on fuzzy logic. Such finite state machines can be used to build both event-driven, time-varying, rule-based systems and the control unit section of a fuzzy logic computer. The architecture of a pipelined intelligent fuzzy controller is presented, and the linguistic model is represented by an overall fuzzy relation stored in a single rule memory. A VLSI integrated circuit implementation of the fuzzy controller is suggested. At a clock rate of 30 MHz, the controller can perform 3 MFLIPS on multi-dimensional fuzzy data.

  20. Automated object-based classification of topography from SRTM data

    PubMed Central

    Drăguţ, Lucian; Eisank, Clemens

    2012-01-01

    We introduce an object-based method to automatically classify topography from SRTM data. The new method relies on the concept of decomposing land-surface complexity into more homogeneous domains. An elevation layer is automatically segmented and classified at three scale levels that represent domains of complexity by using self-adaptive, data-driven techniques. For each domain, scales in the data are detected with the help of local variance and segmentation is performed at these appropriate scales. Objects resulting from segmentation are partitioned into sub-domains based on thresholds given by the mean values of elevation and standard deviation of elevation respectively. Results resemble reasonably patterns of existing global and regional classifications, displaying a level of detail close to manually drawn maps. Statistical evaluation indicates that most of classes satisfy the regionalization requirements of maximizing internal homogeneity while minimizing external homogeneity. Most objects have boundaries matching natural discontinuities at regional level. The method is simple and fully automated. The input data consist of only one layer, which does not need any pre-processing. Both segmentation and classification rely on only two parameters: elevation and standard deviation of elevation. The methodology is implemented as a customized process for the eCognition® software, available as online download. The results are embedded in a web application with functionalities of visualization and download. PMID:22485060

  1. Automated object-based classification of topography from SRTM data

    NASA Astrophysics Data System (ADS)

    Drăguţ, Lucian; Eisank, Clemens

    2012-03-01

    We introduce an object-based method to automatically classify topography from SRTM data. The new method relies on the concept of decomposing land-surface complexity into more homogeneous domains. An elevation layer is automatically segmented and classified at three scale levels that represent domains of complexity by using self-adaptive, data-driven techniques. For each domain, scales in the data are detected with the help of local variance and segmentation is performed at these appropriate scales. Objects resulting from segmentation are partitioned into sub-domains based on thresholds given by the mean values of elevation and standard deviation of elevation respectively. Results resemble reasonably patterns of existing global and regional classifications, displaying a level of detail close to manually drawn maps. Statistical evaluation indicates that most of classes satisfy the regionalization requirements of maximizing internal homogeneity while minimizing external homogeneity. Most objects have boundaries matching natural discontinuities at regional level. The method is simple and fully automated. The input data consist of only one layer, which does not need any pre-processing. Both segmentation and classification rely on only two parameters: elevation and standard deviation of elevation. The methodology is implemented as a customized process for the eCognition® software, available as online download. The results are embedded in a web application with functionalities of visualization and download.

  2. The application of data mining and cloud computing techniques in data-driven models for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Khazaeli, S.; Ravandi, A. G.; Banerji, S.; Bagchi, A.

    2016-04-01

    Recently, data-driven models for Structural Health Monitoring (SHM) have been of great interest among many researchers. In data-driven models, the sensed data are processed to determine the structural performance and evaluate the damages of an instrumented structure without necessitating the mathematical modeling of the structure. A framework of data-driven models for online assessment of the condition of a structure has been developed here. The developed framework is intended for automated evaluation of the monitoring data and structural performance by the Internet technology and resources. The main challenges in developing such framework include: (a) utilizing the sensor measurements to estimate and localize the induced damage in a structure by means of signal processing and data mining techniques, and (b) optimizing the computing and storage resources with the aid of cloud services. The main focus in this paper is to demonstrate the efficiency of the proposed framework for real-time damage detection of a multi-story shear-building structure in two damage scenarios (change in mass and stiffness) in various locations. Several features are extracted from the sensed data by signal processing techniques and statistical methods. Machine learning algorithms are deployed to select damage-sensitive features as well as classifying the data to trace the anomaly in the response of the structure. Here, the cloud computing resources from Amazon Web Services (AWS) have been used to implement the proposed framework.

  3. The Distribution and Abundance of Bird Species: Towards a Satellite, Data Driven Avian Energetics and Species Richness Model

    NASA Technical Reports Server (NTRS)

    Smith, James A.

    2003-01-01

    This paper addresses the fundamental question of why birds occur where and when they do, i.e., what are the causative factors that determine the spatio-temporal distributions, abundance, or richness of bird species? In this paper we outline the first steps toward building a satellite, data-driven model of avian energetics and species richness based on individual bird physiology, morphology, and interaction with the spatio-temporal habitat. To evaluate our model, we will use the North American Breeding Bird Survey and Christmas Bird Count data for species richness, wintering and breeding range. Long term and current satellite data series include AVHRR, Landsat, and MODIS.

  4. Methods, systems and devices for detecting threatening objects and for classifying magnetic data

    DOEpatents

    Kotter, Dale K [Shelley, ID; Roybal, Lyle G [Idaho Falls, ID; Rohrbaugh, David T [Idaho Falls, ID; Spencer, David F [Idaho Falls, ID

    2012-01-24

    A method for detecting threatening objects in a security screening system. The method includes a step of classifying unique features of magnetic data as representing a threatening object. Another step includes acquiring magnetic data. Another step includes determining if the acquired magnetic data comprises a unique feature.

  5. An Examination of the Effectiveness of Child Endangerment Laws in Preventing Child Fatalities in Alcohol-Involved Motor Vehicle Crashes

    PubMed Central

    Kelley-Baker, Tara; Romano, Eduardo

    2016-01-01

    Objective: The aim of this study was to assess the impact of U.S. child-endangerment laws on the prevalence of child passengers fatally injured in motor vehicle crashes in which the adult driver was drinking. Method: We used data from the 2002–2012 Fatality Analysis Reporting System. We conducted both bivariate and multivariate analyses using Heckman selection models. Results: After adjusting for several cofactors, including driver demographics and blood alcohol concentration, child seat positioning, and seat belt laws, we found that passing a DUI child-endangerment law may have no impact at all on the likelihood of finding impaired drivers among those driving with children. Conclusions: There are a number of reasons why DUI child-endangerment laws have not been effective in saving the lives of young passengers who are driven by adult drinking drivers. These reasons include lack of publicity and education, as well as issues related to enforcement. Potential solutions are suggested that include examining sanctions and strengthening of DUI child endangerment laws. PMID:27588542

  6. Evolving Frameworks for Different Communities of Scientists and End Users

    NASA Astrophysics Data System (ADS)

    Graves, S. J.; Keiser, K.

    2016-12-01

    Two evolving frameworks for interdisciplinary science will be described in the context of the Common Data Framework for Earth-Observation Data and the importance of standards and protocols. The Event Data Driven Delivery (ED3) Framework, funded by NASA Applied Sciences, provides the delivery of data based on predetermined subscriptions and associated workflows to various communities of end users. ED3's capabilities are used by scientists, as well as policy and resource managers, when event alerts are triggered to respond to their needs. The EarthCube Integration and Testing Environment (ECITE) Assessment Framework for Technology Interoperability and Integration is being developed to facilitate the EarthCube community's assessment of NSF funded technologies addressing Earth science problems. ECITE is addressing the translation of geoscience researchers' use cases into technology use case that apply EarthCube-funded building block technologies (and other existing technologies) for solving science problems. EarthCube criteria for technology assessment include the use of data, metadata and service standards to improve interoperability and integration across program components. The long-range benefit will be the growth of a cyberinfrastructure with technology components that have been shown to work together to solve known science objectives.

  7. Condom Use among Immigrant Latino Sexual Minorities: Multilevel Analysis after Respondent-Driven Sampling

    PubMed Central

    Rhodes, Scott D.; McCoy, Thomas P.

    2014-01-01

    This study explored correlates of condom use within a respondent-driven sample of 190 Spanish-speaking immigrant Latino sexual minorities, including gay and bisexual men, other men who have sex with men (MSM), and transgender person, in North Carolina. Five analytic approaches for modeling data collected using respondent-driven sampling (RDS) were compared. Across most approaches, knowledge of HIV and sexually transmitted infections (STIs) and increased condom use self-efficacy predicted consistent condom use and increased homophobia predicted decreased consistent condom use. The same correlates were not significant in all analyses but were consistent in most. Clustering due to recruitment chains was low, while clustering due to recruiter was substantial. This highlights the importance accounting for clustering when analyzing RDS data. PMID:25646728

  8. Does faint galaxy clustering contradict gravitational instability?

    NASA Technical Reports Server (NTRS)

    Melott, Adrian L.

    1992-01-01

    It has been argued, based on the weakness of clustering of faint galaxies, that these objects cannot be the precursors of present galaxies in a simple Einstein-de Sitter model universe with clustering driven by gravitational instability. It is shown that the assumptions made about the growth of clustering were too restrictive. In such a universe, the growth of clustering can easily be fast enough to match the data.

  9. DeDaL: Cytoscape 3 app for producing and morphing data-driven and structure-driven network layouts.

    PubMed

    Czerwinska, Urszula; Calzone, Laurence; Barillot, Emmanuel; Zinovyev, Andrei

    2015-08-14

    Visualization and analysis of molecular profiling data together with biological networks are able to provide new mechanistic insights into biological functions. Currently, it is possible to visualize high-throughput data on top of pre-defined network layouts, but they are not always adapted to a given data analysis task. A network layout based simultaneously on the network structure and the associated multidimensional data might be advantageous for data visualization and analysis in some cases. We developed a Cytoscape app, which allows constructing biological network layouts based on the data from molecular profiles imported as values of node attributes. DeDaL is a Cytoscape 3 app, which uses linear and non-linear algorithms of dimension reduction to produce data-driven network layouts based on multidimensional data (typically gene expression). DeDaL implements several data pre-processing and layout post-processing steps such as continuous morphing between two arbitrary network layouts and aligning one network layout with respect to another one by rotating and mirroring. The combination of all these functionalities facilitates the creation of insightful network layouts representing both structural network features and correlation patterns in multivariate data. We demonstrate the added value of applying DeDaL in several practical applications, including an example of a large protein-protein interaction network. DeDaL is a convenient tool for applying data dimensionality reduction methods and for designing insightful data displays based on data-driven layouts of biological networks, built within Cytoscape environment. DeDaL is freely available for downloading at http://bioinfo-out.curie.fr/projects/dedal/.

  10. Palaeo-sea-level and palaeo-ice-sheet databases: problems, strategies, and perspectives

    NASA Astrophysics Data System (ADS)

    Düsterhus, André; Rovere, Alessio; Carlson, Anders E.; Horton, Benjamin P.; Klemann, Volker; Tarasov, Lev; Barlow, Natasha L. M.; Bradwell, Tom; Clark, Jorie; Dutton, Andrea; Gehrels, W. Roland; Hibbert, Fiona D.; Hijma, Marc P.; Khan, Nicole; Kopp, Robert E.; Sivan, Dorit; Törnqvist, Torbjörn E.

    2016-04-01

    Sea-level and ice-sheet databases have driven numerous advances in understanding the Earth system. We describe the challenges and offer best strategies that can be adopted to build self-consistent and standardised databases of geological and geochemical information used to archive palaeo-sea-levels and palaeo-ice-sheets. There are three phases in the development of a database: (i) measurement, (ii) interpretation, and (iii) database creation. Measurement should include the objective description of the position and age of a sample, description of associated geological features, and quantification of uncertainties. Interpretation of the sample may have a subjective component, but it should always include uncertainties and alternative or contrasting interpretations, with any exclusion of existing interpretations requiring a full justification. During the creation of a database, an approach based on accessibility, transparency, trust, availability, continuity, completeness, and communication of content (ATTAC3) must be adopted. It is essential to consider the community that creates and benefits from a database. We conclude that funding agencies should not only consider the creation of original data in specific research-question-oriented projects, but also include the possibility of using part of the funding for IT-related and database creation tasks, which are essential to guarantee accessibility and maintenance of the collected data.

  11. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  12. Calibration of a distributed hydrologic model using observed spatial patterns from MODIS data

    NASA Astrophysics Data System (ADS)

    Demirel, Mehmet C.; González, Gorka M.; Mai, Juliane; Stisen, Simon

    2016-04-01

    Distributed hydrologic models are typically calibrated against streamflow observations at the outlet of the basin. Along with these observations from gauging stations, satellite based estimates offer independent evaluation data such as remotely sensed actual evapotranspiration (aET) and land surface temperature. The primary objective of the study is to compare model calibrations against traditional downstream discharge measurements with calibrations against simulated spatial patterns and combinations of both types of observations. While the discharge based model calibration typically improves the temporal dynamics of the model, it seems to give rise to minimum improvement of the simulated spatial patterns. In contrast, objective functions specifically targeting the spatial pattern performance could potentially increase the spatial model performance. However, most modeling studies, including the model formulations and parameterization, are not designed to actually change the simulated spatial pattern during calibration. This study investigates the potential benefits of incorporating spatial patterns from MODIS data to calibrate the mesoscale hydrologic model (mHM). This model is selected as it allows for a change in the spatial distribution of key soil parameters through the optimization of pedo-transfer function parameters and includes options for using fully distributed daily Leaf Area Index (LAI) values directly as input. In addition the simulated aET can be estimated at a spatial resolution suitable for comparison to the spatial patterns observed with MODIS data. To increase our control on spatial calibration we introduced three additional parameters to the model. These new parameters are part of an empirical equation to the calculate crop coefficient (Kc) from daily LAI maps and used to update potential evapotranspiration (PET) as model inputs. This is done instead of correcting/updating PET with just a uniform (or aspect driven) factor used in the mHM model (version 5.3). We selected the 20 most important parameters out of 53 mHM parameters based on a comprehensive sensitivity analysis (Cuntz et al., 2015). We calibrated 1km-daily mHM for the Skjern basin in Denmark using the Shuffled Complex Evolution (SCE) algorithm and inputs at different spatial scales i.e. meteorological data at 10km and morphological data at 250 meters. We used correlation coefficients between observed monthly (summer months only) MODIS data calculated from cloud free days over the calibration period from 2001 to 2008 and simulated aET from mHM over the same period. Similarly other metrics, e.g mapcurves and fraction skill-score, are also included in our objective function to assess the co-location of the grid-cells. The preliminary results show that multi-objective calibration of mHM against observed streamflow and spatial patterns together does not significantly reduce the spatial errors in aET while it improves the streamflow simulations. This is a strong signal for further investigation of the multi parameter regionalization affecting spatial aET patterns and weighting the spatial metrics in the objective function relative to the streamflow metrics.

  13. Defining datasets and creating data dictionaries for quality improvement and research in chronic disease using routinely collected data: an ontology-driven approach.

    PubMed

    de Lusignan, Simon; Liaw, Siaw-Teng; Michalakidis, Georgios; Jones, Simon

    2011-01-01

    The burden of chronic disease is increasing, and research and quality improvement will be less effective if case finding strategies are suboptimal. To describe an ontology-driven approach to case finding in chronic disease and how this approach can be used to create a data dictionary and make the codes used in case finding transparent. A five-step process: (1) identifying a reference coding system or terminology; (2) using an ontology-driven approach to identify cases; (3) developing metadata that can be used to identify the extracted data; (4) mapping the extracted data to the reference terminology; and (5) creating the data dictionary. Hypertension is presented as an exemplar. A patient with hypertension can be represented by a range of codes including diagnostic, history and administrative. Metadata can link the coding system and data extraction queries to the correct data mapping and translation tool, which then maps it to the equivalent code in the reference terminology. The code extracted, the term, its domain and subdomain, and the name of the data extraction query can then be automatically grouped and published online as a readily searchable data dictionary. An exemplar online is: www.clininf.eu/qickd-data-dictionary.html Adopting an ontology-driven approach to case finding could improve the quality of disease registers and of research based on routine data. It would offer considerable advantages over using limited datasets to define cases. This approach should be considered by those involved in research and quality improvement projects which utilise routine data.

  14. A data driven method for estimation of B(avail) and appK(D) using a single injection protocol with [¹¹C]raclopride in the mouse.

    PubMed

    Wimberley, Catriona J; Fischer, Kristina; Reilhac, Anthonin; Pichler, Bernd J; Gregoire, Marie Claude

    2014-10-01

    The partial saturation approach (PSA) is a simple, single injection experimental protocol that will estimate both B(avail) and appK(D) without the use of blood sampling. This makes it ideal for use in longitudinal studies of neurodegenerative diseases in the rodent. The aim of this study was to increase the range and applicability of the PSA by developing a data driven strategy for determining reliable regional estimates of receptor density (B(avail)) and in vivo affinity (1/appK(D)), and validate the strategy using a simulation model. The data driven method uses a time window guided by the dynamic equilibrium state of the system as opposed to using a static time window. To test the method, simulations of partial saturation experiments were generated and validated against experimental data. The experimental conditions simulated included a range of receptor occupancy levels and three different B(avail) and appK(D) values to mimic diseases states. Also the effect of using a reference region and typical PET noise on the stability and accuracy of the estimates was investigated. The investigations showed that the parameter estimates in a simulated healthy mouse, using the data driven method were within 10±30% of the simulated input for the range of occupancy levels simulated. Throughout all experimental conditions simulated, the accuracy and robustness of the estimates using the data driven method were much improved upon the typical method of using a static time window, especially at low receptor occupancy levels. Introducing a reference region caused a bias of approximately 10% over the range of occupancy levels. Based on extensive simulated experimental conditions, it was shown the data driven method provides accurate and precise estimates of B(avail) and appK(D) for a broader range of conditions compared to the original method. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. AnaBench: a Web/CORBA-based workbench for biomolecular sequence analysis

    PubMed Central

    Badidi, Elarbi; De Sousa, Cristina; Lang, B Franz; Burger, Gertraud

    2003-01-01

    Background Sequence data analyses such as gene identification, structure modeling or phylogenetic tree inference involve a variety of bioinformatics software tools. Due to the heterogeneity of bioinformatics tools in usage and data requirements, scientists spend much effort on technical issues including data format, storage and management of input and output, and memorization of numerous parameters and multi-step analysis procedures. Results In this paper, we present the design and implementation of AnaBench, an interactive, Web-based bioinformatics Analysis workBench allowing streamlined data analysis. Our philosophy was to minimize the technical effort not only for the scientist who uses this environment to analyze data, but also for the administrator who manages and maintains the workbench. With new bioinformatics tools published daily, AnaBench permits easy incorporation of additional tools. This flexibility is achieved by employing a three-tier distributed architecture and recent technologies including CORBA middleware, Java, JDBC, and JSP. A CORBA server permits transparent access to a workbench management database, which stores information about the users, their data, as well as the description of all bioinformatics applications that can be launched from the workbench. Conclusion AnaBench is an efficient and intuitive interactive bioinformatics environment, which offers scientists application-driven, data-driven and protocol-driven analysis approaches. The prototype of AnaBench, managed by a team at the Université de Montréal, is accessible on-line at: . Please contact the authors for details about setting up a local-network AnaBench site elsewhere. PMID:14678565

  16. A data-driven approach to quality risk management.

    PubMed

    Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David

    2013-10-01

    An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. ONLY A SUBSET OF THE RISK FACTORS HAD A SIGNIFICANT ASSOCIATION WITH QUALITY ISSUES, AND INCLUDED: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety.

  17. Inattentional Blindness and Individual Differences in Cognitive Abilities.

    PubMed

    Kreitz, Carina; Furley, Philip; Memmert, Daniel; Simons, Daniel J

    2015-01-01

    People sometimes fail to notice salient unexpected objects when their attention is otherwise occupied, a phenomenon known as inattentional blindness. To explore individual differences in inattentional blindness, we employed both static and dynamic tasks that either presented the unexpected object away from the focus of attention (spatial) or near the focus of attention (central). We hypothesized that noticing in central tasks might be driven by the availability of cognitive resources like working memory, and that noticing in spatial tasks might be driven by the limits on spatial attention like attention breadth. However, none of the cognitive measures predicted noticing in the dynamic central task or in either the static or dynamic spatial task. Only in the central static task did working memory capacity predict noticing, and that relationship was fairly weak. Furthermore, whether or not participants noticed an unexpected object in a static task was only weakly associated with their odds of noticing an unexpected object in a dynamic task. Taken together, our results are largely consistent with the notion that noticing unexpected objects is driven more by stochastic processes common to all people than by stable individual differences in cognitive abilities.

  18. Inattentional Blindness and Individual Differences in Cognitive Abilities

    PubMed Central

    Kreitz, Carina; Furley, Philip; Memmert, Daniel; Simons, Daniel J.

    2015-01-01

    People sometimes fail to notice salient unexpected objects when their attention is otherwise occupied, a phenomenon known as inattentional blindness. To explore individual differences in inattentional blindness, we employed both static and dynamic tasks that either presented the unexpected object away from the focus of attention (spatial) or near the focus of attention (central). We hypothesized that noticing in central tasks might be driven by the availability of cognitive resources like working memory, and that noticing in spatial tasks might be driven by the limits on spatial attention like attention breadth. However, none of the cognitive measures predicted noticing in the dynamic central task or in either the static or dynamic spatial task. Only in the central static task did working memory capacity predict noticing, and that relationship was fairly weak. Furthermore, whether or not participants noticed an unexpected object in a static task was only weakly associated with their odds of noticing an unexpected object in a dynamic task. Taken together, our results are largely consistent with the notion that noticing unexpected objects is driven more by stochastic processes common to all people than by stable individual differences in cognitive abilities. PMID:26258545

  19. Combined effects of inversion and feature removal on N170 responses elicited by faces and car fronts.

    PubMed

    Kloth, Nadine; Itier, Roxane J; Schweinberger, Stefan R

    2013-04-01

    The face-sensitive N170 is typically enhanced for inverted compared to upright faces. Itier, Alain, Sedore, and McIntosh (2007) recently suggested that this N170 inversion effect is mainly driven by the eye region which becomes salient when the face configuration is disrupted. Here we tested whether similar effects could be observed with non-face objects that are structurally similar to faces in terms of possessing a homogeneous within-class first-order feature configuration. We presented upright and inverted pictures of intact car fronts, car fronts without lights, and isolated lights, in addition to analogous face conditions. Upright cars elicited substantial N170 responses of similar amplitude to those evoked by upright faces. In strong contrast to face conditions however, the car-elicited N170 was mainly driven by the global shape rather than the presence or absence of lights, and was dramatically reduced for isolated lights. Overall, our data confirm a differential influence of the eye region in upright and inverted faces. Results for car fronts do not suggest similar interactive encoding of eye-like features and configuration for non-face objects, even when these objects possess a similar feature configuration as faces. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Visual context modulates potentiation of grasp types during semantic object categorization.

    PubMed

    Kalénine, Solène; Shapiro, Allison D; Flumini, Andrea; Borghi, Anna M; Buxbaum, Laurel J

    2014-06-01

    Substantial evidence suggests that conceptual processing of manipulable objects is associated with potentiation of action. Such data have been viewed as evidence that objects are recognized via access to action features. Many objects, however, are associated with multiple actions. For example, a kitchen timer may be clenched with a power grip to move it but pinched with a precision grip to use it. The present study tested the hypothesis that action evocation during conceptual object processing is responsive to the visual scene in which objects are presented. Twenty-five healthy adults were asked to categorize object pictures presented in different naturalistic visual contexts that evoke either move- or use-related actions. Categorization judgments (natural vs. artifact) were performed by executing a move- or use-related action (clench vs. pinch) on a response device, and response times were assessed as a function of contextual congruence. Although the actions performed were irrelevant to the categorization judgment, responses were significantly faster when actions were compatible with the visual context. This compatibility effect was largely driven by faster pinch responses when objects were presented in use-compatible, as compared with move-compatible, contexts. The present study is the first to highlight the influence of visual scene on stimulus-response compatibility effects during semantic object processing. These data support the hypothesis that action evocation during conceptual object processing is biased toward context-relevant actions.

  1. Visual context modulates potentiation of grasp types during semantic object categorization

    PubMed Central

    Kalénine, Solène; Shapiro, Allison D.; Flumini, Andrea; Borghi, Anna M.; Buxbaum, Laurel J.

    2013-01-01

    Substantial evidence suggests that conceptual processing of manipulable objects is associated with potentiation of action. Such data have been viewed as evidence that objects are recognized via access to action features. Many objects, however, are associated with multiple actions. For example, a kitchen timer may be clenched with a power grip to move it, but pinched with a precision grip to use it. The present study tested the hypothesis that action evocation during conceptual object processing is responsive to the visual scene in which objects are presented. Twenty-five healthy adults were asked to categorize object pictures presented in different naturalistic visual contexts that evoke either move- or use-related actions. Categorization judgments (natural vs. artifact) were performed by executing a move- or use-related action (clench vs. pinch) on a response device, and response times were assessed as a function of contextual congruence. Although the actions performed were irrelevant to the categorization judgment, responses were significantly faster when actions were compatible with the visual context. This compatibility effect was largely driven by faster pinch responses when objects were presented in use- compared to move-compatible contexts. The present study is the first to highlight the influence of visual scene on stimulus-response compatibility effects during semantic object processing. These data support the hypothesis that action evocation during conceptual object processing is biased toward context-relevant actions. PMID:24186270

  2. Process Architecture for Managing Digital Object Identifiers

    NASA Astrophysics Data System (ADS)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and underlying system implemented by ESDIS for registering DOIs, as well as some of the lessons learned from the development of the process. In addition, this paper will summarize the recommendations made by the DOI Process and DOI Landing Page User Working Groups, and the procedures developed for implementing those recommendations.

  3. The prioritisation of invasive alien plant control projects using a multi-criteria decision model informed by stakeholder input and spatial data.

    PubMed

    Forsyth, G G; Le Maitre, D C; O'Farrell, P J; van Wilgen, B W

    2012-07-30

    Invasions by alien plants are a significant threat to the biodiversity and functioning of ecosystems and the services they provide. The South African Working for Water program was established to address this problem. It needs to formulate objective and transparent priorities for clearing in the face of multiple and sometimes conflicting demands. This study used the analytic hierarchy process (a multi-criteria decision support technique) to develop and rank criteria for prioritising alien plant control operations in the Western Cape, South Africa. Stakeholder workshops were held to identify a goal and criteria and to conduct pair-wise comparisons to weight the criteria with respect to invasive alien plant control. The combination of stakeholder input (to develop decision models) with data-driven model solutions enabled us to include many alternatives (water catchments), that would otherwise not have been feasible. The most important criteria included the capacity to maintain gains made through control operations, the potential to enhance water resources and conserve biodiversity, and threats from priority invasive alien plant species. We selected spatial datasets and used them to generate weights that could be used to objectively compare alternatives with respect to agreed criteria. The analysis showed that there are many high priority catchments which are not receiving any funding and low priority catchments which are receiving substantial allocations. Clearly, there is a need for realigning priorities, including directing sufficient funds to the highest priority catchments to provide effective control. This approach provided a tractable, consensus-based solution that can be used to direct clearing operations. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Cutting the Composite Gordian Knot: Untangling the AGN-Starburst Threads in Single Aperture Spectra

    NASA Astrophysics Data System (ADS)

    Flury, Sophia; Moran, Edward C.

    2018-01-01

    Standard emission line diagnostics are able to segregate star-forming galaxies and Seyfert nuclei, and it is often assumed that ambiguous emission-line galaxies falling between these two populations are “composite” objects exhibiting both types of photoionization. We have developed a method that predicts the most probable H II and AGN components that could plausibly explain the “composite” classed objects solely on the basis of their SDSS spectra. The majority of our analysis is driven by empirical relationships revealed by SDSS data rather than theoretical models founded in assumptions. To verify our method, we have compared the predictions of our model with publicly released IFU data from the S7 survey and find that composite objects are not in fact a simple linear combination of the two types of emission. The data reveal a key component in the mixing sequence: geometric dilution of the ionizing radiation which powers the NLR of the active nucleus. When accounting for this effect, our model is successful when applied to several composite-class galaxies. Some objects, however, appear to be at variance with the predicted results, suggesting they may not be powered by black hole accretion.

  5. Electromagnetic Properties Analysis on Hybrid-driven System of Electromagnetic Motor

    NASA Astrophysics Data System (ADS)

    Zhao, Jingbo; Han, Bingyuan; Bei, Shaoyi

    2018-01-01

    The hybrid-driven system made of permanent-and electromagnets applied in the electromagnetic motor was analyzed, equivalent magnetic circuit was used to establish the mathematical models of hybrid-driven system, based on the models of hybrid-driven system, the air gap flux, air-gap magnetic flux density, electromagnetic force was proposed. Taking the air-gap magnetic flux density and electromagnetic force as main research object, the hybrid-driven system was researched. Electromagnetic properties of hybrid-driven system with different working current modes is studied preliminary. The results shown that analysis based on hybrid-driven system can improve the air-gap magnetic flux density and electromagnetic force more effectively and can also guarantee the output stability, the effectiveness and feasibility of the hybrid-driven system are verified, which proved theoretical basis for the design of hybrid-driven system.

  6. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    PubMed

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well. © The Author(s) 2013.

  7. Landscape heritage objects' effect on driving: a combined driving simulator and questionnaire study.

    PubMed

    Antonson, Hans; Ahlström, Christer; Mårdh, Selina; Blomqvist, Göran; Wiklund, Mats

    2014-01-01

    According to the literature, landscape (panoramas, heritage objects e.g. landmarks) affects people in various ways. Data are primarily developed by asking people (interviews, photo sessions, focus groups) about their preferences, but to a lesser degree by measuring how the body reacts to such objects. Personal experience while driving a car through a landscape is even more rare. In this paper we study how different types of objects in the landscape affect drivers during their drive. A high-fidelity moving-base driving simulator was used to measure choice of speed and lateral position in combination with stress (heart rate measure) and eye tracking. The data were supplemented with questionnaires. Eighteen test drivers (8 men and 10 women) with a mean age of 37 were recruited. The test drivers were exposed to different new and old types of landscape objects such as 19th century church, wind turbine, 17th century milestone and bus stop, placed at different distances from the road driven. The findings are in some respect contradictory, but it was concluded that that 33% of the test drivers felt stressed during the drive. All test drivers said that they had felt calm at times during the drive but the reason for this was only to a minor degree connected with old and modern objects. The open landscape was experienced as conducive to acceleration. Most objects were, to a small degree, experienced (subjective data) as having a speed-reducing effect, much in line with the simulator data (objective data). Objects close to the road affected the drivers' choice of' lateral position. No significant differences could be observed concerning the test drivers' gaze between old or modern objects, but a significant difference was observed between the test drivers' gaze between road stretches with faraway objects and stretches without objects. No meaningful, significant differences were found for the drivers' stress levels as measured by heart rate. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Active vibrations and noise control for turboprop application research program activities

    NASA Technical Reports Server (NTRS)

    Paonessa, A.; Concilio, A.; Lecce, Leonardo V.

    1992-01-01

    The objectives of this work include the following: (1) development of active noise control techniques to alleviate inefficiencies and drawbacks of passive noise control approach especially at low frequencies; (2) reduction of structurally radiated noise applying external forces to the vibrating structure by means of force actuators made of piezoelectric material; and (3) reduction of fuselage vibration levels in propeller driven aircraft by means of distributed piezoelectric actuators that are actively controlled.

  9. Active vibrations and noise control for turboprop application research program activities

    NASA Astrophysics Data System (ADS)

    Paonessa, A.; Concilio, A.; Lecce, Leonardo V.

    1992-07-01

    The objectives of this work include the following: (1) development of active noise control techniques to alleviate inefficiencies and drawbacks of passive noise control approach especially at low frequencies; (2) reduction of structurally radiated noise applying external forces to the vibrating structure by means of force actuators made of piezoelectric material; and (3) reduction of fuselage vibration levels in propeller driven aircraft by means of distributed piezoelectric actuators that are actively controlled.

  10. Data-Driven Haptic Modeling and Rendering of Viscoelastic and Frictional Responses of Deformable Objects.

    PubMed

    Yim, Sunghoon; Jeon, Seokhee; Choi, Seungmoon

    2016-01-01

    In this paper, we present an extended data-driven haptic rendering method capable of reproducing force responses during pushing and sliding interaction on a large surface area. The main part of the approach is a novel input variable set for the training of an interpolation model, which incorporates the position of a proxy - an imaginary contact point on the undeformed surface. This allows us to estimate friction in both sliding and sticking states in a unified framework. Estimating the proxy position is done in real-time based on simulation using a sliding yield surface - a surface defining a border between the sliding and sticking regions in the external force space. During modeling, the sliding yield surface is first identified via an automated palpation procedure. Then, through manual palpation on a target surface, input data and resultant force data are acquired. The data are used to build a radial basis interpolation model. During rendering, this input-output mapping interpolation model is used to estimate force responses in real-time in accordance with the interaction input. Physical performance evaluation demonstrates that our approach achieves reasonably high estimation accuracy. A user study also shows plausible perceptual realism under diverse and extensive exploration.

  11. As above, so below? Towards understanding inverse models in BCI

    NASA Astrophysics Data System (ADS)

    Lindgren, Jussi T.

    2018-02-01

    Objective. In brain-computer interfaces (BCI), measurements of the user’s brain activity are classified into commands for the computer. With EEG-based BCIs, the origins of the classified phenomena are often considered to be spatially localized in the cortical volume and mixed in the EEG. We investigate if more accurate BCIs can be obtained by reconstructing the source activities in the volume. Approach. We contrast the physiology-driven source reconstruction with data-driven representations obtained by statistical machine learning. We explain these approaches in a common linear dictionary framework and review the different ways to obtain the dictionary parameters. We consider the effect of source reconstruction on some major difficulties in BCI classification, namely information loss, feature selection and nonstationarity of the EEG. Main results. Our analysis suggests that the approaches differ mainly in their parameter estimation. Physiological source reconstruction may thus be expected to improve BCI accuracy if machine learning is not used or where it produces less optimal parameters. We argue that the considered difficulties of surface EEG classification can remain in the reconstructed volume and that data-driven techniques are still necessary. Finally, we provide some suggestions for comparing approaches. Significance. The present work illustrates the relationships between source reconstruction and machine learning-based approaches for EEG data representation. The provided analysis and discussion should help in understanding, applying, comparing and improving such techniques in the future.

  12. The NDVI: Back to basics

    USDA-ARS?s Scientific Manuscript database

    Ease of access to satellite sensor imagery and image products has driven the use of remote sensing data in many disciplines, including landscape ecology, forestry, environmental and wildlife management, agriculture, and epidemiology. A common format of these data is as vegetation indices and of thes...

  13. Adaptive Statistical Iterative Reconstruction-Applied Ultra-Low-Dose CT with Radiography-Comparable Radiation Dose: Usefulness for Lung Nodule Detection

    PubMed Central

    Yoon, Hyun Jung; Hwang, Hye Sun; Moon, Jung Won; Lee, Kyung Soo

    2015-01-01

    Objective To assess the performance of adaptive statistical iterative reconstruction (ASIR)-applied ultra-low-dose CT (ULDCT) in detecting small lung nodules. Materials and Methods Thirty patients underwent both ULDCT and standard dose CT (SCT). After determining the reference standard nodules, five observers, blinded to the reference standard reading results, independently evaluated SCT and both subsets of ASIR- and filtered back projection (FBP)-driven ULDCT images. Data assessed by observers were compared statistically. Results Converted effective doses in SCT and ULDCT were 2.81 ± 0.92 and 0.17 ± 0.02 mSv, respectively. A total of 114 lung nodules were detected on SCT as a standard reference. There was no statistically significant difference in sensitivity between ASIR-driven ULDCT and SCT for three out of the five observers (p = 0.678, 0.735, < 0.01, 0.038, and < 0.868 for observers 1, 2, 3, 4, and 5, respectively). The sensitivity of FBP-driven ULDCT was significantly lower than that of ASIR-driven ULDCT in three out of the five observers (p < 0.01 for three observers, and p = 0.064 and 0.146 for two observers). In jackknife alternative free-response receiver operating characteristic analysis, the mean values of figure-of-merit (FOM) for FBP, ASIR-driven ULDCT, and SCT were 0.682, 0.772, and 0.821, respectively, and there were no significant differences in FOM values between ASIR-driven ULDCT and SCT (p = 0.11), but the FOM value of FBP-driven ULDCT was significantly lower than that of ASIR-driven ULDCT and SCT (p = 0.01 and 0.00). Conclusion Adaptive statistical iterative reconstruction-driven ULDCT delivering a radiation dose of only 0.17 mSv offers acceptable sensitivity in nodule detection compared with SCT and has better performance than FBP-driven ULDCT. PMID:26357505

  14. WIFIRE Data Model and Catalog for Wildfire Data and Tools

    NASA Astrophysics Data System (ADS)

    Altintas, I.; Crawl, D.; Cowart, C.; Gupta, A.; Block, J.; de Callafon, R.

    2014-12-01

    The WIFIRE project (wifire.ucsd.edu) is building an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. WIFIRE may be used by wildfire management authorities in the future to predict wildfire rate of spread and direction, and assess the effectiveness of high-density sensor networks in improving fire and weather predictions. WIFIRE has created a data model for wildfire resources including sensed and archived data, sensors, satellites, cameras, modeling tools, workflows and social information including Twitter feeds. This data model and associated wildfire resource catalog includes a detailed description of the HPWREN sensor network, SDG&E's Mesonet, and NASA MODIS. In addition, the WIFIRE data-model describes how to integrate the data from multiple heterogeneous sources to provide detailed fire-related information. The data catalog describes 'Observables' captured by each instrument using multiple ontologies including OGC SensorML and NASA SWEET. Observables include measurements such as wind speed, air temperature, and relative humidity, as well as their accuracy and resolution. We have implemented a REST service for publishing to and querying from the catalog using Web Application Description Language (WADL). We are creating web-based user interfaces and mobile device Apps that use the REST interface for dissemination to wildfire modeling community and project partners covering academic, private, and government laboratories while generating value to emergency officials and the general public. Additionally, the Kepler scientific workflow system is instrumented to interact with this data catalog to access real-time streaming and archived wildfire data and stream it into dynamic data-driven wildfire models at scale.

  15. Acquisition and Neural Network Prediction of 3D Deformable Object Shape Using a Kinect and a Force-Torque Sensor.

    PubMed

    Tawbe, Bilal; Cretu, Ana-Maria

    2017-05-11

    The realistic representation of deformations is still an active area of research, especially for deformable objects whose behavior cannot be simply described in terms of elasticity parameters. This paper proposes a data-driven neural-network-based approach for capturing implicitly and predicting the deformations of an object subject to external forces. Visual data, in the form of 3D point clouds gathered by a Kinect sensor, is collected over an object while forces are exerted by means of the probing tip of a force-torque sensor. A novel approach based on neural gas fitting is proposed to describe the particularities of a deformation over the selectively simplified 3D surface of the object, without requiring knowledge of the object material. An alignment procedure, a distance-based clustering, and inspiration from stratified sampling support this process. The resulting representation is denser in the region of the deformation (an average of 96.6% perceptual similarity with the collected data in the deformed area), while still preserving the object's overall shape (86% similarity over the entire surface) and only using on average of 40% of the number of vertices in the mesh. A series of feedforward neural networks is then trained to predict the mapping between the force parameters characterizing the interaction with the object and the change in the object shape, as captured by the fitted neural gas nodes. This series of networks allows for the prediction of the deformation of an object when subject to unknown interactions.

  16. MASCOT HTML and XML parser: an implementation of a novel object model for protein identification data.

    PubMed

    Yang, Chunguang G; Granite, Stephen J; Van Eyk, Jennifer E; Winslow, Raimond L

    2006-11-01

    Protein identification using MS is an important technique in proteomics as well as a major generator of proteomics data. We have designed the protein identification data object model (PDOM) and developed a parser based on this model to facilitate the analysis and storage of these data. The parser works with HTML or XML files saved or exported from MASCOT MS/MS ions search in peptide summary report or MASCOT PMF search in protein summary report. The program creates PDOM objects, eliminates redundancy in the input file, and has the capability to output any PDOM object to a relational database. This program facilitates additional analysis of MASCOT search results and aids the storage of protein identification information. The implementation is extensible and can serve as a template to develop parsers for other search engines. The parser can be used as a stand-alone application or can be driven by other Java programs. It is currently being used as the front end for a system that loads HTML and XML result files of MASCOT searches into a relational database. The source code is freely available at http://www.ccbm.jhu.edu and the program uses only free and open-source Java libraries.

  17. The global epidemiology of waterpipe smoking

    PubMed Central

    Maziak, Wasim; Taleb, Ziyad Ben; Bahelah, Raed; Islam, Farahnaz; Jaber, Rana; Auf, Rehab; Salloum, Ramzi G

    2015-01-01

    Objectives In the past decade, waterpipe smoking (a.k.a. hookah, shisha, narghile) has become a global phenomenon. In this review, we provide an updated picture of the main epidemiological trends in waterpipe smoking globally. Data sources Peer-reviewed publications indexed in major biomedical databases between 2004 and 2014. Search keywords included a combination of: waterpipe, hookah, shisha along with epidemiology, patterns, prevalence and predictors. We also used different spellings of waterpipe terms commonly used. Study selection The focus was on studies with large representative samples, national data or high-quality reports that illuminated aspects of the epidemiology and trends in waterpipe smoking. Data extraction Multiple researchers extracted the data independently and collectively decided on the most important and pertinent studies to include in the review. Data synthesis Waterpipe smoking has become a global phenomenon among youth. The global waterpipe epidemic is likely driven by (1) the introduction of manufactured flavoured tobacco (Maassel); (2) the intersection between waterpipe's social dimension and thriving café culture; (3) the evolution of mass communication media; (4) the lack of regulatory/policy framework specific to the waterpipe. Waterpipe smoking is becoming the most popular tobacco use method among youth in the Middle East, and is quickly gaining popularity elsewhere. Important patterns of waterpipe smoking include the predominance among younger, male, high socioeconomic, and urban groups. Intermittent and social use are also noted patterns. Conclusions Waterpipe smoking has become a global public health problem. Developing surveillance, intervention and regulatory/policy frameworks specific to the waterpipe has become a public health priority. PMID:25298368

  18. Teaching Assistant Professional Development in Biology: Designed for and Driven by Multidimensional Data.

    PubMed

    Wyse, Sara A; Long, Tammy M; Ebert-May, Diane

    2014-01-01

    Graduate teaching assistants (TAs) are increasingly responsible for instruction in undergraduate science, technology, engineering, and mathematics (STEM) courses. Various professional development (PD) programs have been developed and implemented to prepare TAs for this role, but data about effectiveness are lacking and are derived almost exclusively from self-reported surveys. In this study, we describe the design of a reformed PD (RPD) model and apply Kirkpatrick's Evaluation Framework to evaluate multiple outcomes of TA PD before, during, and after implementing RPD. This framework allows evaluation that includes both direct measures and self-reported data. In RPD, TAs created and aligned learning objectives and assessments and incorporated more learner-centered instructional practices in their teaching. However, these data are inconsistent with TAs' self-reported perceptions about RPD and suggest that single measures are insufficient to evaluate TA PD programs. © 2014 Wyse et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  19. A System for Measurement of Convection Aboard Space Station

    NASA Technical Reports Server (NTRS)

    Bogatyrev, Gennady P.; Gorbunov, Aleksei V; Putin, Gennady F.; Ivanov, Alexander I.; Nikitin, Sergei A.; Polezhaev, Vadim I.

    1996-01-01

    A simple device for direct measurement of buoyancy driven fluid flows in a low-gravity environment is proposed. A system connecting spacecraft accelerometers data and results of thermal convection in enclosure measurements and numerical simulations is developed. This system will permit also to evaluate the low frequency microacceleration component. The goal of the paper is to present objectives and current results of ground-based experimental and numerical modeling of this convection detector.

  20. Palaeo sea-level and ice-sheet databases: problems, strategies and perspectives

    NASA Astrophysics Data System (ADS)

    Rovere, Alessio; Düsterhus, André; Carlson, Anders; Barlow, Natasha; Bradwell, Tom; Dutton, Andrea; Gehrels, Roland; Hibbert, Fiona; Hijma, Marc; Horton, Benjamin; Klemann, Volker; Kopp, Robert; Sivan, Dorit; Tarasov, Lev; Törnqvist, Torbjorn

    2016-04-01

    Databases of palaeoclimate data have driven many major developments in understanding the Earth system. The measurement and interpretation of palaeo sea-level and ice-sheet data that form such databases pose considerable challenges to the scientific communities that use them for further analyses. In this paper, we build on the experience of the PALSEA (PALeo constraints on SEA level rise) community, which is a working group inside the PAGES (Past Global Changes) project, to describe the challenges and best strategies that can be adopted to build a self-consistent and standardised database of geological and geochemical data related to palaeo sea levels and ice sheets. Our aim in this paper is to identify key points that need attention and subsequent funding when undertaking the task of database creation. We conclude that any sea-level or ice-sheet database must be divided into three instances: i) measurement; ii) interpretation; iii) database creation. Measurement should include postion, age, description of geological features, and quantification of uncertainties. All must be described as objectively as possible. Interpretation can be subjective, but it should always include uncertainties and include all the possible interpretations, without unjustified a priori exclusions. We propose that, in the creation of a database, an approach based on Accessibility, Transparency, Trust, Availability, Continued updating, Completeness and Communication of content (ATTAC3) must be adopted. Also, it is essential to consider the community structure that creates and benefits of a database. We conclude that funding sources should consider to address not only the creation of original data in specific research-question oriented projects, but also include the possibility to use part of the funding for IT-related and database creation tasks, which are essential to guarantee accessibility and maintenance of the collected data.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ranganathan, V; Kumar, P; Bzdusek, K

    Purpose: We propose a novel data-driven method to predict the achievability of clinical objectives upfront before invoking the IMRT optimization. Methods: A new metric called “Geometric Complexity (GC)” is used to estimate the achievability of clinical objectives. Here, GC is the measure of the number of “unmodulated” beamlets or rays that intersect the Region-of-interest (ROI) and the target volume. We first compute the geometric complexity ratio (GCratio) between the GC of a ROI (say, parotid) in a reference plan and the GC of the same ROI in a given plan. The GCratio of a ROI indicates the relative geometric complexitymore » of the ROI as compared to the same ROI in the reference plan. Hence GCratio can be used to predict if a defined clinical objective associated with the ROI can be met by the optimizer for a given case. Basically a higher GCratio indicates a lesser likelihood for the optimizer to achieve the clinical objective defined for a given ROI. Similarly, a lower GCratio indicates a higher likelihood for the optimizer to achieve the clinical objective defined for the given ROI. We have evaluated the proposed method on four Head and Neck cases using Pinnacle3 (version 9.10.0) Treatment Planning System (TPS). Results: Out of the total of 28 clinical objectives from four head and neck cases included in the study, 25 were in agreement with the prediction, which implies an agreement of about 85% between predicted and obtained results. The Pearson correlation test shows a positive correlation between predicted and obtained results (Correlation = 0.82, r2 = 0.64, p < 0.005). Conclusion: The study demonstrates the feasibility of the proposed method in head and neck cases for predicting the achievability of clinical objectives with reasonable accuracy.« less

  2. Abuse and Misuse of Selected Dietary Supplements Among Adolescents: a Look at Poison Center Data

    PubMed Central

    Morgan, Jill A.; Lardieri, Allison B.; Kishk, Omayma A.; Klein-Schwartz, Wendy

    2017-01-01

    OBJECTIVE The use of dietary supplements has increased and is associated with adverse effects. Indications for use include recreation, body image concerns, mood enhancement, or control of medical conditions. The risk of adverse effects may be enhanced if agents are used improperly. The objective of this study was to determine the frequency of abuse and misuse of 4 dietary substances among adolescents reported nationally to poison centers. Secondary outcomes included an assessment of medical outcomes, clinical effects, location of treatments provided, and treatments administered. METHODS This descriptive retrospective review assessed data concerning the use of garcinia (Garcinia cambogia), guarana (Paullinia cupana), salvia (Salvia divinorum), and St John's wort (Hypericum perforatum) among adolescents reported nationally to poison centers from 2003 to 2014. Adolescents with a singlesubstance exposure to one of the substances of interest coded as intentional abuse or misuse were included. Poison center calls for drug information or those with unrelated clinical effects were excluded. Data were collected from the National Poison Data System. RESULTS There were 84 cases: 7 cases of Garcinia cambogia, 28 Paullinia cupana, 23 Salvia divinorum, and 26 Hypericum perforatum. Garcinia cambogia was used more frequently by females (100% versus 0%), and Paullinia cupana and Salvia divinorum were used more frequently by males (61% versus 36% and 91% versus 9%, respectively). Abuse, driven by Salvia divinorum, was more common overall than misuse. Abuse was also more common among males than females (p <0.001). Use of these agents fluctuated over time. Overall, use trended down since 2010, except for Garcinia cambogia use. In 62 cases (73.8%), the medical outcome was minor or had no effect or was judged as nontoxic or minimally toxic. Clinical effects were most common with Paullinia cupana and Salvia divinorum. Treatment sites included emergency department (n = 33; 39.3%), non-healthcare facility (n = 24; 28.6%), admission to a health care facility (n = 8; 9.5%), and other/unknown (n = 19; 22.6%). CONCLUSIONS Abuse and misuse of these dietary supplements was uncommon, and outcomes were mild. Further research should be performed to determine use and outcomes of abuse/misuse of other dietary supplements in this population. PMID:29290737

  3. Shindigs, brunches, and rodeos: the neural basis of event words.

    PubMed

    Bedny, Marina; Dravida, Swethasri; Saxe, Rebecca

    2014-09-01

    Events (e.g., "running" or "eating") constitute a basic type within human cognition and human language. We asked whether thinking about events, as compared to other conceptual categories, depends on partially independent neural circuits. Indirect evidence for this hypothesis comes from previous studies showing elevated posterior temporal responses to verbs, which typically label events. Neural responses to verbs could, however, be driven either by their grammatical or by their semantic properties. In the present experiment, we separated the effects of grammatical class (verb vs. noun) and semantic category (event vs. object) by measuring neural responses to event nouns (e.g., "the hurricane"). Participants rated the semantic relatedness of event nouns, as well as of two categories of object nouns-animals (e.g., "the alligator") and plants (e.g., "the acorn")-and three categories of verbs-manner of motion (e.g., "to roll"), emission (e.g., "to sparkle"), and perception (e.g., "to gaze"). As has previously been observed, we found larger responses to verbs than to object nouns in the left posterior middle (LMTG) and superior (LSTG) temporal gyri. Crucially, we also found that the LMTG responds more to event than to object nouns. These data suggest that part of the posterior lateral temporal response to verbs is driven by their semantic properties. By contrast, a more superior region, at the junction of the temporal and parietal cortices, responded more to verbs than to all nouns, irrespective of their semantic category. We concluded that the neural mechanisms engaged when thinking about event and object categories are partially dissociable.

  4. Prototype Development: Context-Driven Dynamic XML Ophthalmologic Data Capture Application.

    PubMed

    Peissig, Peggy; Schwei, Kelsey M; Kadolph, Christopher; Finamore, Joseph; Cancel, Efrain; McCarty, Catherine A; Okorie, Asha; Thomas, Kate L; Allen Pacheco, Jennifer; Pathak, Jyotishman; Ellis, Stephen B; Denny, Joshua C; Rasmussen, Luke V; Tromp, Gerard; Williams, Marc S; Vrabec, Tamara R; Brilliant, Murray H

    2017-09-13

    The capture and integration of structured ophthalmologic data into electronic health records (EHRs) has historically been a challenge. However, the importance of this activity for patient care and research is critical. The purpose of this study was to develop a prototype of a context-driven dynamic extensible markup language (XML) ophthalmologic data capture application for research and clinical care that could be easily integrated into an EHR system. Stakeholders in the medical, research, and informatics fields were interviewed and surveyed to determine data and system requirements for ophthalmologic data capture. On the basis of these requirements, an ophthalmology data capture application was developed to collect and store discrete data elements with important graphical information. The context-driven data entry application supports several features, including ink-over drawing capability for documenting eye abnormalities, context-based Web controls that guide data entry based on preestablished dependencies, and an adaptable database or XML schema that stores Web form specifications and allows for immediate changes in form layout or content. The application utilizes Web services to enable data integration with a variety of EHRs for retrieval and storage of patient data. This paper describes the development process used to create a context-driven dynamic XML data capture application for optometry and ophthalmology. The list of ophthalmologic data elements identified as important for care and research can be used as a baseline list for future ophthalmologic data collection activities. ©Peggy Peissig, Kelsey M Schwei, Christopher Kadolph, Joseph Finamore, Efrain Cancel, Catherine A McCarty, Asha Okorie, Kate L Thomas, Jennifer Allen Pacheco, Jyotishman Pathak, Stephen B Ellis, Joshua C Denny, Luke V Rasmussen, Gerard Tromp, Marc S Williams, Tamara R Vrabec, Murray H Brilliant. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.09.2017.

  5. Contrasting analytical and data-driven frameworks for radiogenomic modeling of normal tissue toxicities in prostate cancer.

    PubMed

    Coates, James; Jeyaseelan, Asha K; Ybarra, Norma; David, Marc; Faria, Sergio; Souhami, Luis; Cury, Fabio; Duclos, Marie; El Naqa, Issam

    2015-04-01

    We explore analytical and data-driven approaches to investigate the integration of genetic variations (single nucleotide polymorphisms [SNPs] and copy number variations [CNVs]) with dosimetric and clinical variables in modeling radiation-induced rectal bleeding (RB) and erectile dysfunction (ED) in prostate cancer patients. Sixty-two patients who underwent curative hypofractionated radiotherapy (66 Gy in 22 fractions) between 2002 and 2010 were retrospectively genotyped for CNV and SNP rs5489 in the xrcc1 DNA repair gene. Fifty-four patients had full dosimetric profiles. Two parallel modeling approaches were compared to assess the risk of severe RB (Grade⩾3) and ED (Grade⩾1); Maximum likelihood estimated generalized Lyman-Kutcher-Burman (LKB) and logistic regression. Statistical resampling based on cross-validation was used to evaluate model predictive power and generalizability to unseen data. Integration of biological variables xrcc1 CNV and SNP improved the fit of the RB and ED analytical and data-driven models. Cross-validation of the generalized LKB models yielded increases in classification performance of 27.4% for RB and 14.6% for ED when xrcc1 CNV and SNP were included, respectively. Biological variables added to logistic regression modeling improved classification performance over standard dosimetric models by 33.5% for RB and 21.2% for ED models. As a proof-of-concept, we demonstrated that the combination of genetic and dosimetric variables can provide significant improvement in NTCP prediction using analytical and data-driven approaches. The improvement in prediction performance was more pronounced in the data driven approaches. Moreover, we have shown that CNVs, in addition to SNPs, may be useful structural genetic variants in predicting radiation toxicities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Social comparison modulates reward-driven attentional capture.

    PubMed

    Jiao, Jun; Du, Feng; He, Xiaosong; Zhang, Kan

    2015-10-01

    It is well established that attention can be captured by task irrelevant and non-salient objects associated with value through reward learning. However, it is unknown whether social comparison influences reward-driven attentional capture. The present study created four social contexts to examine whether different social comparisons modulate the reward-driven capture of attention. The results showed that reward-driven attentional capture varied with different social comparison conditions. Most prominently, reward-driven attentional capture is dramatically reduced in the disadvantageous social comparison context, in which an individual is informed that the other participant is earning more monetary reward for performing the same task. These findings suggest that social comparison can affect the reward-driven capture of attention.

  7. Georgia concrete pavement performance and longevity.

    DOT National Transportation Integrated Search

    2012-02-01

    The Georgia Department of Transportation (GDOT) has effectively utilized its pavement management system (PMS) to make informed, data-driven pavement maintenance decisions, including project selection, project prioritization, and funding allocation. C...

  8. Driving Ms. Data: Creating Data-Driven Possibilities

    ERIC Educational Resources Information Center

    Hoffman, Richard

    2005-01-01

    This article describes how driven Web sites help schools and districts maximize their IT resources by making online content more "self-service" for users. It shows how to set up the capacity to create data-driven sites. By definition, a data-driven Web site is one in which the content comes from some back-end data source, such as a…

  9. Financial Summary, Nanofiltration Data, and Lithium Uptake Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jay Renew

    Integrated testing of nanofiltration and lithium uptake subsystems using synthetic geothermal brine. Also includes a financial summary (Pro Forma) of the proposed 'Geothermal Thermoelectric Generation (G-TEG) with Integrated Temperature Driven Membrane Distillation and Novel Manganese Oxide for Lithium Extraction' (first pass 500 gpm).

  10. Data-Driven Surface Traversability Analysis for Mars 2020 Landing Site Selection

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Rothrock, Brandon; Almeida, Eduardo; Ansar, Adnan; Otero, Richard; Huertas, Andres; Heverly, Matthew

    2015-01-01

    The objective of this paper is three-fold: 1) to describe the engineering challenges in the surface mobility of the Mars 2020 Rover mission that are considered in the landing site selection processs, 2) to introduce new automated traversability analysis capabilities, and 3) to present the preliminary analysis results for top candidate landing sites. The analysis capabilities presented in this paper include automated terrain classification, automated rock detection, digital elevation model (DEM) generation, and multi-ROI (region of interest) route planning. These analysis capabilities enable to fully utilize the vast volume of high-resolution orbiter imagery, quantitatively evaluate surface mobility requirements for each candidate site, and reject subjectivity in the comparison between sites in terms of engineering considerations. The analysis results supported the discussion in the Second Landing Site Workshop held in August 2015, which resulted in selecting eight candidate sites that will be considered in the third workshop.

  11. A knowledge-based support system for mechanical ventilation of the lungs. The KUSIVAR concept and prototype.

    PubMed

    Rudowski, R; Frostell, C; Gill, H

    1989-09-01

    The KUSIVAR is an expert system for mechanical ventilation of adult patients suffering from respiratory insufficiency. Its main objective is to provide guidance in respirator management. The knowledge base includes both qualitative, rule-based knowledge and quantitative knowledge expressed in the form of mathematical models (expert control) which is used for prediction of arterial gas tensions and optimization purposes. The system is data driven and uses a forward chaining mechanism for rule invocation. The interaction with the user will be performed in advisory, critiquing, semi-automatic and automatic modes. The system is at present in an advanced prototype stage. Prototyping is performed using KEE (Knowledge Engineering Environment) on a Sperry Explorer workstation. For further development and clinical use the expert system will be downloaded to an advanced PC. The system is intended to support therapy with a Siemens-Elema Servoventilator 900 C.

  12. Algebraic reasoning for the enhancement of data-driven building reconstructions

    NASA Astrophysics Data System (ADS)

    Meidow, Jochen; Hammer, Horst

    2016-04-01

    Data-driven approaches for the reconstruction of buildings feature the flexibility needed to capture objects of arbitrary shape. To recognize man-made structures, geometric relations such as orthogonality or parallelism have to be detected. These constraints are typically formulated as sets of multivariate polynomials. For the enforcement of the constraints within an adjustment process, a set of independent and consistent geometric constraints has to be determined. Gröbner bases are an ideal tool to identify such sets exactly. A complete workflow for geometric reasoning is presented to obtain boundary representations of solids based on given point clouds. The constraints are formulated in homogeneous coordinates, which results in simple polynomials suitable for the successful derivation of Gröbner bases for algebraic reasoning. Strategies for the reduction of the algebraical complexity are presented. To enforce the constraints, an adjustment model is introduced, which is able to cope with homogeneous coordinates along with their singular covariance matrices. The feasibility and the potential of the approach are demonstrated by the analysis of a real data set.

  13. Modeling radiation belt electron dynamics during GEM challenge intervals with the DREAM3D diffusion model

    NASA Astrophysics Data System (ADS)

    Tu, Weichao; Cunningham, G. S.; Chen, Y.; Henderson, M. G.; Camporeale, E.; Reeves, G. D.

    2013-10-01

    a response to the Geospace Environment Modeling (GEM) "Global Radiation Belt Modeling Challenge," a 3D diffusion model is used to simulate the radiation belt electron dynamics during two intervals of the Combined Release and Radiation Effects Satellite (CRRES) mission, 15 August to 15 October 1990 and 1 February to 31 July 1991. The 3D diffusion model, developed as part of the Dynamic Radiation Environment Assimilation Model (DREAM) project, includes radial, pitch angle, and momentum diffusion and mixed pitch angle-momentum diffusion, which are driven by dynamic wave databases from the statistical CRRES wave data, including plasmaspheric hiss, lower-band, and upper-band chorus. By comparing the DREAM3D model outputs to the CRRES electron phase space density (PSD) data, we find that, with a data-driven boundary condition at Lmax = 5.5, the electron enhancements can generally be explained by radial diffusion, though additional local heating from chorus waves is required. Because the PSD reductions are included in the boundary condition at Lmax = 5.5, our model captures the fast electron dropouts over a large L range, producing better model performance compared to previous published results. Plasmaspheric hiss produces electron losses inside the plasmasphere, but the model still sometimes overestimates the PSD there. Test simulations using reduced radial diffusion coefficients or increased pitch angle diffusion coefficients inside the plasmasphere suggest that better wave models and more realistic radial diffusion coefficients, both inside and outside the plasmasphere, are needed to improve the model performance. Statistically, the results show that, with the data-driven outer boundary condition, including radial diffusion and plasmaspheric hiss is sufficient to model the electrons during geomagnetically quiet times, but to best capture the radiation belt variations during active times, pitch angle and momentum diffusion from chorus waves are required.

  14. Big Data in Medicine is Driving Big Changes

    PubMed Central

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  15. Teachers' Intentions to Use National Literacy and Numeracy Assessment Data: A Pilot Study

    ERIC Educational Resources Information Center

    Pierce, Robyn; Chick, Helen

    2011-01-01

    In recent years the educational policy environment has emphasised data-driven change. This has increased the expectation for school personnel to use statistical information to inform their programs and to improve teaching practices. Such data include system reports of student achievement tests and socio-economic profiles provided to schools by…

  16. Between Oais and Agile a Dynamic Data Management Approach

    NASA Astrophysics Data System (ADS)

    Bennett, V. L.; Conway, E. A.; Waterfall, A. M.; Pepler, S.

    2015-12-01

    In this paper we decribe an approach to the integration of existing archival activities which lies between compliance with the more rigid OAIS/TRAC standards and a more flexible "Agile" approach to the curation and preservation of Earth Observation data. We provide a high level overview of existing practice and discuss how these procedures can be extended and supported through the description of preservation state. The aim of which is to facilitate the dynamic controlled management of scientific data through its lifecycle. While processes are considered they are not statically defined but rather driven by human interactions in the form of risk management/review procedure that produce actionable plans, which are responsive to change. We then proceed by describing the feasibility testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospheric Data Centre and NERC Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2 Petabytes of data and 137 million individual files, which were analysed and characterised in terms of format, based risk. We are then able to present an overview of the format based risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets We continue by presenting a dynamic data management information model and provide discussion of the following core model entities and their relationships: Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans which support responsive interactions with the community. The Result entities describe the outcomes of the plans. This includes Acquisitions. Mitigations and Accepted Risks. With risk acceptance permitting imperfect but functional solutions that can be realistically supported within an archives resource levels

  17. Data-driven Science in Geochemistry & Petrology: Vision & Reality

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Ghiorso, M. S.; Spear, F. S.

    2013-12-01

    Science in many fields is increasingly ';data-driven'. Though referred to as a ';new' Fourth Paradigm (Hey, 2009), data-driven science is not new, and examples are cited in the Geochemical Society's data policy, including the compilation of Dziewonski & Anderson (1981) that led to PREM, and Zindler & Hart (1986), who compiled mantle isotope data to present for the first time a comprehensive view of the Earth's mantle. Today, rapidly growing data volumes, ubiquity of data access, and new computational and information management technologies enable data-driven science at a radically advanced scale of speed, extent, flexibility, and inclusiveness, with the ability to seamlessly synthesize observations, experiments, theory, and computation, and to statistically mine data across disciplines, leading to more comprehensive, well informed, and high impact scientific advances. Are geochemists, petrologists, and volcanologists ready to participate in this revolution of the scientific process? In the past year, researchers from the VGP community and related disciplines have come together at several cyberinfrastructure related workshops, in part prompted by the EarthCube initiative of the US NSF, to evaluate the status of cyberinfrastructure in their field, to put forth key scientific challenges, and identify primary data and software needs to address these. Science scenarios developed by workshop participants that range from non-equilibrium experiments focusing on mass transport, chemical reactions, and phase transformations (J. Hammer) to defining the abundance of elements and isotopes in every voxel in the Earth (W. McDonough), demonstrate the potential of cyberinfrastructure enabled science, and define the vision of how data access, visualization, analysis, computation, and cross-domain interoperability can and should support future research in VGP. The primary obstacle for data-driven science in VGP remains the dearth of accessible, integrated data from lab and sensor measurements, experiments, and models, both from past and from present studies, and their poor discoverability, interoperability, and standardization. Other deficiencies include the lack of widespread sample curation and online sample catalogs, and broad community support and enforcement of open data sharing policies and a strategy for sustained funding and operation of the cyberinfrastructure. In order to achieve true data-driven science in geochemistry and petrology, one of the primary requirements is to change the way data and models are managed and shared to dramatically improve their access and re-usability. Adoption of new data publication practices, new ways of citing data that ensure attribution and credit to authors, tools that help investigators to seamlessly manage their data throughout the data life cycle, from the point of acquisition to upload to repositories, and population of databases with historical data are among the most urgent needs. The community, especially early career scientists, must work together to produce the cultural shift within the discipline toward sharing of data and knowledge, virtual collaboration, and social networking. Dziewonski, A M, & Anderson, D L: Physics of the Earth and Planet Interiors 25 (4), 297 (1981) Hey, T, Tansley, S, Tolle, K (Eds.): Redmond, VA: Microsoft Research (2009) Zindler, A, & Hart, S R: Ann. Rev. Earth Plan. Sci. 14, 493 (1986)

  18. Task-based data-acquisition optimization for sparse image reconstruction systems

    NASA Astrophysics Data System (ADS)

    Chen, Yujia; Lou, Yang; Kupinski, Matthew A.; Anastasio, Mark A.

    2017-03-01

    Conventional wisdom dictates that imaging hardware should be optimized by use of an ideal observer (IO) that exploits full statistical knowledge of the class of objects to be imaged, without consideration of the reconstruction method to be employed. However, accurate and tractable models of the complete object statistics are often difficult to determine in practice. Moreover, in imaging systems that employ compressive sensing concepts, imaging hardware and (sparse) image reconstruction are innately coupled technologies. We have previously proposed a sparsity-driven ideal observer (SDIO) that can be employed to optimize hardware by use of a stochastic object model that describes object sparsity. The SDIO and sparse reconstruction method can therefore be "matched" in the sense that they both utilize the same statistical information regarding the class of objects to be imaged. To efficiently compute SDIO performance, the posterior distribution is estimated by use of computational tools developed recently for variational Bayesian inference. Subsequently, the SDIO test statistic can be computed semi-analytically. The advantages of employing the SDIO instead of a Hotelling observer are systematically demonstrated in case studies in which magnetic resonance imaging (MRI) data acquisition schemes are optimized for signal detection tasks.

  19. Quantum correlations from a room-temperature optomechanical cavity

    NASA Astrophysics Data System (ADS)

    Purdy, T. P.; Grutter, K. E.; Srinivasan, K.; Taylor, J. M.

    2017-06-01

    The act of position measurement alters the motion of an object being measured. This quantum measurement backaction is typically much smaller than the thermal motion of a room-temperature object and thus difficult to observe. By shining laser light through a nanomechanical beam, we measure the beam’s thermally driven vibrations and perturb its motion with optical force fluctuations at a level dictated by the Heisenberg measurement-disturbance uncertainty relation. We demonstrate a cross-correlation technique to distinguish optically driven motion from thermally driven motion, observing this quantum backaction signature up to room temperature. We use the scale of the quantum correlations, which is determined by fundamental constants, to gauge the size of thermal motion, demonstrating a path toward absolute thermometry with quantum mechanically calibrated ticks.

  20. Data-Driven Identification of Risk Factors of Patient Satisfaction at a Large Urban Academic Medical Center.

    PubMed

    Li, Li; Lee, Nathan J; Glicksberg, Benjamin S; Radbill, Brian D; Dudley, Joel T

    2016-01-01

    The Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey is the first publicly reported nationwide survey to evaluate and compare hospitals. Increasing patient satisfaction is an important goal as it aims to achieve a more effective and efficient healthcare delivery system. In this study, we develop and apply an integrative, data-driven approach to identify clinical risk factors that associate with patient satisfaction outcomes. We included 1,771 unique adult patients who completed the HCAHPS survey and were discharged from the inpatient Medicine service from 2010 to 2012. We collected 266 clinical features including patient demographics, lab measurements, medications, disease categories, and procedures. We developed and applied a data-driven approach to identify risk factors that associate with patient satisfaction outcomes. We identify 102 significant risk factors associating with 18 surveyed questions. The most significantly recurrent clinical risk factors were: self-evaluation of health, education level, Asian, White, treatment in BMT oncology division, being prescribed a new medication. Patients who were prescribed pregabalin were less satisfied particularly in relation to communication with nurses and pain management. Explanation of medication usage was associated with communication with nurses (q = 0.001); however, explanation of medication side effects was associated with communication with doctors (q = 0.003). Overall hospital rating was associated with hospital environment, communication with doctors, and communication about medicines. However, patient likelihood to recommend hospital was associated with hospital environment, communication about medicines, pain management, and communication with nurse. Our study identified a number of putatively novel clinical risk factors for patient satisfaction that suggest new opportunities to better understand and manage patient satisfaction. Hospitals can use a data-driven approach to identify clinical risk factors for poor patient satisfaction to support development of specific interventions to improve patients' experience of care.

  1. Action-Driven Visual Object Tracking With Deep Reinforcement Learning.

    PubMed

    Yun, Sangdoo; Choi, Jongwon; Yoo, Youngjoon; Yun, Kimin; Choi, Jin Young

    2018-06-01

    In this paper, we propose an efficient visual tracker, which directly captures a bounding box containing the target object in a video by means of sequential actions learned using deep neural networks. The proposed deep neural network to control tracking actions is pretrained using various training video sequences and fine-tuned during actual tracking for online adaptation to a change of target and background. The pretraining is done by utilizing deep reinforcement learning (RL) as well as supervised learning. The use of RL enables even partially labeled data to be successfully utilized for semisupervised learning. Through the evaluation of the object tracking benchmark data set, the proposed tracker is validated to achieve a competitive performance at three times the speed of existing deep network-based trackers. The fast version of the proposed method, which operates in real time on graphics processing unit, outperforms the state-of-the-art real-time trackers with an accuracy improvement of more than 8%.

  2. Perceptual basis of evolving Western musical styles

    PubMed Central

    Rodriguez Zivic, Pablo H.; Shifres, Favio; Cecchi, Guillermo A.

    2013-01-01

    The brain processes temporal statistics to predict future events and to categorize perceptual objects. These statistics, called expectancies, are found in music perception, and they span a variety of different features and time scales. Specifically, there is evidence that music perception involves strong expectancies regarding the distribution of a melodic interval, namely, the distance between two consecutive notes within the context of another. The recent availability of a large Western music dataset, consisting of the historical record condensed as melodic interval counts, has opened new possibilities for data-driven analysis of musical perception. In this context, we present an analytical approach that, based on cognitive theories of music expectation and machine learning techniques, recovers a set of factors that accurately identifies historical trends and stylistic transitions between the Baroque, Classical, Romantic, and Post-Romantic periods. We also offer a plausible musicological and cognitive interpretation of these factors, allowing us to propose them as data-driven principles of melodic expectation. PMID:23716669

  3. Efficiency disparities among community hospitals in Tennessee: do size, location, ownership, and network matter?

    PubMed

    Roh, Chul-Young; Moon, M Jae; Jung, Kwangho

    2013-11-01

    This study examined the impact of ownership, size, location, and network on the relative technical efficiency of community hospitals in Tennessee for the 2002-2006 period, by applying data envelopment analysis (DEA) to measure technical efficiency (decomposed into scale efficiency and pure technical efficiency). Data envelopment analysis results indicate that medium-size hospitals (126-250 beds) are more efficient than their counterparts. Interestingly, public hospitals are significantly more efficient than private and nonprofit hospitals in Tennessee, and rural hospitals are more efficient than urban hospitals. This is the first study to investigate whether hospital networks with other health care providers affect hospital efficiency. Results indicate that community hospitals with networks are more efficient than non-network hospitals. From a management and policy perspective, this study suggests that public policies should induce hospitals to downsize or upsize into optional size, and private hospitals and nonprofit hospitals should change their organizational objectives from profit-driven to quality-driven.

  4. The Real Time Display Builder (RTDB)

    NASA Technical Reports Server (NTRS)

    Kindred, Erick D.; Bailey, Samuel A., Jr.

    1989-01-01

    The Real Time Display Builder (RTDB) is a prototype interactive graphics tool that builds logic-driven displays. These displays reflect current system status, implement fault detection algorithms in real time, and incorporate the operational knowledge of experienced flight controllers. RTDB utilizes an object-oriented approach that integrates the display symbols with the underlying operational logic. This approach allows the user to specify the screen layout and the driving logic as the display is being built. RTDB is being developed under UNIX in C utilizing the MASSCOMP graphics environment with appropriate functional separation to ease portability to other graphics environments. RTDB grew from the need to develop customized real-time data-driven Space Shuttle systems displays. One display, using initial functionality of the tool, was operational during the orbit phase of STS-26 Discovery. RTDB is being used to produce subsequent displays for the Real Time Data System project currently under development within the Mission Operations Directorate at NASA/JSC. The features of the tool, its current state of development, and its applications are discussed.

  5. Mixture-based gatekeeping procedures in adaptive clinical trials.

    PubMed

    Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji

    2018-01-01

    Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.

  6. Intrinsic Bayesian Active Contours for Extraction of Object Boundaries in Images

    PubMed Central

    Srivastava, Anuj

    2010-01-01

    We present a framework for incorporating prior information about high-probability shapes in the process of contour extraction and object recognition in images. Here one studies shapes as elements of an infinite-dimensional, non-linear quotient space, and statistics of shapes are defined and computed intrinsically using differential geometry of this shape space. Prior models on shapes are constructed using probability distributions on tangent bundles of shape spaces. Similar to the past work on active contours, where curves are driven by vector fields based on image gradients and roughness penalties, we incorporate the prior shape knowledge in the form of vector fields on curves. Through experimental results, we demonstrate the use of prior shape models in the estimation of object boundaries, and their success in handling partial obscuration and missing data. Furthermore, we describe the use of this framework in shape-based object recognition or classification. PMID:21076692

  7. Balancing the popularity bias of object similarities for personalised recommendation

    NASA Astrophysics Data System (ADS)

    Hou, Lei; Pan, Xue; Liu, Kecheng

    2018-03-01

    Network-based similarity measures have found wide applications in recommendation algorithms and made significant contributions for uncovering users' potential interests. However, existing measures are generally biased in terms of popularity, that the popular objects tend to have more common neighbours with others and thus are considered more similar to others. Such popularity bias of similarity quantification will result in the biased recommendations, with either poor accuracy or poor diversity. Based on the bipartite network modelling of the user-object interactions, this paper firstly calculates the expected number of common neighbours of two objects with given popularities in random networks. A Balanced Common Neighbour similarity index is accordingly developed by removing the random-driven common neighbours, estimated as the expected number, from the total number. Recommendation experiments in three data sets show that balancing the popularity bias in a certain degree can significantly improve the recommendations' accuracy and diversity simultaneously.

  8. Object localization in handheld thermal images for fireground understanding

    NASA Astrophysics Data System (ADS)

    Vandecasteele, Florian; Merci, Bart; Jalalvand, Azarakhsh; Verstockt, Steven

    2017-05-01

    Despite the broad application of the handheld thermal imaging cameras in firefighting, its usage is mostly limited to subjective interpretation by the person carrying the device. As remedies to overcome this limitation, object localization and classification mechanisms could assist the fireground understanding and help with the automated localization, characterization and spatio-temporal (spreading) analysis of the fire. An automated understanding of thermal images can enrich the conventional knowledge-based firefighting techniques by providing the information from the data and sensing-driven approaches. In this work, transfer learning is applied on multi-labeling convolutional neural network architectures for object localization and recognition in monocular visual, infrared and multispectral dynamic images. Furthermore, the possibility of analyzing fire scene images is studied and their current limitations are discussed. Finally, the understanding of the room configuration (i.e., objects location) for indoor localization in reduced visibility environments and the linking with Building Information Models (BIM) are investigated.

  9. Data-driven modelling of social forces and collective behaviour in zebrafish.

    PubMed

    Zienkiewicz, Adam K; Ladu, Fabrizio; Barton, David A W; Porfiri, Maurizio; Bernardo, Mario Di

    2018-04-14

    Zebrafish are rapidly emerging as a powerful model organism in hypothesis-driven studies targeting a number of functional and dysfunctional processes. Mathematical models of zebrafish behaviour can inform the design of experiments, through the unprecedented ability to perform pilot trials on a computer. At the same time, in-silico experiments could help refining the analysis of real data, by enabling the systematic investigation of key neurobehavioural factors. Here, we establish a data-driven model of zebrafish social interaction. Specifically, we derive a set of interaction rules to capture the primary response mechanisms which have been observed experimentally. Contrary to previous studies, we include dynamic speed regulation in addition to turning responses, which together provide attractive, repulsive and alignment interactions between individuals. The resulting multi-agent model provides a novel, bottom-up framework to describe both the spontaneous motion and individual-level interaction dynamics of zebrafish, inferred directly from experimental observations. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Real World Data Driven Evolution of Volvo Cars’ Side Impact Protection Systems and their Effectiveness

    PubMed Central

    Jakobsson, Lotta; Lindman, Magdalena; Svanberg, Bo; Carlsson, Henrik

    2010-01-01

    This study analyses the outcome of the continuous improved occupant protection over the last two decades for front seat near side occupants in side impacts based on a real world driven working process. The effectiveness of four generations of improved side impact protection are calculated based on data from Volvo’s statistical accident database of Volvo Cars in Sweden. Generation I includes vehicles with a new structural and interior concept (SIPS). Generation II includes vehicles with structural improvements and a new chest airbag (SIPSbag). Generation III includes vehicles with further improved SIPS and SIPSbag as well as the new concept with a head protecting Inflatable Curtain (IC). Generation IV includes the most recent vehicles with further improvements of all the systems plus advanced sensors and seat belt pretensioner activation. Compared to baseline vehicles, vehicles of generation I reduce MAIS2+ injuries by 54%, generation II by 61% and generation III by 72%. For generation IV effectiveness figures cannot be calculated because of the lack of MAIS2+ injuries. A continuous improved performance is also seen when studying the AIS2+ pelvis, abdomen, chest and head injuries separately. By using the same real world driven working process, future improvements and possibly new passive as well as active safety systems, will be developed with the aim of further improved protection to near side occupants in side impacts. PMID:21050597

  11. General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark

    2010-01-01

    Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.

  12. Framework for developing hybrid process-driven, artificial neural network and regression models for salinity prediction in river systems

    NASA Astrophysics Data System (ADS)

    Hunter, Jason M.; Maier, Holger R.; Gibbs, Matthew S.; Foale, Eloise R.; Grosvenor, Naomi A.; Harders, Nathan P.; Kikuchi-Miller, Tahali C.

    2018-05-01

    Salinity modelling in river systems is complicated by a number of processes, including in-stream salt transport and various mechanisms of saline accession that vary dynamically as a function of water level and flow, often at different temporal scales. Traditionally, salinity models in rivers have either been process- or data-driven. The primary problem with process-based models is that in many instances, not all of the underlying processes are fully understood or able to be represented mathematically. There are also often insufficient historical data to support model development. The major limitation of data-driven models, such as artificial neural networks (ANNs) in comparison, is that they provide limited system understanding and are generally not able to be used to inform management decisions targeting specific processes, as different processes are generally modelled implicitly. In order to overcome these limitations, a generic framework for developing hybrid process and data-driven models of salinity in river systems is introduced and applied in this paper. As part of the approach, the most suitable sub-models are developed for each sub-process affecting salinity at the location of interest based on consideration of model purpose, the degree of process understanding and data availability, which are then combined to form the hybrid model. The approach is applied to a 46 km reach of the Murray River in South Australia, which is affected by high levels of salinity. In this reach, the major processes affecting salinity include in-stream salt transport, accession of saline groundwater along the length of the reach and the flushing of three waterbodies in the floodplain during overbank flows of various magnitudes. Based on trade-offs between the degree of process understanding and data availability, a process-driven model is developed for in-stream salt transport, an ANN model is used to model saline groundwater accession and three linear regression models are used to account for the flushing of the different floodplain storages. The resulting hybrid model performs very well on approximately 3 years of daily validation data, with a Nash-Sutcliffe efficiency (NSE) of 0.89 and a root mean squared error (RMSE) of 12.62 mg L-1 (over a range from approximately 50 to 250 mg L-1). Each component of the hybrid model results in noticeable improvements in model performance corresponding to the range of flows for which they are developed. The predictive performance of the hybrid model is significantly better than that of a benchmark process-driven model (NSE = -0.14, RMSE = 41.10 mg L-1, Gbench index = 0.90) and slightly better than that of a benchmark data-driven (ANN) model (NSE = 0.83, RMSE = 15.93 mg L-1, Gbench index = 0.36). Apart from improved predictive performance, the hybrid model also has advantages over the ANN benchmark model in terms of increased capacity for improving system understanding and greater ability to support management decisions.

  13. Eyes On the Ground: Path Forward Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brost, Randolph; Little, Charles Q.; peter-stein, natacha

    A previous report assesses our progress to date on the Eyes On the Ground project, and reviews lessons learned [1]. In this report, we address the implications of those lessons in defining the most productive path forward for the remainder of the project. We propose two main concepts: Interactive Diagnosis and Model-Driven Assistance. Among these, the Model-Driven Assistance concept appears the most promising. The Model-Driven Assistance concept is based on an approximate but useful model of a facility, which provides a unified representation for storing, viewing, and analyzing data that is known about the facility. This representation provides value tomore » both inspectors and IAEA headquarters, and facilitates communication between the two. The concept further includes a lightweight, portable field tool to aid the inspector in executing a variety of inspection tasks, including capture of images and 3-d scan data. We develop a detailed description of this concept, including its system components, functionality, and example use cases. The envisioned tool would provide value by reducing inspector cognitive load, streamlining inspection tasks, and facilitating communication between the inspector and teams at IAEA headquarters. We conclude by enumerating the top implementation priorities to pursue in the remaining limited time of the project. Approved for public release; further dissemination unlimited.« less

  14. Implementing RDA Data Citation Recommendations: Case Study in South Africa

    NASA Astrophysics Data System (ADS)

    Hugo, Wim

    2016-04-01

    SAEON operates a shared research data infrastructure for its own data sets and for clients and end users in the Earth and Environmental Sciences domain. SAEON has a license to issue Digital Object Identifiers via DataCite on behalf of third parties, and have recently concluded development work to make a universal data deposit, description, and DOI minting facility available. This facility will be used to develop a number of end user gateways, including DataCite South Africa (in collaboration with National Research Foundation and addressing all grant-funded research in the country), DIRISA (Data-intensive Research Infrastructure for South Africa - in collaboration with Meraka Institute and Department of Science and Technology), and SASDI (South African Spatial Data Infrastructure). The RDA recently published Data Citation Recommendations [1], and this was used as a basis for specification of Digital Object Identifier implementation, raising two significant challenges: 1. Synchronisation of frequently harvested meta-data sets where version management practice did not align with the RDA recommendations, and 2. Handling sub-sets of and queries on large, continuously updated data sets. In the first case, we have developed a set of tests that determine the logical course of action when discrepancies are found during synchronization, and we have incorporated these into meta-data harvester configurations. Additionally, we have developed a state diagram and attendant workflow for meta-data that includes problem states emanating from DOI management, reporting services for data depositors, and feedback to end users in respect of synchronisation issues. In the second case, in the absence of firm guidelines from DataCite, we are seeking community consensus and feedback on an approach that caches all queries performed and subsets derived from data, and provide these with anchor-style extensions linked to the dataset's original DOI. This allows extended DOIs to resolve to a meta-data page on which the cached data set is available as an anchored download link.All cached datasets are provided with checksum values to verify the contents against such copies as may exist. The paper reviews recent service-driven portal interface developments, both services and graphical user interfaces, including wizard-style, configurable applications for meta-data management and DOI minting, discovery, download, visualization, and reporting. It showcases examples of the two permanent identifier problem areas and how these were addressed. The paper concludes with contributions to open research questions, including (1) determining optimal meta-data granularity and (2) proposing an implementation guideline for extended DOIs. [1] A. Rauber, D. van Uytvanck, A. Asmi, S. Pröll, "Data Citation Recommendations", November 2015, RDA. https://rd-alliance.org/group/data-citation-wg/outcomes/data-citation-recommendation.htm

  15. Data driven approaches vs. qualitative approaches in climate change impact and vulnerability assessment.

    NASA Astrophysics Data System (ADS)

    Zebisch, Marc; Schneiderbauer, Stefan; Petitta, Marcello

    2015-04-01

    In the last decade the scope of climate change science has broadened significantly. 15 years ago the focus was mainly on understanding climate change, providing climate change scenarios and giving ideas about potential climate change impacts. Today, adaptation to climate change has become an increasingly important field of politics and one role of science is to inform and consult this process. Therefore, climate change science is not anymore focusing on data driven approaches only (such as climate or climate impact models) but is progressively applying and relying on qualitative approaches including opinion and expertise acquired through interactive processes with local stakeholders and decision maker. Furthermore, climate change science is facing the challenge of normative questions, such us 'how important is a decrease of yield in a developed country where agriculture only represents 3% of the GDP and the supply with agricultural products is strongly linked to global markets and less depending on local production?'. In this talk we will present examples from various applied research and consultancy projects on climate change vulnerabilities including data driven methods (e.g. remote sensing and modelling) to semi-quantitative and qualitative assessment approaches. Furthermore, we will discuss bottlenecks, pitfalls and opportunities in transferring climate change science to policy and decision maker oriented climate services.

  16. Pharmacist-driven antimicrobial stewardship in intensive care units in East China: A multicenter prospective cohort study.

    PubMed

    Li, Zhongwang; Cheng, Baoli; Zhang, Kai; Xie, Guohao; Wang, Yan; Hou, Jinchao; Chu, Lihua; Zhao, Jialian; Xu, Zhijun; Lu, Zhongqiu; Sun, Huaqin; Zhang, Jian; Wang, Zhiyi; Wu, Haiya; Fang, Xiangming

    2017-09-01

    Antimicrobial stewardship programs, particularly pharmacist-driven programs, help reduce the unnecessary use of antimicrobial agents. The objective of this study was to assess the influence of pharmacist-driven antimicrobial stewardship on antimicrobial use, multidrug resistance, and patient outcomes in adult intensive care units in China. We conducted a multicenter prospective cohort study with a sample of 577 patients. A total of 353 patients were included under a pharmacist-driven antimicrobial stewardship program, whereas the remaining 224 patients served as controls. The primary outcome was all-cause hospital mortality. The pharmacist-driven antimicrobial stewardship program had a lower hospital mortality rate compared with the nonpharmacist program (19.3% vs 29.0%; P = .007). Furthermore, logistic regression analysis indicated that the pharmacist-driven program independently predicted hospital mortality (odds ratio, 0.57; 95% confidence interval, 0.36-0.91; P = .017) after adjustment. Meanwhile, this strategy had a lower rate of multidrug resistance (23.8% vs 31.7%; P = .037). Moreover, the strategy optimized antimicrobial use, such as having a shorter duration of empirical antimicrobial therapy (2.7 days; interquartile range [IQR], 1.7-4.6 vs 3.0; IQR, 1.9-6.2; P = .002) and accumulated duration of antimicrobial treatment (4.0; IQR, 2.0-7.0 vs 5.0; IQR, 3.0-9.5; P = .030). Pharmacist-driven antimicrobial stewardship in an intensive care unit decreased patient mortality and the emergence of multidrug resistance, and optimized antimicrobial agent use. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  17. Nano-ADEPT Aeroloads Wind Tunnel Test

    NASA Technical Reports Server (NTRS)

    Smith, Brandon; Cassell, A.; Yount, B.; Kruger, C.; Brivkalns, C.; Makino, A.; Zarchi, K.; McDaniel, R.; Venkatapathy, E.; Swanson, G.

    2015-01-01

    Analysis completed since the test suggests that all test objectives were met– This claim will be verified in the coming weeks as the data is examined further– Final disposition of test objective success will be documented in a final reportsubmitted to NASA stakeholders (early August 2015)– Expect conference paper in early 2016• Data products and observations made during testing will be used to refinecomputational models of Nano-ADEPT• Carbon fabric relaxed from its pre-test state during the test– System-level tolerance for relaxation will be driven by destination-specific andmission-specific aerothermal and aerodynamic requirements• Bonus experiment of asymmetric shape demonstrates that an asymmetricdeployable blunt body can be used to generate measureable lift– With a strut actuation system and a robust GN&C algorithm, this effect could beused to steer a blunt body at hypersonic speeds to aid precision landing

  18. Data-Driven Zero-Sum Neuro-Optimal Control for a Class of Continuous-Time Unknown Nonlinear Systems With Disturbance Using ADP.

    PubMed

    Wei, Qinglai; Song, Ruizhuo; Yan, Pengfei

    2016-02-01

    This paper is concerned with a new data-driven zero-sum neuro-optimal control problem for continuous-time unknown nonlinear systems with disturbance. According to the input-output data of the nonlinear system, an effective recurrent neural network is introduced to reconstruct the dynamics of the nonlinear system. Considering the system disturbance as a control input, a two-player zero-sum optimal control problem is established. Adaptive dynamic programming (ADP) is developed to obtain the optimal control under the worst case of the disturbance. Three single-layer neural networks, including one critic and two action networks, are employed to approximate the performance index function, the optimal control law, and the disturbance, respectively, for facilitating the implementation of the ADP method. Convergence properties of the ADP method are developed to show that the system state will converge to a finite neighborhood of the equilibrium. The weight matrices of the critic and the two action networks are also convergent to finite neighborhoods of their optimal ones. Finally, the simulation results will show the effectiveness of the developed data-driven ADP methods.

  19. Utilization of wireless structural health monitoring as decision making tools for a condition and reliability-based assessment of railroad bridges

    NASA Astrophysics Data System (ADS)

    Flanigan, Katherine A.; Johnson, Nephi R.; Hou, Rui; Ettouney, Mohammed; Lynch, Jerome P.

    2017-04-01

    The ability to quantitatively assess the condition of railroad bridges facilitates objective evaluation of their robustness in the face of hazard events. Of particular importance is the need to assess the condition of railroad bridges in networks that are exposed to multiple hazards. Data collected from structural health monitoring (SHM) can be used to better maintain a structure by prompting preventative (rather than reactive) maintenance strategies and supplying quantitative information to aid in recovery. To that end, a wireless monitoring system is validated and installed on the Harahan Bridge which is a hundred-year-old long-span railroad truss bridge that crosses the Mississippi River near Memphis, TN. This bridge is exposed to multiple hazards including scour, vehicle/barge impact, seismic activity, and aging. The instrumented sensing system targets non-redundant structural components and areas of the truss and floor system that bridge managers are most concerned about based on previous inspections and structural analysis. This paper details the monitoring system and the analytical method for the assessment of bridge condition based on automated data-driven analyses. Two primary objectives of monitoring the system performance are discussed: 1) monitoring fatigue accumulation in critical tensile truss elements; and 2) monitoring the reliability index values associated with sub-system limit states of these members. Moreover, since the reliability index is a scalar indicator of the safety of components, quantifiable condition assessment can be used as an objective metric so that bridge owners can make informed damage mitigation strategies and optimize resource management on single bridge or network levels.

  20. Health Literacy and Women's Reproductive Health: A Systematic Review

    PubMed Central

    Vitko, Michelle; O'Conor, Rachel; Bailey, Stacy Cooper

    2016-01-01

    Abstract Background: Health literacy is thought to impact women's reproductive health, yet no comprehensive systematic reviews have been conducted on the topic. Our objective was to systematically identify, investigate, and summarize research on the relationship between health literacy and women's reproductive health knowledge, behaviors, and outcomes. Methods: PRISMA guidelines were used to guide this review. English language, peer-reviewed research articles indexed in MEDLINE as of February 2015 were searched, along with study results posted on Clinicaltrials.gov. Articles were included if they (1) described original data-driven research conducted in developed countries, (2) were published in a peer-reviewed journal, (3) measured health literacy using a validated assessment, (4) reported on the relationship between health literacy and reproductive health outcomes, related knowledge, or behaviors, and (5) consisted of a study population that included reproductive age women. Results: A total of 34 articles met eligibility criteria and were included in this review. Data were abstracted from articles by two study authors using a standardized form. Abstracted data were then reviewed and summarized in table format. Overall, health literacy was associated with reproductive health knowledge across a spectrum of topics. It was also related to certain health behaviors, such as prenatal vitamin use and breastfeeding. Its relationship with other reproductive behaviors and outcomes remains unclear. Conclusions: Health literacy plays an important role in reproductive knowledge and may impact behaviors and outcomes. While further research is necessary, healthcare providers should utilize health literacy best practices now to promote high-quality care for patients. PMID:27564780

  1. Representational similarity analysis reveals commonalities and differences in the semantic processing of words and objects.

    PubMed

    Devereux, Barry J; Clarke, Alex; Marouchos, Andreas; Tyler, Lorraine K

    2013-11-27

    Understanding the meanings of words and objects requires the activation of underlying conceptual representations. Semantic representations are often assumed to be coded such that meaning is evoked regardless of the input modality. However, the extent to which meaning is coded in modality-independent or amodal systems remains controversial. We address this issue in a human fMRI study investigating the neural processing of concepts, presented separately as written words and pictures. Activation maps for each individual word and picture were used as input for searchlight-based multivoxel pattern analyses. Representational similarity analysis was used to identify regions correlating with low-level visual models of the words and objects and the semantic category structure common to both. Common semantic category effects for both modalities were found in a left-lateralized network, including left posterior middle temporal gyrus (LpMTG), left angular gyrus, and left intraparietal sulcus (LIPS), in addition to object- and word-specific semantic processing in ventral temporal cortex and more anterior MTG, respectively. To explore differences in representational content across regions and modalities, we developed novel data-driven analyses, based on k-means clustering of searchlight dissimilarity matrices and seeded correlation analysis. These revealed subtle differences in the representations in semantic-sensitive regions, with representations in LIPS being relatively invariant to stimulus modality and representations in LpMTG being uncorrelated across modality. These results suggest that, although both LpMTG and LIPS are involved in semantic processing, only the functional role of LIPS is the same regardless of the visual input, whereas the functional role of LpMTG differs for words and objects.

  2. Data-Driven School Administrator Behaviors and State Report Card Results

    ERIC Educational Resources Information Center

    Spencer, James A., Jr.

    2014-01-01

    The purpose of this study was to identify the principal behaviors that would define an instructional leader as being a data-driven school administrator and to assess current school administrators' levels of being data-driven. This research attempted to examine the relationship between the degree to which a principal was data-driven and the…

  3. Direction of Magnetoencephalography Sources Associated with Feedback and Feedforward Contributions in a Visual Object Recognition Task

    PubMed Central

    Ahlfors, Seppo P.; Jones, Stephanie R.; Ahveninen, Jyrki; Hämäläinen, Matti S.; Belliveau, John W.; Bar, Moshe

    2014-01-01

    Identifying inter-area communication in terms of the hierarchical organization of functional brain areas is of considerable interest in human neuroimaging. Previous studies have suggested that the direction of magneto- and electroencephalography (MEG, EEG) source currents depends on the layer-specific input patterns into a cortical area. We examined the direction in MEG source currents in a visual object recognition experiment in which there were specific expectations of activation in the fusiform region being driven by either feedforward or feedback inputs. The source for the early non-specific visual evoked response, presumably corresponding to feedforward driven activity, pointed outward, i.e., away from the white matter. In contrast, the source for the later, object-recognition related signals, expected to be driven by feedback inputs, pointed inward, toward the white matter. Associating specific features of the MEG/EEG source waveforms to feedforward and feedback inputs could provide unique information about the activation patterns within hierarchically organized cortical areas. PMID:25445356

  4. Light scattering by ultrasonically-controlled small particles: system design, calibration, and measurement results

    NASA Astrophysics Data System (ADS)

    Kassamakov, Ivan; Maconi, Göran; Penttilä, Antti; Helander, Petteri; Gritsevich, Maria; Puranen, Tuomas; Salmi, Ari; Hæggström, Edward; Muinonen, Karri

    2018-02-01

    We present the design of a novel scatterometer for precise measurement of the angular Mueller matrix profile of a mm- to µm-sized sample held in place by sound. The scatterometer comprises a tunable multimode Argon-krypton laser (with possibility to set 1 of the 12 wavelengths in visible range), linear polarizers, a reference photomultiplier tube (PMT) for monitoring the beam intensity, and a micro-PMT module mounted radially towards the sample at an adjustable radius. The measurement angle is controlled by a motor-driven rotation stage with an accuracy of 15'. The system is fully automated using LabVIEW, including the FPGA-based data acquisition and the instrument's user interface. The calibration protocol ensures accurate measurements by using a control sphere sample (diameter 3 mm, refractive index of 1.5) fixed first on a static holder followed by accurate multi-wavelength measurements of the same sample levitated ultrasonically. To demonstrate performance of the scatterometer, we conducted detailed measurements of light scattered by a particle derived from the Chelyabinsk meteorite, as well as planetary analogue materials. The measurements are the first of this kind, since they are obtained using controlled spectral angular scattering including linear polarization effects, for arbitrary shaped objects. Thus, our novel approach permits a non-destructive, disturbance-free measurement with control of the orientation and location of the scattering object.

  5. Multi-registration of software library resources

    DOEpatents

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-04-05

    Data communications, including issuing, by an application program to a high level data communications library, a request for initialization of a data communications service; issuing to a low level data communications library a request for registration of data communications functions; registering the data communications functions, including instantiating a factory object for each of the one or more data communications functions; issuing by the application program an instruction to execute a designated data communications function; issuing, to the low level data communications library, an instruction to execute the designated data communications function, including passing to the low level data communications library a call parameter that identifies a factory object; creating with the identified factory object the data communications object that implements the data communications function according to the protocol; and executing by the low level data communications library the designated data communications function.

  6. Tracking with occlusions via graph cuts.

    PubMed

    Papadakis, Nicolas; Bugeau, Aurélie

    2011-01-01

    This work presents a new method for tracking and segmenting along time-interacting objects within an image sequence. One major contribution of the paper is the formalization of the notion of visible and occluded parts. For each object, we aim at tracking these two parts. Assuming that the velocity of each object is driven by a dynamical law, predictions can be used to guide the successive estimations. Separating these predicted areas into good and bad parts with respect to the final segmentation and representing the objects with their visible and occluded parts permit handling partial and complete occlusions. To achieve this tracking, a label is assigned to each object and an energy function representing the multilabel problem is minimized via a graph cuts optimization. This energy contains terms based on image intensities which enable segmenting and regularizing the visible parts of the objects. It also includes terms dedicated to the management of the occluded and disappearing areas, which are defined on the areas of prediction of the objects. The results on several challenging sequences prove the strength of the proposed approach.

  7. Environmental Data-Driven Inquiry and Exploration (EDDIE)- Water Focused Modules for interacting with Big Hydrologic Data

    NASA Astrophysics Data System (ADS)

    Meixner, T.; Gougis, R.; O'Reilly, C.; Klug, J.; Richardson, D.; Castendyk, D.; Carey, C.; Bader, N.; Stomberg, J.; Soule, D. C.

    2016-12-01

    High-frequency sensor data are driving a shift in the Earth and environmental sciences. The availability of high-frequency data creates an engagement opportunity for undergraduate students in primary research by using large, long-term, and sensor-based, data directly in the scientific curriculum. Project EDDIE (Environmental Data-Driven Inquiry & Exploration) has developed flexible classroom activity modules designed to meet a series of pedagogical goals that include (1) developing skills required to manipulate large datasets at different scales to conduct inquiry-based investigations; (2) developing students' reasoning about statistical variation; and (3) fostering accurate student conceptions about the nature of environmental science. The modules cover a wide range of topics, including lake physics and metabolism, stream discharge, water quality, soil respiration, seismology, and climate change. In this presentation we will focus on a sequence of modules of particular interest to hydrologists - stream discharge, water quality and nutrient loading. Assessment results show that our modules are effective at making students more comfortable analyzing data, improved understanding of statistical concepts, and stronger data analysis capability. This project is funded by an NSF TUES grant (NSF DEB 1245707).

  8. Data Analysis and Data Mining: Current Issues in Biomedical Informatics

    PubMed Central

    Bellazzi, Riccardo; Diomidous, Marianna; Sarkar, Indra Neil; Takabayashi, Katsuhiko; Ziegler, Andreas; McCray, Alexa T.

    2011-01-01

    Summary Background Medicine and biomedical sciences have become data-intensive fields, which, at the same time, enable the application of data-driven approaches and require sophisticated data analysis and data mining methods. Biomedical informatics provides a proper interdisciplinary context to integrate data and knowledge when processing available information, with the aim of giving effective decision-making support in clinics and translational research. Objectives To reflect on different perspectives related to the role of data analysis and data mining in biomedical informatics. Methods On the occasion of the 50th year of Methods of Information in Medicine a symposium was organized, that reflected on opportunities, challenges and priorities of organizing, representing and analysing data, information and knowledge in biomedicine and health care. The contributions of experts with a variety of backgrounds in the area of biomedical data analysis have been collected as one outcome of this symposium, in order to provide a broad, though coherent, overview of some of the most interesting aspects of the field. Results The paper presents sections on data accumulation and data-driven approaches in medical informatics, data and knowledge integration, statistical issues for the evaluation of data mining models, translational bioinformatics and bioinformatics aspects of genetic epidemiology. Conclusions Biomedical informatics represents a natural framework to properly and effectively apply data analysis and data mining methods in a decision-making context. In the future, it will be necessary to preserve the inclusive nature of the field and to foster an increasing sharing of data and methods between researchers. PMID:22146916

  9. Solder Joint Health Monitoring Testbed

    NASA Technical Reports Server (NTRS)

    Delaney, Michael M.; Flynn, James; Browder, Mark

    2009-01-01

    A method of monitoring the health of selected solder joints, called SJ-BIST, has been developed by Ridgetop Group Inc. under a Small Business Innovative Research (SBIR) contract. The primary goal of this research program is to test and validate this method in a flight environment using realistically seeded faults in selected solder joints. An additional objective is to gather environmental data for future development of physics-based and data-driven prognostics algorithms. A test board is being designed using a Xilinx FPGA. These boards will be tested both in flight and on the ground using a shaker table and an altitude chamber.

  10. Research Driven Policy: Is Financial Capacity Related to Dangerousness?

    PubMed

    DeLeon, Patrick H; Paxton, Maegan M; Spencer, Tonya; Bajjani-Gebara, Jouhayna E

    2018-05-22

    Current Veterans administration policy directly links a Veteran's adjudged capacity to manage personal financial resources with their ability to purchase or possess a firearm, pursuant to the regulatory authority of the National Instant Criminal Background Check System (NICS). Preventing Veterans' suicide is a highly laudable public health objective. Effectively utilizing scientific research to "inform" public policy is equally important. The authors should be congratulated for their efforts. However, it is important in utilizing large set population-based data, especially social science data, to evaluate policy alternatives that there be substantial face (i.e., clinical) validity. Correlation does not necessarily represent causation.

  11. Framework for a clinical information system.

    PubMed

    Van De Velde, R; Lansiers, R; Antonissen, G

    2002-01-01

    The design and implementation of Clinical Information System architecture is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the "middle" tier apply the clinical (business) model and application rules. The main characteristics are the focus on modelling and reuse of both data and business logic. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.

  12. Insulator-to-conducting transition in dense fluid helium.

    PubMed

    Celliers, P M; Loubeyre, P; Eggert, J H; Brygoo, S; McWilliams, R S; Hicks, D G; Boehly, T R; Jeanloz, R; Collins, G W

    2010-05-07

    By combining diamond-anvil-cell and laser-driven shock wave techniques, we produced dense He samples up to 1.5 g/cm(3) at temperatures reaching 60 kK. Optical measurements of reflectivity and temperature show that electronic conduction in He at these conditions is temperature-activated (semiconducting). A fit to the data suggests that the mobility gap closes with increasing density, and that hot dense He becomes metallic above approximately 1.9 g/cm(3). These data provide a benchmark to test models that describe He ionization at conditions found in astrophysical objects, such as cold white dwarf atmospheres.

  13. USACM Thematic Workshop On Uncertainty Quantification And Data-Driven Modeling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, James R.

    The USACM Thematic Workshop on Uncertainty Quantification and Data-Driven Modeling was held on March 23-24, 2017, in Austin, TX. The organizers of the technical program were James R. Stewart of Sandia National Laboratories and Krishna Garikipati of University of Michigan. The administrative organizer was Ruth Hengst, who serves as Program Coordinator for the USACM. The organization of this workshop was coordinated through the USACM Technical Thrust Area on Uncertainty Quantification and Probabilistic Analysis. The workshop website (http://uqpm2017.usacm.org) includes the presentation agenda as well as links to several of the presentation slides (permission to access the presentations was granted by eachmore » of those speakers, respectively). Herein, this final report contains the complete workshop program that includes the presentation agenda, the presentation abstracts, and the list of posters.« less

  14. Cruella: developing a scalable tissue microarray data management system.

    PubMed

    Cowan, James D; Rimm, David L; Tuck, David P

    2006-06-01

    Compared with DNA microarray technology, relatively little information is available concerning the special requirements, design influences, and implementation strategies of data systems for tissue microarray technology. These issues include the requirement to accommodate new and different data elements for each new project as well as the need to interact with pre-existing models for clinical, biological, and specimen-related data. To design and implement a flexible, scalable tissue microarray data storage and management system that could accommodate information regarding different disease types and different clinical investigators, and different clinical investigation questions, all of which could potentially contribute unforeseen data types that require dynamic integration with existing data. The unpredictability of the data elements combined with the novelty of automated analysis algorithms and controlled vocabulary standards in this area require flexible designs and practical decisions. Our design includes a custom Java-based persistence layer to mediate and facilitate interaction with an object-relational database model and a novel database schema. User interaction is provided through a Java Servlet-based Web interface. Cruella has become an indispensable resource and is used by dozens of researchers every day. The system stores millions of experimental values covering more than 300 biological markers and more than 30 disease types. The experimental data are merged with clinical data that has been aggregated from multiple sources and is available to the researchers for management, analysis, and export. Cruella addresses many of the special considerations for managing tissue microarray experimental data and the associated clinical information. A metadata-driven approach provides a practical solution to many of the unique issues inherent in tissue microarray research, and allows relatively straightforward interoperability with and accommodation of new data models.

  15. Personalized mortality prediction driven by electronic medical data and a patient similarity metric.

    PubMed

    Lee, Joon; Maslove, David M; Dubin, Joel A

    2015-01-01

    Clinical outcome prediction normally employs static, one-size-fits-all models that perform well for the average patient but are sub-optimal for individual patients with unique characteristics. In the era of digital healthcare, it is feasible to dynamically personalize decision support by identifying and analyzing similar past patients, in a way that is analogous to personalized product recommendation in e-commerce. Our objectives were: 1) to prove that analyzing only similar patients leads to better outcome prediction performance than analyzing all available patients, and 2) to characterize the trade-off between training data size and the degree of similarity between the training data and the index patient for whom prediction is to be made. We deployed a cosine-similarity-based patient similarity metric (PSM) to an intensive care unit (ICU) database to identify patients that are most similar to each patient and subsequently to custom-build 30-day mortality prediction models. Rich clinical and administrative data from the first day in the ICU from 17,152 adult ICU admissions were analyzed. The results confirmed that using data from only a small subset of most similar patients for training improves predictive performance in comparison with using data from all available patients. The results also showed that when too few similar patients are used for training, predictive performance degrades due to the effects of small sample sizes. Our PSM-based approach outperformed well-known ICU severity of illness scores. Although the improved prediction performance is achieved at the cost of increased computational burden, Big Data technologies can help realize personalized data-driven decision support at the point of care. The present study provides crucial empirical evidence for the promising potential of personalized data-driven decision support systems. With the increasing adoption of electronic medical record (EMR) systems, our novel medical data analytics contributes to meaningful use of EMR data.

  16. Personalized Mortality Prediction Driven by Electronic Medical Data and a Patient Similarity Metric

    PubMed Central

    Lee, Joon; Maslove, David M.; Dubin, Joel A.

    2015-01-01

    Background Clinical outcome prediction normally employs static, one-size-fits-all models that perform well for the average patient but are sub-optimal for individual patients with unique characteristics. In the era of digital healthcare, it is feasible to dynamically personalize decision support by identifying and analyzing similar past patients, in a way that is analogous to personalized product recommendation in e-commerce. Our objectives were: 1) to prove that analyzing only similar patients leads to better outcome prediction performance than analyzing all available patients, and 2) to characterize the trade-off between training data size and the degree of similarity between the training data and the index patient for whom prediction is to be made. Methods and Findings We deployed a cosine-similarity-based patient similarity metric (PSM) to an intensive care unit (ICU) database to identify patients that are most similar to each patient and subsequently to custom-build 30-day mortality prediction models. Rich clinical and administrative data from the first day in the ICU from 17,152 adult ICU admissions were analyzed. The results confirmed that using data from only a small subset of most similar patients for training improves predictive performance in comparison with using data from all available patients. The results also showed that when too few similar patients are used for training, predictive performance degrades due to the effects of small sample sizes. Our PSM-based approach outperformed well-known ICU severity of illness scores. Although the improved prediction performance is achieved at the cost of increased computational burden, Big Data technologies can help realize personalized data-driven decision support at the point of care. Conclusions The present study provides crucial empirical evidence for the promising potential of personalized data-driven decision support systems. With the increasing adoption of electronic medical record (EMR) systems, our novel medical data analytics contributes to meaningful use of EMR data. PMID:25978419

  17. Teaching Note--"By Any Means Necessary!" Infusing Socioeconomic Justice Content into Quantitative Research Course Work

    ERIC Educational Resources Information Center

    Slayter, Elspeth M.

    2017-01-01

    Existing research suggests a majority of faculty include social justice content in research courses but not through the use of existing quantitative data for in-class activities that foster mastery of data analysis and interpretation and curiosity about social justice-related topics. By modeling data-driven dialogue and the deconstruction of…

  18. Guidelines for Datacenter Energy Information System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Reshma; Mahdavi, Rod; Mathew, Paul

    2013-12-01

    The purpose of this document is to provide structured guidance to data center owners, operators, and designers, to empower them with information on how to specify and procure data center energy information systems (EIS) for managing the energy utilization of their data centers. Data centers are typically energy-intensive facilities that can consume up to 100 times more energy per unit area than a standard office building (FEMP 2013). This guidance facilitates “data-driven decision making,” which will be enabled by following the approach outlined in the guide. This will bring speed, clarity, and objectivity to any energy or asset management decisionsmore » because of the ability to monitor and track an energy management project’s performance.« less

  19. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    PubMed

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  20. How Evolution May Work Through Curiosity-Driven Developmental Process.

    PubMed

    Oudeyer, Pierre-Yves; Smith, Linda B

    2016-04-01

    Infants' own activities create and actively select their learning experiences. Here we review recent models of embodied information seeking and curiosity-driven learning and show that these mechanisms have deep implications for development and evolution. We discuss how these mechanisms yield self-organized epigenesis with emergent ordered behavioral and cognitive developmental stages. We describe a robotic experiment that explored the hypothesis that progress in learning, in and for itself, generates intrinsic rewards: The robot learners probabilistically selected experiences according to their potential for reducing uncertainty. In these experiments, curiosity-driven learning led the robot learner to successively discover object affordances and vocal interaction with its peers. We explain how a learning curriculum adapted to the current constraints of the learning system automatically formed, constraining learning and shaping the developmental trajectory. The observed trajectories in the robot experiment share many properties with those in infant development, including a mixture of regularities and diversities in the developmental patterns. Finally, we argue that such emergent developmental structures can guide and constrain evolution, in particular with regard to the origins of language. Copyright © 2016 Cognitive Science Society, Inc.

  1. Immigrants and Employer-Sponsored Health Insurance

    PubMed Central

    Buchmueller, Thomas C; Lo Sasso, Anthony T; Lurie, Ithai; Dolfin, Sarah

    2007-01-01

    Objective To investigate the factors underlying the lower rate of employer-sponsored health insurance coverage for foreign-born workers. Data Sources 2001 Survey of Income and Program Participation. Study Design We estimate probit regressions to determine the effect of immigrant status on employer-sponsored health insurance coverage, including the probabilities of working for a firm that offers coverage, being eligible for coverage, and taking up coverage. Data Extraction Methods We identified native born citizens, naturalized citizens, and noncitizen residents between the ages of 18 and 65, in the year 2002. Principal Findings First, we find that the large difference in coverage rates for immigrants and native-born Americans is driven by the very low rates of coverage for noncitizen immigrants. Differences between native-born and naturalized citizens are quite small and for some outcomes are statistically insignificant when we control for observable characteristics. Second, our results indicate that the gap between natives and noncitizens is explained mainly by differences in the probability of working for a firm that offers insurance. Conditional on working for such a firm, noncitizens are only slightly less likely to be eligible for coverage and, when eligible, are only slightly less likely to take up coverage. Third, roughly two-thirds of the native/noncitizen gap in coverage overall and in the probability of working for an insurance-providing employer is explained by characteristics of the individual and differences in the types of jobs they hold. Conclusions The substantially higher rate of uninsurance among immigrants is driven by the lower rate of health insurance offers by the employers of immigrants. PMID:17355593

  2. A data-driven approach to quality risk management

    PubMed Central

    Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David

    2013-01-01

    Aim: An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Materials and Methods: Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. Results: Only a subset of the risk factors had a significant association with quality issues, and included: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Conclusion: Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety. PMID:24312890

  3. Automated Measurement and Verification and Innovative Occupancy Detection Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Phillip; Bruce, Nordman; Piette, Mary Ann

    In support of DOE’s sensors and controls research, the goal of this project is to move toward integrated building to grid systems by building on previous work to develop and demonstrate a set of load characterization measurement and evaluation tools that are envisioned to be part of a suite of applications for transactive efficient buildings, built upon data-driven load characterization and prediction models. This will include the ability to include occupancy data in the models, plus data collection and archival methods to include different types of occupancy data with existing networks and a taxonomy for naming these data within amore » Volttron agent platform.« less

  4. Design and Deployment of a Pediatric Cardiac Arrest Surveillance System.

    PubMed

    Duval-Arnould, Jordan Michel; Newton, Heather Marie; McNamara, Leann; Engorn, Branden Michael; Jones, Kareen; Bernier, Meghan; Dodge, Pamela; Salamone, Cheryl; Bhalala, Utpal; Jeffers, Justin M; Engineer, Lilly; Diener-West, Marie; Hunt, Elizabeth Anne

    2018-01-01

    We aimed to increase detection of pediatric cardiopulmonary resuscitation (CPR) events and collection of physiologic and performance data for use in quality improvement (QI) efforts. We developed a workflow-driven surveillance system that leveraged organizational information technology systems to trigger CPR detection and analysis processes. We characterized detection by notification source, type, location, and year, and compared it to previous methods of detection. From 1/1/2013 through 12/31/2015, there were 2,986 unique notifications associated with 2,145 events, 317 requiring CPR. PICU and PEDS-ED accounted for 65% of CPR events, whereas floor care areas were responsible for only 3% of events. 100% of PEDS-OR and >70% of PICU CPR events would not have been included in QI efforts. Performance data from both defibrillator and bedside monitor increased annually. (2013: 1%; 2014: 18%; 2015: 27%). After deployment of this system, detection has increased ∼9-fold and performance data collection increased annually. Had the system not been deployed, 100% of PEDS-OR and 50-70% of PICU, NICU, and PEDS-ED events would have been missed. By leveraging hospital information technology and medical device data, identification of pediatric cardiac arrest with an associated increased capture in the proportion of objective performance data is possible.

  5. System and method for embedding emotion in logic systems

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A. (Inventor)

    2012-01-01

    A system, method, and computer readable-media for creating a stable synthetic neural system. The method includes training an intellectual choice-driven synthetic neural system (SNS), training an emotional rule-driven SNS by generating emotions from rules, incorporating the rule-driven SNS into the choice-driven SNS through an evolvable interface, and balancing the emotional SNS and the intellectual SNS to achieve stability in a nontrivial autonomous environment with a Stability Algorithm for Neural Entities (SANE). Generating emotions from rules can include coding the rules into the rule-driven SNS in a self-consistent way. Training the emotional rule-driven SNS can occur during a training stage in parallel with training the choice-driven SNS. The training stage can include a self assessment loop which measures performance characteristics of the rule-driven SNS against core genetic code. The method uses a stability threshold to measure stability of the incorporated rule-driven SNS and choice-driven SNS using SANE.

  6. The Evolution of System Safety at NASA

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Everett, Chris; Groen, Frank

    2014-01-01

    The NASA system safety framework is in the process of change, motivated by the desire to promote an objectives-driven approach to system safety that explicitly focuses system safety efforts on system-level safety performance, and serves to unify, in a purposeful manner, safety-related activities that otherwise might be done in a way that results in gaps, redundancies, or unnecessary work. An objectives-driven approach to system safety affords more flexibility to determine, on a system-specific basis, the means by which adequate safety is achieved and verified. Such flexibility and efficiency is becoming increasingly important in the face of evolving engineering modalities and acquisition models, where, for example, NASA will increasingly rely on commercial providers for transportation services to low-earth orbit. A key element of this objectives-driven approach is the use of the risk-informed safety case (RISC): a structured argument, supported by a body of evidence, that provides a compelling, comprehensible and valid case that a system is or will be adequately safe for a given application in a given environment. The RISC addresses each of the objectives defined for the system, providing a rational basis for making informed risk acceptance decisions at relevant decision points in the system life cycle.

  7. Epilepsy informatics and an ontology-driven infrastructure for large database research and patient care in epilepsy.

    PubMed

    Sahoo, Satya S; Zhang, Guo-Qiang; Lhatoo, Samden D

    2013-08-01

    The epilepsy community increasingly recognizes the need for a modern classification system that can also be easily integrated with effective informatics tools. The 2010 reports by the United States President's Council of Advisors on Science and Technology (PCAST) identified informatics as a critical resource to improve quality of patient care, drive clinical research, and reduce the cost of health services. An effective informatics infrastructure for epilepsy, which is underpinned by a formal knowledge model or ontology, can leverage an ever increasing amount of multimodal data to improve (1) clinical decision support, (2) access to information for patients and their families, (3) easier data sharing, and (4) accelerate secondary use of clinical data. Modeling the recommendations of the International League Against Epilepsy (ILAE) classification system in the form of an epilepsy domain ontology is essential for consistent use of terminology in a variety of applications, including electronic health records systems and clinical applications. In this review, we discuss the data management issues in epilepsy and explore the benefits of an ontology-driven informatics infrastructure and its role in adoption of a "data-driven" paradigm in epilepsy research. Wiley Periodicals, Inc. © 2013 International League Against Epilepsy.

  8. Target volume and artifact evaluation of a new data-driven 4D CT.

    PubMed

    Martin, Rachael; Pan, Tinsu

    Four-dimensional computed tomography (4D CT) is often used to define the internal gross target volume (IGTV) for radiation therapy of lung cancer. Traditionally, this technique requires the use of an external motion surrogate; however, a new image, data-driven 4D CT, has become available. This study aims to describe this data-driven 4D CT and compare target contours created with it to those created using standard 4D CT. Cine CT data of 35 patients undergoing stereotactic body radiation therapy were collected and sorted into phases using standard and data-driven 4D CT. IGTV contours were drawn using a semiautomated method on maximum intensity projection images of both 4D CT methods. Errors resulting from reproducibility of the method were characterized. A comparison of phase image artifacts was made using a normalized cross-correlation method that assigned a score from +1 (data-driven "better") to -1 (standard "better"). The volume difference between the data-driven and standard IGTVs was not significant (data driven was 2.1 ± 1.0% smaller, P = .08). The Dice similarity coefficient showed good similarity between the contours (0.949 ± 0.006). The mean surface separation was 0.4 ± 0.1 mm and the Hausdorff distance was 3.1 ± 0.4 mm. An average artifact score of +0.37 indicated that the data-driven method had significantly fewer and/or less severe artifacts than the standard method (P = 1.5 × 10 -5 for difference from 0). On average, the difference between IGTVs derived from data-driven and standard 4D CT was not clinically relevant or statistically significant, suggesting data-driven 4D CT can be used in place of standard 4D CT without adjustments to IGTVs. The relatively large differences in some patients were usually attributed to limitations in automatic contouring or differences in artifacts. Artifact reduction and setup simplicity suggest a clinical advantage to data-driven 4D CT. Published by Elsevier Inc.

  9. Combustion Sensors: Gas Turbine Applications

    NASA Technical Reports Server (NTRS)

    Human, Mel

    2002-01-01

    This report documents efforts to survey the current research directions in sensor technology for gas turbine systems. The work is driven by the current and future requirements on system performance and optimization. Accurate real time measurements of velocities, pressure, temperatures, and species concentrations will be required for objectives such as combustion instability attenuation, pollutant reduction, engine health management, exhaust profile control via active control, etc. Changing combustor conditions - engine aging, flow path slagging, or rapid maneuvering - will require adaptive responses; the effectiveness of such will be only as good as the dynamic information available for processing. All of these issues point toward the importance of continued sensor development. For adequate control of the combustion process, sensor data must include information about the above mentioned quantities along with equivalence ratios and radical concentrations, and also include both temporal and spatial velocity resolution. Ultimately these devices must transfer from the laboratory to field installations, and thus must become low weight and cost, reliable and maintainable. A primary conclusion from this study is that the optics-based sensor science will be the primary diagnostic in future gas turbine technologies.

  10. Big Data Ecosystems Enable Scientific Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Critchlow, Terence J.; Kleese van Dam, Kerstin

    Over the past 5 years, advances in experimental, sensor and computational technologies have driven the exponential growth in the volumes, acquisition rates, variety and complexity of scientific data. As noted by Hey et al in their 2009 e-book The Fourth Paradigm, this availability of large-quantities of scientifically meaningful data has given rise to a new scientific methodology - data intensive science. Data intensive science is the ability to formulate and evaluate hypotheses using data and analysis to extend, complement and, at times, replace experimentation, theory, or simulation. This new approach to science no longer requires scientists to interact directly withmore » the objects of their research; instead they can utilize digitally captured, reduced, calibrated, analyzed, synthesized and visualized results - allowing them carry out 'experiments' in data.« less

  11. Using connectome-based predictive modeling to predict individual behavior from brain connectivity

    PubMed Central

    Shen, Xilin; Finn, Emily S.; Scheinost, Dustin; Rosenberg, Monica D.; Chun, Marvin M.; Papademetris, Xenophon; Constable, R Todd

    2017-01-01

    Neuroimaging is a fast developing research area where anatomical and functional images of human brains are collected using techniques such as functional magnetic resonance imaging (fMRI), diffusion tensor imaging (DTI), and electroencephalography (EEG). Technical advances and large-scale datasets have allowed for the development of models capable of predicting individual differences in traits and behavior using brain connectivity measures derived from neuroimaging data. Here, we present connectome-based predictive modeling (CPM), a data-driven protocol for developing predictive models of brain-behavior relationships from connectivity data using cross-validation. This protocol includes the following steps: 1) feature selection, 2) feature summarization, 3) model building, and 4) assessment of prediction significance. We also include suggestions for visualizing the most predictive features (i.e., brain connections). The final result should be a generalizable model that takes brain connectivity data as input and generates predictions of behavioral measures in novel subjects, accounting for a significant amount of the variance in these measures. It has been demonstrated that the CPM protocol performs equivalently or better than most of the existing approaches in brain-behavior prediction. However, because CPM focuses on linear modeling and a purely data-driven driven approach, neuroscientists with limited or no experience in machine learning or optimization would find it easy to implement the protocols. Depending on the volume of data to be processed, the protocol can take 10–100 minutes for model building, 1–48 hours for permutation testing, and 10–20 minutes for visualization of results. PMID:28182017

  12. The Potential of Knowing More: A Review of Data-Driven Urban Water Management.

    PubMed

    Eggimann, Sven; Mutzner, Lena; Wani, Omar; Schneider, Mariane Yvonne; Spuhler, Dorothee; Moy de Vitry, Matthew; Beutler, Philipp; Maurer, Max

    2017-03-07

    The promise of collecting and utilizing large amounts of data has never been greater in the history of urban water management (UWM). This paper reviews several data-driven approaches which play a key role in bringing forward a sea change. It critically investigates whether data-driven UWM offers a promising foundation for addressing current challenges and supporting fundamental changes in UWM. We discuss the examples of better rain-data management, urban pluvial flood-risk management and forecasting, drinking water and sewer network operation and management, integrated design and management, increasing water productivity, wastewater-based epidemiology and on-site water and wastewater treatment. The accumulated evidence from literature points toward a future UWM that offers significant potential benefits thanks to increased collection and utilization of data. The findings show that data-driven UWM allows us to develop and apply novel methods, to optimize the efficiency of the current network-based approach, and to extend functionality of today's systems. However, generic challenges related to data-driven approaches (e.g., data processing, data availability, data quality, data costs) and the specific challenges of data-driven UWM need to be addressed, namely data access and ownership, current engineering practices and the difficulty of assessing the cost benefits of data-driven UWM.

  13. Crowd Sourcing to Improve Urban Stormwater Management

    NASA Astrophysics Data System (ADS)

    Minsker, B. S.; Band, L. E.; Heidari Haratmeh, B.; Law, N. L.; Leonard, L. N.; Rai, A.

    2017-12-01

    Over half of the world's population currently lives in urban areas, a number predicted to grow to 60 percent by 2030. Urban areas face unprecedented and growing challenges that threaten society's long-term wellbeing, including poverty; chronic health problems; widespread pollution and resource degradation; and increased natural disasters. These are "wicked" problems involving "systems of systems" that require unprecedented information sharing and collaboration across disciplines and organizational boundaries. Cities are recognizing that the increasing stream of data and information ("Big Data"), informatics, and modeling can support rapid advances on these challenges. Nonetheless, information technology solutions can only be effective in addressing these challenges through deeply human and systems perspectives. A stakeholder-driven approach ("crowd sourcing") is needed to develop urban systems that address multiple needs, such as parks that capture and treat stormwater while improving human and ecosystem health and wellbeing. We have developed informatics- and Cloud-based collaborative methods that enable crowd sourcing of green stormwater infrastructure (GSI: rain gardens, bioswales, trees, etc.) design and management. The methods use machine learning, social media data, and interactive design tools (called IDEAS-GI) to identify locations and features of GSI that perform best on a suite of objectives, including life cycle cost, stormwater volume reduction, and air pollution reduction. Insights will be presented on GI features that best meet stakeholder needs and are therefore most likely to improve human wellbeing and be well maintained.

  14. An ISA-TAB-Nano based data collection framework to support data-driven modelling of nanotoxicology.

    PubMed

    Marchese Robinson, Richard L; Cronin, Mark T D; Richarz, Andrea-Nicole; Rallo, Robert

    2015-01-01

    Analysis of trends in nanotoxicology data and the development of data driven models for nanotoxicity is facilitated by the reporting of data using a standardised electronic format. ISA-TAB-Nano has been proposed as such a format. However, in order to build useful datasets according to this format, a variety of issues has to be addressed. These issues include questions regarding exactly which (meta)data to report and how to report them. The current article discusses some of the challenges associated with the use of ISA-TAB-Nano and presents a set of resources designed to facilitate the manual creation of ISA-TAB-Nano datasets from the nanotoxicology literature. These resources were developed within the context of the NanoPUZZLES EU project and include data collection templates, corresponding business rules that extend the generic ISA-TAB-Nano specification as well as Python code to facilitate parsing and integration of these datasets within other nanoinformatics resources. The use of these resources is illustrated by a "Toy Dataset" presented in the Supporting Information. The strengths and weaknesses of the resources are discussed along with possible future developments.

  15. An ISA-TAB-Nano based data collection framework to support data-driven modelling of nanotoxicology

    PubMed Central

    Marchese Robinson, Richard L; Richarz, Andrea-Nicole; Rallo, Robert

    2015-01-01

    Summary Analysis of trends in nanotoxicology data and the development of data driven models for nanotoxicity is facilitated by the reporting of data using a standardised electronic format. ISA-TAB-Nano has been proposed as such a format. However, in order to build useful datasets according to this format, a variety of issues has to be addressed. These issues include questions regarding exactly which (meta)data to report and how to report them. The current article discusses some of the challenges associated with the use of ISA-TAB-Nano and presents a set of resources designed to facilitate the manual creation of ISA-TAB-Nano datasets from the nanotoxicology literature. These resources were developed within the context of the NanoPUZZLES EU project and include data collection templates, corresponding business rules that extend the generic ISA-TAB-Nano specification as well as Python code to facilitate parsing and integration of these datasets within other nanoinformatics resources. The use of these resources is illustrated by a “Toy Dataset” presented in the Supporting Information. The strengths and weaknesses of the resources are discussed along with possible future developments. PMID:26665069

  16. A Low-Cost Method for Multiple Disease Prediction.

    PubMed

    Bayati, Mohsen; Bhaskar, Sonia; Montanari, Andrea

    Recently, in response to the rising costs of healthcare services, employers that are financially responsible for the healthcare costs of their workforce have been investing in health improvement programs for their employees. A main objective of these so called "wellness programs" is to reduce the incidence of chronic illnesses such as cardiovascular disease, cancer, diabetes, and obesity, with the goal of reducing future medical costs. The majority of these wellness programs include an annual screening to detect individuals with the highest risk of developing chronic disease. Once these individuals are identified, the company can invest in interventions to reduce the risk of those individuals. However, capturing many biomarkers per employee creates a costly screening procedure. We propose a statistical data-driven method to address this challenge by minimizing the number of biomarkers in the screening procedure while maximizing the predictive power over a broad spectrum of diseases. Our solution uses multi-task learning and group dimensionality reduction from machine learning and statistics. We provide empirical validation of the proposed solution using data from two different electronic medical records systems, with comparisons to a statistical benchmark.

  17. Role of regulatory agencies in translating pharmacogenetics to the clinics

    PubMed Central

    Prasad, Krishna

    2009-01-01

    Overall, the regulators (here the term is used in the broad sense including competent authorities, the national departments of health and the European commission) have a significant role in translating pharmacogenomics into clinical practice. The first objective is to establish the role of the genomic information that is available, and this should be data driven. Conduct of robust clinical trials that are sound both scientifically and from a regulatory perspective should be encouraged. Significant interaction between Academia, Pharma industry and the regulator is essential with the overall aim of improving public health. Conceptually, this would involve the triumvirate (Academia, industry and regulators) as an orchestra with the regulators perhaps taking the role of the conductor while the significant players would be those that generate data (Academia and industry). The regulators also need to ensure that clear guidance is available for use of the information and the tests with a significant level of uniformity between the ICH regions. The commercial availability of the test will have considerable impact on the use of pharmacogenomics, but this is currently beyond the scope of this paper. PMID:22461095

  18. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks

    PubMed Central

    Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin

    2016-01-01

    The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data. PMID:27690035

  19. Status of Goldstone solar energy system study of the first Goldstone energy project

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.

    1977-01-01

    The results reached by the DSN engineering section and private consultants in the review of the initial plan of the Golstone Energy Project are summarized. The main objectives were in the areas of energy conservation and the application of solar-driven systems for power and hydrogen generation. This summary will provide background data for management planning decisions both to the DSN engineering section and other organizations planning a similar program. The review showed that an add-on solar driven absorption refrigeration unit with its associated changes to the existing system was not cost-effective, having a payback period of 29 years. Similar economically unattractive results were found for both a solar-hydrogen and a wind-hydrogen generation plant. However, cutting the hydrogen generation linkage from this plant improved its economic feasibility.

  20. Asteroids in the High Cadence Transient Survey

    NASA Astrophysics Data System (ADS)

    Peña, J.; Fuentes, C.; Förster, F.; Maureira, J. C.; San Martín, J.; Littín, J.; Huijse, P.; Cabrera-Vives, G.; Estévez, P. A.; Galbany, L.; González-Gaitán, S.; Martínez, J.; de Jaeger, Th.; Hamuy, M.

    2018-03-01

    We report on the serendipitous observations of solar system objects imaged during the High cadence Transient Survey 2014 observation campaign. Data from this high-cadence wide-field survey was originally analyzed for finding variable static sources using machine learning to select the most-likely candidates. In this work, we search for moving transients consistent with solar system objects and derive their orbital parameters. We use a simple, custom motion detection algorithm to link trajectories and assume Keplerian motion to derive the asteroid’s orbital parameters. We use known asteroids from the Minor Planet Center database to assess the detection efficiency of the survey and our search algorithm. Trajectories have an average of nine detections spread over two days, and our fit yields typical errors of {σ }a∼ 0.07 {au}, σ e ∼ 0.07 and σ i ∼ 0.°5 in semimajor axis, eccentricity, and inclination, respectively, for known asteroids in our sample. We extract 7700 orbits from our trajectories, identifying 19 near-Earth objects, 6687 asteroids, 14 Centaurs, and 15 trans-Neptunian objects. This highlights the complementarity of supernova wide-field surveys for solar system research and the significance of machine learning to clean data of false detections. It is a good example of the data-driven science that Large Synoptic Survey Telescope will deliver.

  1. Vibrotactile feedback for conveying object shape information as perceived by artificial sensing of robotic arm.

    PubMed

    Khasnobish, Anwesha; Pal, Monalisa; Sardar, Dwaipayan; Tibarewala, D N; Konar, Amit

    2016-08-01

    This work is a preliminary study towards developing an alternative communication channel for conveying shape information to aid in recognition of items when tactile perception is hindered. Tactile data, acquired during object exploration by sensor fitted robot arm, are processed to recognize four basic geometric shapes. Patterns representing each shape, classified from tactile data, are generated using micro-controller-driven vibration motors which vibrotactually stimulate users to convey the particular shape information. These motors are attached on the subject's arm and their psychological (verbal) responses are recorded to assess the competence of the system to convey shape information to the user in form of vibrotactile stimulations. Object shapes are classified from tactile data with an average accuracy of 95.21 %. Three successive sessions of shape recognition from vibrotactile pattern depicted learning of the stimulus from subjects' psychological response which increased from 75 to 95 %. This observation substantiates the learning of vibrotactile stimulation in user over the sessions which in turn increase the system efficacy. The tactile sensing module and vibrotactile pattern generating module are integrated to complete the system whose operation is analysed in real-time. Thus, the work demonstrates a successful implementation of the complete schema of artificial tactile sensing system for object-shape recognition through vibrotactile stimulations.

  2. Autonomous Modelling of X-ray Spectra Using Robust Global Optimization Methods

    NASA Astrophysics Data System (ADS)

    Rogers, Adam; Safi-Harb, Samar; Fiege, Jason

    2015-08-01

    The standard approach to model fitting in X-ray astronomy is by means of local optimization methods. However, these local optimizers suffer from a number of problems, such as a tendency for the fit parameters to become trapped in local minima, and can require an involved process of detailed user intervention to guide them through the optimization process. In this work we introduce a general GUI-driven global optimization method for fitting models to X-ray data, written in MATLAB, which searches for optimal models with minimal user interaction. We directly interface with the commonly used XSPEC libraries to access the full complement of pre-existing spectral models that describe a wide range of physics appropriate for modelling astrophysical sources, including supernova remnants and compact objects. Our algorithm is powered by the Ferret genetic algorithm and Locust particle swarm optimizer from the Qubist Global Optimization Toolbox, which are robust at finding families of solutions and identifying degeneracies. This technique will be particularly instrumental for multi-parameter models and high-fidelity data. In this presentation, we provide details of the code and use our techniques to analyze X-ray data obtained from a variety of astrophysical sources.

  3. Virtualization of event sources in wireless sensor networks for the internet of things.

    PubMed

    Lucas Martínez, Néstor; Martínez, José-Fernán; Hernández Díaz, Vicente

    2014-12-01

    Wireless Sensor Networks (WSNs) are generally used to collect information from the environment. The gathered data are delivered mainly to sinks or gateways that become the endpoints where applications can retrieve and process such data. However, applications would also expect from a WSN an event-driven operational model, so that they can be notified whenever occur some specific environmental changes instead of continuously analyzing the data provided periodically. In either operational model, WSNs represent a collection of interconnected objects, as outlined by the Internet of Things. Additionally, in order to fulfill the Internet of Things principles, Wireless Sensor Networks must have a virtual representation that allows indirect access to their resources, a model that should also include the virtualization of event sources in a WSN. Thus, in this paper a model for a virtual representation of event sources in a WSN is proposed. They are modeled as internet resources that are accessible by any internet application, following an Internet of Things approach. The model has been tested in a real implementation where a WSN has been deployed in an open neighborhood environment. Different event sources have been identified in the proposed scenario, and they have been represented following the proposed model.

  4. Physical Samples Linked Data in Action

    NASA Astrophysics Data System (ADS)

    Ji, P.; Arko, R. A.; Lehnert, K.; Bristol, S.

    2017-12-01

    Most data and metadata related to physical samples currently reside in isolated relational databases driven by diverse data models. How to approach the challenge for sharing, interchanging and integrating data from these difference relational databases motivated us to publish Linked Open Data for collections of physical samples, using Semantic Web technologies including the Resource Description Framework (RDF), RDF Query Language (SPARQL), and Web Ontology Language (OWL). In last few years, we have released four knowledge graphs concentrated on physical samples, including System for Earth Sample Registration (SESAR), USGS National Geochemical Database (NGDC), Ocean Biogeographic Information System (OBIS), and Earthchem Database. Currently the four knowledge graphs contain over 12 million facets (triples) about objects of interest to the geoscience domain. Choosing appropriate domain ontologies for representing context of data is the core of the whole work. Geolink ontology developed by Earthcube Geolink project was used as top level to represent common concepts like person, organization, cruise, etc. Physical sample ontology developed by Interdisciplinary Earth Data Alliance (IEDA) and Darwin Core vocabulary were used as second level to describe details about geological samples and biological diversity. We also focused on finding and building best tool chains to support the whole life cycle of publishing linked data we have, including information retrieval, linked data browsing and data visualization. Currently, Morph, Virtuoso Server, LodView, LodLive, and YASGUI were employed for converting, storing, representing, and querying data in a knowledge base (RDF triplestore). Persistent digital identifier is another main point we concentrated on. Open Researcher & Contributor IDs (ORCIDs), International Geo Sample Numbers (IGSNs), Global Research Identifier Database (GRID) and other persistent identifiers were used to link different resources from various graphs with person, sample, organization, cruise, etc. This work is supported by the EarthCube "GeoLink" project (NSF# ICER14-40221 and others) and the "USGS-IEDA Partnership to Support a Data Lifecycle Framework and Tools" project (USGS# G13AC00381).

  5. The Datafication of Everything - Even Toilets.

    PubMed

    Lun, Kwok-Chan

    2018-04-22

    Health informatics has benefitted from the development of Info-Communications Technology (ICT) over the last fifty years. Advances in ICT in healthcare have now started to spur advances in Data Technology as hospital information systems, electronic health and medical records, mobile devices, social media and Internet Of Things (IOT) are making a substantial impact on the generation of data. It is timely for healthcare institutions to recognize data as a corporate asset and promote a data-driven culture within the institution. It is both strategic and timely for IMIA, as an international organization in health informatics, to take the lead to promote a data-driven culture in healthcare organizations. This can be achieved by expanding the terms of reference of its existing Working Group on Data Mining and Big Data Analysis to include (1) data analytics with special reference to healthcare, (2) big data tools and solutions, (3) bridging information technology and data technology and (4) data quality issues and challenges. Georg Thieme Verlag KG Stuttgart.

  6. Traditional Medicine Collection Tracking System (TM-CTS): a database for ethnobotanically driven drug-discovery programs.

    PubMed

    Harris, Eric S J; Erickson, Sean D; Tolopko, Andrew N; Cao, Shugeng; Craycroft, Jane A; Scholten, Robert; Fu, Yanling; Wang, Wenquan; Liu, Yong; Zhao, Zhongzhen; Clardy, Jon; Shamu, Caroline E; Eisenberg, David M

    2011-05-17

    Ethnobotanically driven drug-discovery programs include data related to many aspects of the preparation of botanical medicines, from initial plant collection to chemical extraction and fractionation. The Traditional Medicine Collection Tracking System (TM-CTS) was created to organize and store data of this type for an international collaborative project involving the systematic evaluation of commonly used Traditional Chinese Medicinal plants. The system was developed using domain-driven design techniques, and is implemented using Java, Hibernate, PostgreSQL, Business Intelligence and Reporting Tools (BIRT), and Apache Tomcat. The TM-CTS relational database schema contains over 70 data types, comprising over 500 data fields. The system incorporates a number of unique features that are useful in the context of ethnobotanical projects such as support for information about botanical collection, method of processing, quality tests for plants with existing pharmacopoeia standards, chemical extraction and fractionation, and historical uses of the plants. The database also accommodates data provided in multiple languages and integration with a database system built to support high throughput screening based drug discovery efforts. It is accessed via a web-based application that provides extensive, multi-format reporting capabilities. This new database system was designed to support a project evaluating the bioactivity of Chinese medicinal plants. The software used to create the database is open source, freely available, and could potentially be applied to other ethnobotanically driven natural product collection and drug-discovery programs. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Traditional Medicine Collection Tracking System (TM-CTS): A Database for Ethnobotanically-Driven Drug-Discovery Programs

    PubMed Central

    Harris, Eric S. J.; Erickson, Sean D.; Tolopko, Andrew N.; Cao, Shugeng; Craycroft, Jane A.; Scholten, Robert; Fu, Yanling; Wang, Wenquan; Liu, Yong; Zhao, Zhongzhen; Clardy, Jon; Shamu, Caroline E.; Eisenberg, David M.

    2011-01-01

    Aim of the study. Ethnobotanically-driven drug-discovery programs include data related to many aspects of the preparation of botanical medicines, from initial plant collection to chemical extraction and fractionation. The Traditional Medicine-Collection Tracking System (TM-CTS) was created to organize and store data of this type for an international collaborative project involving the systematic evaluation of commonly used Traditional Chinese Medicinal plants. Materials and Methods. The system was developed using domain-driven design techniques, and is implemented using Java, Hibernate, PostgreSQL, Business Intelligence and Reporting Tools (BIRT), and Apache Tomcat. Results. The TM-CTS relational database schema contains over 70 data types, comprising over 500 data fields. The system incorporates a number of unique features that are useful in the context of ethnobotanical projects such as support for information about botanical collection, method of processing, quality tests for plants with existing pharmacopoeia standards, chemical extraction and fractionation, and historical uses of the plants. The database also accommodates data provided in multiple languages and integration with a database system built to support high throughput screening based drug discovery efforts. It is accessed via a web-based application that provides extensive, multi-format reporting capabilities. Conclusions. This new database system was designed to support a project evaluating the bioactivity of Chinese medicinal plants. The software used to create the database is open source, freely available, and could potentially be applied to other ethnobotanically-driven natural product collection and drug-discovery programs. PMID:21420479

  8. Universal depinning transition of domain walls in ultrathin ferromagnets

    NASA Astrophysics Data System (ADS)

    Diaz Pardo, R.; Savero Torres, W.; Kolton, A. B.; Bustingorry, S.; Jeudy, V.

    2017-05-01

    We present a quantitative and comparative study of magnetic-field-driven domain-wall depinning transition in different ferromagnetic ultrathin films over a wide range of temperature. We reveal a universal scaling function accounting for both drive and thermal effects on the depinning transition, including critical exponents. The consistent description we obtain for both the depinning and subthreshold thermally activated creep motion should shed light on the universal glassy dynamics of thermally fluctuating elastic objects pinned by disordered energy landscapes.

  9. Harnessing Orbital Debris to Sense the Space Environment

    NASA Astrophysics Data System (ADS)

    Mutschler, S.; Axelrad, P.; Matsuo, T.

    A key requirement for accurate space situational awareness (SSA) is knowledge of the non-conservative forces that act on space objects. These effects vary temporally and spatially, driven by the dynamical behavior of space weather. Existing SSA algorithms adjust space weather models based on observations of calibration satellites. However, lack of sufficient data and mismodeling of non-conservative forces cause inaccuracies in space object motion prediction. The uncontrolled nature of debris makes it particularly sensitive to the variations in space weather. Our research takes advantage of this behavior by inverting observations of debris objects to infer the space environment parameters causing their motion. In addition, this research will produce more accurate predictions of the motion of debris objects. The hypothesis of this research is that it is possible to utilize a "cluster" of debris objects, objects within relatively close proximity of each other, to sense their local environment. We focus on deriving parameters of an atmospheric density model to more precisely predict the drag force on LEO objects. An Ensemble Kalman Filter (EnKF) is used for assimilation; the prior ensemble to the posterior ensemble is transformed during the measurement update in a manner that does not require inversion of large matrices. A prior ensemble is utilized to empirically determine the nonlinear relationship between measurements and density parameters. The filter estimates an extended state that includes position and velocity of the debris object, and atmospheric density parameters. The density is parameterized as a grid of values, distributed by latitude and local sidereal time over a spherical shell encompassing Earth. This research focuses on LEO object motion, but it can also be extended to additional orbital regimes for observation and refinement of magnetic field and solar radiation models. An observability analysis of the proposed approach is presented in terms of the measurement cadence necessary to estimate the local space environment.

  10. A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification

    DOE PAGES

    Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; ...

    2013-01-01

    Background . The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective . To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods . The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expertmore » knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results . The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions . Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less

  11. A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification

    PubMed Central

    Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Varnum, Susan M.; Brown, Joseph N.; Riensche, Roderick M.; Adkins, Joshua N.; Jacobs, Jon M.; Hoidal, John R.; Scholand, Mary Beth; Pounds, Joel G.; Blackburn, Michael R.; Rodland, Karin D.; McDermott, Jason E.

    2013-01-01

    Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification. PMID:24223463

  12. A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.

    2013-10-01

    Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integratedmore » into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less

  13. Risk Factors Associated with Injury and Mortality from Paediatric Low Speed Vehicle Incidents: A Systematic Review

    PubMed Central

    Paul Anthikkat, Anne; Page, Andrew; Barker, Ruth

    2013-01-01

    Objective. This study reviews modifiable risk factors associated with fatal and nonfatal injury from low-speed vehicle runover (LSVRO) incidents involving children aged 0–15 years. Data Sources. Electronic searches for child pedestrian and driveway injuries from the peer-reviewed literature and transport-related websites from 1955 to 2012. Study Selection. 41 studies met the study inclusion criteria. Data Extraction. A systematic narrative summary was conducted that included study design, methodology, risk factors, and other study variables. Results. The most commonly reported risk factors for LSVRO incidents included age under 5 years, male gender, and reversing vehicles. The majority of reported incidents involved residential driveways, but several studies identified other traffic and nontraffic locations. Low socioeconomic status and rental accommodation were also associated with LSVRO injury. Vehicles were most commonly driven by a family member, predominantly a parent. Conclusion. There are a number of modifiable vehicular, environmental, and behavioural factors associated with LSVRO injuries in young children that have been identified in the literature to date. Strategies relating to vehicle design (devices for increased rearward visibility and crash avoidance systems), housing design (physical separation of driveway and play areas), and behaviour (driver behaviour, supervision of young children) are discussed. PMID:23781251

  14. Laser driven plasmas based incoherent x-ray sources at PALS and ELI Beamlines (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kozlová, Michaela

    2017-05-01

    We will present data on a various X-ray production schemes from laser driven plasmas at the PALS Research Center and discuss the plan for the ELI Beamlines project. One of the approaches, how to generate ultrashort pulses of incoherent X-ray radiation, is based on interaction of femtosecond laser pulses with solid or liquid targets. So-called K-alpha source depending on used targets emits in hard X-ray region from micrometric source size. The source exhibits sufficient spatial coherence to observe phase contrast. Detailed characterization of various sources including the x-ray spectrum and the x-ray average yield along with phase contrast images of test objects will be presented. Other method, known as laser wakefield electron acceleration (LWFA), can produce up to GeV electron beams emitting radiation in collimated beam with a femtosecnond pulse duration. This approach was theoretically and experimentally examined at the PALS Center. The parameters of the PALS Ti:S laser interaction were studied by extensive particle-in-cell simulations with radiation post-processors in order to evaluate the capabilities of our system in this field. The extensions of those methods at the ELI Beamlines facility will enable to generate either higher X-ray energies or higher repetition rate. The architecture of such sources and their considered applications will be proposed.

  15. Priming of Reach and Grasp Actions by Handled Objects

    ERIC Educational Resources Information Center

    Masson, Michael E. J.; Bub, Daniel N.; Breuer, Andreas T.

    2011-01-01

    Pictures of handled objects such as a beer mug or frying pan are shown to prime speeded reach and grasp actions that are compatible with the object. To determine whether the evocation of motor affordances implied by this result is driven merely by the physical orientation of the object's handle as opposed to higher-level properties of the object,…

  16. Zooming into local active galactic nuclei: the power of combining SDSS-IV MaNGA with higher resolution integral field unit observations

    NASA Astrophysics Data System (ADS)

    Wylezalek, Dominika; Schnorr Müller, Allan; Zakamska, Nadia L.; Storchi-Bergmann, Thaisa; Greene, Jenny E.; Müller-Sánchez, Francisco; Kelly, Michael; Liu, Guilin; Law, David R.; Barrera-Ballesteros, Jorge K.; Riffel, Rogemar A.; Thomas, Daniel

    2017-05-01

    Ionized gas outflows driven by active galactic nuclei (AGN) are ubiquitous in high-luminosity AGN with outflow speeds apparently correlated with the total bolometric luminosity of the AGN. This empirical relation and theoretical work suggest that in the range Lbol ˜ 1043-45 erg s-1 there must exist a threshold luminosity above which the AGN becomes powerful enough to launch winds that will be able to escape the galaxy potential. In this paper, we present pilot observations of two AGN in this transitional range that were taken with the Gemini North Multi-Object Spectrograph integral field unit (IFU). Both sources have also previously been observed within the Sloan Digital Sky Survey-IV (SDSS) Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey. While the MaNGA IFU maps probe the gas fields on galaxy-wide scales and show that some regions are dominated by AGN ionization, the new Gemini IFU data zoom into the centre with four times better spatial resolution. In the object with the lower Lbol we find evidence of a young or stalled biconical AGN-driven outflow where none was obvious at the MaNGA resolution. In the object with the higher Lbol we trace the large-scale biconical outflow into the nuclear region and connect the outflow from small to large scales. These observations suggest that AGN luminosity and galaxy potential are crucial in shaping wind launching and propagation in low-luminosity AGN. The transition from small and young outflows to galaxy-wide feedback can only be understood by combining large-scale IFU data that trace the galaxy velocity field with higher resolution, small-scale IFU maps.

  17. DATA MANAGEMENT SYSTEM FOR MOBILE SATELLITE PROPAGATION DATA

    NASA Technical Reports Server (NTRS)

    Kantak, A. V.

    1994-01-01

    The "Data Management System for Mobile Satellite Propogation" package is a collection of FORTRAN programs and UNIX shell scripts designed to handle the huge amounts of data resulting from Mobile Satellite propogation experiments. These experiments are designed to assist in defining channels for mobile satellite systems. By understanding multipath fading characteristics of the channel, doppler effects, and blockage due to manmade objects as well as natural surroundings, characterization of the channel can be realized. Propogation experiments, then, are performed using a prototype of the system simulating the ultimate product environment. After the data from these experiments is generated, the researcher must access this data with a minimum of effort and to derive some standard results. The programs included in this package manipulate the data files generated by the NASA/JPL Mobile Satellite propogation experiment on an interactive basis. In the experiment, a transmitter operating at 869 MHz was carried to an altitude of 32Km by a stratospheric balloon. A vehicle within the line-of-sight of the transmitter was then driven around, splitting the incoming signal into I and Q channels, and sampling the resulting signal strength at 1000 samples per second. The data was collected at various antenna elavation angles and different times of day generating the ancillary data for the experiment. This package contains a program to convert the binary format of the data generated into standard ASCII format suitable for use with a wide variety of machine architectures. Also included is a UNIX shell-script designed to parse this ASCII file into those records of data that match the researcher's desired values for the ancillary data parameters. In addition, four FORTRAN programs are included to obtain standard quantities from the data. Quantities such as probability of signal level greater than or equal to a specified signal level, probability density of the signal levels, frequency of fade duration, and Fourier Transforms of the sampled data can be generated from the propogation experiment data. All programs in this package are written in either FORTRAN 77 or UNIX shell-scripts. The package does not include test data. The programs were developed in 1987 for use with a UNIX operating system on a DEC MicroVAX computer.

  18. Design of a surface deformation measuring instrument for the Surface Tension Driven Convection Experiment (STDCE-2)

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    1993-01-01

    This final technical report covers the work accomplished (under NAG3-1300) from 1 October 1991 to 1 October 1993. The grant is a direct result of Dr. H. Philip Stahl's (of Rose-Hulman Institute of Technology) participation in the NASA/ASEE Summer Faculty Fellowship Program at NASA Lewis Research Center sponsored by Case Western Reserve University and the Ohio Aerospace Institute. The Surface Tension Driven Convection Experiment (STDCE) is a fundamental fluid physics experiment designed to provide quantitative data on the thermocapillary flow of fluid under the influence of an increased localized surface temperature. STDCE flew on the Space Shuttle Columbia in the First United States Microgravity Laboratory (USML-1) in June 1992. The second flight of this experiment (STDCE-2) is scheduled for 1995. The specific science objectives of STDCE-2 are to determine the extent and nature of thermocapillary flows, the effect of heating mode and level, the effect of the liquid free-surface shape, and the onset conditions for and nature of oscillatory flows. In order to satisfy one of these objectives, an instrument for measuring the shape of an air/oil free surface must be developed.

  19. Uncovering hidden nodes in complex networks in the presence of noise

    PubMed Central

    Su, Ri-Qi; Lai, Ying-Cheng; Wang, Xiao; Do, Younghae

    2014-01-01

    Ascertaining the existence of hidden objects in a complex system, objects that cannot be observed from the external world, not only is curiosity-driven but also has significant practical applications. Generally, uncovering a hidden node in a complex network requires successful identification of its neighboring nodes, but a challenge is to differentiate its effects from those of noise. We develop a completely data-driven, compressive-sensing based method to address this issue by utilizing complex weighted networks with continuous-time oscillatory or discrete-time evolutionary-game dynamics. For any node, compressive sensing enables accurate reconstruction of the dynamical equations and coupling functions, provided that time series from this node and all its neighbors are available. For a neighboring node of the hidden node, this condition cannot be met, resulting in abnormally large prediction errors that, counterintuitively, can be used to infer the existence of the hidden node. Based on the principle of differential signal, we demonstrate that, when strong noise is present, insofar as at least two neighboring nodes of the hidden node are subject to weak background noise only, unequivocal identification of the hidden node can be achieved. PMID:24487720

  20. Data presentation techniques for rotating machinery malfunction diagnosis

    NASA Technical Reports Server (NTRS)

    Spettel, T.

    1985-01-01

    Baseline steady state data is excellent for documentation of vibration signals at normal operating conditions. Assuming that a set of initial data was acquired with the machinery in a good state of repair, any future changes or deterioration in mechanical condition can be easily compared to the baseline information. Often this type of comparison will yield sufficient information for evaluation of the problem. However, many malfunctions require the analysis of transient data in order to identify the malfunction. Steady-state data formats consist of: Time Base Waveform, Orbit, Spectrum. Transient data formats consist of: Polar, Bode, Cascade. Our objective is to demonstrate the use of the above formats to diagnose a machine malfunction. A turbine-driven compressor train is chosen as an example. The machine train outline drawing is shown.

  1. Model Driven Engineering

    NASA Astrophysics Data System (ADS)

    Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan

    A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.

  2. One year on VESPA, a community-driven Virtual Observatory in Planetary Science

    NASA Astrophysics Data System (ADS)

    Erard, S.; Cecconi, B.; Le Sidaner, P.; Rossi, A. P.; Capria, M. T.; Schmitt, B.; Andre, N.; Vandaele, A. C.; Scherf, M.; Hueso, R.; Maattanen, A. E.; Thuillot, W.; Achilleos, N.; Marmo, C.; Santolik, O.; Benson, K.

    2016-12-01

    The Europlanet H2020 program started on 1/9/2015 for 4 years. It includes an activity to adapt Virtual Observatory (VO) techniques to Planetary Science data called VESPA. The objective is to facilitate searches in big archives as well as sparse databases, to provide simple data access and on-line visualization, and to allow small data providers to make their data available in an interoperable environment with minimum effort. The VESPA system, based on a prototype developed in a previous program [1], has been hugely improved during the first year of Europlanet H2020: the infrastructure has been upgraded to describe data in many fields more accurately; the main user search interface (http://vespa.obspm.fr) has been redesigned to provide more flexibility; alternative ways to access Planetary Science data services from VO tools are being implemented in addition to receiving data from the main interface; VO tools are being improved to handle specificities of Solar System data, e.g. measurements in reflected light, coordinate systems, etc. Existing data services have been updated, and new ones have been designed. The global objective (50 data services) is already overstepped, with 54 services open or being finalized. A procedure to install data services has been documented, and hands-on sessions are organized twice a year at EGU and EPSC; this is intended to favour the installation of services by individual research teams, e.g. to distribute derived data related to a published study. In complement, regular discussions are held with big data providers, starting with space agencies (IPDA). Common projects with ESA and NASA's PDS have been engaged, which should lead to a connection between PDS4 and EPN-TAP. In parallel, a Solar System Interest Group has been decided in IVOA; the goal is here to adapt existing astronomy standards to Planetary Science.Future steps will include the development of a connection between the VO world and GIS tools, and integration of Heliophysics, planetary plasma and mineral spectroscopy data. The Europlanet 2020 Research Infrastructure project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208. [1] Erard et al 2014, Astronomy & Computing 7-8, 71-80. http://arxiv.org/abs/1407.4886

  3. IRON OPTIMIZATION FOR FENTON-DRIVEN OXIDATION OF MTBE-SPENT GRANULAR ACTIVATED CARBON

    EPA Science Inventory

    Fenton-driven chemical regeneration of granular activated carbon (GAC) is accomplished through the addition of H2O2 and iron (Fe) to spent GAC. The overall objective of this treatment process is to transform target contaminants into less toxic byproducts, re-establish the sorpti...

  4. The Development of Teaching and Learning in Bright-Field Microscopy Technique

    ERIC Educational Resources Information Center

    Iskandar, Yulita Hanum P.; Mahmud, Nurul Ethika; Wahab, Wan Nor Amilah Wan Abdul; Jamil, Noor Izani Noor; Basir, Nurlida

    2013-01-01

    E-learning should be pedagogically-driven rather than technologically-driven. The objectives of this study are to develop an interactive learning system in bright-field microscopy technique in order to support students' achievement of their intended learning outcomes. An interactive learning system on bright-field microscopy technique was…

  5. Field instrumentation and testing to study set-up phenomenon of piles driven into Louisiana clayey soils.

    DOT National Transportation Integrated Search

    2011-02-01

    The main objective of this research study is to evaluate the time-dependent increase in pile capacity (or pile setup phenomenon) for piles driven into Louisiana soils through conducting repeated static and dynamic field testing with time on full-scal...

  6. 3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands.

    PubMed

    Mateo, Carlos M; Gil, Pablo; Torres, Fernando

    2016-05-05

    Sensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception system of complex grasping tasks to support a robot controller when other sensor systems, such as tactile and force, are not able to obtain useful data relevant to the grasping manipulation task. In particular, a new visual approach based on RGBD data was implemented to help a robot controller carry out intelligent manipulation tasks with flexible objects. The proposed method supervises the interaction between the grasped object and the robot hand in order to avoid poor contact between the fingertips and an object when there is neither force nor pressure data. This new approach is also used to measure changes to the shape of an object's surfaces and so allows us to find deformations caused by inappropriate pressure being applied by the hand's fingers. Test was carried out for grasping tasks involving several flexible household objects with a multi-fingered robot hand working in real time. Our approach generates pulses from the deformation detection method and sends an event message to the robot controller when surface deformation is detected. In comparison with other methods, the obtained results reveal that our visual pipeline does not use deformations models of objects and materials, as well as the approach works well both planar and 3D household objects in real time. In addition, our method does not depend on the pose of the robot hand because the location of the reference system is computed from a recognition process of a pattern located place at the robot forearm. The presented experiments demonstrate that the proposed method accomplishes a good monitoring of grasping task with several objects and different grasping configurations in indoor environments.

  7. Understanding and Improving Modifiable Cardiovascular Risks within the Air Force

    DTIC Science & Technology

    2013-10-04

    Promotion Model ( HPM ). Findings: The definition of health included exercise, proper eating, sleep, and a spiritual connection, as well as the absence of...to health behaviors, including what it takes to be healthy, knowing oneself, and existing Air Force policies. The HPM did not fully address all of the...was used to arrange the data into data-driven themes. These themes were then compared to the elements of the Health Promotion Model ( HPM

  8. Object-oriented model-driven control

    NASA Technical Reports Server (NTRS)

    Drysdale, A.; Mcroberts, M.; Sager, J.; Wheeler, R.

    1994-01-01

    A monitoring and control subsystem architecture has been developed that capitalizes on the use of modeldriven monitoring and predictive control, knowledge-based data representation, and artificial reasoning in an operator support mode. We have developed an object-oriented model of a Controlled Ecological Life Support System (CELSS). The model based on the NASA Kennedy Space Center CELSS breadboard data, tracks carbon, hydrogen, and oxygen, carbodioxide, and water. It estimates and tracks resorce-related parameters such as mass, energy, and manpower measurements such as growing area required for balance. We are developing an interface with the breadboard systems that is compatible with artificial reasoning. Initial work is being done on use of expert systems and user interface development. This paper presents an approach to defining universally applicable CELSS monitor and control issues, and implementing appropriate monitor and control capability for a particular instance: the KSC CELSS Breadboard Facility.

  9. Data Center Energy Practitioner (DCEP) Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Traber, Kim; Salim, Munther; Sartor, Dale A.

    2016-02-02

    The main objective for the DCEP program is to raise the standards of those involved in energy assessments of data centers to accelerate energy savings. The program is driven by the fact that significant knowledge, training, and skills are required to perform accurate energy assessments. The program will raise the confidence level in energy assessments in data centers. For those who pass the exam, the program will recognize them as Data Center Energy Practitioners (DCEPs) by issuing a certificate. Hardware req.: PC, MAC; Software Req.: Windows; Related/Auxiliary software--MS Office; Type of files: executable modules, user guide; Documentation: e-user manual; Documentation:more » http://www.1.eere.energy.gov/industry/datacenters/ 12/10/15-New Documentation URL: https://datacenters.lbl.gov/dcep« less

  10. Topical video object discovery from key frames by modeling word co-occurrence prior.

    PubMed

    Zhao, Gangqiang; Yuan, Junsong; Hua, Gang; Yang, Jiong

    2015-12-01

    A topical video object refers to an object, that is, frequently highlighted in a video. It could be, e.g., the product logo and the leading actor/actress in a TV commercial. We propose a topic model that incorporates a word co-occurrence prior for efficient discovery of topical video objects from a set of key frames. Previous work using topic models, such as latent Dirichelet allocation (LDA), for video object discovery often takes a bag-of-visual-words representation, which ignored important co-occurrence information among the local features. We show that such data driven co-occurrence information from bottom-up can conveniently be incorporated in LDA with a Gaussian Markov prior, which combines top-down probabilistic topic modeling with bottom-up priors in a unified model. Our experiments on challenging videos demonstrate that the proposed approach can discover different types of topical objects despite variations in scale, view-point, color and lighting changes, or even partial occlusions. The efficacy of the co-occurrence prior is clearly demonstrated when compared with topic models without such priors.

  11. Depletion force induced collective motion of microtubules driven by kinesin

    NASA Astrophysics Data System (ADS)

    Inoue, Daisuke; Mahmot, Bulbul; Kabir, Arif Md. Rashedul; Farhana, Tamanna Ishrat; Tokuraku, Kiyotaka; Sada, Kazuki; Konagaya, Akihiko; Kakugo, Akira

    2015-10-01

    Collective motion is a fascinating example of coordinated behavior of self-propelled objects, which is often associated with the formation of large scale patterns. Nowadays, the in vitro gliding assay is being considered a model system to experimentally investigate various aspects of group behavior and pattern formation by self-propelled objects. In the in vitro gliding assay, cytoskeletal filaments F-actin or microtubules are driven by the surface immobilized associated biomolecular motors myosin or dynein respectively. Although the F-actin/myosin or microtubule/dynein system was found to be promising in understanding the collective motion and pattern formation by self-propelled objects, the most widely used biomolecular motor system microtubule/kinesin could not be successfully employed so far in this regard. Failure in exhibiting collective motion by kinesin driven microtubules is attributed to the intrinsic properties of kinesin, which was speculated to affect the behavior of individual gliding microtubules and mutual interactions among them. In this work, for the first time, we have demonstrated the collective motion of kinesin driven microtubules by regulating the mutual interaction among the gliding microtubules, by employing a depletion force among them. Proper regulation of the mutual interaction among the gliding microtubules through the employment of the depletion force was found to allow the exhibition of collective motion and stream pattern formation by the microtubules. This work offers a universal means for demonstrating the collective motion using the in vitro gliding assay of biomolecular motor systems and will help obtain a meticulous understanding of the fascinating coordinated behavior and pattern formation by self-propelled objects.Collective motion is a fascinating example of coordinated behavior of self-propelled objects, which is often associated with the formation of large scale patterns. Nowadays, the in vitro gliding assay is being considered a model system to experimentally investigate various aspects of group behavior and pattern formation by self-propelled objects. In the in vitro gliding assay, cytoskeletal filaments F-actin or microtubules are driven by the surface immobilized associated biomolecular motors myosin or dynein respectively. Although the F-actin/myosin or microtubule/dynein system was found to be promising in understanding the collective motion and pattern formation by self-propelled objects, the most widely used biomolecular motor system microtubule/kinesin could not be successfully employed so far in this regard. Failure in exhibiting collective motion by kinesin driven microtubules is attributed to the intrinsic properties of kinesin, which was speculated to affect the behavior of individual gliding microtubules and mutual interactions among them. In this work, for the first time, we have demonstrated the collective motion of kinesin driven microtubules by regulating the mutual interaction among the gliding microtubules, by employing a depletion force among them. Proper regulation of the mutual interaction among the gliding microtubules through the employment of the depletion force was found to allow the exhibition of collective motion and stream pattern formation by the microtubules. This work offers a universal means for demonstrating the collective motion using the in vitro gliding assay of biomolecular motor systems and will help obtain a meticulous understanding of the fascinating coordinated behavior and pattern formation by self-propelled objects. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr02213d

  12. From the Users' Perspective-The UCSD Libraries User Survey Project.

    ERIC Educational Resources Information Center

    Talbot, Dawn E.; Lowell, Gerald R.; Martin, Kerry

    1998-01-01

    Discussion of a user-driven survey conducted at the University of California, San Diego libraries focuses on the methodology that resulted in a high response rate. Highlights goals for the survey, including acceptance of data by groups outside the library and for benchmarking data; planning; user population; and questionnaire development. (LRW)

  13. Materials discovery guided by data-driven insights

    NASA Astrophysics Data System (ADS)

    Klintenberg, Mattias

    As the computational power continues to grow systematic computational exploration has become an important tool for materials discovery. In this presentation the Electronic Structure Project (ESP/ELSA) will be discussed and a number of examples presented that show some of the capabilities of a data-driven methodology for guiding materials discovery. These examples include topological insulators, detector materials and 2D materials. ESP/ELSA is an initiative that dates back to 2001 and today contain many tens of thousands of materials that have been investigated using a robust and high accuracy electronic structure method (all-electron FP-LMTO) thus providing basic materials first-principles data for most inorganic compounds that have been structurally characterized. The web-site containing the ESP/ELSA data has as of today been accessed from more than 4,000 unique computers from all around the world.

  14. Challenges in Developing XML-Based Learning Repositories

    NASA Astrophysics Data System (ADS)

    Auksztol, Jerzy; Przechlewski, Tomasz

    There is no doubt that modular design has many advantages, including the most important ones: reusability and cost-effectiveness. In an e-leaming community parlance the modules are determined as Learning Objects (LOs) [11]. An increasing amount of learning objects have been created and published online, several standards has been established and multiple repositories developed for them. For example Cisco Systems, Inc., "recognizes a need to move from creating and delivering large inflexible training courses, to database-driven objects that can be reused, searched, and modified independent of their delivery media" [6]. The learning object paradigm of education resources authoring is promoted mainly to reduce the cost of the content development and to increase its quality. A frequently used metaphor of Learning Objects paradigm compares them to Lego Logs or objects in Object-Oriented program design [25]. However a metaphor is only an abstract idea, which should be turned to something more concrete to be usable. The problem is that many papers on LOs end up solely in metaphors. In our opinion Lego or OO metaphors are gross oversimplificatation of the problem as there is much easier to develop Lego set or design objects in OO program than develop truly interoperable, context-free learning content1.

  15. Key design elements of a data utility for national biosurveillance: event-driven architecture, caching, and Web service model.

    PubMed

    Tsui, Fu-Chiang; Espino, Jeremy U; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M

    2005-01-01

    The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance-systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions.

  16. REDLetr: Workflow and tools to support the migration of legacy clinical data capture systems to REDCap

    PubMed Central

    Dunn, William D; Cobb, Jake; Levey, Allan I; Gutman, David A

    2017-01-01

    Objective A memory clinic at an academic medical center has relied on several ad hoc data capture systems including Microsoft Access and Excel for cognitive assessments over the last several years. However these solutions are challenging to maintain and limit the potential of hypothesis-driven or longitudinal research. REDCap, a secure web application based on php and MySQL, is a practical solution for improving data capture and organization. Here, we present a workflow and toolset to facilitate legacy data migration and real-time clinical research data collection into REDCap as well as challenges encountered. Materials and Methods Legacy data consisted of neuropsychological tests stored in over 4,000 Excel workbooks. Functions for data extraction, norm scoring, converting to REDCap-compatible formats, accessing the REDCap API, and clinical report generation were developed and executed in Python. Results Over 400 unique data points for each workbook were migrated and integrated into our REDCap database. Moving forward, our REDCap-based system replaces the Excel-based data collection method as well as eases the integration to the Electronic Health Record. Conclusion In the age of growing data, efficient organization and storage of clinical and research data is critical for advancing research and providing efficient patient care. We believe that the tools and workflow described in this work to promote legacy data integration as well as real time data collection into REDCap ultimately facilitate these goals. PMID:27396629

  17. Dr Google and the consumer: a qualitative study exploring the navigational needs and online health information-seeking behaviors of consumers with chronic health conditions.

    PubMed

    Lee, Kenneth; Hoti, Kreshnik; Hughes, Jeffery David; Emmerton, Lynne

    2014-12-02

    The abundance of health information available online provides consumers with greater access to information pertinent to the management of health conditions. This is particularly important given an increasing drive for consumer-focused health care models globally, especially in the management of chronic health conditions, and in recognition of challenges faced by lay consumers with finding, understanding, and acting on health information sourced online. There is a paucity of literature exploring the navigational needs of consumers with regards to accessing online health information. Further, existing interventions appear to be didactic in nature, and it is unclear whether such interventions appeal to consumers' needs. Our goal was to explore the navigational needs of consumers with chronic health conditions in finding online health information within the broader context of consumers' online health information-seeking behaviors. Potential barriers to online navigation were also identified. Semistructured interviews were conducted with adult consumers who reported using the Internet for health information and had at least one chronic health condition. Participants were recruited from nine metropolitan community pharmacies within Western Australia, as well as through various media channels. Interviews were audio-recorded, transcribed verbatim, and then imported into QSR NVivo 10. Two established approaches to thematic analysis were adopted. First, a data-driven approach was used to minimize potential bias in analysis and improve construct and criterion validity. A theory-driven approach was subsequently used to confirm themes identified by the former approach and to ensure identified themes were relevant to the objectives. Two levels of analysis were conducted for both data-driven and theory-driven approaches: manifest-level analysis, whereby face-value themes were identified, and latent-level analysis, whereby underlying concepts were identified. We conducted 17 interviews, with data saturation achieved by the 14th interview. While we identified a broad range of online health information-seeking behaviors, most related to information discussed during consumer-health professional consultations such as looking for information about medication side effects. The barriers we identified included intrinsic barriers, such as limited eHealth literacy, and extrinsic barriers, such as the inconsistency of information between different online sources. The navigational needs of our participants were extrinsic in nature and included health professionals directing consumers to appropriate online resources and better filtering of online health information. Our participants' online health information-seeking behaviors, reported barriers, and navigational needs were underpinned by the themes of trust, patient activation, and relevance. This study suggests that existing interventions aimed to assist consumers with navigating online health information may not be what consumers want or perceive they need. eHealth literacy and patient activation appear to be prevalent concepts in the context of consumers' online health information-seeking behaviors. Furthermore, the role for health professionals in guiding consumers to quality online health information is highlighted.

  18. Microstrip Yagi Antenna with Dual Aperture-Coupled Feed

    NASA Technical Reports Server (NTRS)

    Pogorzelski, Ronald; Venkatesan, Jaikrishna

    2008-01-01

    A proposed microstrip Yagi antenna would operate at a frequency of 8.4 GHz (which is in the X band) and would feature a mechanically simpler, more elegant design, relative to a prior L-band microstrip Yagi antenna. In general, the purpose of designing a microstrip Yagi antenna is to combine features of a Yagi antenna with those of a microstrip patch to obtain an antenna that can be manufactured at low cost, has a low profile, and radiates a directive beam that, as plotted on an elevation plane perpendicular to the antenna plane, appears tilted away from the broadside. Such antennas are suitable for flush mounting on surfaces of diverse objects, including spacecraft, aircraft, land vehicles, and computers. Stated somewhat more precisely, what has been proposed is a microstrip antenna comprising an array of three Yagi elements. Each element would include four microstrip-patch Yagi subelements: one reflector patch, one driven patch, and two director patches. To obtain circular polarization, each driven patch would be fed by use of a dual offset aperture-coupled feed featuring bow-tie-shaped apertures. The selection of the dual offset bow-tie aperture geometry is supported by results found in published literature that show that this geometry would enable matching of the impedances of the driven patches to the 50-Omega impedance of the microstrip feedline while maintaining a desirably large front-to-back lobe ratio.

  19. Joint Optimization of Fluence Field Modulation and Regularization in Task-Driven Computed Tomography.

    PubMed

    Gang, G J; Siewerdsen, J H; Stayman, J W

    2017-02-11

    This work presents a task-driven joint optimization of fluence field modulation (FFM) and regularization in quadratic penalized-likelihood (PL) reconstruction. Conventional FFM strategies proposed for filtered-backprojection (FBP) are evaluated in the context of PL reconstruction for comparison. We present a task-driven framework that leverages prior knowledge of the patient anatomy and imaging task to identify FFM and regularization. We adopted a maxi-min objective that ensures a minimum level of detectability index ( d' ) across sample locations in the image volume. The FFM designs were parameterized by 2D Gaussian basis functions to reduce dimensionality of the optimization and basis function coefficients were estimated using the covariance matrix adaptation evolutionary strategy (CMA-ES) algorithm. The FFM was jointly optimized with both space-invariant and spatially-varying regularization strength ( β ) - the former via an exhaustive search through discrete values and the latter using an alternating optimization where β was exhaustively optimized locally and interpolated to form a spatially-varying map. The optimal FFM inverts as β increases, demonstrating the importance of a joint optimization. For the task and object investigated, the optimal FFM assigns more fluence through less attenuating views, counter to conventional FFM schemes proposed for FBP. The maxi-min objective homogenizes detectability throughout the image and achieves a higher minimum detectability than conventional FFM strategies. The task-driven FFM designs found in this work are counter to conventional patterns for FBP and yield better performance in terms of the maxi-min objective, suggesting opportunities for improved image quality and/or dose reduction when model-based reconstructions are applied in conjunction with FFM.

  20. Lower extremity EMG-driven modeling of walking with automated adjustment of musculoskeletal geometry

    PubMed Central

    Meyer, Andrew J.; Patten, Carolynn

    2017-01-01

    Neuromusculoskeletal disorders affecting walking ability are often difficult to manage, in part due to limited understanding of how a patient’s lower extremity muscle excitations contribute to the patient’s lower extremity joint moments. To assist in the study of these disorders, researchers have developed electromyography (EMG) driven neuromusculoskeletal models utilizing scaled generic musculoskeletal geometry. While these models can predict individual muscle contributions to lower extremity joint moments during walking, the accuracy of the predictions can be hindered by errors in the scaled geometry. This study presents a novel EMG-driven modeling method that automatically adjusts surrogate representations of the patient’s musculoskeletal geometry to improve prediction of lower extremity joint moments during walking. In addition to commonly adjusted neuromusculoskeletal model parameters, the proposed method adjusts model parameters defining muscle-tendon lengths, velocities, and moment arms. We evaluated our EMG-driven modeling method using data collected from a high-functioning hemiparetic subject walking on an instrumented treadmill at speeds ranging from 0.4 to 0.8 m/s. EMG-driven model parameter values were calibrated to match inverse dynamic moments for five degrees of freedom in each leg while keeping musculoskeletal geometry close to that of an initial scaled musculoskeletal model. We found that our EMG-driven modeling method incorporating automated adjustment of musculoskeletal geometry predicted net joint moments during walking more accurately than did the same method without geometric adjustments. Geometric adjustments improved moment prediction errors by 25% on average and up to 52%, with the largest improvements occurring at the hip. Predicted adjustments to musculoskeletal geometry were comparable to errors reported in the literature between scaled generic geometric models and measurements made from imaging data. Our results demonstrate that with appropriate experimental data, joint moment predictions for walking generated by an EMG-driven model can be improved significantly when automated adjustment of musculoskeletal geometry is included in the model calibration process. PMID:28700708

  1. The South/Southeast Asia Research Initiative (SARI) Update and Meeting Objectives

    NASA Technical Reports Server (NTRS)

    Vadrevu, Krishna Prasad

    2017-01-01

    Land Use/Cover Change (LU/CC) is one of the most important types of environmental change in South and Southeast Asian countries. Several studies suggest that LU/CC in these countries is in large part driven by population growth and economic development. In the region, changes that are most common include urban expansion, agricultural land loss, land abandonment, deforestation, logging, reforestation, etc. To address the research needs and priorities in the region, a regional initiative entitled South Southeast Asia Regional Initiative (SARI) has been developed involving US and regional scientists. The initiative is funded by NASA Land Cover, Land Use Change program. The goal of SARI is to integrate state-of-the-art remote sensing, natural sciences, engineering and social sciences to enrich LU/CC science in South Southeast Asian countries. In the presentation, LU/CC change research in SARI countries will be highlighted including the drivers of change. For example, in South Asia, forest cover has been increasing in countries like India, Nepal and Bhutan due to sustainable afforestation measures; whereas, large-scale deforestation in Southeast Asian countries is still continuing, due to oil palm plantation expansion driven by the international market demand in Malaysia and Indonesia. With respect to urbanization, South and Southeast Asian countries contain 23 megacities, each with more than 10 million people. Rapid urbanization is driving agricultural land loss and agricultural intensification has been increasing due to less availability of land for growing food crops such as in India, Vietnam, and Thailand. The drivers of LUCC vary widely in the region and include such factors as land tenure, local economic development, government policies, inappropriate land management, land speculation, improved road networks, etc. In addition, variability in the weather, climate, and socioeconomic factors also drive LU/CC resulting in disruptions of biogeochemical cycles, radiation and the surface energy balance of the atmosphere. The presentation will also highlight SARI collaborative activities with space agencies, universities and non-government organizations including data sharing mechanisms in the region.

  2. Revising the `Henry Problem' of density-driven groundwater flow: A review of historic Biscayne aquifer data.

    NASA Astrophysics Data System (ADS)

    Weyer, K. U.

    2016-12-01

    Coastal groundwater flow investigations at the Cutler site of the Biscayne Bay south of Miami, Florida, gave rise to the dominating concept of density-driven flow of sea water into coastal aquifers indicated as a saltwater wedge. Within that wedge convection type return flow of seawater and a dispersion zone were concluded by Cooper et al. (1964, USGS Water Supply Paper 1613-C) to be the cause of the Biscayne aquifer `sea water wedge'. This conclusion was merely based on the chloride distribution within the aquifer and on an analytical model concept assuming convection flow within a confined aquifer without taking non-chemical field data into consideration. This concept was later labelled the `Henry Problem', which any numerical variable density flow program has to be able to simulate to be considered acceptable. Revisiting the above summarizing publication with its record of piezometric field data (heads) showed that the so-called sea water wedge was actually caused by discharging deep saline groundwater driven by gravitational flow and not by denser sea water. Density driven flow of seawater into the aquifer was not found reflected in the head measurements for low and high tide conditions which had been taken contemporaneously with the chloride measurements. These head measurements had not been included in the flow interpretation. The very same head measurements indicated a clear dividing line between shallow local fresh groundwater flow and saline deep groundwater flow without the existence of a dispersion zone or a convection cell. The Biscayne situation emphasizes the need for any chemical interpretation of flow pattern to be backed up by head data as energy indicators of flow fields. At the Biscayne site density driven flow of seawater did and does not exist. Instead this site and the Florida coast line in general are the end points of local fresh and regional saline groundwater flow systems driven by gravity forces and not by density differences.

  3. Geology of -30247, -35247, and -40247 Quadrangles, Southern Hesperia Planum, Mars

    NASA Technical Reports Server (NTRS)

    Mest, S. C.; Crown, D. A.

    2010-01-01

    Geologic mapping of MTM -30247, -35247, and -40247 quadrangles is being used to characterize Reull Vallis (RV) and examine the roles and timing of volatile-driven erosional and depositional processes. This study complements earlier investigations of the eastern Hellas region, including regional analyses [1-6], mapping studies of circum-Hellas canyons [7-10], and volcanic studies of Hadriaca and Tyrrhena Paterae [11-13]. Key scientific objectives include 1) characterizing RV in its "fluvial zone," and evaluating its history of formation, 2) analyzing channels in the surrounding plains and potential connections to RV, and 3) examining young, possibly sedimentary plains along RV.

  4. Leveraging information technology to drive improvement in patient satisfaction.

    PubMed

    Nash, Mary; Pestrue, Justin; Geier, Peter; Sharp, Karen; Helder, Amy; McAlearney, Ann Scheck

    2010-01-01

    A healthcare organization's commitment to quality and the patient experience requires senior leader involvement in improvement strategies, and accountability for goals. Further, improvement strategies are most effective when driven by data, and in the world of patient satisfaction, evidence is growing that nurse leader rounding and discharge calls are strategic tactics that can improve patient satisfaction. This article describes how The Ohio State University Medical Center (OSUMC) leveraged health information technology (IT) to apply a data-driven strategy execution to improve the patient experience. Specifically, two IT-driven approaches were used: (1) business intelligence reporting tools were used to create a meaningful reporting system including dashboards, scorecards, and tracking reports and (2) an improvement plan was implemented that focused on two high-impact tactics and data to hardwire accountability. Targeted information from the IT systems enabled clinicians and administrators to execute these strategic tactics, and senior leaders to monitor achievement of strategic goals. As a result, OSUMC's inpatient satisfaction scores on the Hospital Consumer Assessment of Healthcare Providers and Systems survey improved from 56% nines and tens in 2006 to 71% in 2009. © 2010 National Association for Healthcare Quality.

  5. Materials Data Science: Current Status and Future Outlook

    NASA Astrophysics Data System (ADS)

    Kalidindi, Surya R.; De Graef, Marc

    2015-07-01

    The field of materials science and engineering is on the cusp of a digital data revolution. After reviewing the nature of data science and Big Data, we discuss the features of materials data that distinguish them from data in other fields. We introduce the concept of process-structure-property (PSP) linkages and illustrate how the determination of PSPs is one of the main objectives of materials data science. Then we review a selection of materials databases, as well as important aspects of materials data management, such as storage hardware, archiving strategies, and data access strategies. We introduce the emerging field of materials data analytics, which focuses on data-driven approaches to extract and curate materials knowledge from available data sets. The critical need for materials e-collaboration platforms is highlighted, and we conclude the article with a number of suggestions regarding the near-term future of the materials data science field.

  6. Pre-Exposure Prophylaxis (PrEP) as an Additional Tool for HIV Prevention Among Men Who Have Sex With Men in Belgium: The Be-PrEP-ared Study Protocol.

    PubMed

    De Baetselier, Irith; Reyniers, Thijs; Nöstlinger, Christiana; Wouters, Kristien; Fransen, Katrien; Crucitti, Tania; Kenyon, Chris; Buyze, Jozefien; Schurmans, Céline; Laga, Marie; Vuylsteke, Bea

    2017-01-30

    Pre-exposure prophylaxis (PrEP) is a promising and effective tool to prevent HIV. With the approval of Truvada as daily PrEP by the European Commission in August 2016, individual European Member states prepare themselves for PrEP implementation following the examples of France and Norway. However, context-specific data to guide optimal implementation is currently lacking. With this demonstration project we evaluate whether daily and event-driven PrEP, provided within a comprehensive prevention package, is a feasible and acceptable additional prevention tool for men who have sex with men (MSM) at high risk of acquiring HIV in Belgium. The study's primary objective is to document the uptake, acceptability, and adherence to both daily and event-driven PrEP, while several secondary objectives have been formulated including impact of PrEP use on sexual behavior. The Be-PrEP-ared study is a phase 3, single-site, open-label prospective cohort study with a large social science component embedded in the trial. A total of 200 participants choose between daily or event-driven PrEP use and may switch, discontinue, or restart their regimen at the 3-monthly visits for a duration of 18 months. Data are collected on several platforms: an electronic case report form, a Web-based tool where participants register their sexual behavior and pill use, a more detailed electronic self-administered questionnaire completed during study visits on a tablet computer, and in-depth interviews among a selected sample of participants. To answer the primary objective, the recruitment rate, (un)safe sex behavior during the last 6 months, percentage of reported intention to use PrEP in the future, retention rates in different regimens, and attitudes towards PrEP use will be analyzed. Adherence will be monitored using self-reported adherence, pill count, tenofovir drug levels in blood samples, and the perceived skills to adhere. All participants are currently enrolled, and the last study visit is planned to take place around Q3 2018. As PrEP is not yet available in Belgium for use, this study will provide insights into how to optimally implement PrEP within the current health care provision and will shape national and European guidelines with regard to the place of PrEP in HIV prevention strategies. EU Clinical Trial 2015-000054-37; https://www.clinicaltrialsregister.eu/ctr-search/trial/2015-000054-37/BE (Archived by WebCite at http://www.webcitation.org/6nacjSdmM). ©Irith De Baetselier, Thijs Reyniers, Christiana Nöstlinger, Kristien Wouters, Katrien Fransen, Tania Crucitti, Chris Kenyon, Jozefien Buyze, Céline Schurmans, Marie Laga, Bea Vuylsteke, Be-PrEP-Ared Study Group. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 30.01.2017.

  7. Pivotal role of computers and software in mass spectrometry - SEQUEST and 20 years of tandem MS database searching.

    PubMed

    Yates, John R

    2015-11-01

    Advances in computer technology and software have driven developments in mass spectrometry over the last 50 years. Computers and software have been impactful in three areas: the automation of difficult calculations to aid interpretation, the collection of data and control of instruments, and data interpretation. As the power of computers has grown, so too has the utility and impact on mass spectrometers and their capabilities. This has been particularly evident in the use of tandem mass spectrometry data to search protein and nucleotide sequence databases to identify peptide and protein sequences. This capability has driven the development of many new approaches to study biological systems, including the use of "bottom-up shotgun proteomics" to directly analyze protein mixtures. Graphical Abstract ᅟ.

  8. Global Comparators Project: International Comparison of Hospital Outcomes Using Administrative Data

    PubMed Central

    Bottle, Alex; Middleton, Steven; Kalkman, Cor J; Livingston, Edward H; Aylin, Paul

    2013-01-01

    Objective. To produce comparable risk-adjusted outcome rates for an international sample of hospitals in a collaborative project to share outcomes and learning. Data Sources. Administrative data varying in scope, format, and coding systems were pooled from each participating hospital for the years 2005–2010. Study Design. Following reconciliation of the different coding systems in the various countries, in-hospital mortality, unplanned readmission within 30 days, and “prolonged” hospital stay (>75th percentile) were risk-adjusted via logistic regression. A web-based interface was created to facilitate outcomes analysis for individual medical centers and enable peer comparisons. Small groups of clinicians are now exploring the potential reasons for variations in outcomes in their specialty. Principal Findings. There were 6,737,211 inpatient records, including 214,622 in-hospital deaths. Although diagnostic coding depth varied appreciably by country, comorbidity weights were broadly comparable. U.S. hospitals generally had the lowest mortality rates, shortest stays, and highest readmission rates. Conclusions. Intercountry differences in outcomes may result from differences in the quality of care or in practice patterns driven by socio-economic factors. Carefully managed administrative data can be an effective resource for initiating dialog between hospitals within and across countries. Inclusion of important outcomes beyond hospital discharge would increase the value of these analyses. PMID:23742025

  9. Change control microcomputer device for vehicle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morishita, M.; Kouge, S.

    1986-08-19

    A charge control microcomputer device for a vehicle is described which consists of: a clutch device for transmitting the rotary output of an engine; a charging generator driven by the clutch device; a battery charged by an output of the charging generator; a voltage regulator for controlling an output voltage of the charging generator to a predetermined value; an engine controlling microcomputer for receiving engine data, to control the engine; and a charge control microcomputer for processing the engine data from the engine controlling microcomputer and charge system data including terminal voltage data from the battery and generated voltage datamore » from the charging generator, to determine a reference voltage for the voltage regulator in accordance with the engine data and the charge system data, and for processing an engine rotation signal to generate and apply an operating instruction to the clutch device in accordance with the engine data and the charge system data, such that the charging generator is driven within a predetermined range of revolutions per minute at all times.« less

  10. Objective Bayesian analysis of neutrino masses and hierarchy

    NASA Astrophysics Data System (ADS)

    Heavens, Alan F.; Sellentin, Elena

    2018-04-01

    Given the precision of current neutrino data, priors still impact noticeably the constraints on neutrino masses and their hierarchy. To avoid our understanding of neutrinos being driven by prior assumptions, we construct a prior that is mathematically minimally informative. Using the constructed uninformative prior, we find that the normal hierarchy is favoured but with inconclusive posterior odds of 5.1:1. Better data is hence needed before the neutrino masses and their hierarchy can be well constrained. We find that the next decade of cosmological data should provide conclusive evidence if the normal hierarchy with negligible minimum mass is correct, and if the uncertainty in the sum of neutrino masses drops below 0.025 eV. On the other hand, if neutrinos obey the inverted hierarchy, achieving strong evidence will be difficult with the same uncertainties. Our uninformative prior was constructed from principles of the Objective Bayesian approach. The prior is called a reference prior and is minimally informative in the specific sense that the information gain after collection of data is maximised. The prior is computed for the combination of neutrino oscillation data and cosmological data and still applies if the data improve.

  11. Bringing Geoscience Research into Undergraduate Education in the Classroom and Online

    NASA Astrophysics Data System (ADS)

    Reed, D. L.

    2008-12-01

    The growth of the cyberinfrastructure provides new opportunities for students and instructors to place data- driven, classroom and laboratory exercises in the context of an integrated research project. Undergraduate majors in a classroom section of the applied geophysics course at SJSU use Google Earth to first visualize the geomorphic expression of the Silver Creek fault in the foothills of the eastern Santa Clara Valley in order to identify key research questions regarding the northward projection of the fault beneath the valley floor, near downtown San Jose. The 3-D visualization, both regionally and locally, plays a key element in establishing the overall framework of the research. Students then plan a seismic hazards study in an urban environment, which is the primary focus of the class, using satellite imagery to locate specific stations along a geophysical transect crossing the inferred location of the fault. Geophysical modeling along the transect combines field-based data acquisition by members of the class with regional geophysical data, downloaded from an online USGS database. Students carry out all aspects of the research from project planning, to data acquisition and analysis, report writing, and an oral presentation of the results. In contrast, online courses present special challenges as students may become frustrated navigating complex user interfaces, sometimes employed in research-driven online databases, and not achieve the desired learning outcomes. Consequently, an alternate approach, implemented in an online oceanography course, is for the instructor to first extract research data from online databases, build visualizations, and then place the learning objects in the context of a virtual oceanographic research expedition. Several examples of this approach, to engage students in the experience of oceanographic research, will be presented, including seafloor mapping studies around the Golden Gate and across the major ocean basins, using data obtained in part through the use of the Marine Geoscience Data System and GeoMapApp. Students also locate and undertake submersible dives inside hydrothermal vents using visualizations provided by the OceanExplorer program and New Millennium Observatory of NOAA/PMEL. Other learning activities include participation, at least virtually, in an iron fertilization experiment in the Southern Ocean (SOFeX) and the development of a model of surface circulation using data from the Global Drifter Program and the National Data Buoy Center. One factor contributing to student learning is to establish a research context for the class early on, so that students become engaged in a sense of exploration, testing and discovery.

  12. Haptic Classification of Common Objects: Knowledge-Driven Exploration.

    ERIC Educational Resources Information Center

    Lederman, Susan J.; Klatzky, Roberta L.

    1990-01-01

    Theoretical and empirical issues relating to haptic exploration and the representation of common objects during haptic classification were investigated in 3 experiments involving a total of 112 college students. Results are discussed in terms of a computational model of human haptic object classification with implications for dextrous robot…

  13. Radiation hydrodynamics of triggered star formation: the effect of the diffuse radiation field

    NASA Astrophysics Data System (ADS)

    Haworth, Thomas J.; Harries, Tim J.

    2012-02-01

    We investigate the effect of including diffuse field radiation when modelling the radiatively driven implosion of a Bonnor-Ebert sphere (BES). Radiation-hydrodynamical calculations are performed by using operator splitting to combine Monte Carlo photoionization with grid-based Eulerian hydrodynamics that includes self-gravity. It is found that the diffuse field has a significant effect on the nature of radiatively driven collapse which is strongly coupled to the strength of the driving shock that is established before impacting the BES. This can result in either slower or more rapid star formation than expected using the on-the-spot approximation depending on the distance of the BES from the source object. As well as directly compressing the BES, stronger shocks increase the thickness and density in the shell of accumulated material, which leads to short, strong, photoevaporative ejections that reinforce the compression whenever it slows. This happens particularly effectively when the diffuse field is included as rocket motion is induced over a larger area of the shell surface. The formation and evolution of 'elephant trunks' via instability is also found to vary significantly when the diffuse field is included. Since the perturbations that seed instabilities are smeared out elephant trunks form less readily and, once formed, are exposed to enhanced thermal compression.

  14. Progress on VESPA, a community-driven Virtual Observatory in Planetary Science

    NASA Astrophysics Data System (ADS)

    Erard, S.; Cecconi, B.; Le Sidaner, P.; Rossi, A. P.; Capria, M. T.; Schmitt, B.; Genot, V. N.; André, N.; Vandaele, A. C.; Scherf, M.; Hueso, R.; Maattanen, A. E.; Carry, B.; Achilleos, N.; Marmo, C.; Santolik, O.; Benson, K.; Fernique, P.

    2017-12-01

    The Europlanet H2020 program started on 1/9/2015 for 4 years. It includes an activity to adapt Virtual Observatory (VO) techniques to Planetary Science data called VESPA. The objective is to facilitate searches in big archives as well as sparse databases, to provide simple data access and on-line visualization, and to allow small data providers to make their data available in an interoperable environment with minimum effort. The VESPA system, based on a prototype developed in a previous program [1], has been hugely improved during the first two years of Europlanet H2020: the infrastructure has been upgraded to describe data in many fields more accurately; the main user search interface (http://vespa.obspm.fr) has been redesigned to provide more flexibility; alternative ways to access Planetary Science data services from VO tools have been implemented; VO tools are being improved to handle specificities of Solar System data, e.g. measurements in reflected light, coordinate systems, etc. Current steps include the development of a connection between the VO world and GIS tools, and integration of Heliophysics, planetary plasmas, and mineral spectroscopy data to support of the analysis of observations. Existing data services have been updated, and new ones have been designed. The global objective is already overstepped, with 34 services open and 20 more being finalized. A procedure to install data services has been documented, and hands-on sessions are organized twice a year at EGU and EPSC; this is intended to favour the installation of services by individual research teams, e.g. to distribute derived data related to a published study. In complement, regular discussions are held with big data providers, starting with space agencies (IPDA). Common projects with ESA and NASA's PDS have been engaged, with the goal to connect PDS4 and EPN-TAP. In parallel, a Solar System Interest Group has just been started in IVOA; the goal is here to adapt existing astronomy standards to Planetary Science. The Europlanet 2020 Research Infrastructure project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208. [1] Erard et al 2014, Astronomy & Computing 7-8, 71-80. http://arxiv.org/abs/1407.4886

  15. Task driven optimal leg trajectories in insect-scale legged microrobots

    NASA Astrophysics Data System (ADS)

    Doshi, Neel; Goldberg, Benjamin; Jayaram, Kaushik; Wood, Robert

    Origami inspired layered manufacturing techniques and 3D-printing have enabled the development of highly articulated legged robots at the insect-scale, including the 1.43g Harvard Ambulatory MicroRobot (HAMR). Research on these platforms has expanded its focus from manufacturing aspects to include design optimization and control for application-driven tasks. Consequently, the choice of gait selection, body morphology, leg trajectory, foot design, etc. have become areas of active research. HAMR has two controlled degrees-of-freedom per leg, making it an ideal candidate for exploring leg trajectory. We will discuss our work towards optimizing HAMR's leg trajectories for two different tasks: climbing using electroadhesives and level ground running (5-10 BL/s). These tasks demonstrate the ability of single platform to adapt to vastly different locomotive scenarios: quasi-static climbing with controlled ground contact, and dynamic running with un-controlled ground contact. We will utilize trajectory optimization methods informed by existing models and experimental studies to determine leg trajectories for each task. We also plan to discuss how task specifications and choice of objective function have contributed to the shape of these optimal leg trajectories.

  16. CFD Simulations to Improve Ventilation in Low-Income Housing

    NASA Astrophysics Data System (ADS)

    Ho, Rosemond; Gorle, Catherine

    2017-11-01

    Quality of housing plays an important role in public health. In Dhaka, Bangladesh, the leading causes of death include tuberculosis, lower respiratory infections, and chronic obstructive pulmonary disease, so improving home ventilation could potentially mitigate these negative health effects. The goal of this project is to use computational fluid dynamics (CFD) to predict the relative effectiveness of different ventilation strategies for Dhaka homes. A Reynolds-averaged Navier-Stokes CFD model of a standard Dhaka home with apertures of different sizes and locations was developed to predict air exchange rates. Our initial focus is on simulating ventilation driven by buoyancy-alone conditions, which is often considered the limiting case in natural ventilation design. We explore the relationship between ventilation rate and aperture area to determine the most promising configurations for optimal ventilation solutions. Future research will include the modeling of wind-driven conditions, and extensive uncertainty quantification studies to investigate the effect of variability in the layout of homes and neighborhoods, and in local wind and temperature conditions. The ultimate objective is to formulate robust design recommendations that can reduce risks of respiratory illness in low-income housing.

  17. Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color.

    PubMed

    Bannert, Michael M; Bartels, Andreas

    2018-04-11

    Color is special among basic visual features in that it can form a defining part of objects that are engrained in our memory. Whereas most neuroimaging research on human color vision has focused on responses related to external stimulation, the present study investigated how sensory-driven color vision is linked to subjective color perception induced by object imagery. We recorded fMRI activity in male and female volunteers during viewing of abstract color stimuli that were red, green, or yellow in half of the runs. In the other half we asked them to produce mental images of colored, meaningful objects (such as tomato, grapes, banana) corresponding to the same three color categories. Although physically presented color could be decoded from all retinotopically mapped visual areas, only hV4 allowed predicting colors of imagined objects when classifiers were trained on responses to physical colors. Importantly, only neural signal in hV4 was predictive of behavioral performance in the color judgment task on a trial-by-trial basis. The commonality between neural representations of sensory-driven and imagined object color and the behavioral link to neural representations in hV4 identifies area hV4 as a perceptual hub linking externally triggered color vision with color in self-generated object imagery. SIGNIFICANCE STATEMENT Humans experience color not only when visually exploring the outside world, but also in the absence of visual input, for example when remembering, dreaming, and during imagery. It is not known where neural codes for sensory-driven and internally generated hue converge. In the current study we evoked matching subjective color percepts, one driven by physically presented color stimuli, the other by internally generated color imagery. This allowed us to identify area hV4 as the only site where neural codes of corresponding subjective color perception converged regardless of its origin. Color codes in hV4 also predicted behavioral performance in an imagery task, suggesting it forms a perceptual hub for color perception. Copyright © 2018 the authors 0270-6474/18/383657-12$15.00/0.

  18. Share2Quit: Web-Based Peer-Driven Referrals for Smoking Cessation

    PubMed Central

    2013-01-01

    Background Smoking is the number one preventable cause of death in the United States. Effective Web-assisted tobacco interventions are often underutilized and require new and innovative engagement approaches. Web-based peer-driven chain referrals successfully used outside health care have the potential for increasing the reach of Internet interventions. Objective The objective of our study was to describe the protocol for the development and testing of proactive Web-based chain-referral tools for increasing the access to Decide2Quit.org, a Web-assisted tobacco intervention system. Methods We will build and refine proactive chain-referral tools, including email and Facebook referrals. In addition, we will implement respondent-driven sampling (RDS), a controlled chain-referral sampling technique designed to remove inherent biases in chain referrals and obtain a representative sample. We will begin our chain referrals with an initial recruitment of former and current smokers as seeds (initial participants) who will be trained to refer current smokers from their social network using the developed tools. In turn, these newly referred smokers will also be provided the tools to refer other smokers from their social networks. We will model predictors of referral success using sample weights from the RDS to estimate the success of the system in the targeted population. Results This protocol describes the evaluation of proactive Web-based chain-referral tools, which can be used in tobacco interventions to increase the access to hard-to-reach populations, for promoting smoking cessation. Conclusions Share2Quit represents an innovative advancement by capitalizing on naturally occurring technology trends to recruit smokers to Web-assisted tobacco interventions. PMID:24067329

  19. Ion-driven wind: Aerodynamics, performance limits, and optimization

    NASA Astrophysics Data System (ADS)

    Rickard, Matthew James Alan

    When a strong electric field is generated between a sharp, charged object and a grounded electrode in a gas medium, ions that are generated via a corona discharge near the tip of the sharp object migrate to the electrical ground, setting the neutral hulk gas in motion. The strength of the flow generated from such a process; known as a "corona", "ionic", or "ion-driven" wind, increases with electric field until electrical breakdown is reached. Previous studies have found an upper bound on the velocity of the ion-driven wind, even when a series of electrode stages are aggregated. With the intent of maximizing the gas flow front such devices, this dissertation describes a series of experiments that have been conducted and a numerical model that has been employed. Although typical hardware configurations include a wire parallel to a plate, a wire placed concentrically within a cylinder, or a needle facing a perpendicular plate or mesh, the chosen setup for this study is a needle facing a concentric ring. Using multiple experimental techniques and numerical simulation, velocity profiles have been observed at the ring exit and are sensitive to the design of the mounting hardware. The numerical model predicts the ideal electrode geometry for maximizing flow through a single unit. A modular, multi-staged system has been constructed and, when loaded with an exit nozzle, the exit velocity can be substantially increased. Further, if a small-scale (sub-millimeter) system is created, it is expected that the velocity will increase with multi-staging, even in the absence of an exit nozzle.

  20. System and method for authentication

    DOEpatents

    Duerksen, Gary L.; Miller, Seth A.

    2015-12-29

    Described are methods and systems for determining authenticity. For example, the method may include providing an object of authentication, capturing characteristic data from the object of authentication, deriving authentication data from the characteristic data of the object of authentication, and comparing the authentication data with an electronic database comprising reference authentication data to provide an authenticity score for the object of authentication. The reference authentication data may correspond to one or more reference objects of authentication other than the object of authentication.

  1. Magnetism in curved geometries

    NASA Astrophysics Data System (ADS)

    Streubel, Robert; Fischer, Peter; Kronast, Florian; Kravchuk, Volodymyr P.; Sheka, Denis D.; Gaididei, Yuri; Schmidt, Oliver G.; Makarov, Denys

    2016-09-01

    Extending planar two-dimensional structures into the three-dimensional space has become a general trend in multiple disciplines, including electronics, photonics, plasmonics and magnetics. This approach provides means to modify conventional or to launch novel functionalities by tailoring the geometry of an object, e.g. its local curvature. In a generic electronic system, curvature results in the appearance of scalar and vector geometric potentials inducing anisotropic and chiral effects. In the specific case of magnetism, even in the simplest case of a curved anisotropic Heisenberg magnet, the curvilinear geometry manifests two exchange-driven interactions, namely effective anisotropy and antisymmetric exchange, i.e. Dzyaloshinskii-Moriya-like interaction. As a consequence, a family of novel curvature-driven effects emerges, which includes magnetochiral effects and topologically induced magnetization patterning, resulting in theoretically predicted unlimited domain wall velocities, chirality symmetry breaking and Cherenkov-like effects for magnons. The broad range of altered physical properties makes these curved architectures appealing in view of fundamental research on e.g. skyrmionic systems, magnonic crystals or exotic spin configurations. In addition to these rich physics, the application potential of three-dimensionally shaped objects is currently being explored as magnetic field sensorics for magnetofluidic applications, spin-wave filters, advanced magneto-encephalography devices for diagnosis of epilepsy or for energy-efficient racetrack memory devices. These recent developments ranging from theoretical predictions over fabrication of three-dimensionally curved magnetic thin films, hollow cylinders or wires, to their characterization using integral means as well as the development of advanced tomography approaches are in the focus of this review.

  2. Revisiting the hypothesis-driven interview in a contemporary context.

    PubMed

    Holmes, Alex; Singh, Bruce; McColl, Geoff

    2011-12-01

    The "hypothesis-driven interview" was articulated by George Engel as a method of raising and testing hypotheses in the process of building a biopsychosocial formulation and determining the most likely diagnosis. This interview was a forerunner of the modern medical interview as well as the contemporary psychiatric assessment. The objective of this article is to describe the hypothesis-driven interview and to explore its relationship with the contemporary medical interview. The literature on the medical and hypothesis-driven interview was reviewed. Key features of each were identified. The hypothesis-driven interview shares much with the contemporary medical interview. In addition, it enhances the application of communication skills and allows the interviewer to develop a formulation during the course of the assessment. The hypothesis-driven interview is well suited to the aims of a contemporary psychiatric assessment.

  3. Development and implementation of a balanced scorecard in an academic hospitalist group.

    PubMed

    Hwa, Michael; Sharpe, Bradley A; Wachter, Robert M

    2013-03-01

    Academic hospitalist groups (AHGs) are often expected to excel in multiple domains: quality improvement, patient safety, education, research, administration, and clinical care. To be successful, AHGs must develop strategies to balance their energies, resources, and performance. The balanced scorecard (BSC) is a strategic management system that enables organizations to translate their mission and vision into specific objectives and metrics across multiple domains. To date, no hospitalist group has reported on BSC implementation. We set out to develop a BSC as part of a strategic planning initiative. Based on a needs assessment of the University of California, San Francisco, Division of Hospital Medicine, mission and vision statements were developed. We engaged representative faculty to develop strategic objectives and determine performance metrics across 4 BSC perspectives. There were 41 metrics identified, and 16 were chosen for the initial BSC. It allowed us to achieve several goals: 1) present a broad view of performance, 2) create transparency and accountability, 3) communicate goals and engage faculty, and 4) ensure we use data to guide strategic decisions. Several lessons were learned, including the need to build faculty consensus, establish metrics with reliable measureable data, and the power of the BSC to drive goals across the division. We successfully developed and implemented a BSC in an AHG as part of a strategic planning initiative. The BSC has been instrumental in allowing us to achieve balanced success in multiple domains. Academic groups should consider employing the BSC as it allows for a data-driven strategic planning and assessment process. Copyright © 2013 Society of Hospital Medicine.

  4. Computer Series, 70.

    ERIC Educational Resources Information Center

    Moore, John W.

    1986-01-01

    Describes: (1) spreadheet programs (including VisiCalc) for experiments; (2) event-driven data acquisition (using ADALAB with an Acculab Infrared Spectometer); (3) microcomputer-controlled cyclic voltammetry; (4) inexpensive computerized experiments; (5) the "KC? Discoverer" program; and (6) MOLDOT (space-filling perspective diagrams of…

  5. 3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands

    PubMed Central

    Mateo, Carlos M.; Gil, Pablo; Torres, Fernando

    2016-01-01

    Sensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception system of complex grasping tasks to support a robot controller when other sensor systems, such as tactile and force, are not able to obtain useful data relevant to the grasping manipulation task. In particular, a new visual approach based on RGBD data was implemented to help a robot controller carry out intelligent manipulation tasks with flexible objects. The proposed method supervises the interaction between the grasped object and the robot hand in order to avoid poor contact between the fingertips and an object when there is neither force nor pressure data. This new approach is also used to measure changes to the shape of an object’s surfaces and so allows us to find deformations caused by inappropriate pressure being applied by the hand’s fingers. Test was carried out for grasping tasks involving several flexible household objects with a multi-fingered robot hand working in real time. Our approach generates pulses from the deformation detection method and sends an event message to the robot controller when surface deformation is detected. In comparison with other methods, the obtained results reveal that our visual pipeline does not use deformations models of objects and materials, as well as the approach works well both planar and 3D household objects in real time. In addition, our method does not depend on the pose of the robot hand because the location of the reference system is computed from a recognition process of a pattern located place at the robot forearm. The presented experiments demonstrate that the proposed method accomplishes a good monitoring of grasping task with several objects and different grasping configurations in indoor environments. PMID:27164102

  6. PLRP-3: Operational Perspectives of Conducting Science-Driven Extravehicular Activity with Communications Latency

    NASA Technical Reports Server (NTRS)

    Miller, Matthew J.; Lim, Darlene S. S.; Brady, Allyson; Cardman, Zena; Bell, Ernest; Garry, Brent; Reid, Donnie; Chappell, Steve; Abercromby, Andrew F. J.

    2016-01-01

    The Pavilion Lake Research Project (PLRP) is a unique platform where the combination of scientific research and human space exploration concepts can be tested in an underwater spaceflight analog environment. The 2015 PLRP field season was performed at Pavilion Lake, Canada, where science-driven exploration techniques focusing on microbialite characterization and acquisition were evaluated within the context of crew and robotic extravehicular activity (EVA) operations. The primary objectives of this analog study were to detail the capabilities, decision-making process, and operational concepts required to meet non-simulated scientific objectives during 5-minute one-way communication latency utilizing crew and robotic assets. Furthermore, this field study served as an opportunity build upon previous tests at PLRP, NASA Desert Research and Technology Studies (DRATS), and NASA Extreme Environment Mission Operations (NEEMO) to characterize the functional roles and responsibilities of the personnel involved in the distributed flight control team and identify operational constraints imposed by science-driven EVA operations. The relationship and interaction between ground and flight crew was found to be dependent on the specific scientific activities being addressed. Furthermore, the addition of a second intravehicular operator was found to be highly enabling when conducting science-driven EVAs. Future human spaceflight activities will need to cope with the added complexity of dynamic and rapid execution of scientific priorities both during and between EVA execution to ensure scientific objectives are achieved.

  7. Identification of copy number variation-driven genes for liver cancer via bioinformatics analysis.

    PubMed

    Lu, Xiaojie; Ye, Kun; Zou, Kailin; Chen, Jinlian

    2014-11-01

    To screen out copy number variation (CNV)-driven differentially expressed genes (DEGs) in liver cancer and advance our understanding of the pathogenesis, an integrated analysis of liver cancer-related CNV data from The Cancer Genome Atlas (TCGA) and gene expression data from EBI Array Express database were performed. The DEGs were identified by package limma based on the cut-off of |log2 (fold-change)|>0.585 and adjusted p-value<0.05. Using hg19 annotation information provided by UCSC, liver cancer-related CNVs were then screened out. TF-target gene interactions were also predicted with information from UCSC using DAVID online tools. As a result, 25 CNV-driven genes were obtained, including tripartite motif containing 28 (TRIM28) and RanBP-type and C3HC4-type zinc finger containing 1 (RBCK1). In the transcriptional regulatory network, 8 known cancer-related transcription factors (TFs) interacted with 21 CNV-driven genes, suggesting that the other 8 TFs may be involved in liver cancer. These genes may be potential biomarkers for early detection and prevention of liver cancer. These findings may improve our knowledge of the pathogenesis of liver cancer. Nevertheless, further experiments are still needed to confirm our findings.

  8. An evaluation of data-driven motion estimation in comparison to the usage of external-surrogates in cardiac SPECT imaging

    PubMed Central

    Mukherjee, Joyeeta Mitra; Hutton, Brian F; Johnson, Karen L; Pretorius, P Hendrik; King, Michael A

    2014-01-01

    Motion estimation methods in single photon emission computed tomography (SPECT) can be classified into methods which depend on just the emission data (data-driven), or those that use some other source of information such as an external surrogate. The surrogate-based methods estimate the motion exhibited externally which may not correlate exactly with the movement of organs inside the body. The accuracy of data-driven strategies on the other hand is affected by the type and timing of motion occurrence during acquisition, the source distribution, and various degrading factors such as attenuation, scatter, and system spatial resolution. The goal of this paper is to investigate the performance of two data-driven motion estimation schemes based on the rigid-body registration of projections of motion-transformed source distributions to the acquired projection data for cardiac SPECT studies. Comparison is also made of six intensity based registration metrics to an external surrogate-based method. In the data-driven schemes, a partially reconstructed heart is used as the initial source distribution. The partially-reconstructed heart has inaccuracies due to limited angle artifacts resulting from using only a part of the SPECT projections acquired while the patient maintained the same pose. The performance of different cost functions in quantifying consistency with the SPECT projection data in the data-driven schemes was compared for clinically realistic patient motion occurring as discrete pose changes, one or two times during acquisition. The six intensity-based metrics studied were mean-squared difference (MSD), mutual information (MI), normalized mutual information (NMI), pattern intensity (PI), normalized cross-correlation (NCC) and entropy of the difference (EDI). Quantitative and qualitative analysis of the performance is reported using Monte-Carlo simulations of a realistic heart phantom including degradation factors such as attenuation, scatter and system spatial resolution. Further the visual appearance of motion-corrected images using data-driven motion estimates was compared to that obtained using the external motion-tracking system in patient studies. Pattern intensity and normalized mutual information cost functions were observed to have the best performance in terms of lowest average position error and stability with degradation of image quality of the partial reconstruction in simulations. In all patients, the visual quality of PI-based estimation was either significantly better or comparable to NMI-based estimation. Best visual quality was obtained with PI-based estimation in 1 of the 5 patient studies, and with external-surrogate based correction in 3 out of 5 patients. In the remaining patient study there was little motion and all methods yielded similar visual image quality. PMID:24107647

  9. EarthCube: Advancing Partnerships, Collaborative Platforms and Knowledge Networks in the Ocean Sciences

    NASA Astrophysics Data System (ADS)

    Stephen, Diggs; Lee, Allison

    2014-05-01

    The National Science Foundation's EarthCube initiative aims to create a community-driven data and knowledge management system that will allow for unprecedented data sharing across the geosciences. More than 2,500 participants through forums, work groups, EarthCube events, and virtual and in-person meetings have participated. The individuals that have engaged represent the core earth-system sciences of solid Earth, Atmosphere, Oceans, and Polar Sciences. EarthCube is a cornerstone of NSF's Cyberinfrastructure for the 21st Century (CIF21) initiative, whose chief objective is to develop a U.S. nationwide, sustainable, and community-based cyberinfrastructure for researchers and educators. Increasingly effective community-driven cyberinfrastructure allows global data discovery and knowledge management and achieves interoperability and data integration across scientific disciplines. There is growing convergence across scientific and technical communities on creating a networked, knowledge management system and scientific data cyberinfrastructure that integrates Earth system and human dimensions data in an open, transparent, and inclusive manner. EarthCube does not intend to replicate these efforts, but build upon them. An agile development process is underway for the development and governance of EarthCube. The agile approach was deliberately selected due to its iterative and incremental nature while promoting adaptive planning and rapid and flexible response. Such iterative deployment across a variety of EarthCube stakeholders encourages transparency, consensus, accountability, and inclusiveness.

  10. Identifying Measures Used for Assessing Quality of YouTube Videos with Patient Health Information: A Review of Current Literature.

    PubMed

    Gabarron, Elia; Fernandez-Luque, Luis; Armayones, Manuel; Lau, Annie Ys

    2013-02-28

    Recent publications on YouTube have advocated its potential for patient education. However, a reliable description of what could be considered quality information for patient education on YouTube is missing. To identify topics associated with the concept of quality information for patient education on YouTube in the scientific literature. A literature review was performed in MEDLINE, ISI Web of Knowledge, Scopus, and PsychINFO. Abstract selection was first conducted by two independent reviewers; discrepancies were discussed in a second abstract review with two additional independent reviewers. Full text of selected papers were analyzed looking for concepts, definitions, and topics used by its authors that focused on the quality of information on YouTube for patient education. In total, 456 abstracts were extracted and 13 papers meeting eligibility criteria were analyzed. Concepts identified related to quality of information for patient education are categorized as expert-driven, popularity-driven, or heuristic-driven measures. These include (in descending order): (1) quality of content in 10/13 (77%), (2) view count in 9/13 (69%), (3) health professional opinion in 8/13 (62%), (4) adequate length or duration in 6/13 (46%), (5) public ratings in 5/13 (39%), (6) adequate title, tags, and description in 5/13 (39%), (7) good description or a comprehensive narrative in 4/13 (31%), (8) evidence-based practices included in video in 4/13 (31%), (9) suitability as a teaching tool in 4/13 (31%), (10) technical quality in 4/13 (31%), (11) credentials provided in video in 4/13 (31%), (12) enough amount of content to identify its objective in 3/13 (23%), and (13) viewership share in 2/13 (15%). Our review confirms that the current topics linked to quality of information for patient education on YouTube are unclear and not standardized. Although expert-driven, popularity-driven, or heuristic-driven measures are used as proxies to estimate the quality of video information, caution should be applied when using YouTube for health promotion and patient educational material.

  11. Threshold-driven optimization for reference-based auto-planning

    NASA Astrophysics Data System (ADS)

    Long, Troy; Chen, Mingli; Jiang, Steve; Lu, Weiguo

    2018-02-01

    We study threshold-driven optimization methodology for automatically generating a treatment plan that is motivated by a reference DVH for IMRT treatment planning. We present a framework for threshold-driven optimization for reference-based auto-planning (TORA). Commonly used voxel-based quadratic penalties have two components for penalizing under- and over-dosing of voxels: a reference dose threshold and associated penalty weight. Conventional manual- and auto-planning using such a function involves iteratively updating the preference weights while keeping the thresholds constant, an unintuitive and often inconsistent method for planning toward some reference DVH. However, driving a dose distribution by threshold values instead of preference weights can achieve similar plans with less computational effort. The proposed methodology spatially assigns reference DVH information to threshold values, and iteratively improves the quality of that assignment. The methodology effectively handles both sub-optimal and infeasible DVHs. TORA was applied to a prostate case and a liver case as a proof-of-concept. Reference DVHs were generated using a conventional voxel-based objective, then altered to be either infeasible or easy-to-achieve. TORA was able to closely recreate reference DVHs in 5-15 iterations of solving a simple convex sub-problem. TORA has the potential to be effective for auto-planning based on reference DVHs. As dose prediction and knowledge-based planning becomes more prevalent in the clinical setting, incorporating such data into the treatment planning model in a clear, efficient way will be crucial for automated planning. A threshold-focused objective tuning should be explored over conventional methods of updating preference weights for DVH-guided treatment planning.

  12. Representational Similarity Analysis Reveals Commonalities and Differences in the Semantic Processing of Words and Objects

    PubMed Central

    Devereux, Barry J.; Clarke, Alex; Marouchos, Andreas; Tyler, Lorraine K.

    2013-01-01

    Understanding the meanings of words and objects requires the activation of underlying conceptual representations. Semantic representations are often assumed to be coded such that meaning is evoked regardless of the input modality. However, the extent to which meaning is coded in modality-independent or amodal systems remains controversial. We address this issue in a human fMRI study investigating the neural processing of concepts, presented separately as written words and pictures. Activation maps for each individual word and picture were used as input for searchlight-based multivoxel pattern analyses. Representational similarity analysis was used to identify regions correlating with low-level visual models of the words and objects and the semantic category structure common to both. Common semantic category effects for both modalities were found in a left-lateralized network, including left posterior middle temporal gyrus (LpMTG), left angular gyrus, and left intraparietal sulcus (LIPS), in addition to object- and word-specific semantic processing in ventral temporal cortex and more anterior MTG, respectively. To explore differences in representational content across regions and modalities, we developed novel data-driven analyses, based on k-means clustering of searchlight dissimilarity matrices and seeded correlation analysis. These revealed subtle differences in the representations in semantic-sensitive regions, with representations in LIPS being relatively invariant to stimulus modality and representations in LpMTG being uncorrelated across modality. These results suggest that, although both LpMTG and LIPS are involved in semantic processing, only the functional role of LIPS is the same regardless of the visual input, whereas the functional role of LpMTG differs for words and objects. PMID:24285896

  13. Pedagogical Distance: Explaining Misalignment in Student-Driven Online Learning Activities Using Activity Theory

    ERIC Educational Resources Information Center

    Westberry, Nicola; Franken, Margaret

    2015-01-01

    This paper provides an Activity Theory analysis of two online student-driven interactive learning activities to interrogate assumptions that such groups can effectively learn in the absence of the teacher. Such an analysis conceptualises learning tasks as constructed objects that drive pedagogical activity. The analysis shows a disconnect between…

  14. A predictive estimation method for carbon dioxide transport by data-driven modeling with a physically-based data model

    NASA Astrophysics Data System (ADS)

    Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun

    2017-11-01

    In this study, a data-driven method for predicting CO2 leaks and associated concentrations from geological CO2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems.

  15. External radioactive markers for PET data-driven respiratory gating in positron emission tomography.

    PubMed

    Büther, Florian; Ernst, Iris; Hamill, James; Eich, Hans T; Schober, Otmar; Schäfers, Michael; Schäfers, Klaus P

    2013-04-01

    Respiratory gating is an established approach to overcoming respiration-induced image artefacts in PET. Of special interest in this respect are raw PET data-driven gating methods which do not require additional hardware to acquire respiratory signals during the scan. However, these methods rely heavily on the quality of the acquired PET data (statistical properties, data contrast, etc.). We therefore combined external radioactive markers with data-driven respiratory gating in PET/CT. The feasibility and accuracy of this approach was studied for [(18)F]FDG PET/CT imaging in patients with malignant liver and lung lesions. PET data from 30 patients with abdominal or thoracic [(18)F]FDG-positive lesions (primary tumours or metastases) were included in this prospective study. The patients underwent a 10-min list-mode PET scan with a single bed position following a standard clinical whole-body [(18)F]FDG PET/CT scan. During this scan, one to three radioactive point sources (either (22)Na or (18)F, 50-100 kBq) in a dedicated holder were attached the patient's abdomen. The list mode data acquired were retrospectively analysed for respiratory signals using established data-driven gating approaches and additionally by tracking the motion of the point sources in sinogram space. Gated reconstructions were examined qualitatively, in terms of the amount of respiratory displacement and in respect of changes in local image intensity in the gated images. The presence of the external markers did not affect whole-body PET/CT image quality. Tracking of the markers led to characteristic respiratory curves in all patients. Applying these curves for gated reconstructions resulted in images in which motion was well resolved. Quantitatively, the performance of the external marker-based approach was similar to that of the best intrinsic data-driven methods. Overall, the gain in measured tumour uptake from the nongated to the gated images indicating successful removal of respiratory motion was correlated with the magnitude of the respiratory displacement of the respective tumour lesion, but not with lesion size. Respiratory information can be assessed from list-mode PET/CT through PET data-derived tracking of external radioactive markers. This information can be successfully applied to respiratory gating to reduce motion-related image blurring. In contrast to other previously described PET data-driven approaches, the external marker approach is independent of tumour uptake and thereby applicable even in patients with poor uptake and small tumours.

  16. An Inversion Analysis of Recent Variability in Natural CO2 Fluxes Using GOSAT and In Situ Observations

    NASA Astrophysics Data System (ADS)

    Wang, J. S.; Kawa, S. R.; Baker, D. F.; Collatz, G. J.; Ott, L. E.

    2015-12-01

    About one-half of the global CO2 emissions from fossil fuel combustion and deforestation accumulates in the atmosphere, where it contributes to global warming. The rest is taken up by vegetation and the ocean. The precise contribution of the two sinks, and their location and year-to-year variability are, however, not well understood. We use two different approaches, batch Bayesian synthesis inversion and variational data assimilation, to deduce the global spatiotemporal distributions of CO2 fluxes during 2009-2010. One of our objectives is to assess different sources of uncertainties in inferred fluxes, including uncertainties in prior flux estimates and observations, and differences in inversion techniques. For prior constraints, we utilize fluxes and uncertainties from the CASA-GFED model of the terrestrial biosphere and biomass burning driven by satellite observations and interannually varying meteorology. We also use measurement-based ocean flux estimates and two sets of fixed fossil CO2 emissions. Here, our inversions incorporate column CO2 measurements from the GOSAT satellite (ACOS retrieval, filtered and bias-corrected) and in situ observations (individual flask and afternoon-average continuous observations) to estimate fluxes in 108 regions over 8-day intervals for the batch inversion and at 3° x 3.75° weekly for the variational system. Relationships between fluxes and atmospheric concentrations are derived consistently for the two inversion systems using the PCTM atmospheric transport model driven by meteorology from the MERRA reanalysis. We compare the posterior fluxes and uncertainties derived using different data sets and the two inversion approaches, and evaluate the posterior atmospheric concentrations against independent data including aircraft measurements. The optimized fluxes generally resemble those from other studies. For example, the results indicate that the terrestrial biosphere is a net CO2 sink, and a GOSAT-only inversion suggests a shift in the global sink from the tropics/south to the north relative to the prior and to an in-situ-only inversion. We also find a smaller terrestrial sink in higher-latitude northern regions in boreal summer of 2010 relative to 2009.

  17. MO-G-304-01: FEATURED PRESENTATION: Expanding the Knowledge Base for Data-Driven Treatment Planning: Incorporating Patient Outcome Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, SP; Quon, H; Cheng, Z

    2015-06-15

    Purpose: To extend the capabilities of knowledge-based treatment planning beyond simple dose queries by incorporating validated patient outcome models. Methods: From an analytic, relational database of 684 head and neck cancer patients, 372 patients were identified having dose data for both left and right parotid glands as well as baseline and follow-up xerostomia assessments. For each existing patient, knowledge-based treatment planning was simulated for by querying the dose-volume histograms and geometric shape relationships (overlap volume histograms) for all other patients. Dose predictions were captured at normalized volume thresholds (NVT) of 0%, 10%, 20, 30%, 40%, 50%, and 85% and weremore » compared with the actual achieved doses using the Wilcoxon signed-rank test. Next, a logistic regression model was used to predict the maximum severity of xerostomia up to three months following radiotherapy. Baseline xerostomia scores were subtracted from follow-up assessments and were also included in the model. The relative risks from predicted doses and actual doses were computed and compared. Results: The predicted doses for both parotid glands were significantly less than the achieved doses (p < 0.0001), with differences ranging from 830 cGy ± 1270 cGy (0% NVT) to 1673 cGy ± 1197 cGy (30% NVT). The modelled risk of xerostomia ranged from 54% to 64% for achieved doses and from 33% to 51% for the dose predictions. Relative risks varied from 1.24 to 1.87, with maximum relative risk occurring at 85% NVT. Conclusions: Data-driven generation of treatment planning objectives without consideration of the underlying normal tissue complication probability may Result in inferior plans, even if quality metrics indicate otherwise. Inclusion of complication models in knowledge-based treatment planning is necessary in order to close the feedback loop between radiotherapy treatments and patient outcomes. Future work includes advancing and validating complication models in the context of knowledge-based treatment planning. This work is supported by Philips Radiation Oncology Systems.« less

  18. An Inversion Analysis of Recent Variability in Natural CO2 Fluxes Using GOSAT and In Situ Observations

    NASA Technical Reports Server (NTRS)

    Wang, James S.; Kawa, S. Randolph; Collatz, G. James; Baker, David F.; Ott, Lesley

    2015-01-01

    About one-half of the global CO2 emissions from fossil fuel combustion and deforestation accumulates in the atmosphere, where it contributes to global warming. The rest is taken up by vegetation and the ocean. The precise contribution of the two sinks, and their location and year-to-year variability are, however, not well understood. We use two different approaches, batch Bayesian synthesis inversion and variational data assimilation, to deduce the global spatiotemporal distributions of CO2 fluxes during 2009-2010. One of our objectives is to assess different sources of uncertainties in inferred fluxes, including uncertainties in prior flux estimates and observations, and differences in inversion techniques. For prior constraints, we utilize fluxes and uncertainties from the CASA-GFED model of the terrestrial biosphere and biomass burning driven by satellite observations and interannually varying meteorology. We also use measurement-based ocean flux estimates and two sets of fixed fossil CO2 emissions. Here, our inversions incorporate column CO2 measurements from the GOSAT satellite (ACOS retrieval, filtered and bias-corrected) and in situ observations (individual flask and afternoon-average continuous observations) to estimate fluxes in 108 regions over 8-day intervals for the batch inversion and at 3 x 3.75 weekly for the variational system. Relationships between fluxes and atmospheric concentrations are derived consistently for the two inversion systems using the PCTM atmospheric transport model driven by meteorology from the MERRA reanalysis. We compare the posterior fluxes and uncertainties derived using different data sets and the two inversion approaches, and evaluate the posterior atmospheric concentrations against independent data including aircraft measurements. The optimized fluxes generally resemble those from other studies. For example, the results indicate that the terrestrial biosphere is a net CO2 sink, and a GOSAT-only inversion suggests a shift in the global sink from the tropics south to the north relative to the prior and to an in-situ-only inversion. We also find a smaller terrestrial sink in higher-latitude northern regions in boreal summer of 2010 relative to 2009.

  19. Using scattering theory to compute invariant manifolds and numerical results for the laser-driven Hénon-Heiles system.

    PubMed

    Blazevski, Daniel; Franklin, Jennifer

    2012-12-01

    Scattering theory is a convenient way to describe systems that are subject to time-dependent perturbations which are localized in time. Using scattering theory, one can compute time-dependent invariant objects for the perturbed system knowing the invariant objects of the unperturbed system. In this paper, we use scattering theory to give numerical computations of invariant manifolds appearing in laser-driven reactions. In this setting, invariant manifolds separate regions of phase space that lead to different outcomes of the reaction and can be used to compute reaction rates.

  20. Chicago Residents’ Perceptions of Air Quality: Objective Pollution, the Built Environment, and Neighborhood Stigma Theory

    PubMed Central

    King, Katherine E.

    2014-01-01

    Substantial research documents higher pollution levels in minority neighborhoods, but little research evaluates how residents perceive their own communities’ pollution risks. According to “Neighborhood stigma” theory, survey respondents share a cultural bias that minorities cause social dysfunction, leading to over-reports of dysfunction in minority communities. This study investigates perceptions of residential outdoor air quality by linking objective data on built and social environments with multiple measures of pollution and a representative survey of Chicago residents. Consistent with the scholarly narrative, results show air quality is rated worse where minorities and poverty are concentrated, even after extensive adjustment for objective pollution and built environment measures. Perceptions of air pollution may thus be driven by neighborhood socioeconomic position far more than by respondents’ ability to perceive pollution. The finding that 63.5% of the sample reported excellent or good air quality helps to explain current challenging in promoting environmental action. PMID:26527847

  1. Decomposing delta, theta, and alpha time–frequency ERP activity from a visual oddball task using PCA

    PubMed Central

    Bernat, Edward M.; Malone, Stephen M.; Williams, William J.; Patrick, Christopher J.; Iacono, William G.

    2008-01-01

    Objective Time–frequency (TF) analysis has become an important tool for assessing electrical and magnetic brain activity from event-related paradigms. In electrical potential data, theta and delta activities have been shown to underlie P300 activity, and alpha has been shown to be inhibited during P300 activity. Measures of delta, theta, and alpha activity are commonly taken from TF surfaces. However, methods for extracting relevant activity do not commonly go beyond taking means of windows on the surface, analogous to measuring activity within a defined P300 window in time-only signal representations. The current objective was to use a data driven method to derive relevant TF components from event-related potential data from a large number of participants in an oddball paradigm. Methods A recently developed PCA approach was employed to extract TF components [Bernat, E. M., Williams, W. J., and Gehring, W. J. (2005). Decomposing ERP time-frequency energy using PCA. Clin Neurophysiol, 116(6), 1314–1334] from an ERP dataset of 2068 17 year olds (979 males). TF activity was taken from both individual trials and condition averages. Activity including frequencies ranging from 0 to 14 Hz and time ranging from stimulus onset to 1312.5 ms were decomposed. Results A coordinated set of time–frequency events was apparent across the decompositions. Similar TF components representing earlier theta followed by delta were extracted from both individual trials and averaged data. Alpha activity, as predicted, was apparent only when time–frequency surfaces were generated from trial level data, and was characterized by a reduction during the P300. Conclusions Theta, delta, and alpha activities were extracted with predictable time-courses. Notably, this approach was effective at characterizing data from a single-electrode. Finally, decomposition of TF data generated from individual trials and condition averages produced similar results, but with predictable differences. Specifically, trial level data evidenced more and more varied theta measures, and accounted for less overall variance. PMID:17027110

  2. Flow enhancement of deformable self-driven objects by countercurrent

    NASA Astrophysics Data System (ADS)

    Mashiko, Takashi; Fujiwara, Takashi

    2016-10-01

    We report numerical simulations of the mixed flows of two groups of deformable self-driven objects. The objects belonging to the group A (B) have drift coefficient D =DA (DB), where a positive (negative) value of D denotes the rightward (leftward) driving force. For co-current flows (DA ,DB > 0), the result is rather intuitive: the net flow of one group (QA) increases if the driving force of the other group is stronger than its own driving force (i.e., DB >DA), and decreases otherwise (DB

  3. Enhancing the T-shaped learning profile when teaching hydrology using data, modeling, and visualization activities

    NASA Astrophysics Data System (ADS)

    Sanchez, Christopher A.; Ruddell, Benjamin L.; Schiesser, Roy; Merwade, Venkatesh

    2016-03-01

    Previous research has suggested that the use of more authentic learning activities can produce more robust and durable knowledge gains. This is consistent with calls within civil engineering education, specifically hydrology, that suggest that curricula should more often include professional perspective and data analysis skills to better develop the "T-shaped" knowledge profile of a professional hydrologist (i.e., professional breadth combined with technical depth). It was expected that the inclusion of a data-driven simulation lab exercise that was contextualized within a real-world situation and more consistent with the job duties of a professional in the field, would provide enhanced learning and appreciation of job duties beyond more conventional paper-and-pencil exercises in a lower-division undergraduate course. Results indicate that while students learned in both conditions, learning was enhanced for the data-driven simulation group in nearly every content area. This pattern of results suggests that the use of data-driven modeling and visualization activities can have a significant positive impact on instruction. This increase in learning likely facilitates the development of student perspective and conceptual mastery, enabling students to make better choices about their studies, while also better preparing them for work as a professional in the field.

  4. Enhancing the T-shaped learning profile when teaching hydrology using data, modeling, and visualization activities

    NASA Astrophysics Data System (ADS)

    Sanchez, C. A.; Ruddell, B. L.; Schiesser, R.; Merwade, V.

    2015-07-01

    Previous research has suggested that the use of more authentic learning activities can produce more robust and durable knowledge gains. This is consistent with calls within civil engineering education, specifically hydrology, that suggest that curricula should more often include professional perspective and data analysis skills to better develop the "T-shaped" knowledge profile of a professional hydrologist (i.e., professional breadth combined with technical depth). It was expected that the inclusion of a data driven simulation lab exercise that was contextualized within a real-world situation and more consistent with the job duties of a professional in the field, would provide enhanced learning and appreciation of job duties beyond more conventional paper-and-pencil exercises in a lower division undergraduate course. Results indicate that while students learned in both conditions, learning was enhanced for the data-driven simulation group in nearly every content area. This pattern of results suggests that the use of data-driven modeling and visualization activities can have a significant positive impact on instruction. This increase in learning likely facilitates the development of student perspective and conceptual mastery, enabling students to make better choices about their studies, while also better preparing them for work as a professional in the field.

  5. Empowering America's Communities to Prepare for the Effects of Climate Change: Developing Actionable Climate Science Under the President's Climate Action Plan

    NASA Astrophysics Data System (ADS)

    Duffy, P. B.; Colohan, P.; Driggers, R.; Herring, D.; Laurier, F.; Petes, L.; Ruffo, S.; Tilmes, C.; Venkataraman, B.; Weaver, C. P.

    2014-12-01

    Effective adaptation to impacts of climate change requires best-available information. To be most useful, this information should be easily found, well-documented, and translated into tools that decision-makers use and trust. To meet these needs, the President's Climate Action Plan includes efforts to develop "actionable climate science". The Climate Data Initiative (CDI) leverages the Federal Government's extensive, open data resources to stimulate innovation and private-sector entrepreneurship in support of actions to prepare for climate change. The Initiative forges commitments and partnerships from the private, NGO, academic, and public sectors to create data-driven tools. Open data from Federal agencies to support this innovation is available on Climate.Data.gov, initially focusing on coastal flooding but soon to expand to topics including food, energy, water, energy, transportation, and health. The Climate Resilience Toolkit (CRT) will facilitate access to data-driven resilience tools, services, and best practices, including those accessible through the CDI. The CRT will also include access to training and tutorials, case studies, engagement forums, and other information sources. The Climate Action Plan also calls for a public-private partnership on extreme weather risk, with the goal of generating improved assessments of risk from different types of extreme weather events, using methods and data that are transparent and accessible. Finally, the U.S. Global Change Research Program and associated agencies work to advance the science necessary to inform decisions and sustain assessments. Collectively, these efforts represent increased emphasis across the Federal Government on the importance of information to support climate resilience.

  6. Understanding the Voice of the Customer: Practical, Data-Driven Planning and Decision Making for Access Services

    ERIC Educational Resources Information Center

    Huff-Eibl, Robyn; Miller-Wells, John; Begay, Wendy

    2014-01-01

    This article describes the process and role frontline access and public service staff play in needs assessment and evaluation of user services, specifically in understanding the voice of the customer. Information includes how the University of Arizona Libraries have incorporated daily data collection into the strategic planning process, resources…

  7. Re-Defining Language Teacher Cognition through a Data-Driven Model: The Case of Three EFL Teachers

    ERIC Educational Resources Information Center

    Öztürk, Gökhan; Gürbüz, Nurdan

    2017-01-01

    This study examined the main sources of the participant English as a foreign language (EFL) teachers' cognitions, their classroom practices and the impact of institutional context on these practices. The participants included three Turkish EFL instructors working at English preparatory programs at university level. The data were collected through…

  8. Knowledge-Based Object Detection in Laser Scanning Point Clouds

    NASA Astrophysics Data System (ADS)

    Boochs, F.; Karmacharya, A.; Marbs, A.

    2012-07-01

    Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.

  9. Object-oriented approach to fast display of electrophysiological data under MS-windows.

    PubMed

    Marion-Poll, F

    1995-12-01

    Microcomputers provide neuroscientists an alternative to a host of laboratory equipment to record and analyze electrophysiological data. Object-oriented programming tools bring an essential link between custom needs for data acquisition and analysis with general software packages. In this paper, we outline the layout of basic objects that display and manipulate electrophysiological data files. Visual inspection of the recordings is a basic requirement of any data analysis software. We present an approach that allows flexible and fast display of large data sets. This approach involves constructing an intermediate representation of the data in order to lower the number of actual points displayed while preserving the aspect of the data. The second group of objects is related to the management of lists of data files. Typical experiments designed to test the biological activity of pharmacological products include scores of files. Data manipulation and analysis are facilitated by creating multi-document objects that include the names of all experiment files. Implementation steps of both objects are described for an MS-Windows hosted application.

  10. Origin of the pulse-like signature of shallow long-period volcano seismicity

    USGS Publications Warehouse

    Chouet, Bernard A.; Dawson, Phillip B.

    2016-01-01

    Short-duration, pulse-like long-period (LP) events are a characteristic type of seismicity accompanying eruptive activity at Mount Etna in Italy in 2004 and 2008 and at Turrialba Volcano in Costa Rica and Ubinas Volcano in Peru in 2009. We use the discrete wave number method to compute the free surface response in the near field of a rectangular tensile crack embedded in a homogeneous elastic half space and to gain insights into the origin of the LP pulses. Two source models are considered, including (1) a vertical fluid-driven crack and (2) a unilateral tensile rupture growing at a fixed sub-Rayleigh velocity with constant opening on a vertical crack. We apply cross correlation to the synthetics and data to demonstrate that a fluid-driven crack provides a natural explanation for these data with realistic source sizes and fluid properties. Our modeling points to shallow sources (<1 km depth), whose signatures are representative of the Rayleigh pulse sampled at epicentral distances >∼1 km. While a slow-rupture failure provides another potential model for these events, the synthetics and resulting fits to the data are not optimal in this model compared to a fluid-driven source. We infer that pulse-like LP signatures are parts of the continuum of responses produced by shallow fluid-driven sources in volcanoes.

  11. Modelling Tradeoffs Evolution in Multipurpose Water Systems Operation in Response to Extreme Events

    NASA Astrophysics Data System (ADS)

    Mason, E.; Gazzotti, P.; Amigoni, F.; Giuliani, M.; Castelletti, A.

    2015-12-01

    Multipurpose water resource systems are usually operated on a tradeoff of the operating objectives, which - under steady state climatic and socio-economic boundary conditions - is supposed to ensure a fair and/or efficient balance among the conflicting interests. Extreme variability in the system's drivers might affect operators' risk aversion and force a change in the tradeoff. Properly accounting for these shifts is key to any rigorous retrospective assessment of operators' behavior and the associated system's performance. In this study, we explore how the selection of different optimal tradeoffs among the operating objectives is linked to the variations of the boundary conditions, such as, for example, drifting rainfall season or remarkable changes in crop and energy prices. We argue that tradeoff selection is driven by recent, extreme variations in system performance: underperforming on one of the operating objective target value should push the tradeoff toward the disadvantaged objective. To test this assumption, we developed a rational procedure to simulate the operators' tradeoff selection process. We map the selection onto a multi lateral negotiation process, where different multiple, virtual agents optimize different operating objectives. The agents periodically negotiate a compromise on the operating policy. The agent's rigidity in each negotiation round is determined by the recent system performances according to the specific objective it represents. The negotiation follows a set-based egocentric monotonic concession protocol: at each negotiation step an agent incrementally adds some options to the set of its acceptable compromises and (possibly) accepts lower and lower satisfying policies until an agreement is achieved. We apply this reiterated negotiation framework on the regulated Lake Como, Italy, simulating the lake dam operation and its recurrent updates over the last 50 years. The operation aims to balance shoreline flood prevention and irrigation deficit control in the downstream irrigated areas. The results of our simulated negotiations are able to accurately capture the operator's risk aversion changes as driven by extreme wet and dry situations, and to well reproduce the observational release data.

  12. Veterans’ Preferences for Exchanging Information Using Veterans Affairs Health Information Technologies: Focus Group Results and Modeling Simulations

    PubMed Central

    Chavez, Margeaux; Nazi, Kim; Antinori, Nicole; Melillo, Christine; Cotner, Bridget A; Hathaway, Wendy; Cook, Ashley; Wilck, Nancy; Noonan, Abigail

    2017-01-01

    Background The Department of Veterans Affairs (VA) has multiple health information technology (HIT) resources for veterans to support their health care management. These include a patient portal, VetLink Kiosks, mobile apps, and telehealth services. The veteran patient population has a variety of needs and preferences that can inform current VA HIT redesign efforts to meet consumer needs. Objective This study aimed to describe veterans’ experiences using the current VA HIT and identify their vision for the future of an integrated VA HIT system. Methods Two rounds of focus group interviews were conducted with a single cohort of 47 veterans and one female caregiver recruited from Bedford, Massachusetts, and Tampa, Florida. Focus group interviews included simulation modeling activities and a self-administered survey. This study also used an expert panel group to provide data and input throughout the study process. High-fidelity, interactive simulations were created and used to facilitate collection of qualitative data. The simulations were developed based on system requirements, data collected through operational efforts, and participants' reported preferences for using VA HIT. Pairwise comparison activities of HIT resources were conducted with both focus groups and the expert panel. Rapid iterative content analysis was used to analyze qualitative data. Descriptive statistics summarized quantitative data. Results Data themes included (1) current use of VA HIT, (2) non-VA HIT use, and (3) preferences for future use of VA HIT. Data indicated that, although the Secure Messaging feature was often preferred, a full range of HIT options are needed. These data were then used to develop veteran-driven simulations that illustrate user needs and expectations when using a HIT system and services to access VA health care services. Conclusions Patient participant redesign processes present critical opportunities for creating a human-centered design. Veterans value virtual health care options and prefer standardized, integrated, and synchronized user-friendly interface designs. PMID:29061553

  13. Collaborative Project: The problem of bias in defining uncertainty in computationally enabled strategies for data-driven climate model development. Final Technical Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huerta, Gabriel

    The objective of the project is to develop strategies for better representing scientific sensibilities within statistical measures of model skill that then can be used within a Bayesian statistical framework for data-driven climate model development and improved measures of model scientific uncertainty. One of the thorny issues in model evaluation is quantifying the effect of biases on climate projections. While any bias is not desirable, only those biases that affect feedbacks affect scatter in climate projections. The effort at the University of Texas is to analyze previously calculated ensembles of CAM3.1 with perturbed parameters to discover how biases affect projectionsmore » of global warming. The hypothesis is that compensating errors in the control model can be identified by their effect on a combination of processes and that developing metrics that are sensitive to dependencies among state variables would provide a way to select version of climate models that may reduce scatter in climate projections. Gabriel Huerta at the University of New Mexico is responsible for developing statistical methods for evaluating these field dependencies. The UT effort will incorporate these developments into MECS, which is a set of python scripts being developed at the University of Texas for managing the workflow associated with data-driven climate model development over HPC resources. This report reflects the main activities at the University of New Mexico where the PI (Huerta) and the Postdocs (Nosedal, Hattab and Karki) worked on the project.« less

  14. Estimating Setup of Driven Piles into Louisiana Clayey Soils

    DOT National Transportation Integrated Search

    2009-11-15

    Two types of mathematical models for pile setup prediction, the Skov-Denver model and the newly developed rate-based model, have been established from all the dynamic and static testing data, including restrikes of the production piles, restrikes, st...

  15. Estimating setup of driven piles into Louisiana clayey soils.

    DOT National Transportation Integrated Search

    2010-11-15

    Two types of mathematical models for pile setup prediction, the Skov-Denver model and the newly developed rate-based model, have been established from all the dynamic and static testing data, including restrikes of the production piles, restrikes, st...

  16. On Lack of Robustness in Hydrological Model Development Due to Absence of Guidelines for Selecting Calibration and Evaluation Data: Demonstration for Data-Driven Models

    NASA Astrophysics Data System (ADS)

    Zheng, Feifei; Maier, Holger R.; Wu, Wenyan; Dandy, Graeme C.; Gupta, Hoshin V.; Zhang, Tuqiao

    2018-02-01

    Hydrological models are used for a wide variety of engineering purposes, including streamflow forecasting and flood-risk estimation. To develop such models, it is common to allocate the available data to calibration and evaluation data subsets. Surprisingly, the issue of how this allocation can affect model evaluation performance has been largely ignored in the research literature. This paper discusses the evaluation performance bias that can arise from how available data are allocated to calibration and evaluation subsets. As a first step to assessing this issue in a statistically rigorous fashion, we present a comprehensive investigation of the influence of data allocation on the development of data-driven artificial neural network (ANN) models of streamflow. Four well-known formal data splitting methods are applied to 754 catchments from Australia and the U.S. to develop 902,483 ANN models. Results clearly show that the choice of the method used for data allocation has a significant impact on model performance, particularly for runoff data that are more highly skewed, highlighting the importance of considering the impact of data splitting when developing hydrological models. The statistical behavior of the data splitting methods investigated is discussed and guidance is offered on the selection of the most appropriate data splitting methods to achieve representative evaluation performance for streamflow data with different statistical properties. Although our results are obtained for data-driven models, they highlight the fact that this issue is likely to have a significant impact on all types of hydrological models, especially conceptual rainfall-runoff models.

  17. Quantum correlations from a room-temperature optomechanical cavity.

    PubMed

    Purdy, T P; Grutter, K E; Srinivasan, K; Taylor, J M

    2017-06-23

    The act of position measurement alters the motion of an object being measured. This quantum measurement backaction is typically much smaller than the thermal motion of a room-temperature object and thus difficult to observe. By shining laser light through a nanomechanical beam, we measure the beam's thermally driven vibrations and perturb its motion with optical force fluctuations at a level dictated by the Heisenberg measurement-disturbance uncertainty relation. We demonstrate a cross-correlation technique to distinguish optically driven motion from thermally driven motion, observing this quantum backaction signature up to room temperature. We use the scale of the quantum correlations, which is determined by fundamental constants, to gauge the size of thermal motion, demonstrating a path toward absolute thermometry with quantum mechanically calibrated ticks. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  18. Common and differential electrophysiological mechanisms underlying semantic object memory retrieval probed by features presented in different stimulus types.

    PubMed

    Chiang, Hsueh-Sheng; Eroh, Justin; Spence, Jeffrey S; Motes, Michael A; Maguire, Mandy J; Krawczyk, Daniel C; Brier, Matthew R; Hart, John; Kraut, Michael A

    2016-08-01

    How the brain combines the neural representations of features that comprise an object in order to activate a coherent object memory is poorly understood, especially when the features are presented in different modalities (visual vs. auditory) and domains (verbal vs. nonverbal). We examined this question using three versions of a modified Semantic Object Retrieval Test, where object memory was probed by a feature presented as a written word, a spoken word, or a picture, followed by a second feature always presented as a visual word. Participants indicated whether each feature pair elicited retrieval of the memory of a particular object. Sixteen subjects completed one of the three versions (N=48 in total) while their EEG were recorded simultaneously. We analyzed EEG data in four separate frequency bands (delta: 1-4Hz, theta: 4-7Hz; alpha: 8-12Hz; beta: 13-19Hz) using a multivariate data-driven approach. We found that alpha power time-locked to response was modulated by both cross-modality (visual vs. auditory) and cross-domain (verbal vs. nonverbal) probing of semantic object memory. In addition, retrieval trials showed greater changes in all frequency bands compared to non-retrieval trials across all stimulus types in both response-locked and stimulus-locked analyses, suggesting dissociable neural subcomponents involved in binding object features to retrieve a memory. We conclude that these findings support both modality/domain-dependent and modality/domain-independent mechanisms during semantic object memory retrieval. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. HIV risk practices by female sex workers according to workplace.

    PubMed

    Damacena, Giseli Nogueira; Szwarcwald, Célia Landmann; Souza Júnior, Paulo Roberto Borges de

    2014-06-01

    OBJECTIVE To investigate differences in HIV infection- related risk practices by Female Sex Workers according to workplace and the effects of homophily on estimating HIV prevalence. METHODS Data from 2,523 women, recruited using Respondent-Driven Sampling, were used for the study carried out in 10 Brazilian cities in 2008-2009. The study included female sex workers aged 18 and over. The questionnaire was completed by the subjects and included questions on characteristics of professional activity, sexual practices, use of drugs, HIV testing, and access to health services. HIV quick tests were conducted. The participants were classified in two groups according to place of work: on the street or indoor venues, like nightclubs and saunas. To compare variable distributions by place of work, we used Chi-square homogeneity tests, taking into consideration unequal selection probabilities as well as the structure of dependence between observations. We tested the effect of homophily by workplace on estimated HIV prevalence. RESULTS The highest HIV risk practices were associated with: working on the streets, lower socioeconomic status, low regular smear test coverage, higher levels of crack use and higher levels of syphilis serological scars as well as higher prevalence of HIV infection. The effect of homophily was higher among sex workers in indoor venues. However, it did not affect the estimated prevalence of HIV, even after using a post-stratification by workplace procedure. CONCLUSIONS The findings suggest that strategies should focus on extending access to, and utilization of, health services. Prevention policies should be specifically aimed at street workers. Regarding the application of Respondent-Driven Sampling, the sample should be sufficient to estimate transition probabilities, as the network develops more quickly among sex workers in indoor venues.

  20. HIV risk practices by female sex workers according to workplace

    PubMed Central

    Damacena, Giseli Nogueira; Szwarcwald, Célia Landmann; de Souza, Paulo Roberto Borges

    2014-01-01

    OBJECTIVE To investigate differences in HIV infection- related risk practices by Female Sex Workers according to workplace and the effects of homophily on estimating HIV prevalence. METHODS Data from 2,523 women, recruited using Respondent-Driven Sampling, were used for the study carried out in 10 Brazilian cities in 2008-2009. The study included female sex workers aged 18 and over. The questionnaire was completed by the subjects and included questions on characteristics of professional activity, sexual practices, use of drugs, HIV testing, and access to health services. HIV quick tests were conducted. The participants were classified in two groups according to place of work: on the street or indoor venues, like nightclubs and saunas. To compare variable distributions by place of work, we used Chi-square homogeneity tests, taking into consideration unequal selection probabilities as well as the structure of dependence between observations. We tested the effect of homophily by workplace on estimated HIV prevalence. RESULTS The highest HIV risk practices were associated with: working on the streets, lower socioeconomic status, low regular smear test coverage, higher levels of crack use and higher levels of syphilis serological scars as well as higher prevalence of HIV infection. The effect of homophily was higher among sex workers in indoor venues. However, it did not affect the estimated prevalence of HIV, even after using a post-stratification by workplace procedure. CONCLUSIONS The findings suggest that strategies should focus on extending access to, and utilization of, health services. Prevention policies should be specifically aimed at street workers. Regarding the application of Respondent-Driven Sampling, the sample should be sufficient to estimate transition probabilities, as the network develops more quickly among sex workers in indoor venues. PMID:25119937

  1. Data driven innovations in structural health monitoring

    NASA Astrophysics Data System (ADS)

    Rosales, M. J.; Liyanapathirana, R.

    2017-05-01

    At present, substantial investments are being allocated to civil infrastructures also considered as valuable assets at a national or global scale. Structural Health Monitoring (SHM) is an indispensable tool required to ensure the performance and safety of these structures based on measured response parameters. The research to date on damage assessment has tended to focus on the utilization of wireless sensor networks (WSN) as it proves to be the best alternative over the traditional visual inspections and tethered or wired counterparts. Over the last decade, the structural health and behaviour of innumerable infrastructure has been measured and evaluated owing to several successful ventures of implementing these sensor networks. Various monitoring systems have the capability to rapidly transmit, measure, and store large capacities of data. The amount of data collected from these networks have eventually been unmanageable which paved the way to other relevant issues such as data quality, relevance, re-use, and decision support. There is an increasing need to integrate new technologies in order to automate the evaluation processes as well as to enhance the objectivity of data assessment routines. This paper aims to identify feasible methodologies towards the application of time-series analysis techniques to judiciously exploit the vast amount of readily available as well as the upcoming data resources. It continues the momentum of a greater effort to collect and archive SHM approaches that will serve as data-driven innovations for the assessment of damage through efficient algorithms and data analytics.

  2. An Experimental Investigation of Unsteady Thrust Augmentation Using a Speaker-Driven Jet

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Wernet, Mark P.; John, Wentworth T.

    2004-01-01

    An experimental investigation is described in which a simple speaker-driven jet was used as a pulsed thrust source (driver) for an ejector configuration. The objectives of the investigation were twofold: first, to add to the experimental body of evidence showing that an unsteady thrust source, combined with a properly sized ejector generally yields higher thrust augmentation values than a similarly sized, steady driver of equivalent thrust. Second, to identify characteristics of the unsteady driver that may be useful for sizing ejectors, and predicting what thrust augmentation values may be achieved. The speaker-driven jet provided a convenient source for the investigation because it is entirely unsteady (having no mean component) and because relevant parameters such as frequency, time-averaged thrust, and diameter are easily variable. The experimental setup will be described, as will the various measurements made. These include both thrust and Digital Particle Imaging Velocimetry of the driver. It will be shown that thrust augmentation values as high as 1.8 were obtained, that the diameter of the best ejector scaled with the dimensions of the emitted vortex, and that the so-called Formation Number serves as a useful dimensionless number by which to characterize the jet and predict performance.

  3. Evaluation of global water quality - the potential of a data- and model-driven analysis

    NASA Astrophysics Data System (ADS)

    Bärlund, Ilona; Flörke, Martina; Alcamo, Joseph; Völker, Jeanette; Malsy, Marcus; Kaus, Andrew; Reder, Klara; Büttner, Olaf; Katterfeld, Christiane; Dietrich, Désirée; Borchardt, Dietrich

    2016-04-01

    The ongoing socio-economic development presents a new challenge for water quality worldwide, especially in developing and emerging countries. It is estimated that due to population growth and the extension of water supply networks, the amount of waste water will rise sharply. This can lead to an increased risk of surface water quality degradation, if the wastewater is not sufficiently treated. This development has impacts on ecosystems and human health, as well as food security. The United Nations Member States have adopted targets for sustainable development. They include, inter alia, sustainable protection of water quality and sustainable use of water resources. To achieve these goals, appropriate monitoring strategies and the development of indicators for water quality are required. Within the pre-study for a 'World Water Quality Assessment' (WWQA) led by United Nations Environment Programme (UNEP), a methodology for assessing water quality, taking into account the above-mentioned objectives has been developed. The novelty of this methodology is the linked model- and data-driven approach. The focus is on parameters reflecting the key water quality issues, such as increased waste water pollution, salinization or eutrophication. The results from the pre-study show, for example, that already about one seventh of all watercourses in Latin America, Africa and Asia show high organic pollution. This is of central importance for inland fisheries and associated food security. In addition, it could be demonstrated that global water quality databases have large gaps. These must be closed in the future in order to obtain an overall picture of global water quality and to target measures more efficiently. The aim of this presentation is to introduce the methodology developed within the WWQA pre-study and to show selected examples of application in Latin America, Africa and Asia.

  4. Data for Renewable Energy Planning, Policy, and Investment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sarah L

    Reliable, robust, and validated data are critical for informed planning, policy development, and investment in the clean energy sector. The Renewable Energy (RE) Explorer was developed to support data-driven renewable energy analysis that can inform key renewable energy decisions globally. This document presents the types of geospatial and other data at the core of renewable energy analysis and decision making. Individual data sets used to inform decisions vary in relation to spatial and temporal resolution, quality, and overall usefulness. From Data to Decisions, a complementary geospatial data and analysis decision guide, provides an in-depth view of these and other considerationsmore » to enable data-driven planning, policymaking, and investment. Data support a wide variety of renewable energy analyses and decisions, including technical and economic potential assessment, renewable energy zone analysis, grid integration, risk and resiliency identification, electrification, and distributed solar photovoltaic potential. This fact sheet provides information on the types of data that are important for renewable energy decision making using the RE Data Explorer or similar types of geospatial analysis tools.« less

  5. Exploring Techniques of Developing Writing Skill in IELTS Preparatory Courses: A Data-Driven Study

    ERIC Educational Resources Information Center

    Ostovar-Namaghi, Seyyed Ali; Safaee, Seyyed Esmail

    2017-01-01

    Being driven by the hypothetico-deductive mode of inquiry, previous studies have tested the effectiveness of theory-driven interventions under controlled experimental conditions to come up with universally applicable generalizations. To make a case in the opposite direction, this data-driven study aims at uncovering techniques and strategies…

  6. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  7. Linking diet, physical activity, cardiorespiratory fitness and obesity to serum metabolite networks: findings from a population-based study

    PubMed Central

    Floegel, A; Wientzek, A; Bachlechner, U; Jacobs, S; Drogan, D; Prehn, C; Adamski, J; Krumsiek, J; Schulze, M B; Pischon, T; Boeing, H

    2014-01-01

    Objective: It is not yet resolved how lifestyle factors and intermediate phenotypes interrelate with metabolic pathways. We aimed to investigate the associations between diet, physical activity, cardiorespiratory fitness and obesity with serum metabolite networks in a population-based study. Methods: The present study included 2380 participants of a randomly drawn subcohort of the European Prospective Investigation into Cancer and Nutrition-Potsdam. Targeted metabolomics was used to measure 127 serum metabolites. Additional data were available including anthropometric measurements, dietary assessment including intake of whole-grain bread, coffee and cake and cookies by food frequency questionnaire, and objectively measured physical activity energy expenditure and cardiorespiratory fitness in a subsample of 100 participants. In a data-driven approach, Gaussian graphical modeling was used to draw metabolite networks and depict relevant associations between exposures and serum metabolites. In addition, the relationship of different exposure metabolite networks was estimated. Results: In the serum metabolite network, the different metabolite classes could be separated. There was a big group of phospholipids and acylcarnitines, a group of amino acids and C6-sugar. Amino acids were particularly positively associated with cardiorespiratory fitness and physical activity. C6-sugar and acylcarnitines were positively associated with obesity and inversely with intake of whole-grain bread. Phospholipids showed opposite associations with obesity and coffee intake. Metabolite networks of coffee intake and obesity were strongly inversely correlated (body mass index (BMI): r=−0.57 and waist circumference: r=−0.59). A strong positive correlation was observed between metabolite networks of BMI and waist circumference (r=0.99), as well as the metabolite networks of cake and cookie intake with cardiorespiratory fitness and intake of whole-grain bread (r=0.52 and r=0.50; respectively). Conclusions: Lifestyle factors and phenotypes seem to interrelate in various metabolic pathways. A possible protective effect of coffee could be mediated via counterbalance of pathways of obesity involving hepatic phospholipids. Experimental studies should validate the biological mechanisms. PMID:24608922

  8. Data Driven Professional Development Design for Out-of-School Time Educators Using Planetary Science and Engineering Educational Materials

    NASA Astrophysics Data System (ADS)

    Clark, J.; Bloom, N.

    2017-12-01

    Data driven design practices should be the basis for any effective educational product, particularly those used to support STEM learning and literacy. Planetary Learning that Advances the Nexus of Engineering, Technology, and Science (PLANETS) is a five-year NASA-funded (NNX16AC53A) interdisciplinary and cross-institutional partnership to develop and disseminate STEM out-of-school time (OST) curricular and professional development units that integrate planetary science, technology, and engineering. The Center for Science Teaching and Learning at Northern Arizona University, the U.S. Geological Survey Astrogeology Science Center, and the Museum of Science Boston are partners in developing, piloting, and researching the impact of three out of school time units. Two units are for middle grades youth and one is for upper elementary aged youth. The presentation will highlight the data driven development process of the educational products used to provide support for educators teaching these curriculum units. This includes how data from the project needs assessment, curriculum pilot testing, and professional support product field tests are used in the design of products for out of school time educators. Based on data analysis, the project is developing and testing four tiers of professional support for OST educators. Tier 1 meets the immediate needs of OST educators to teach curriculum and include how-to videos and other direct support materials. Tier 2 provides additional content and pedagogical knowledge and includes short content videos designed to specifically address the content of the curriculum. Tier 3 elaborates on best practices in education and gives guidance on methods, for example, to develop cultural relevancy for underrepresented students. Tier 4 helps make connections to other NASA or educational products that support STEM learning in out of school settings. Examples of the tiers of support will be provided.

  9. Measurement and Analysis of Extreme Wave and Ice Actions in the Great Lakes for Offshore Wind Platform Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    England, Tony; van Nieuwstadt, Lin; De Roo, Roger

    This project, funded by the Department of Energy as DE-EE0005376, successfully measured wind-driven lake ice forces on an offshore structure in Lake Superior through one of the coldest winters in recent history. While offshore regions of the Great Lakes offer promising opportunities for harvesting wind energy, these massive bodies of freshwater also offer extreme and unique challenges. Among these challenges is the need to anticipate forces exerted on offshore structures by lake ice. The parameters of interest include the frequency, extent, and movement of lake ice, parameters that are routinely monitored via satellite, and ice thickness, a parameter that hasmore » been monitored at discrete locations over many years and is routinely modeled. Essential relationships for these data to be of use in the design of offshore structures and the primary objective of this project are measurements of maximum forces that lake ice of known thicknesses might exert on an offshore structure.« less

  10. Pharmacy Residency School-wide Match Rates and Modifiable Predictors in ACPE-accredited Colleges and Schools of Pharmacy

    PubMed Central

    Whittaker, Alana; Shan, Guogen

    2017-01-01

    Objective. To analyze the modifiable predictors of institution-wide residency match rates. Methods. This was a retrospective analysis of colleges and schools of pharmacy data and school-wide PGY-1 pharmacy residency match rates for 2013 through 2015. Independent variables included NAPLEX passing rates, history of ACPE probation, NIH funding, academic health center affiliation, dual-degree availability, program length, admit-to-applicant ratio, class size, tuition, student-driven research, clinically focused academic tracks, residency affiliation, U.S. News & World Report rankings, and minority enrollment. Results. In a repeated measures model, predictors of match results were NAPLEX pass rate, class size, academic health center affiliation, admit-to-applicant ratio, U.S. News & World Report rankings, and minority enrollment. Conclusion. Indicators of student achievement, college/school reputation, affiliations, and class demographics were significant predictors of institution-wide residency match rates. Further research is needed to understand how changes in these factors may influence overall match rates. PMID:29367773

  11. NASA Tech Briefs, July 2005

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Thin-Film Resistance Heat-Flux Sensors Circuit Indicates that Voice-Recording Disks are Nearly Full Optical Sensing of Combustion Instabilities in Gas Turbines Topics include: Crane-Load Contact Sensor; Hexagonal and Pentagonal Fractal Multiband Antennas; Multifunctional Logic Gate Controlled by Temperature; Multifunctional Logic Gate Controlled by Supply Voltage; Power Divider for Waveforms Rich in Harmonics; SCB Quantum Computers Using iSWAP and 1-Qubit Rotations; CSAM Metrology Software Tool; Update on Rover Sequencing and Visualization Program; Selecting Data from a Star Catalog; Rotating Desk for Collaboration by Two Computer Programmers; Variable-Pressure Washer; Magnetically Attached Multifunction Maintenance Rover; Improvements in Fabrication of Sand/Binder Cores for Casting; Solid Freeform Fabrication of Composite-Material Objects; Efficient Computational Model of Hysteresis; Gauges for Highly Precise Metrology of a Compound Mirror; Improved Electrolytic Hydrogen Peroxide Generator; High-Power Fiber Lasers Using Photonic Band Gap Materials; Ontology-Driven Information Integration; Quantifying Traversability of Terrain for a Mobile Robot; More About Arc-Welding Process for Making Carbon Nanotubes; Controlling Laser Spot Size in Outer Space; or Software-Reconfigurable Processors for Spacecraft.

  12. Contrasting beliefs about screening for mental disorders among UK military personnel returning from deployment to Afghanistan.

    PubMed

    Keeling, M; Knight, T; Sharp, D; Fertout, M; Greenberg, N; Chesnokov, M; Rona, R J

    2012-12-01

    The objective of the study was to elicit beliefs and experiences of the value of a screening programme for mental illness among UK military personnel. Three months after returning from Afghanistan 21 army personnel participated in a qualitative study about mental health screening. One-to-one interviews were conducted and recorded. Data-driven thematic analysis was used. Researchers identified master themes represented by extracts of text from the 21 complete transcripts. Participants made positive remarks on the advantages of screening. Noted barriers to seeking help included: unwillingness to receive advice, a wish to deal with any problems themselves and a belief that military personnel should be strong enough to cope with any difficulties. Participants believed that overcoming barriers to participating in screening and seeking help would be best achieved by making screening compulsory. Although respondents were positive about a screening programme for mental illness, the barriers to seeking help for mental illness appear deep rooted and reinforced by the value ascribed to hardiness.

  13. Pharmacy Residency School-wide Match Rates and Modifiable Predictors in ACPE-accredited Colleges and Schools of Pharmacy.

    PubMed

    Whittaker, Alana; Smith, Katherine P; Shan, Guogen

    2017-12-01

    Objective. To analyze the modifiable predictors of institution-wide residency match rates. Methods. This was a retrospective analysis of colleges and schools of pharmacy data and school-wide PGY-1 pharmacy residency match rates for 2013 through 2015. Independent variables included NAPLEX passing rates, history of ACPE probation, NIH funding, academic health center affiliation, dual-degree availability, program length, admit-to-applicant ratio, class size, tuition, student-driven research, clinically focused academic tracks, residency affiliation, U.S. News & World Report rankings, and minority enrollment. Results. In a repeated measures model, predictors of match results were NAPLEX pass rate, class size, academic health center affiliation, admit-to-applicant ratio, U.S. News & World Report rankings, and minority enrollment. Conclusion. Indicators of student achievement, college/school reputation, affiliations, and class demographics were significant predictors of institution-wide residency match rates. Further research is needed to understand how changes in these factors may influence overall match rates.

  14. The CREST Simulation Development Process: Training the Next Generation.

    PubMed

    Sweet, Robert M

    2017-04-01

    The challenges of training and assessing endourologic skill have driven the development of new training systems. The Center for Research in Education and Simulation Technologies (CREST) has developed a team and a methodology to facilitate this development process. Backwards design principles were applied. A panel of experts first defined desired clinical and educational outcomes. Outcomes were subsequently linked to learning objectives. Gross task deconstruction was performed, and the primary domain was classified as primarily involving decision-making, psychomotor skill, or communication. A more detailed cognitive task analysis was performed to elicit and prioritize relevant anatomy/tissues, metrics, and errors. Reference anatomy was created using a digital anatomist and clinician working off of a clinical data set. Three dimensional printing can facilitate this process. When possible, synthetic or virtual tissue behavior and textures were recreated using data derived from human tissue. Embedded sensors/markers and/or computer-based systems were used to facilitate the collection of objective metrics. A learning Verification and validation occurred throughout the engineering development process. Nine endourology-relevant training systems were created by CREST with this approach. Systems include basic laparoscopic skills (BLUS), vesicourethral anastomosis, pyeloplasty, cystoscopic procedures, stent placement, rigid and flexible ureteroscopy, GreenLight PVP (GL Sim), Percutaneous access with C-arm (CAT), Nephrolithotomy (NLM), and a vascular injury model. Mixed modalities have been used, including "smart" physical models, virtual reality, augmented reality, and video. Substantial validity evidence for training and assessment has been collected on systems. An open source manikin-based modular platform is under development by CREST with the Department of Defense that will unify these and other commercial task trainers through the common physiology engine, learning management system, standard data connectors, and standards. Using the CREST process has and will ensure that the systems we create meet the needs of training and assessing endourologic skills.

  15. Joint Optimization of Fluence Field Modulation and Regularization in Task-Driven Computed Tomography

    PubMed Central

    Gang, G. J.; Siewerdsen, J. H.; Stayman, J. W.

    2017-01-01

    Purpose This work presents a task-driven joint optimization of fluence field modulation (FFM) and regularization in quadratic penalized-likelihood (PL) reconstruction. Conventional FFM strategies proposed for filtered-backprojection (FBP) are evaluated in the context of PL reconstruction for comparison. Methods We present a task-driven framework that leverages prior knowledge of the patient anatomy and imaging task to identify FFM and regularization. We adopted a maxi-min objective that ensures a minimum level of detectability index (d′) across sample locations in the image volume. The FFM designs were parameterized by 2D Gaussian basis functions to reduce dimensionality of the optimization and basis function coefficients were estimated using the covariance matrix adaptation evolutionary strategy (CMA-ES) algorithm. The FFM was jointly optimized with both space-invariant and spatially-varying regularization strength (β) - the former via an exhaustive search through discrete values and the latter using an alternating optimization where β was exhaustively optimized locally and interpolated to form a spatially-varying map. Results The optimal FFM inverts as β increases, demonstrating the importance of a joint optimization. For the task and object investigated, the optimal FFM assigns more fluence through less attenuating views, counter to conventional FFM schemes proposed for FBP. The maxi-min objective homogenizes detectability throughout the image and achieves a higher minimum detectability than conventional FFM strategies. Conclusions The task-driven FFM designs found in this work are counter to conventional patterns for FBP and yield better performance in terms of the maxi-min objective, suggesting opportunities for improved image quality and/or dose reduction when model-based reconstructions are applied in conjunction with FFM. PMID:28626290

  16. Joint optimization of fluence field modulation and regularization in task-driven computed tomography

    NASA Astrophysics Data System (ADS)

    Gang, G. J.; Siewerdsen, J. H.; Stayman, J. W.

    2017-03-01

    Purpose: This work presents a task-driven joint optimization of fluence field modulation (FFM) and regularization in quadratic penalized-likelihood (PL) reconstruction. Conventional FFM strategies proposed for filtered-backprojection (FBP) are evaluated in the context of PL reconstruction for comparison. Methods: We present a task-driven framework that leverages prior knowledge of the patient anatomy and imaging task to identify FFM and regularization. We adopted a maxi-min objective that ensures a minimum level of detectability index (d') across sample locations in the image volume. The FFM designs were parameterized by 2D Gaussian basis functions to reduce dimensionality of the optimization and basis function coefficients were estimated using the covariance matrix adaptation evolutionary strategy (CMA-ES) algorithm. The FFM was jointly optimized with both space-invariant and spatially-varying regularization strength (β) - the former via an exhaustive search through discrete values and the latter using an alternating optimization where β was exhaustively optimized locally and interpolated to form a spatially-varying map. Results: The optimal FFM inverts as β increases, demonstrating the importance of a joint optimization. For the task and object investigated, the optimal FFM assigns more fluence through less attenuating views, counter to conventional FFM schemes proposed for FBP. The maxi-min objective homogenizes detectability throughout the image and achieves a higher minimum detectability than conventional FFM strategies. Conclusions: The task-driven FFM designs found in this work are counter to conventional patterns for FBP and yield better performance in terms of the maxi-min objective, suggesting opportunities for improved image quality and/or dose reduction when model-based reconstructions are applied in conjunction with FFM.

  17. Re-evaluation of heat flow data near Parkfield, CA: Evidence for a weak San Andreas Fault

    USGS Publications Warehouse

    Fulton, P.M.; Saffer, D.M.; Harris, Reid N.; Bekins, B.A.

    2004-01-01

    Improved interpretations of the strength of the San Andreas Fault near Parkfield, CA based on thermal data require quantification of processes causing significant scatter and uncertainty in existing heat flow data. These effects include topographic refraction, heat advection by topographically-driven groundwater flow, and uncertainty in thermal conductivity. Here, we re-evaluate the heat flow data in this area by correcting for full 3-D terrain effects. We then investigate the potential role of groundwater flow in redistributing fault-generated heat, using numerical models of coupled heat and fluid flow for a wide range of hydrologic scenarios. We find that a large degree of the scatter in the data can be accounted for by 3-D terrain effects, and that for plausible groundwater flow scenarios frictional heat generated along a strong fault is unlikely to be redistributed by topographically-driven groundwater flow in a manner consistent with the 3-D corrected data. Copyright 2004 by the American Geophysical Union.

  18. On Mixed Data and Event Driven Design for Adaptive-Critic-Based Nonlinear $H_{\\infty}$ Control.

    PubMed

    Wang, Ding; Mu, Chaoxu; Liu, Derong; Ma, Hongwen

    2018-04-01

    In this paper, based on the adaptive critic learning technique, the control for a class of unknown nonlinear dynamic systems is investigated by adopting a mixed data and event driven design approach. The nonlinear control problem is formulated as a two-player zero-sum differential game and the adaptive critic method is employed to cope with the data-based optimization. The novelty lies in that the data driven learning identifier is combined with the event driven design formulation, in order to develop the adaptive critic controller, thereby accomplishing the nonlinear control. The event driven optimal control law and the time driven worst case disturbance law are approximated by constructing and tuning a critic neural network. Applying the event driven feedback control, the closed-loop system is built with stability analysis. Simulation studies are conducted to verify the theoretical results and illustrate the control performance. It is significant to observe that the present research provides a new avenue of integrating data-based control and event-triggering mechanism into establishing advanced adaptive critic systems.

  19. An Analysis of Category Management of Service Contracts

    DTIC Science & Technology

    2017-12-01

    management teams a way to make informed , data-driven decisions. Data-driven decisions derived from clustering not only align with Category...savings. Furthermore, this methodology provides a data-driven visualization to inform sound business decisions on potential Category Management ...Category Management initiatives. The Maptitude software will allow future research to collect data and develop visualizations to inform Category

  20. Auditory salience using natural soundscapes.

    PubMed

    Huang, Nicholas; Elhilali, Mounya

    2017-03-01

    Salience describes the phenomenon by which an object stands out from a scene. While its underlying processes are extensively studied in vision, mechanisms of auditory salience remain largely unknown. Previous studies have used well-controlled auditory scenes to shed light on some of the acoustic attributes that drive the salience of sound events. Unfortunately, the use of constrained stimuli in addition to a lack of well-established benchmarks of salience judgments hampers the development of comprehensive theories of sensory-driven auditory attention. The present study explores auditory salience in a set of dynamic natural scenes. A behavioral measure of salience is collected by having human volunteers listen to two concurrent scenes and indicate continuously which one attracts their attention. By using natural scenes, the study takes a data-driven rather than experimenter-driven approach to exploring the parameters of auditory salience. The findings indicate that the space of auditory salience is multidimensional (spanning loudness, pitch, spectral shape, as well as other acoustic attributes), nonlinear and highly context-dependent. Importantly, the results indicate that contextual information about the entire scene over both short and long scales needs to be considered in order to properly account for perceptual judgments of salience.

  1. MEA/A-1 experiment 81F01 conducted on STS-7 flight, June 1983. Containerless processing of glass forming melts

    NASA Technical Reports Server (NTRS)

    Day, D. E.; Ray, C. S.

    1983-01-01

    The space processing of containerless, glassforming melts on board the space shuttle flight STS-7 is investigated. Objectives include; (1) obtain quantitative evidence for the supression of heterogeneous nucleation/crystallization, (2) study melt homogenization without gravity driven convection, (3) procedural development for bubble free, high purity homogeneous melts inmicro-g, (4) comparative analysis of melts on Earth and in micro g, and (5) assess the apparatus for processing multicomponent, glass forming melts in a low gravity environment.

  2. An Item-Driven Adaptive Design for Calibrating Pretest Items. Research Report. ETS RR-14-38

    ERIC Educational Resources Information Center

    Ali, Usama S.; Chang, Hua-Hua

    2014-01-01

    Adaptive testing is advantageous in that it provides more efficient ability estimates with fewer items than linear testing does. Item-driven adaptive pretesting may also offer similar advantages, and verification of such a hypothesis about item calibration was the main objective of this study. A suitability index (SI) was introduced to adaptively…

  3. Topographically driven groundwater flow and the San Andreas heat flow paradox revisited

    USGS Publications Warehouse

    Saffer, D.M.; Bekins, B.A.; Hickman, S.

    2003-01-01

    Evidence for a weak San Andreas Fault includes (1) borehole heat flow measurements that show no evidence for a frictionally generated heat flow anomaly and (2) the inferred orientation of ??1 nearly perpendicular to the fault trace. Interpretations of the stress orientation data remain controversial, at least in close proximity to the fault, leading some researchers to hypothesize that the San Andreas Fault is, in fact, strong and that its thermal signature may be removed or redistributed by topographically driven groundwater flow in areas of rugged topography, such as typify the San Andreas Fault system. To evaluate this scenario, we use a steady state, two-dimensional model of coupled heat and fluid flow within cross sections oriented perpendicular to the fault and to the primary regional topography. Our results show that existing heat flow data near Parkfield, California, do not readily discriminate between the expected thermal signature of a strong fault and that of a weak fault. In contrast, for a wide range of groundwater flow scenarios in the Mojave Desert, models that include frictional heat generation along a strong fault are inconsistent with existing heat flow data, suggesting that the San Andreas Fault at this location is indeed weak. In both areas, comparison of modeling results and heat flow data suggest that advective redistribution of heat is minimal. The robust results for the Mojave region demonstrate that topographically driven groundwater flow, at least in two dimensions, is inadequate to obscure the frictionally generated heat flow anomaly from a strong fault. However, our results do not preclude the possibility of transient advective heat transport associated with earthquakes.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jewitt, David, E-mail: jewitt@ucla.edu; Department of Physics and Astronomy, UCLA, Los Angeles, CA 90095-1567

    Some asteroids eject dust, unexpectedly producing transient, comet-like comae and tails. First ascribed to the sublimation of near-surface water ice, mass-losing asteroids (also called 'main-belt comets') can in fact be driven by a surprising diversity of mechanisms. In this paper, we consider 11 dynamical asteroids losing mass, in nine of which the ejected material is spatially resolved. We address mechanisms for producing mass loss including rotational instability, impact ejection, electrostatic repulsion, radiation pressure sweeping, dehydration stresses, and thermal fracture, in addition to the sublimation of ice. In two objects (133P and 238P) the repetitive nature of the observed activity leavesmore » ice sublimation as the only reasonable explanation, while in a third ((596) Scheila), a recent impact is the cause. Another impact may account for activity in P/2010 A2, but this tiny object can also be explained as having shed mass after reaching rotational instability. Mass loss from (3200) Phaethon is probably due to cracking or dehydration at extreme ({approx}1000 K) perihelion temperatures, perhaps aided by radiation pressure sweeping. For the other bodies, the mass-loss mechanisms remain unidentified, pending the acquisition of more and better data. While the active asteroid sample size remains small, the evidence for an astonishing diversity of mass-loss processes in these bodies is clear.« less

  5. Historical analysis of riparian vegetation change in response to shifting management objectives on the Middle Rio Grande

    USGS Publications Warehouse

    Petrakis, Roy; van Leeuwen, Willem J.D.; Villarreal, Miguel; Tashjian, Paul; Dello Russo, Regina; Scott, Christopher A.

    2017-01-01

    Riparian ecosystems are valuable to the ecological and human communities that depend on them. Over the past century, they have been subject to shifting management practices to maximize human use and ecosystem services, creating a complex relationship between water policy, management, and the natural ecosystem. This has necessitated research on the spatial and temporal dynamics of riparian vegetation change. The San Acacia Reach of the Middle Rio Grande has experienced multiple management and river flow fluctuations, resulting in threats to its riparian and aquatic ecosystems. This research uses remote sensing data, GIS, a review of management decisions, and an assessment of climate to both quantify how riparian vegetation has been altered over time and provide interpretations of the relationships between riparian change and shifting climate and management objectives. This research focused on four management phases from 1935 to 2014, each highlighting different management practices and climate-driven river patterns, providing unique opportunities to observe a direct relationship between river management, climate, and riparian response. Overall, we believe that management practices coupled with reduced surface river-flows with limited overbank flooding influenced the compositional and spatial patterns of vegetation, including possibly increasing non-native vegetation coverage. However, recent restoration efforts have begun to reduce non-native vegetation coverage.

  6. An open source framework for tracking and state estimation ('Stone Soup')

    NASA Astrophysics Data System (ADS)

    Thomas, Paul A.; Barr, Jordi; Balaji, Bhashyam; White, Kruger

    2017-05-01

    The ability to detect and unambiguously follow all moving entities in a state-space is important in multiple domains both in defence (e.g. air surveillance, maritime situational awareness, ground moving target indication) and the civil sphere (e.g. astronomy, biology, epidemiology, dispersion modelling). However, tracking and state estimation researchers and practitioners have difficulties recreating state-of-the-art algorithms in order to benchmark their own work. Furthermore, system developers need to assess which algorithms meet operational requirements objectively and exhaustively rather than intuitively or driven by personal favourites. We have therefore commenced the development of a collaborative initiative to create an open source framework for production, demonstration and evaluation of Tracking and State Estimation algorithms. The initiative will develop a (MIT-licensed) software platform for researchers and practitioners to test, verify and benchmark a variety of multi-sensor and multi-object state estimation algorithms. The initiative is supported by four defence laboratories, who will contribute to the development effort for the framework. The tracking and state estimation community will derive significant benefits from this work, including: access to repositories of verified and validated tracking and state estimation algorithms, a framework for the evaluation of multiple algorithms, standardisation of interfaces and access to challenging data sets. Keywords: Tracking,

  7. Strategic directions of computing at Fermilab

    NASA Astrophysics Data System (ADS)

    Wolbers, Stephen

    1998-05-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.

  8. A high-speed, large-capacity, 'jukebox' optical disk system

    NASA Technical Reports Server (NTRS)

    Ammon, G. J.; Calabria, J. A.; Thomas, D. T.

    1985-01-01

    Two optical disk 'jukebox' mass storage systems which provide access to any data in a store of 10 to the 13th bits (1250G bytes) within six seconds have been developed. The optical disk jukebox system is divided into two units, including a hardware/software controller and a disk drive. The controller provides flexibility and adaptability, through a ROM-based microcode-driven data processor and a ROM-based software-driven control processor. The cartridge storage module contains 125 optical disks housed in protective cartridges. Attention is given to a conceptual view of the disk drive unit, the NASA optical disk system, the NASA database management system configuration, the NASA optical disk system interface, and an open systems interconnect reference model.

  9. A data-driven dynamics simulation framework for railway vehicles

    NASA Astrophysics Data System (ADS)

    Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2018-03-01

    The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.

  10. Learning and exploration in action-perception loops.

    PubMed

    Little, Daniel Y; Sommer, Friedrich T

    2013-01-01

    Discovering the structure underlying observed data is a recurring problem in machine learning with important applications in neuroscience. It is also a primary function of the brain. When data can be actively collected in the context of a closed action-perception loop, behavior becomes a critical determinant of learning efficiency. Psychologists studying exploration and curiosity in humans and animals have long argued that learning itself is a primary motivator of behavior. However, the theoretical basis of learning-driven behavior is not well understood. Previous computational studies of behavior have largely focused on the control problem of maximizing acquisition of rewards and have treated learning the structure of data as a secondary objective. Here, we study exploration in the absence of external reward feedback. Instead, we take the quality of an agent's learned internal model to be the primary objective. In a simple probabilistic framework, we derive a Bayesian estimate for the amount of information about the environment an agent can expect to receive by taking an action, a measure we term the predicted information gain (PIG). We develop exploration strategies that approximately maximize PIG. One strategy based on value-iteration consistently learns faster than previously developed reward-free exploration strategies across a diverse range of environments. Psychologists believe the evolutionary advantage of learning-driven exploration lies in the generalized utility of an accurate internal model. Consistent with this hypothesis, we demonstrate that agents which learn more efficiently during exploration are later better able to accomplish a range of goal-directed tasks. We will conclude by discussing how our work elucidates the explorative behaviors of animals and humans, its relationship to other computational models of behavior, and its potential application to experimental design, such as in closed-loop neurophysiology studies.

  11. Data Driven Math Intervention: What the Numbers Say

    ERIC Educational Resources Information Center

    Martin, Anthony W.

    2013-01-01

    This study was designed to determine whether or not data driven math skills groups would be effective in increasing student academic achievement. From this topic three key questions arose: "Would the implementation of data driven math skills groups improve student academic achievement more than standard instruction as measured by the…

  12. Cognitive Processing Therapy for Spanish-speaking Latinos: A Formative Study of a Model-Driven Cultural Adaptation of the Manual to Enhance Implementation in a Usual Care Setting.

    PubMed

    Valentine, Sarah E; Borba, Christina P C; Dixon, Louise; Vaewsorn, Adin S; Guajardo, Julia Gallegos; Resick, Patricia A; Wiltsey Stirman, Shannon; Marques, Luana

    2017-03-01

    As part of a larger implementation trial for cognitive processing therapy (CPT) for posttraumatic stress disorder (PTSD) in a community health center, we used formative evaluation to assess relations between iterative cultural adaption (for Spanish-speaking clients) and implementation outcomes (appropriateness and acceptability) for CPT. Qualitative data for the current study were gathered through multiple sources (providers: N = 6; clients: N = 22), including CPT therapy sessions, provider fieldnotes, weekly consultation team meetings, and researcher fieldnotes. Findings from conventional and directed content analysis of the data informed refinements to the CPT manual. Data-driven refinements included adaptations related to cultural context (i.e., language, regional variation in wording), urban context (e.g., crime/violence), and literacy level. Qualitative findings suggest improved appropriateness and acceptability of CPT for Spanish-speaking clients. Our study reinforces the need for dual application of cultural adaptation and implementation science to address the PTSD treatment needs of Spanish-speaking clients. © 2016 Wiley Periodicals, Inc.

  13. Computer Aided Teaching of Digital Signal Processing.

    ERIC Educational Resources Information Center

    Castro, Ian P.

    1990-01-01

    Describes a microcomputer-based software package developed at the University of Surrey for teaching digital signal processing to undergraduate science and engineering students. Menu-driven software capabilities are explained, including demonstration of qualitative concepts and experimentation with quantitative data, and examples are given of…

  14. MOPED 2.5—An Integrated Multi-Omics Resource: Multi-Omics Profiling Expression Database Now Includes Transcriptomics Data

    PubMed Central

    Montague, Elizabeth; Stanberry, Larissa; Higdon, Roger; Janko, Imre; Lee, Elaine; Anderson, Nathaniel; Choiniere, John; Stewart, Elizabeth; Yandl, Gregory; Broomall, William; Kolker, Natali

    2014-01-01

    Abstract Multi-omics data-driven scientific discovery crucially rests on high-throughput technologies and data sharing. Currently, data are scattered across single omics repositories, stored in varying raw and processed formats, and are often accompanied by limited or no metadata. The Multi-Omics Profiling Expression Database (MOPED, http://moped.proteinspire.org) version 2.5 is a freely accessible multi-omics expression database. Continual improvement and expansion of MOPED is driven by feedback from the Life Sciences Community. In order to meet the emergent need for an integrated multi-omics data resource, MOPED 2.5 now includes gene relative expression data in addition to protein absolute and relative expression data from over 250 large-scale experiments. To facilitate accurate integration of experiments and increase reproducibility, MOPED provides extensive metadata through the Data-Enabled Life Sciences Alliance (DELSA Global, http://delsaglobal.org) metadata checklist. MOPED 2.5 has greatly increased the number of proteomics absolute and relative expression records to over 500,000, in addition to adding more than four million transcriptomics relative expression records. MOPED has an intuitive user interface with tabs for querying different types of omics expression data and new tools for data visualization. Summary information including expression data, pathway mappings, and direct connection between proteins and genes can be viewed on Protein and Gene Details pages. These connections in MOPED provide a context for multi-omics expression data exploration. Researchers are encouraged to submit omics data which will be consistently processed into expression summaries. MOPED as a multi-omics data resource is a pivotal public database, interdisciplinary knowledge resource, and platform for multi-omics understanding. PMID:24910945

  15. General Purpose Data-Driven Monitoring for Space Operations

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Martin, Rodney A.; Schwabacher, Mark A.; Spirkovska, Liljana; Taylor, William McCaa; Castle, Joseph P.; Mackey, Ryan M.

    2009-01-01

    As modern space propulsion and exploration systems improve in capability and efficiency, their designs are becoming increasingly sophisticated and complex. Determining the health state of these systems, using traditional parameter limit checking, model-based, or rule-based methods, is becoming more difficult as the number of sensors and component interactions grow. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults or failures. The Inductive Monitoring System (IMS) is a data-driven system health monitoring software tool that has been successfully applied to several aerospace applications. IMS uses a data mining technique called clustering to analyze archived system data and characterize normal interactions between parameters. The scope of IMS based data-driven monitoring applications continues to expand with current development activities. Successful IMS deployment in the International Space Station (ISS) flight control room to monitor ISS attitude control systems has led to applications in other ISS flight control disciplines, such as thermal control. It has also generated interest in data-driven monitoring capability for Constellation, NASA's program to replace the Space Shuttle with new launch vehicles and spacecraft capable of returning astronauts to the moon, and then on to Mars. Several projects are currently underway to evaluate and mature the IMS technology and complementary tools for use in the Constellation program. These include an experiment on board the Air Force TacSat-3 satellite, and ground systems monitoring for NASA's Ares I-X and Ares I launch vehicles. The TacSat-3 Vehicle System Management (TVSM) project is a software experiment to integrate fault and anomaly detection algorithms and diagnosis tools with executive and adaptive planning functions contained in the flight software on-board the Air Force Research Laboratory TacSat-3 satellite. The TVSM software package will be uploaded after launch to monitor spacecraft subsystems such as power and guidance, navigation, and control (GN&C). It will analyze data in real-time to demonstrate detection of faults and unusual conditions, diagnose problems, and react to threats to spacecraft health and mission goals. The experiment will demonstrate the feasibility and effectiveness of integrated system health management (ISHM) technologies with both ground and on-board experiments.

  16. An intelligent service matching method for mechanical equipment condition monitoring using the fibre Bragg grating sensor network

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Zhou, Zude; Liu, Quan; Xu, Wenjun

    2017-02-01

    Due to the advantages of being able to function under harsh environmental conditions and serving as a distributed condition information source in a networked monitoring system, the fibre Bragg grating (FBG) sensor network has attracted considerable attention for equipment online condition monitoring. To provide an overall conditional view of the mechanical equipment operation, a networked service-oriented condition monitoring framework based on FBG sensing is proposed, together with an intelligent matching method for supporting monitoring service management. In the novel framework, three classes of progressive service matching approaches, including service-chain knowledge database service matching, multi-objective constrained service matching and workflow-driven human-interactive service matching, are developed and integrated with an enhanced particle swarm optimisation (PSO) algorithm as well as a workflow-driven mechanism. Moreover, the manufacturing domain ontology, FBG sensor network structure and monitoring object are considered to facilitate the automatic matching of condition monitoring services to overcome the limitations of traditional service processing methods. The experimental results demonstrate that FBG monitoring services can be selected intelligently, and the developed condition monitoring system can be re-built rapidly as new equipment joins the framework. The effectiveness of the service matching method is also verified by implementing a prototype system together with its performance analysis.

  17. Polarization characteristics of an altazimuth sky scanner

    NASA Technical Reports Server (NTRS)

    Garrison, L. M.; Blaszczak, Z.; Green, A. E. S.

    1980-01-01

    A theoretical description of the polarization characteristics of an altazimuth sky scanner optical system based on Mueller-Stokes calculus is presented. This computer-driven optical system was designed to perform laboratory studies of skylight and of celestial objects during day or night, and has no space limitations; however, the two parallel 45 deg tilt mirrors introduce some intrinsic polarization. Therefore, proper data interpretation requires a theoretical understanding of the polarization features of the instrument and accurate experimental determination of the Mueller-Stokes matrix elements describing the polarizing and depolarizing action of the system.

  18. Argos: Design and Development of Object-Oriented, Event-Driven Multimedia Data Base Technology in Support of the Paperless Ship

    DTIC Science & Technology

    1988-12-01

    on openStack global mode -- mode may be any of the following types: -- navigate - traverse through the graphical hierarchy -- order - for...ordering an item via graphics put "NAVIGATE" into MODE hide message box hide menubar set userlevel to 5 end openStack on closestack -- this handler will... openStack hide menuBAR hide message box end openStack * BKGND #1, BUTTON #1: Next * * * * ** ** * on mouseUp visual effect wipe left go to next card of

  19. Learning Data Driven Representations from Large Collections of Multidimensional Patterns with Minimal Supervision

    DTIC Science & Technology

    2008-08-04

    Army Research Office (ARO) grants DAAD 19-02-1-0383 and W911NF-06-1-0076. Stéphane Coulombe , Sharon Core, Amar Mukherjee and David Chester played a...the goodness of the final outcome of the joint alignment is critically de - pendent on the appropriate choice of penalty in the objective function1...C. Dahlke, LB. Davenport, P. Davies, B. de Pablos, A. Delcher, Z. Deng, AD. Mays, I. Dew, SM. Dietz, K. Dodson, LE. Doup, M. Downes, S. Dugan-Rocha

  20. On statistical inference in time series analysis of the evolution of road safety.

    PubMed

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

Top