Science.gov

Sample records for generation tool decider

  1. E-DECIDER: Earthquake Disaster Decision Support and Response Tools - Development and Experiences

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Blom, R. G.; Bawden, G. W.; Fox, G.; Pierce, M.; Rundle, J. B.; Wang, J.; Ma, Y.; yoder, M. R.; Sachs, M. K.; Parker, J. W.

    2011-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision-making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. The overall goal of the project is to deliver these capabilities as standards-compliant Geographical Information System (GIS) data products through a web portal/web services infrastructure that will allow easy use by decision-makers; this design ensures that the system will be readily supportable and extensible in the future. E-DECIDER is incorporating the earthquake forecasting methodology developed through NASA's QuakeSim project, as well as other QuakeSim geophysical modeling tools. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, will allow us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). We are also working to provide a catalog of HAZUS input files and models for scenario earthquakes based on the QuakeSim forecast models, as well as designing an automated workflow for generating HAZUS models in the event of an earthquake (triggered from the USGS earthquake feed). Initially, E-DECIDER's focus was to deliver rapid and readily accessible InSAR products following earthquake disasters. Following our experiences with recent past events, such as the Baja Mexico earthquake and the Tohoku-oki Japan earthquake, we found that in many instances, radar data is not readily available following the event, whereas optical imagery can be provided fairly quickly as a result of the invocation of the International Charter. This led us to re-evaluate the type of data we would need to process and the products we could deliver

  2. Disaster Response Tools for Decision Support and Data Discovery - E-DECIDER and GeoGateway

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Granat, R. A.; Lyzenga, G. A.; Pierce, M. E.; Wang, J.; Grant Ludwig, L.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2015-12-01

    Providing actionable data for situational awareness following an earthquake or other disaster is critical to decision makers in order to improve their ability to anticipate requirements and provide appropriate resources for response. E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) is a decision support system producing remote sensing and geophysical modeling products that are relevant to the emergency preparedness and response communities and serves as a gateway to enable the delivery of actionable information to these communities. GeoGateway is a data product search and analysis gateway for scientific discovery, field use, and disaster response focused on NASA UAVSAR and GPS data that integrates with fault data, seismicity and models. Key information on the nature, magnitude and scope of damage, or Essential Elements of Information (EEI), necessary to achieve situational awareness are often generated from a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. We have worked in partnership with the California Earthquake Clearinghouse to develop actionable data products for use in their response efforts, particularly in regularly scheduled, statewide exercises like the recent May 2015 Capstone/SoCal NLE/Ardent Sentry Exercises and in the August 2014 South Napa earthquake activation. We also provided a number of products, services, and consultation to the NASA agency-wide response to the April 2015 Gorkha, Nepal earthquake. We will present perspectives on developing tools for decision support and data discovery in partnership with the Clearinghouse and for the Nepal earthquake. Products delivered included map layers as part of the common operational data plan for the Clearinghouse, delivered through XchangeCore Web Service Data Orchestration, enabling users to create merged datasets from multiple providers. For the Nepal response effort, products included models

  3. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  4. Web Tools: The Second Generation

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  5. E-DECIDER: Using Earth Science Data and Modeling Tools to Develop Decision Support for Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, Margaret T.; Wang, Jun; Pierce, Marlon E.; Yoder, Mark R.; Parker, Jay W.; Burl, Michael C.; Stough, Timothy M.; Granat, Robert A.; Donnellan, Andrea; Rundle, John B.; Ma, Yu; Bawden, Gerald W.; Yuen, Karen

    2015-08-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). This in turn is delivered through standards-compliant web services for desktop and hand-held devices.

  6. Sense, decide, act, communicate (SDAC): next generation of smart sensor systems

    NASA Astrophysics Data System (ADS)

    Berry, Nina; Davis, Jesse; Ko, Teresa H.; Kyker, Ron; Pate, Ron; Stark, Doug; Stinnett, Regan; Baker, James; Cushner, Adam; Van Dyke, Colin; Kyckelhahn, Brian

    2004-09-01

    The recent war on terrorism and increased urban warfare has been a major catalysis for increased interest in the development of disposable unattended wireless ground sensors. While the application of these sensors to hostile domains has been generally governed by specific tasks, this research explores a unique paradigm capitalizing on the fundamental functionality related to sensor systems. This functionality includes a sensors ability to Sense - multi-modal sensing of environmental events, Decide - smart analysis of sensor data, Act - response to environmental events, and Communication - internal to system and external to humans (SDAC). The main concept behind SDAC sensor systems is to integrate the hardware, software, and networking to generate 'knowledge and not just data'. This research explores the usage of wireless SDAC units to collectively make up a sensor system capable of persistent, adaptive, and autonomous behavior. These systems are base on the evaluation of scenarios and existing systems covering various domains. This paper presents a promising view of sensor network characteristics, which will eventually yield smart (intelligent collectives) network arrays of SDAC sensing units generally applicable to multiple related domains. This paper will also discuss and evaluate the demonstration system developed to test the concepts related to SDAC systems.

  7. Deciding about hormone therapy

    MedlinePlus

    HRT - deciding; Estrogen replacement therapy - deciding; ERT- deciding; Hormone replacement therapy - deciding; Menopause - deciding; HT - deciding; Menopausal hormone therapy - deciding; MHT - deciding

  8. Quantitative versus Qualitative Evaluation: A Tool to Decide Which to Use

    ERIC Educational Resources Information Center

    Dobrovolny, Jackie L.; Fuentes, Stephanie Christine G.

    2008-01-01

    Evaluation is often avoided in human performance technology (HPT), but it is an essential and frequently catalytic activity that adds significant value to projects. Knowing how to approach an evaluation and whether to use qualitative, quantitative, or both methods makes evaluation much easier. In this article, we provide tools to help determine…

  9. E-DECIDER: Using Earth Science Data and Modeling Tools to Develop Decision Support for Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Stough, T. M.; Burl, M. C.; Pierce, M.; Wang, J.; Ma, Y.; Rundle, J. B.; yoder, M. R.; Bawden, G. W.

    2012-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision-making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. Geodetic imaging data, including from inteferometric synthetic aperture radar (InSAR) and GPS, have a rich scientific heritage for use in earthquake research. Survey grade GPS was developed in the 1980s and the first InSAR image of an earthquake was produced for the 1992 Landers event. As more of these types of data have become increasingly available they have also shown great utility for providing key information for disaster response. Work has been done to translate these data into useful and actionable information for decision makers in the event of an earthquake disaster. In addition to observed data, modeling tools provide essential preliminary estimates while data are still being collected and/or processed, which can be refined as data products become available. Now, with more data and better models, we are able apply these to responders who need easy tools and routinely produced data products. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER has taken advantage of the legacy of Earth science data, including MODIS, Landsat, SCIGN, PBO, UAVSAR, and modeling tools such as the ones developed by QuakeSim, in order to deliver successful decision support products for earthquake disaster response. The project has

  10. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  11. Maraviroc Clinical Test (MCT) as an alternative tool to decide CCR5-antagonists prescription in naïve HIV-infected patients.

    PubMed

    Genebat, Miguel; de Pablo-Bernal, Rebeca S; Pulido, Ildefonso; Jiménez-Mejías, Manuel E; Martínez, Onofre; Pacheco, Yolanda M; Raffi-El-Idrissi Benhia, Mohammed; Abad, María Antonia; Ruiz-Mateos, Ezequiel; Leal, Manuel

    2015-09-01

    Our aim was to analyze the virological response to a combined antiretroviral therapy started after Maraviroc Clinical Test (MCT) in naïve HIV-infected patients. Forty-one patients were exposed to MCT, based on an 8-day MVC monotherapy. If undetectability or a viral load reduction >1 log10 HIV-RNA copies/ml was achieved, a MVC-containing cART was prescribed. Forty patients showed a positive MCT; undetectability after 48weeks on cART was achieved in 34/41 (82.9%) patients. The result of MCT was compared with a genotypic tropism method and with Trofile®, showing 10.7% and 18.75% discordance rates, respectively. MCT is a reliable tool to decide CCR5-antagonists prescription, also in the naïve scenario where most patients show a virological response to MVC independently the tropism result reported by genotypic or phenotypic methods.

  12. Decision generation tools and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  13. Automatic tool path generation for finish machining

    SciTech Connect

    Kwok, Kwan S.; Loucks, C.S.; Driessen, B.J.

    1997-03-01

    A system for automatic tool path generation was developed at Sandia National Laboratories for finish machining operations. The system consists of a commercially available 5-axis milling machine controlled by Sandia developed software. This system was used to remove overspray on cast turbine blades. A laser-based, structured-light sensor, mounted on a tool holder, is used to collect 3D data points around the surface of the turbine blade. Using the digitized model of the blade, a tool path is generated which will drive a 0.375 inch diameter CBN grinding pin around the tip of the blade. A fuzzified digital filter was developed to properly eliminate false sensor readings caused by burrs, holes and overspray. The digital filter was found to successfully generate the correct tool path for a blade with intentionally scanned holes and defects. The fuzzified filter improved the computation efficiency by a factor of 25. For application to general parts, an adaptive scanning algorithm was developed and presented with simulation results. A right pyramid and an ellipsoid were scanned successfully with the adaptive algorithm.

  14. Groundwater Monitoring Report Generation Tools - 12005

    SciTech Connect

    Lopez, Natalie

    2012-07-01

    Compliance with National and State environmental regulations (e.g. Resource Conservation and Recovery Act (RCRA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) aka SuperFund) requires Savannah River Site (SRS) to extensively collect and report groundwater monitoring data, with potential fines for missed reporting deadlines. Several utilities have been developed at SRS to facilitate production of the regulatory reports which include maps, data tables, charts and statistics. Components of each report are generated in accordance with complex sets of regulatory requirements specific to each site monitored. SRS developed a relational database to incorporate the detailed reporting rules with the groundwater data, and created a set of automation tools to interface with the information and generate the report components. These process improvements enhanced quality and consistency by centralizing the information, and have reduced manpower and production time through automated efficiencies. (author)

  15. GROUNDWATER MONITORING REPORT GENERATION TOOLS - 12005

    SciTech Connect

    Lopez, N.

    2011-11-21

    Compliance with National and State environmental regulations (e.g. Resource Conservation and Recovery Act (RCRA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) aka SuperFund) requires Savannah River Site (SRS) to extensively collect and report groundwater monitoring data, with potential fines for missed reporting deadlines. Several utilities have been developed at SRS to facilitate production of the regulatory reports which include maps, data tables, charts and statistics. Components of each report are generated in accordance with complex sets of regulatory requirements specific to each site monitored. SRS developed a relational database to incorporate the detailed reporting rules with the groundwater data, and created a set of automation tools to interface with the information and generate the report components. These process improvements enhanced quality and consistency by centralizing the information, and have reduced manpower and production time through automated efficiencies.

  16. Projectile-generating explosive access tool

    SciTech Connect

    Jakaboski, Juan-Carlos; Hughs, Chance G; Todd, Steven N

    2013-06-11

    A method for generating a projectile using an explosive device that can generate a projectile from the opposite side of a wall from the side where the explosive device is detonated. The projectile can be generated without breaching the wall of the structure or container. The device can optionally open an aperture in a solid wall of a structure or a container and form a high-kinetic-energy projectile from the portion of the wall removed to create the aperture.

  17. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  18. Dewarless Logging Tool - 1st Generation

    SciTech Connect

    HENFLING,JOSEPH A.; NORMANN,RANDY A.

    2000-08-01

    This report focuses on Sandia National Laboratories' effort to create high-temperature logging tools for geothermal applications without the need for heat shielding. One of the mechanisms for failure in conventional downhole tools is temperature. They can only survive a limited number of hours in high temperature environments. For the first time since the evolution of integrated circuits, components are now commercially available that are qualified to 225 C with many continuing to work up to 300 C. These components are primarily based on Silicon-On-Insulator (SOI) technology. Sandia has developed and tested a simple data logger based on this technology that operates up to 300 C with a few limiting components operating to only 250 C without thermal protection. An actual well log to 240 C without shielding is discussed. The first prototype high-temperature tool measures pressure and temperature using a wire-line for power and communication. The tool is based around the HT83C51 microcontroller. A brief discussion of the background and status of the High Temperature Instrumentation program at Sandia, objectives, data logger development, and future project plans are given.

  19. Test Generators: Teacher's Tool or Teacher's Headache?

    ERIC Educational Resources Information Center

    Eiser, Leslie

    1988-01-01

    Discusses the advantages and disadvantages of test generation programs. Includes setting up, printing exams and "bells and whistles." Reviews eight computer packages for Apple and IBM personal computers. Compares features, costs, and usage. (CW)

  20. Projectile-generating explosive access tool

    DOEpatents

    Jakaboski, Juan-Carlos; Todd, Steven N.

    2011-10-18

    An explosive device that can generate a projectile from the opposite side of a wall from the side where the explosive device is detonated. The projectile can be generated without breaching the wall of the structure or container. The device can optionally open an aperture in a solid wall of a structure or a container and form a high-kinetic-energy projectile from the portion of the wall removed to create the aperture.

  1. Next-Generation Ion Thruster Design Tool

    NASA Technical Reports Server (NTRS)

    Stolz, Peter

    2015-01-01

    Computational tools that accurately predict the performance of electric propulsion devices are highly desirable and beneficial to NASA and the broader electric propulsion community. The current state of the art in electric propulsion modeling relies heavily on empirical data and numerous computational "knobs." In Phase I of this project, Tech-X Corporation developed the most detailed ion engine discharge chamber model that currently exists. This kinetic model simulates all particles in the discharge chamber along with a physically correct simulation of the electric fields. In addition, kinetic erosion models are included for modeling the ion-impingement effects on thruster component erosion. In Phase II, Tech-X developed a user-friendly computer program for NASA and other governmental and industry customers. Tech-X has implemented a number of advanced numerical routines to bring the computational time down to a commercially acceptable level. NASA now has a highly sophisticated, user-friendly ion engine discharge chamber modeling tool.

  2. Community Resources. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Community Resources, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday…

  3. BGen: A UML Behavior Network Generator Tool

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry; Reder, Leonard J.; Balian, Harry

    2010-01-01

    BGen software was designed for autogeneration of code based on a graphical representation of a behavior network used for controlling automatic vehicles. A common format used for describing a behavior network, such as that used in the JPL-developed behavior-based control system, CARACaS ["Control Architecture for Robotic Agent Command and Sensing" (NPO-43635), NASA Tech Briefs, Vol. 32, No. 10 (October 2008), page 40] includes a graph with sensory inputs flowing through the behaviors in order to generate the signals for the actuators that drive and steer the vehicle. A computer program to translate Unified Modeling Language (UML) Freeform Implementation Diagrams into a legacy C implementation of Behavior Network has been developed in order to simplify the development of C-code for behavior-based control systems. UML is a popular standard developed by the Object Management Group (OMG) to model software architectures graphically. The C implementation of a Behavior Network is functioning as a decision tree.

  4. Estimating the Horizon of articles to decide when to stop searching in systematic reviews: an example using a systematic review of RCTs evaluating osteoporosis clinical decision support tools.

    PubMed

    Kastner, Monika; Straus, Sharon; Goldsmith, Charlie H

    2007-10-11

    Researchers conducting systematic reviews need to search multiple bibliographic databases such as MEDLINE and EMBASE. However, researchers have no rational search stopping rule when looking for potentially-relevant articles. We empirically tested a stopping rule based on the concept of capture-mark-recapture (CMR), which was first pioneered in ecology. The principles of CMR can be adapted to systematic reviews and meta-analyses to estimate the Horizon of articles in the literature with its confidence interval. We retrospectively tested this Horizon Estimation using a systematic review of randomized controlled trials (RCTs) that evaluated clinical decision support tools for osteoporosis disease management. The Horizon Estimation was calculated based on 4 bibliographic databases that were included as the main data sources for the review in the following order: MEDLINE, EMBASE, CINAHL, and EBM Reviews. The systematic review captured 68% of known articles from the 4 data sources, which represented 592 articles that were estimated as missing from the Horizon.

  5. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  6. The parser generator as a general purpose tool

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.; Collins, W. R.

    1985-01-01

    The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.

  7. MEAT: An Authoring Tool for Generating Adaptable Learning Resources

    ERIC Educational Resources Information Center

    Kuo, Yen-Hung; Huang, Yueh-Min

    2009-01-01

    Mobile learning (m-learning) is a new trend in the e-learning field. The learning services in m-learning environments are supported by fundamental functions, especially the content and assessment services, which need an authoring tool to rapidly generate adaptable learning resources. To fulfill the imperious demand, this study proposes an…

  8. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  9. A computational tool to design and generate crystal structures

    NASA Astrophysics Data System (ADS)

    Ferreira, R. C.; Vieira, M. B.; Dantas, S. O.; Lobosco, M.

    2014-03-01

    The evolution of computers, more specifically regarding the increased storage and data processing capacity, allowed the construction of computational tools for the simulation of physical and chemical phenomena. Thus, practical experiments are being replaced, in some cases, by computational ones. In this context, we can highlight models used to simulate different phenomena on atomic scale. The construction of these simulators requires, by developers, the study and definition of accurate and reliable models. This complexity is often reflected in the construction of complex simulators, which simulate a limited group of structures. Such structures are sometimes expressed in a fixed manner using a limited set of geometric shapes. This work proposes a computational tool that aims to generate a set of crystal structures. The proposed tool consists of a) a programming language, which is used to describe the structures using for this purpose their characteristic functions and CSG (Constructive Solid Geometry) operators, and b) a compiler/interpreter that examines the source code written in the proposed language, and generates the objects accordingly. This tool enables the generation of an unrestricted number of structures, which can be incorporated in simulators such as the Monte Carlo Spin Engine, developed by our group at UFJF.

  10. Tools for Simulation and Benchmark Generation at Exascale

    SciTech Connect

    Lagadapati, Mahesh; Mueller, Frank; Engelmann, Christian

    2013-01-01

    The path to exascale high-performance computing (HPC) poses several challenges related to power, performance, resilience, productivity, programmability, data movement, and data management. Investigating the performance of parallel applications at scale on future architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. Simulations using models of future HPC systems and communication traces from applications running on existing HPC systems can offer an insight into the performance of future architectures. This work targets technology developed for scalable application tracing of communication events and memory profiles, but can be extended to other areas, such as I/O, control flow, and data flow. It further focuses on extreme-scale simulation of millions of Message Passing Interface (MPI) ranks using a lightweight parallel discrete event simulation (PDES) toolkit for performance evaluation. Instead of simply replaying a trace within a simulation, the approach is to generate a benchmark from it and to run this benchmark within a simulation using models to reflect the performance characteristics of future-generation HPC systems. This provides a number of benefits, such as eliminating the data intensive trace replay and enabling simulations at different scales. The presented work utilizes the ScalaTrace tool to generate scalable trace files, the ScalaBenchGen tool to generate the benchmark, and the xSim tool to run the benchmark within a simulation.

  11. S3D: An interactive surface grid generation tool

    NASA Technical Reports Server (NTRS)

    Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David

    1992-01-01

    S3D, an interactive software tool for surface grid generation, is described. S3D provides the means with which a geometry definition based either on a discretized curve set or a rectangular set can be quickly processed towards the generation of a surface grid for computational fluid dynamics (CFD) applications. This is made possible as a result of implementing commonly encountered surface gridding tasks in an environment with a highly efficient and user friendly graphical interface. Some of the more advanced features of S3D include surface-surface intersections, optimized surface domain decomposition and recomposition, and automated propagation of edge distributions to surrounding grids.

  12. Automatic Tool Path Generation for Robot Integrated Surface Sculpturing System

    NASA Astrophysics Data System (ADS)

    Zhu, Jiang; Suzuki, Ryo; Tanaka, Tomohisa; Saito, Yoshio

    In this paper, a surface sculpturing system based on 8-axis robot is proposed, the CAD/CAM software and tool path generation algorithm for this sculpturing system are presented. The 8-axis robot is composed of a 6-axis manipulator and a 2-axis worktable, it carves block of polystyrene foams by heated cutting tools. Multi-DOF (Degree of Freedom) robot benefits from the faster fashion than traditional RP (Rapid Prototyping) methods and more flexibility than CNC machining. With its flexibility driven from an 8-axis configuration, as well as efficient custom-developed software for rough cutting and finish cutting, this surface sculpturing system can carve sculptured surface accurately and efficiently.

  13. The Requirements Generation System: A tool for managing mission requirements

    NASA Technical Reports Server (NTRS)

    Sheppard, Sylvia B.

    1994-01-01

    Historically, NASA's cost for developing mission requirements has been a significant part of a mission's budget. Large amounts of time have been allocated in mission schedules for the development and review of requirements by the many groups who are associated with a mission. Additionally, tracing requirements from a current document to a parent document has been time-consuming and costly. The Requirements Generation System (RGS) is a computer-supported cooperative-work tool that assists mission developers in the online creation, review, editing, tracing, and approval of mission requirements as well as in the production of requirements documents. This paper describes the RGS and discusses some lessons learned during its development.

  14. ALPAL: A tool to generate simulation codes from natural descriptions

    SciTech Connect

    Cook, G.O. Jr.; Painter, J.F.

    1991-01-01

    ALPAL is a tool that automatically generates code to solve nonlinear integro-differential equations, given a very high-level specification of the equations to be solved and the numerical methods to be used. ALPAL is designed to handle the sort of complicated mathematical models used in very large scientific simulation codes. Other features of ALPAL include an interactive graphical front end, the ability to symbolically compute exact Jacobians for implicit methods, and a high degree of code optimization. 14 refs., 9 figs.

  15. Deciding to quit drinking alcohol

    MedlinePlus

    ... Alcohol abuse - quitting drinking; Quitting drinking; Quitting alcohol; Alcoholism - deciding to quit ... pubmed/23698791 . National Institute on Alcohol Abuse and Alcoholism. Alcohol and health. www.niaaa.nih.gov/alcohol- ...

  16. MCM generator: a Java-based tool for generating medical metadata.

    PubMed

    Munoz, F; Hersh, W

    1998-01-01

    In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files.

  17. MCM generator: a Java-based tool for generating medical metadata.

    PubMed

    Munoz, F; Hersh, W

    1998-01-01

    In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files. PMID:9929299

  18. Benchmarking the next generation of homology inference tools

    PubMed Central

    Saripella, Ganapathi Varma; Sonnhammer, Erik L. L.; Forslund, Kristoffer

    2016-01-01

    Motivation: Over the last decades, vast numbers of sequences were deposited in public databases. Bioinformatics tools allow homology and consequently functional inference for these sequences. New profile-based homology search tools have been introduced, allowing reliable detection of remote homologs, but have not been systematically benchmarked. To provide such a comparison, which can guide bioinformatics workflows, we extend and apply our previously developed benchmark approach to evaluate the ‘next generation’ of profile-based approaches, including CS-BLAST, HHSEARCH and PHMMER, in comparison with the non-profile based search tools NCBI-BLAST, USEARCH, UBLAST and FASTA. Method: We generated challenging benchmark datasets based on protein domain architectures within either the PFAM + Clan, SCOP/Superfamily or CATH/Gene3D domain definition schemes. From each dataset, homologous and non-homologous protein pairs were aligned using each tool, and standard performance metrics calculated. We further measured congruence of domain architecture assignments in the three domain databases. Results: CSBLAST and PHMMER had overall highest accuracy. FASTA, UBLAST and USEARCH showed large trade-offs of accuracy for speed optimization. Conclusion: Profile methods are superior at inferring remote homologs but the difference in accuracy between methods is relatively small. PHMMER and CSBLAST stand out with the highest accuracy, yet still at a reasonable computational cost. Additionally, we show that less than 0.1% of Swiss-Prot protein pairs considered homologous by one database are considered non-homologous by another, implying that these classifications represent equivalent underlying biological phenomena, differing mostly in coverage and granularity. Availability and Implementation: Benchmark datasets and all scripts are placed at (http://sonnhammer.org/download/Homology_benchmark). Contact: forslund@embl.de Supplementary information: Supplementary data are available at

  19. Tool for Generating Realistic Residential Hot Water Event Schedules: Preprint

    SciTech Connect

    Hendron, B.; Burch, J.; Barker, G.

    2010-08-01

    The installed energy savings for advanced residential hot water systems can depend greatly on detailed occupant use patterns. Quantifying these patterns is essential for analyzing measures such as tankless water heaters, solar hot water systems with demand-side heat exchangers, distribution system improvements, and recirculation loops. This paper describes the development of an advanced spreadsheet tool that can generate a series of year-long hot water event schedules consistent with realistic probability distributions of start time, duration and flow rate variability, clustering, fixture assignment, vacation periods, and seasonality. This paper also presents the application of the hot water event schedules in the context of an integral-collector-storage solar water heating system in a moderate climate.

  20. Sandia Generated Matrix Tool (SGMT) v. 1.0

    2010-03-24

    Provides a tool with which create and characterize a very large set of matrix-based visual analogy problems that have properties that are similar to Raven™s Progressive Matrices (RPMs). The software uses the same underlying patterns found in RPMs to generate large numbers of unique matrix problems using parameters chosen by the researcher. Specifically, the software is designed so that researchers can choose the type, direction, and number of relations in a problem and then createmore » any number of unique matrices that share the same underlying structure (e.g. changes in numerosity in a diagonal pattern) but have different surface features (e.g. shapes, colors).Raven™s Progressive Matrices (RPMs) are a widely-used test for assessing intelligence and reasoning ability. Since the test is non-verbal, it can be applied to many different populations and has been used all over the world. However, there are relatively few matrices in the sets developed by Raven, which limits their use in experiments requiring large numbers of stimuli. This tool creates a matrix set in a systematic way that allows researchers to have a great deal of control over the underlying structure, surface features, and difficulty of the matrix problems while providing a large set of novel matrices with which to conduct experiments.« less

  1. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    SciTech Connect

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  2. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGES

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; et al

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  3. Next generation sequencing (NGS): a golden tool in forensic toolkit.

    PubMed

    Aly, S M; Sabri, D M

    2015-01-01

    The DNA analysis is a cornerstone in contemporary forensic sciences. DNA sequencing technologies are powerful tools that enrich molecular sciences in the past based on Sanger sequencing and continue to glowing these sciences based on Next generation sequencing (NGS). Next generation sequencing has excellent potential to flourish and increase the molecular applications in forensic sciences by jumping over the pitfalls of the conventional method of sequencing. The main advantages of NGS compared to conventional method that it utilizes simultaneously a large number of genetic markers with high-resolution of genetic data. These advantages will help in solving several challenges such as mixture analysis and dealing with minute degraded samples. Based on these new technologies, many markers could be examined to get important biological data such as age, geographical origins, tissue type determination, external visible traits and monozygotic twins identification. It also could get data related to microbes, insects, plants and soil which are of great medico-legal importance. Despite the dozens of forensic research involving NGS, there are requirements before using this technology routinely in forensic cases. Thus, there is a great need to more studies that address robustness of these techniques. Therefore, this work highlights the applications of forensic sciences in the era of massively parallel sequencing.

  4. Next generation sequencing: new tools in immunology and hematology

    PubMed Central

    Mori, Antonio; Deola, Sara; Xumerle, Luciano; Mijatovic, Vladan; Malerba, Giovanni

    2013-01-01

    One of the hallmarks of the adaptive immune system is the specificity of B and T cell receptors. Thanks to somatic recombination, a large repertoire of receptors can be generated within an individual that guarantee the recognition of a vast number of antigens. Monoclonal antibodies have limited applicability, given the high degree of diversity among these receptors, in BCR and TCR monitoring. Furthermore, with regard to cancer, better characterization of complex genomes and the ability to monitor tumor-specific cryptic mutations or translocations are needed to develop better tailored therapies. Novel technologies, by enhancing the ability of BCR and TCR monitoring, can help in the search for minimal residual disease during hematological malignancy diagnosis and follow-up, and can aid in improving bone marrow transplantation techniques. Recently, a novel technology known as next generation sequencing has been developed; this allows the recognition of unique sequences and provides depth of coverage, heterogeneity, and accuracy of sequencing. This provides a powerful tool that, along with microarray analysis for gene expression, may become integral in resolving the remaining key problems in hematology. This review describes the state of the art of this novel technology, its application in the immunological and hematological fields, and the possible benefits it will provide for the hematology and immunology community. PMID:24466547

  5. Deciding where to Stop Speaking

    ERIC Educational Resources Information Center

    Tydgat, Ilse; Stevens, Michael; Hartsuiker, Robert J.; Pickering, Martin J.

    2011-01-01

    This study investigated whether speakers strategically decide where to interrupt their speech once they need to stop. We conducted four naming experiments in which pictures of colored shapes occasionally changed in color or shape. Participants then merely had to stop (Experiment 1); or they had to stop and resume speech (Experiments 2-4). They…

  6. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    SciTech Connect

    Cem Sarica; Holden Zhang

    2006-05-31

    The developments of oil and gas fields in deep waters (5000 ft and more) will become more common in the future. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas, oil and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of hydrocarbon recovery from design to operation. Recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications, including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is crucial for any multiphase separation technique, either at topside, seabed or bottom-hole, to know inlet conditions such as flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. Therefore, the development of a new generation of multiphase flow predictive tools is needed. The overall objective of the proposed study is to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). In the current multiphase modeling approach, flow pattern and flow behavior (pressure gradient and phase fractions) prediction modeling are separated. Thus, different models based on different physics are employed, causing inaccuracies and discontinuities. Moreover, oil and water are treated as a pseudo single phase, ignoring the distinct characteristics of both oil and water, and often resulting in inaccurate design that leads to operational problems. In this study, a new model is being developed through a theoretical and experimental study employing a revolutionary approach. The

  7. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  8. Performance Evaluation Tools for Next Generation Scalable Computing Platforms

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Sarukkai, Sekhar; Craw, James (Technical Monitor)

    1995-01-01

    The Federal High Performance and Communications (HPCC) Program continue to focus on R&D in a wide range of high performance computing and communications technologies. Using its accomplishments in the past four years as building blocks towards a Global Information Infrastructure (GII), an Implementation Plan that identifies six Strategic Focus Areas for R&D has been proposed. This white paper argues that a new generation of system software and programming tools must be developed to support these focus areas, so that the R&D we invest today can lead to technology pay-off a decade from now. The Global Computing Infrastructure (GCI) in the Year 2000 and Beyond would consists of thousands of powerful computing nodes connected via high-speed networks across the globe. Users will be able to obtain computing in formation services the GCI with the ease of using a plugging a toaster into the electrical outlet on the wall anywhere in the country. Developing and managing the GO requires performance prediction and monitoring capabilities that do not exist. Various accomplishments in this field today must be integrated and expanded to support this vision.

  9. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    SciTech Connect

    Tulsa Fluid Flow

    2008-08-31

    The developments of fields in deep waters (5000 ft and more) is a common occurrence. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas-oil-and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of the hydrocarbon recovery from design to operation. The recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is very crucial to any multiphase separation technique that is employed either at topside, seabed or bottom-hole to know inlet conditions such as the flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. The overall objective was to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict the flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). The project was conducted in two periods. In Period 1 (four years), gas-oil-water flow in pipes were investigated to understand the fundamental physical mechanisms describing the interaction between the gas-oil-water phases under flowing conditions, and a unified model was developed utilizing a novel modeling approach. A gas-oil-water pipe flow database including field and laboratory data was formed in Period 2 (one year). The database was utilized in model performance demonstration. Period 1 primarily consisted of the development of a unified model and software to predict the gas-oil-water flow, and experimental studies of the gas-oil-water project, including flow behavior description and

  10. Developing the Next Generation of Tools for Simulating Galaxy Outflows

    NASA Astrophysics Data System (ADS)

    Scannapieco, Evan

    Outflows are observed in starbursting galaxies of all masses and at all cosmological epochs. They play a key role throughout the history of the Universe: shaping the galaxy mass-metallicity relation, drastically affecting the content and number density of dwarf galaxies, and transforming the chemical composition of the intergalactic medium. Yet, a complete model of galaxy out ows has proven to be elusive, as it requires both a better understanding of the evolution of the turbulent, multiphase gas in and around starbursting galaxies, and better tools to reproduce this evolution in galaxy-scale simulations. Here we propose to conduct a detailed series of numerical simulations designed to help develop such next-generation tools for the simulation of galaxy outflows. The program will consist of three types of direct numerical simulations, each of which will be targeted to allow galaxy-scale simulations to more accurately model key microphysical processes and their observational consequences. Our first set of simulations will be targeted at better modeling the starbursting interstellar medium (ISM) from which galaxy outflows are driven. The surface densities in starbursting galaxies are much larger than those in the Milky Way, resulting in larger gravitational accelerations and random velocities exceeding 30 or even 100 km/s. Under these conditions, the thermal stability of the ISM is changed dramatically, due to the sharp peak in gas cooling efficiency at H 200,000 K. Our simulations will carefully quantify the key ways in which this medium differs from the local ISM, and the consequences of these differences for when, where, and how outflows are driven. A second set of simulations will be targeted at better modeling the observed properties of rapidly cooling, highly turbulent gas. Because gas cooling in and around starbursts is extremely efficient, turbulent motions are often supersonic, which leads to a distribution of ionization states that is vastly different than

  11. Fine-Tuning Next-Generation Genome Editing Tools.

    PubMed

    Kanchiswamy, Chidananda Nagamangala; Maffei, Massimo; Malnoy, Mickael; Velasco, Riccardo; Kim, Jin-Soo

    2016-07-01

    The availability of genome sequences of numerous organisms and the revolution brought about by genome editing tools (e.g., ZFNs, TALENs, and CRISPR/Cas9 or RGENs) has provided a breakthrough in introducing targeted genetic changes both to explore emergent phenotypes and to introduce new functionalities. However, the wider application of these tools in biology, agriculture, medicine, and biotechnology is limited by off-target mutation effects. In this review, we compare available methods for detecting, measuring, and analyzing off-target mutations. Furthermore, we particularly focus on CRISPR/Cas9 regarding various methods, tweaks, and software tools available to nullify off-target effects. PMID:27167723

  12. Bioethics in America: Who decides?

    SciTech Connect

    Yesley, M.S.

    1992-05-01

    This paper is concerned with the process by which bioethics decisions are made as well as the actual decisions that are reached. The process commonly is one of ``shared decision-making,`` that is, decisionmaking at several levels, beginning with the government and ending with the individual. After the government has defined a scope of permissible activity, the research or health care institution may further limit what activities are permitted. Finally, the individual patient, or, if the patient is incompetent, the patient`s legal representative decides whether or not to participate in the activity. Because bioethics in general, and bioethics related to genetics in particular, evolves through this process of decisionmaking at several levels, this paper briefly traces the process, to see how it works in several areas of bioethics, in order to provide a perspective on the way in which ethical decisions related to genetics are or will be made.

  13. Bioethics in America: Who decides

    SciTech Connect

    Yesley, M.S.

    1992-01-01

    This paper is concerned with the process by which bioethics decisions are made as well as the actual decisions that are reached. The process commonly is one of shared decision-making,'' that is, decisionmaking at several levels, beginning with the government and ending with the individual. After the government has defined a scope of permissible activity, the research or health care institution may further limit what activities are permitted. Finally, the individual patient, or, if the patient is incompetent, the patient's legal representative decides whether or not to participate in the activity. Because bioethics in general, and bioethics related to genetics in particular, evolves through this process of decisionmaking at several levels, this paper briefly traces the process, to see how it works in several areas of bioethics, in order to provide a perspective on the way in which ethical decisions related to genetics are or will be made.

  14. Virtual tool mark generation for efficient striation analysis.

    PubMed

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-07-01

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5-10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  15. Virtual Tool Mark Generation for Efficient Striation Analysis

    SciTech Connect

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  16. An Application of the Mesh Generation and Refinement Tool to Mobile Bay, Alabama, USA

    NASA Astrophysics Data System (ADS)

    Aziz, Wali; Alarcon, Vladimir J.; McAnally, William; Martin, James; Cartwright, John

    2009-08-01

    A grid generation tool, called the Mesh Generation and Refinement Tool (MGRT), has been developed using Qt4. Qt4 is a comprehensive C++ application framework which includes GUI and container class-libraries and tools for cross-platform development. MGRT is capable of using several types of algorithms for grid generation. This paper presents an application of the MGRT grid generation tool for creating an unstructured grid of Mobile Bay (Alabama, USA) that will be used for hydrodynamics modeling. The algorithm used in this particular application is the Advancing-Front/Local-Reconnection (AFLR) [1] [2]. This research shows results of grids created with MGRT and compares them to grids (for the same geographical container) created using other grid generation tools. The superior quality of the grids generated by MGRT is shown.

  17. New generation of exploration tools: interactive modeling software and microcomputers

    SciTech Connect

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly, these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.

  18. E-DECIDER Decision Support Gateway For Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.

    2013-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that

  19. JVM: Java Visual Mapping tool for next generation sequencing read.

    PubMed

    Yang, Ye; Liu, Juan

    2015-01-01

    We developed a program JVM (Java Visual Mapping) for mapping next generation sequencing read to reference sequence. The program is implemented in Java and is designed to deal with millions of short read generated by sequence alignment using the Illumina sequencing technology. It employs seed index strategy and octal encoding operations for sequence alignments. JVM is useful for DNA-Seq, RNA-Seq when dealing with single-end resequencing. JVM is a desktop application, which supports reads capacity from 1 MB to 10 GB.

  20. Computer Generated Optical Illusions: A Teaching and Research Tool.

    ERIC Educational Resources Information Center

    Bailey, Bruce; Harman, Wade

    Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…

  1. Next generation tools to accelerate the synthetic biology process.

    PubMed

    Shih, Steve C C; Moraes, Christopher

    2016-05-16

    Synthetic biology follows the traditional engineering paradigm of designing, building, testing and learning to create new biological systems. While such approaches have enormous potential, major challenges still exist in this field including increasing the speed at which this workflow can be performed. Here, we present recently developed microfluidic tools that can be used to automate the synthetic biology workflow with the goal of advancing the likelihood of producing desired functionalities. With the potential for programmability, automation, and robustness, the integration of microfluidics and synthetic biology has the potential to accelerate advances in areas such as bioenergy, health, and biomaterials. PMID:27146265

  2. UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar; Degani, Asaf; Heymann, Michael

    2004-01-01

    In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.

  3. HALOGEN: a tool for fast generation of mock halo catalogues

    NASA Astrophysics Data System (ADS)

    Avila, Santiago; Murray, Steven G.; Knebe, Alexander; Power, Chris; Robotham, Aaron S. G.; Garcia-Bellido, Juan

    2015-06-01

    We present a simple method of generating approximate synthetic halo catalogues: HALOGEN. This method uses a combination of second-order Lagrangian Perturbation Theory (2LPT) in order to generate the large-scale matter distribution, analytical mass functions to generate halo masses, and a single-parameter stochastic model for halo bias to position haloes. HALOGEN represents a simplification of similar recently published methods. Our method is constrained to recover the two-point function at intermediate (10 h-1 Mpc < r < 50 h-1 Mpc) scales, which we show is successful to within 2 per cent. Larger scales (˜100 h-1 Mpc) are reproduced to within 15 per cent. We compare several other statistics (e.g. power spectrum, point distribution function, redshift space distortions) with results from N-body simulations to determine the validity of our method for different purposes. One of the benefits of HALOGEN is its flexibility, and we demonstrate this by showing how it can be adapted to varying cosmologies and simulation specifications. A driving motivation for the development of such approximate schemes is the need to compute covariance matrices and study the systematic errors for large galaxy surveys, which requires thousands of simulated realizations. We discuss the applicability of our method in this context, and conclude that it is well suited to mass production of appropriate halo catalogues. The code is publicly available at https://github.com/savila/halogen.

  4. Skateboards or Wildlife? Kids Decide!

    ERIC Educational Resources Information Center

    Thomas, Julie; Cooper, Sandra; Haukos, David

    2004-01-01

    How can teachers make science learning relevant to today's technology savvy students? They can incorporate the Internet and use it as a tool to help solve real-life problems. A group of university professors, a field biologist, and classroom teachers teamed up to create an exciting, interactive Web-based learning environment for students and…

  5. Functional viral metagenomics and the next generation of molecular tools

    PubMed Central

    Schoenfeld, Thomas; Liles, Mark; Wommack, K. Eric; Polson, Shawn W.; Godiska, Ronald; Mead, David

    2009-01-01

    The enzymes of bacteriophages and other viruses have been essential research tools since the first days of molecular biology. However, the current repertoire of viral enzymes only hints at their overall potential. The most commonly used enzymes are derived from a surprisingly small number of cultivated viruses, which is remarkable considering the extreme abundance and diversity of viruses revealed over the past decade by metagenomic analysis. To access the treasure trove of enzymes hidden in the global virosphere and develop them for research, therapeutic and diagnostic uses, improvements are needed in our ability to rapidly and efficiently discover, express and characterize viral genes to produce useful proteins. We discuss improvements to sampling and cloning methods, functional and genomics-based screens and expression systems that should accelerate discovery of new enzymes and other viral proteins for use in research and medicine. PMID:19896852

  6. Automated tools for the generation of performance-based training

    SciTech Connect

    Trainor, M.S.; Fries, J.

    1990-01-01

    The field of educational technology is not a new one, but the emphasis in the past has been on the use of technologies for the delivery of instruction and tests. This paper explores the application of technology to the development of performance-based instruction and to the analyses leading up to the development of the instruction. Several technologies are discussed, with specific software packages described. The purpose of these technologies is to streamline the instructional analysis and design process, using the computer for its strengths to aid the human-in-the-loop. Currently, the process is all accomplished manually. Applying automated tools to the process frees the humans from some of the tedium involved so that they can be dedicated to the more complex aspects of the process. 12 refs.

  7. Conceptual Design of Electron-Beam Generated Plasma Tools

    NASA Astrophysics Data System (ADS)

    Agarwal, Ankur; Rauf, Shahid; Dorf, Leonid; Collins, Ken; Boris, David; Walton, Scott

    2015-09-01

    Realization of the next generation of high-density nanostructured devices is predicated on etching features with atomic layer resolution, no damage and high selectivity. High energy electron beams generate plasmas with unique features that make them attractive for applications requiring monolayer precision. In these plasmas, high energy beam electrons ionize the background gas and the resultant daughter electrons cool to low temperatures via collisions with gas molecules and lack of any accelerating fields. For example, an electron temperature of <0.6 eV with densities comparable to conventional plasma sources can be obtained in molecular gases. The chemistry in such plasmas can significantly differ from RF plasmas as the ions/radicals are produced primarily by beam electrons rather than those in the tail of a low energy distribution. In this work, we will discuss the conceptual design of an electron beam based plasma processing system. Plasma properties will be discussed for Ar, Ar/N2, and O2 plasmas using a computational plasma model, and comparisons made to experiments. The fluid plasma model is coupled to a Monte Carlo kinetic model for beam electrons which considers gas phase collisions and the effect of electric and magnetic fields on electron motion. The impact of critical operating parameters such as magnetic field, beam energy, and gas pressure on plasma characteristics in electron-beam plasma processing systems will be discussed. Partially supported by the NRL base program.

  8. Next generation chemical proteomic tools for rapid enzyme profiling.

    PubMed

    Uttamchandani, Mahesh; Lu, Candy H S; Yao, Shao Q

    2009-08-18

    Sequencing of the human genome provided a wealth of information about the genomic blueprint of a cell. But genes do not tell the entire story of life and living processes; identifying the roles of enzymes and mapping out their interactions is also crucial. Enzymes catalyze virtually every cellular process and metabolic exchange. They not only are instrumental in sustaining life but also are required for its regulation and diversification. Diseases such as cancer can be caused by minor changes in enzyme activities. In addition, the unique enzymes of pathogenic organisms are ripe targets for combating infections. Consequently, nearly one-third of all current drug targets are enzymes. An estimated 18-29% of eukaryotic genes encode enzymes, but only a limited proportion of enzymes have thus far been characterized. Therefore, little is understood about the physiological roles, substrate specificity, and downstream targets of the vast majority of these important proteins. A key step toward the biological characterization of enzymes, as well as their adoption as drug targets, is the development of global solutions that bridge the gap in understanding these proteins and their interactions. We herein present technological advances that facilitate the study of enzymes and their properties in a high-throughput manner. Over the years, our group has introduced and developed a variety of such enabling platforms for many classes of enzymes, including kinases, phosphatases, and proteases. For each of these different types of enzymes, specific design considerations are required to develop the appropriate chemical tools to characterize each class. These tools include activity-based probes and chemical compound libraries, which are rapidly assembled using efficient combinatorial synthesis or "click chemistry" strategies. The resulting molecular assortments may then be screened against the target enzymes in high-throughput using microplates or microarrays. These techniques offer

  9. A distributed computing tool for generating neural simulation databases.

    PubMed

    Calin-Jageman, Robert J; Katz, Paul S

    2006-12-01

    After developing a model neuron or network, it is important to systematically explore its behavior across a wide range of parameter values or experimental conditions, or both. However, compiling a very large set of simulation runs is challenging because it typically requires both access to and expertise with high-performance computing facilities. To lower the barrier for large-scale model analysis, we have developed NeuronPM, a client/server application that creates a "screen-saver" cluster for running simulations in NEURON (Hines & Carnevale, 1997). NeuronPM provides a user-friendly way to use existing computing resources to catalog the performance of a neural simulation across a wide range of parameter values and experimental conditions. The NeuronPM client is a Windows-based screen saver, and the NeuronPM server can be hosted on any Apache/PHP/MySQL server. During idle time, the client retrieves model files and work assignments from the server, invokes NEURON to run the simulation, and returns results to the server. Administrative panels make it simple to upload model files, define the parameters and conditions to vary, and then monitor client status and work progress. NeuronPM is open-source freeware and is available for download at http://neuronpm.homeip.net . It is a useful entry-level tool for systematically analyzing complex neuron and network simulations.

  10. Multiscale Toxicology - Building the Next Generation Tools for Toxicology

    SciTech Connect

    Thrall, Brian D.; Minard, Kevin R.; Teeguarden, Justin G.; Waters, Katrina M.

    2012-09-01

    A Cooperative Research and Development Agreement (CRADA) was sponsored by Battelle Memorial Institute (Battelle, Columbus), to initiate a collaborative research program across multiple Department of Energy (DOE) National Laboratories aimed at developing a suite of new capabilities for predictive toxicology. Predicting the potential toxicity of emerging classes of engineered nanomaterials was chosen as one of two focusing problems for this program. PNNL’s focus toward this broader goal was to refine and apply experimental and computational tools needed to provide quantitative understanding of nanoparticle dosimetry for in vitro cell culture systems, which is necessary for comparative risk estimates for different nanomaterials or biological systems. Research conducted using lung epithelial and macrophage cell models successfully adapted magnetic particle detection and fluorescent microscopy technologies to quantify uptake of various forms of engineered nanoparticles, and provided experimental constraints and test datasets for benchmark comparison against results obtained using an in vitro computational dosimetry model, termed the ISSD model. The experimental and computational approaches developed were used to demonstrate how cell dosimetry is applied to aid in interpretation of genomic studies of nanoparticle-mediated biological responses in model cell culture systems. The combined experimental and theoretical approach provides a highly quantitative framework for evaluating relationships between biocompatibility of nanoparticles and their physical form in a controlled manner.

  11. Generating mammalian sirtuin tools for protein-interaction analysis.

    PubMed

    Hershberger, Kathleen A; Motley, Jonathan; Hirschey, Matthew D; Anderson, Kristin A

    2013-01-01

    The sirtuins are a family of NAD(+)-dependent deacylases with important effects on aging, cancer, and metabolism. Sirtuins exert their biological effects by catalyzing deacetylation and/or deacylation reactions in which Acyl groups are removed from lysine residues of specific proteins. A current challenge is to identify specific sirtuin target proteins against the high background of acetylated proteins recently identified by proteomic surveys. New evidence indicates that bona fide sirtuin substrate proteins form stable physical associations with their sirtuin regulator. Therefore, identification of sirtuin interacting proteins could be a useful aid in focusing the search for substrates. Described here is a method for identifying sirtuin protein interactors. Employing basic techniques of molecular cloning and immunochemistry, the method describes the generation of mammalian sirtuin protein expression plasmids and their use to overexpress and immunoprecipitate sirtuins with their interacting partners. Also described is the use of the Database for Annotation, Visualization, and Integrated Discovery for interpreting the sirtuin protein-interaction data obtained. PMID:24014400

  12. Next generation sequencing technologies: tool to study avian virus diversity.

    PubMed

    Kapgate, S S; Barbuddhe, S B; Kumanan, K

    2015-03-01

    Increased globalisation, climatic changes and wildlife-livestock interface led to emergence of novel viral pathogens or zoonoses that have become serious concern to avian, animal and human health. High biodiversity and bird migration facilitate spread of the pathogen and provide reservoirs for emerging infectious diseases. Current classical diagnostic methods designed to be virus-specific or aim to be limited to group of viral agents, hinder identifying of novel viruses or viral variants. Recently developed approaches of next-generation sequencing (NGS) provide culture-independent methods that are useful for understanding viral diversity and discovery of novel virus, thereby enabling a better diagnosis and disease control. This review discusses the different possible steps of a NGS study utilizing sequence-independent amplification, high-throughput sequencing and bioinformatics approaches to identify novel avian viruses and their diversity. NGS lead to the identification of a wide range of new viruses such as picobirnavirus, picornavirus, orthoreovirus and avian gamma coronavirus associated with fulminating disease in guinea fowl and is also used in describing viral diversity among avian species. The review also briefly discusses areas of viral-host interaction and disease associated causalities with newly identified avian viruses. PMID:25790045

  13. Next generation sequencing technologies: tool to study avian virus diversity.

    PubMed

    Kapgate, S S; Barbuddhe, S B; Kumanan, K

    2015-03-01

    Increased globalisation, climatic changes and wildlife-livestock interface led to emergence of novel viral pathogens or zoonoses that have become serious concern to avian, animal and human health. High biodiversity and bird migration facilitate spread of the pathogen and provide reservoirs for emerging infectious diseases. Current classical diagnostic methods designed to be virus-specific or aim to be limited to group of viral agents, hinder identifying of novel viruses or viral variants. Recently developed approaches of next-generation sequencing (NGS) provide culture-independent methods that are useful for understanding viral diversity and discovery of novel virus, thereby enabling a better diagnosis and disease control. This review discusses the different possible steps of a NGS study utilizing sequence-independent amplification, high-throughput sequencing and bioinformatics approaches to identify novel avian viruses and their diversity. NGS lead to the identification of a wide range of new viruses such as picobirnavirus, picornavirus, orthoreovirus and avian gamma coronavirus associated with fulminating disease in guinea fowl and is also used in describing viral diversity among avian species. The review also briefly discusses areas of viral-host interaction and disease associated causalities with newly identified avian viruses.

  14. Profiting from competition: Financial tools for electric generation companies

    NASA Astrophysics Data System (ADS)

    Richter, Charles William, Jr.

    Regulations governing the operation of electric power systems in North America and many other areas of the world are undergoing major changes designed to promote competition. This process of change is often referred to as deregulation. Participants in deregulated electricity systems may find that their profits will greatly benefit from the implementation of successful bidding strategies. While the goal of the regulators may be to create rules which balance reliable power system operation with maximization of the total benefit to society, the goal of generation companies is to maximize their profit, i.e., return to their shareholders. The majority of the research described here is conducted from the point of view of generation companies (GENCOs) wishing to maximize their expected utility function, which is generally comprised of expected profit and risk. Strategies that help a GENCO to maximize its objective function must consider the impact of (and aid in making) operating decisions that may occur within a few seconds to multiple years. The work described here assumes an environment in which energy service companies (ESCOs) buy and GENCOs sell power via double auctions in regional commodity exchanges. Power is transported on wires owned by transmission companies (TRANSCOs) and distribution companies (DISTCOs). The proposed market framework allows participants to trade electrical energy contracts via the spot, futures, options, planning, and swap markets. An important method of studying these proposed markets and the behavior of participating agents is the field of experimental/computational economics. For much of the research reported here, the market simulator developed by Kumar and Sheble and similar simulators has been adapted to allow computerized agents to trade energy. Creating computerized agents that can react as rationally or irrationally as a human trader is a difficult problem for which we have turned to the field of artificial intelligence. Some of our

  15. Multiscale Toxicology- Building the Next Generation Tools for Toxicology

    SciTech Connect

    Retterer, S. T.; Holsapple, M. P.

    2013-10-31

    A Cooperative Research and Development Agreement (CRADA) was established between Battelle Memorial Institute (BMI), Pacific Northwest National Laboratory (PNNL), Oak Ridge National Laboratory (ORNL), Brookhaven National Laboratory (BNL), Lawrence Livermore National Laboratory (LLNL) with the goal of combining the analytical and synthetic strengths of the National Laboratories with BMI's expertise in basic and translational medical research to develop a collaborative pipeline and suite of high throughput and imaging technologies that could be used to provide a more comprehensive understanding of material and drug toxicology in humans. The Multi-Scale Toxicity Initiative (MSTI), consisting of the team members above, was established to coordinate cellular scale, high-throughput in vitro testing, computational modeling and whole animal in vivo toxicology studies between MSTI team members. Development of a common, well-characterized set of materials for testing was identified as a crucial need for the initiative. Two research tracks were established by BMI during the course of the CRADA. The first research track focused on the development of tools and techniques for understanding the toxicity of nanomaterials, specifically inorganic nanoparticles (NPs). ORNL"s work focused primarily on the synthesis, functionalization and characterization of a common set of NPs for dissemination to the participating laboratories. These particles were synthesized to retain the same surface characteristics and size, but to allow visualization using the variety of imaging technologies present across the team. Characterization included the quantitative analysis of physical and chemical properties of the materials as well as the preliminary assessment of NP toxicity using commercially available toxicity screens and emerging optical imaging strategies. Additional efforts examined the development of high-throughput microfluidic and imaging assays for measuring NP uptake, localization, and

  16. PC graphics generation and management tool for real-time applications

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1992-01-01

    A graphics tool was designed and developed for easy generation and management of personal computer graphics. It also provides methods and 'run-time' software for many common artificial intelligence (AI) or expert system (ES) applications.

  17. The mission events graphic generator software: A small tool with big results

    NASA Technical Reports Server (NTRS)

    Lupisella, Mark; Leibee, Jack; Scaffidi, Charles

    1993-01-01

    Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.

  18. Enzyme Function Initiative-Enzyme Similarity Tool (EFI-EST): A web tool for generating protein sequence similarity networks

    PubMed Central

    Gerlt, John A.; Bouvier, Jason T.; Davidson, Daniel B.; Imker, Heidi J.; Sadkhin, Boris; Slater, David R.; Whalen, Katie L.

    2015-01-01

    The Enzyme Function Initiative, an NIH/NIGMS-supported Large-Scale Collaborative Project (EFI; U54GM093342; http://enzymefunction.org/), is focused on devising and disseminating bioinformatics and computational tools as well as experimental strategies for the prediction and assignment of functions (in vitro activities and in vivo physiological/metabolic roles) to uncharacterized enzymes discovered in genome projects. Protein sequence similarity networks (SSNs) are visually powerful tools for analyzing sequence relationships in protein families (H.J. Atkinson, J.H. Morris, T.E. Ferrin, and P.C. Babbitt, PLoS One 2009, 4, e4345). However, the members of the biological/biomedical community have not had access to the capability to generate SSNs for their “favorite” protein families. In this article we announce the EFI-EST (Enzyme Function Initiative-Enzyme Similarity Tool) web tool (http://efi.igb.illinois.edu/efi-est/) that is available without cost for the automated generation of SSNs by the community. The tool can create SSNs for the “closest neighbors” of a user-supplied protein sequence from the UniProt database (Option A) or of members of any user-supplied Pfam and/or InterPro family (Option B). We provide an introduction to SSNs, a description of EFI-EST, and a demonstration of the use of EFI-EST to explore sequence-function space in the OMP decarboxylase superfamily (PF00215). This article is designed as a tutorial that will allow members of the community to use the EFI-EST web tool for exploring sequence/function space in protein families. PMID:25900361

  19. Enzyme Function Initiative-Enzyme Similarity Tool (EFI-EST): A web tool for generating protein sequence similarity networks.

    PubMed

    Gerlt, John A; Bouvier, Jason T; Davidson, Daniel B; Imker, Heidi J; Sadkhin, Boris; Slater, David R; Whalen, Katie L

    2015-08-01

    The Enzyme Function Initiative, an NIH/NIGMS-supported Large-Scale Collaborative Project (EFI; U54GM093342; http://enzymefunction.org/), is focused on devising and disseminating bioinformatics and computational tools as well as experimental strategies for the prediction and assignment of functions (in vitro activities and in vivo physiological/metabolic roles) to uncharacterized enzymes discovered in genome projects. Protein sequence similarity networks (SSNs) are visually powerful tools for analyzing sequence relationships in protein families (H.J. Atkinson, J.H. Morris, T.E. Ferrin, and P.C. Babbitt, PLoS One 2009, 4, e4345). However, the members of the biological/biomedical community have not had access to the capability to generate SSNs for their "favorite" protein families. In this article we announce the EFI-EST (Enzyme Function Initiative-Enzyme Similarity Tool) web tool (http://efi.igb.illinois.edu/efi-est/) that is available without cost for the automated generation of SSNs by the community. The tool can create SSNs for the "closest neighbors" of a user-supplied protein sequence from the UniProt database (Option A) or of members of any user-supplied Pfam and/or InterPro family (Option B). We provide an introduction to SSNs, a description of EFI-EST, and a demonstration of the use of EFI-EST to explore sequence-function space in the OMP decarboxylase superfamily (PF00215). This article is designed as a tutorial that will allow members of the community to use the EFI-EST web tool for exploring sequence/function space in protein families.

  20. Deciding as Intentional Action: Control over Decisions

    PubMed Central

    Shepherd, Joshua

    2015-01-01

    Common-sense folk psychology and mainstream philosophy of action agree about decisions: these are under an agent's direct control, and are thus intentional actions for which agents can be held responsible. I begin this paper by presenting a problem for this view. In short, since the content of the motivational attitudes that drive deliberation and decision remains open-ended until the moment of decision, it is unclear how agents can be thought to exercise control over what they decide at the moment of deciding. I note that this problem might motivate a non-actional view of deciding—a view that decisions are not actions, but are instead passive events of intention acquisition. For without an understanding of how an agent might exercise control over what is decided at the moment of deciding, we lack a good reason for maintaining commitment to an actional view of deciding. However, I then offer the required account of how agents exercise control over decisions at the moment of deciding. Crucial to this account is an understanding of the relation of practical deliberation to deciding, an understanding of skilled deliberative activity, and the role of attention in the mental action of deciding. PMID:26321765

  1. Generation X Teaches College: Generation Construction as Pedagogical Tool in the Writing Classroom.

    ERIC Educational Resources Information Center

    Hassel, Holly; Epp, Dawn Vernooy

    In the 1996 book "Generation X Goes to College: An Eye-Opening Account of Teaching in Post-Modern America," Peter Sacks probes the "decay" of higher education in the United States; a decay he attributes to listless, entitled students. This paper interrogates the paradigm of Boomers and Generation Xers poised in opposition to one another,…

  2. E-DECIDER Rapid Response to the M 6.0 South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    E-DECIDER initiated rapid response mode when the California Earthquake Clearinghouse was activated the morning following the M6 Napa earthquake. Data products, including: 1) rapid damage and loss estimates, 2) deformation magnitude and slope change maps, and 3) aftershock forecasts were provided to the Clearinghouse partners within 24 hours of the event via XchangeCore Web Service Data Orchestration sharing. NASA data products were provided to end-users via XchangeCore, EERI and Clearinghouse websites, and ArcGIS online for Napa response, reaching a wide response audience. The E-DECIDER team helped facilitate rapid delivery of NASA products to stakeholders and participated in Clearinghouse Napa earthquake briefings to update stakeholders on product information. Rapid response products from E-DECIDER can be used to help prioritize response efforts shortly after the event has occurred. InLET (Internet Loss Estimation Tool) post-event damage and casualty estimates were generated quickly after the Napa earthquake. InLET provides immediate post-event estimates of casualties and building damage by performing loss/impact simulations using USGS ground motion data and FEMA HAZUS damage estimation technology. These results were provided to E-DECIDER by their collaborators, ImageCat, Inc. and the Community Stakeholder Network (CSN). Strain magnitude and slope change maps were automatically generated when the Napa earthquake appeared on the USGS feed. These maps provide an early estimate of where the deformation has occurred and where damage may be localized. Using E-DECIDER critical infrastructure overlays with damage estimates, decision makers can direct response effort that can be verified later with field reconnaissance and remote sensing-based observations. Earthquake aftershock forecast maps were produced within hours of the event. These maps highlight areas where aftershocks are likely to occur and can also be coupled with infrastructure overlays to help direct response

  3. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  4. The Sequence of Events generator: A powerful tool for mission operations

    NASA Technical Reports Server (NTRS)

    Wobbe, Hubertus; Braun, Armin

    1994-01-01

    The functions and features of the sequence of events (SOE) and flight operations procedures (FOP) generator developed and used at DLR/GSOC for the positioning of EUTELSAT 2 satellites are presented. The SOE and FOP are the main operational documents that are prepared for nominal as well as for non-nominal mission execution. Their structure and application are described. Both of these documents are generated, validated, and maintained by a common software tool. Its main features and advantages are demonstrated. The tool has been improved continuously over the last 5 years. Due to its flexibility it can easily be applied to other projects and new features may be added.

  5. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the

  6. A decision support tool for landfill methane generation and gas collection.

    PubMed

    Emkes, Harriet; Coulon, Frédéric; Wagland, Stuart

    2015-09-01

    This study presents a decision support tool (DST) to enhance methane generation at individual landfill sites. To date there is no such tool available to provide landfill decision makers with clear and simplified information to evaluate biochemical processes within a landfill site, to assess performance of gas production and to identify potential remedies to any issues. The current lack in understanding stems from the complexity of the landfill waste degradation process. Two scoring sets for landfill gas production performance are calculated with the tool: (1) methane output score which measures the deviation of the actual methane output rate at each site which the prediction generated by the first order decay model LandGEM; and (2) landfill gas indicators' score, which measures the deviation of the landfill gas indicators from their ideal ranges for optimal methane generation conditions. Landfill gas indicators include moisture content, temperature, alkalinity, pH, BOD, COD, BOD/COD ratio, ammonia, chloride, iron and zinc. A total landfill gas indicator score is provided using multi-criteria analysis to calculate the sum of weighted scores for each indicator. The weights for each indicator are calculated using an analytical hierarchical process. The tool is tested against five real scenarios for landfill sites in UK with a range of good, average and poor landfill methane generation over a one year period (2012). An interpretation of the results is given for each scenario and recommendations are highlighted for methane output rate enhancement. Results demonstrate how the tool can help landfill managers and operators to enhance their understanding of methane generation at a site-specific level, track landfill methane generation over time, compare and rank sites, and identify problems areas within a landfill site. PMID:26168873

  7. A decision support tool for landfill methane generation and gas collection.

    PubMed

    Emkes, Harriet; Coulon, Frédéric; Wagland, Stuart

    2015-09-01

    This study presents a decision support tool (DST) to enhance methane generation at individual landfill sites. To date there is no such tool available to provide landfill decision makers with clear and simplified information to evaluate biochemical processes within a landfill site, to assess performance of gas production and to identify potential remedies to any issues. The current lack in understanding stems from the complexity of the landfill waste degradation process. Two scoring sets for landfill gas production performance are calculated with the tool: (1) methane output score which measures the deviation of the actual methane output rate at each site which the prediction generated by the first order decay model LandGEM; and (2) landfill gas indicators' score, which measures the deviation of the landfill gas indicators from their ideal ranges for optimal methane generation conditions. Landfill gas indicators include moisture content, temperature, alkalinity, pH, BOD, COD, BOD/COD ratio, ammonia, chloride, iron and zinc. A total landfill gas indicator score is provided using multi-criteria analysis to calculate the sum of weighted scores for each indicator. The weights for each indicator are calculated using an analytical hierarchical process. The tool is tested against five real scenarios for landfill sites in UK with a range of good, average and poor landfill methane generation over a one year period (2012). An interpretation of the results is given for each scenario and recommendations are highlighted for methane output rate enhancement. Results demonstrate how the tool can help landfill managers and operators to enhance their understanding of methane generation at a site-specific level, track landfill methane generation over time, compare and rank sites, and identify problems areas within a landfill site.

  8. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  9. Case-Based Pedagogy Using Student-Generated Vignettes: A Pre-Service Intercultural Awareness Tool

    ERIC Educational Resources Information Center

    Cournoyer, Amy

    2010-01-01

    This qualitative study investigated the effectiveness of case-based pedagogy as an instructional tool aimed at increasing cultural awareness and competence in the preparation of 18 pre-service and in-service students enrolled in an Intercultural Education course. Each participant generated a vignette based on an instructional challenge identified…

  10. Evaluation of Computer Tools for Idea Generation and Team Formation in Project-Based Learning

    ERIC Educational Resources Information Center

    Ardaiz-Villanueva, Oscar; Nicuesa-Chacon, Xabier; Brene-Artazcoz, Oscar; Sanz de Acedo Lizarraga, Maria Luisa; Sanz de Acedo Baquedano, Maria Teresa

    2011-01-01

    The main objective of this research was to validate the effectiveness of Wikideas and Creativity Connector tools to stimulate the generation of ideas and originality by university students organized into groups according to their indexes of creativity and affinity. Another goal of the study was to evaluate the classroom climate created by these…

  11. Trajectory Assessment and Modification Tools for Next Generation Air Traffic Management Operations

    NASA Technical Reports Server (NTRS)

    Brasil, Connie; Lee, Paul; Mainini, Matthew; Lee, Homola; Lee, Hwasoo; Prevot, Thomas; Smith, Nancy

    2011-01-01

    This paper reviews three Next Generation Air Transportation System (NextGen) based high fidelity air traffic control human-in-the-loop (HITL) simulations, with a focus on the expected requirement of enhanced automated trajectory assessment and modification tools to support future air traffic flow management (ATFM) planning positions. The simulations were conducted at the National Aeronautics and Space Administration (NASA) Ames Research Centers Airspace Operations Laboratory (AOL) in 2009 and 2010. The test airspace for all three simulations assumed the mid-term NextGenEn-Route high altitude environment utilizing high altitude sectors from the Kansas City and Memphis Air Route Traffic Control Centers. Trajectory assessment, modification and coordination decision support tools were developed at the AOL in order to perform future ATFM tasks. Overall tool usage results and user acceptability ratings were collected across three areas of NextGen operatoins to evaluate the tools. In addition to the usefulness and usability feedback, feasibility issues, benefits, and future requirements were also addressed. Overall, the tool sets were rated very useful and usable, and many elements of the tools received high scores and were used frequently and successfully. Tool utilization results in all three HITLs showed both user and system benefits including better airspace throughput, reduced controller workload, and highly effective communication protocols in both full Data Comm and mixed-equipage environments.

  12. Ramping and Uncertainty Prediction Tool - Analysis and Visualization of Wind Generation Impact on Electrical Grid

    2014-03-03

    RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The toolmore » dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variable resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less

  13. The role of optimization in the next generation of computer-based design tools

    NASA Technical Reports Server (NTRS)

    Rogan, J. Edward

    1989-01-01

    There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.

  14. General application of rapid 3-D digitizing and tool path generation for complex shapes

    SciTech Connect

    Kwok, K.S.; Loucks, C.S.; Driessen, B.J.

    1997-09-01

    A system for automatic tool path generation was developed at Sandia National Laboratories for finish machining operations. The system consists of a commercially available 5-axis milling machine controlled by Sandia developed software. This system was used to remove overspray on cast turbine blades. A laser-based, structured-light sensor, mounted on a tool holder, is used to collect 3D data points around the surface of the turbine blade. Using the digitized model of the blade, a tool path is generated which will drive a 0.375 inch grinding pin around the tip of the blade. A fuzzified digital filter was developed to properly eliminate false sensor readings caused by burrs, holes and overspray. The digital filter was found to successfully generate the correct tool path for a blade with intentionally scanned holes and defects. The fuzzified filter improved the computation efficiency by a factor of 25. For application to general parts, an adaptive scanning algorithm was developed and presented with simulation and experimental results. A right pyramid and an ellipsoid were scanned successfully with the adaptive algorithm in simulation studies. In actual experiments, a nose cone and a turbine blade were successfully scanned. A complex shaped turbine blade was successfully scanned and finished machined using these algorithms.

  15. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  16. Child Custody in Divorce: How Parents Decide.

    ERIC Educational Resources Information Center

    Lowery, Carol R.

    One of the most important (and frequently most difficult) decisions faced by divorcing parents is determining who will have custody of their children. To investigate parental beliefs about the standards used in deciding custody, 12 sets of parents completed a questionnaire and were interviewed. Results showed considerable agreement with the…

  17. Forecasting municipal solid waste generation using prognostic tools and regression analysis.

    PubMed

    Ghinea, Cristina; Drăgoi, Elena Niculina; Comăniţă, Elena-Diana; Gavrilescu, Marius; Câmpean, Teofil; Curteanu, Silvia; Gavrilescu, Maria

    2016-11-01

    For an adequate planning of waste management systems the accurate forecast of waste generation is an essential step, since various factors can affect waste trends. The application of predictive and prognosis models are useful tools, as reliable support for decision making processes. In this paper some indicators such as: number of residents, population age, urban life expectancy, total municipal solid waste were used as input variables in prognostic models in order to predict the amount of solid waste fractions. We applied Waste Prognostic Tool, regression analysis and time series analysis to forecast municipal solid waste generation and composition by considering the Iasi Romania case study. Regression equations were determined for six solid waste fractions (paper, plastic, metal, glass, biodegradable and other waste). Accuracy Measures were calculated and the results showed that S-curve trend model is the most suitable for municipal solid waste (MSW) prediction. PMID:27454099

  18. Forecasting municipal solid waste generation using prognostic tools and regression analysis.

    PubMed

    Ghinea, Cristina; Drăgoi, Elena Niculina; Comăniţă, Elena-Diana; Gavrilescu, Marius; Câmpean, Teofil; Curteanu, Silvia; Gavrilescu, Maria

    2016-11-01

    For an adequate planning of waste management systems the accurate forecast of waste generation is an essential step, since various factors can affect waste trends. The application of predictive and prognosis models are useful tools, as reliable support for decision making processes. In this paper some indicators such as: number of residents, population age, urban life expectancy, total municipal solid waste were used as input variables in prognostic models in order to predict the amount of solid waste fractions. We applied Waste Prognostic Tool, regression analysis and time series analysis to forecast municipal solid waste generation and composition by considering the Iasi Romania case study. Regression equations were determined for six solid waste fractions (paper, plastic, metal, glass, biodegradable and other waste). Accuracy Measures were calculated and the results showed that S-curve trend model is the most suitable for municipal solid waste (MSW) prediction.

  19. Developing Next-Generation Telehealth Tools and Technologies: Patients, Systems, and Data Perspectives

    PubMed Central

    Filart, Rosemarie; Burgess, Lawrence P.; Lee, Insup; Poropatich, Ronald K.

    2010-01-01

    Abstract The major goals of telemedicine today are to develop next-generation telehealth tools and technologies to enhance healthcare delivery to medically underserved populations using telecommunication technology, to increase access to medical specialty services while decreasing healthcare costs, and to provide training of healthcare providers, clinical trainees, and students in health-related fields. Key drivers for these tools and technologies are the need and interest to collaborate among telehealth stakeholders, including patients, patient communities, research funders, researchers, healthcare services providers, professional societies, industry, healthcare management/economists, and healthcare policy makers. In the development, marketing, adoption, and implementation of these tools and technologies, communication, training, cultural sensitivity, and end-user customization are critical pieces to the process. Next-generation tools and technologies are vehicles toward personalized medicine, extending the telemedicine model to include cell phones and Internet-based telecommunications tools for remote and home health management with video assessment, remote bedside monitoring, and patient-specific care tools with event logs, patient electronic profile, and physician note-writing capability. Telehealth is ultimately a system of systems in scale and complexity. To cover the full spectrum of dynamic and evolving needs of end-users, we must appreciate system complexity as telehealth moves toward increasing functionality, integration, interoperability, outreach, and quality of service. Toward that end, our group addressed three overarching questions: (1) What are the high-impact topics? (2) What are the barriers to progress? and (3) What roles can the National Institutes of Health and its various institutes and centers play in fostering the future development of telehealth? PMID:20043711

  20. Developing next-generation telehealth tools and technologies: patients, systems, and data perspectives.

    PubMed

    Ackerman, Michael J; Filart, Rosemarie; Burgess, Lawrence P; Lee, Insup; Poropatich, Ronald K

    2010-01-01

    The major goals of telemedicine today are to develop next-generation telehealth tools and technologies to enhance healthcare delivery to medically underserved populations using telecommunication technology, to increase access to medical specialty services while decreasing healthcare costs, and to provide training of healthcare providers, clinical trainees, and students in health-related fields. Key drivers for these tools and technologies are the need and interest to collaborate among telehealth stakeholders, including patients, patient communities, research funders, researchers, healthcare services providers, professional societies, industry, healthcare management/economists, and healthcare policy makers. In the development, marketing, adoption, and implementation of these tools and technologies, communication, training, cultural sensitivity, and end-user customization are critical pieces to the process. Next-generation tools and technologies are vehicles toward personalized medicine, extending the telemedicine model to include cell phones and Internet-based telecommunications tools for remote and home health management with video assessment, remote bedside monitoring, and patient-specific care tools with event logs, patient electronic profile, and physician note-writing capability. Telehealth is ultimately a system of systems in scale and complexity. To cover the full spectrum of dynamic and evolving needs of end-users, we must appreciate system complexity as telehealth moves toward increasing functionality, integration, interoperability, outreach, and quality of service. Toward that end, our group addressed three overarching questions: (1) What are the high-impact topics? (2) What are the barriers to progress? and (3) What roles can the National Institutes of Health and its various institutes and centers play in fostering the future development of telehealth? PMID:20043711

  1. Method and tool for generating and managing image quality allocations through the design and development process

    NASA Astrophysics Data System (ADS)

    Sparks, Andrew W.; Olson, Craig; Theisen, Michael J.; Addiego, Chris J.; Hutchins, Tiffany G.; Goodman, Timothy D.

    2016-05-01

    Performance models for infrared imaging systems require image quality parameters; optical design engineers need image quality design goals; systems engineers develop image quality allocations to test imaging systems against. It is a challenge to maintain consistency and traceability amongst the various expressions of image quality. We present a method and parametric tool for generating and managing expressions of image quality during the system modeling, requirements specification, design, and testing phases of an imaging system design and development project.

  2. CHOPCHOP v2: a web tool for the next generation of CRISPR genome engineering.

    PubMed

    Labun, Kornel; Montague, Tessa G; Gagnon, James A; Thyme, Summer B; Valen, Eivind

    2016-07-01

    In just 3 years CRISPR genome editing has transformed biology, and its popularity and potency continue to grow. New CRISPR effectors and rules for locating optimum targets continue to be reported, highlighting the need for computational CRISPR targeting tools to compile these rules and facilitate target selection and design. CHOPCHOP is one of the most widely used web tools for CRISPR- and TALEN-based genome editing. Its overarching principle is to provide an intuitive and powerful tool that can serve both novice and experienced users. In this major update we introduce tools for the next generation of CRISPR advances, including Cpf1 and Cas9 nickases. We support a number of new features that improve the targeting power, usability and efficiency of CHOPCHOP. To increase targeting range and specificity we provide support for custom length sgRNAs, and we evaluate the sequence composition of the whole sgRNA and its surrounding region using models compiled from multiple large-scale studies. These and other new features, coupled with an updated interface for increased usability and support for a continually growing list of organisms, maintain CHOPCHOP as one of the leading tools for CRISPR genome editing. CHOPCHOP v2 can be found at http://chopchop.cbu.uib.no. PMID:27185894

  3. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  4. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  5. Automatic generation of bioinformatics tools for predicting protein–ligand binding sites

    PubMed Central

    Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-01-01

    Motivation: Predictive tools that model protein–ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein–ligand binding predictive tools would be useful. Results: We developed a system for automatically generating protein–ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5–1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. Availability and implementation: The source code and web application are freely available for download at http://utprot.net. They are implemented in Python and supported on Linux. Contact: shimizu@bi.a.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26545824

  6. CHOPCHOP v2: a web tool for the next generation of CRISPR genome engineering

    PubMed Central

    Labun, Kornel; Montague, Tessa G.; Gagnon, James A.; Thyme, Summer B.; Valen, Eivind

    2016-01-01

    In just 3 years CRISPR genome editing has transformed biology, and its popularity and potency continue to grow. New CRISPR effectors and rules for locating optimum targets continue to be reported, highlighting the need for computational CRISPR targeting tools to compile these rules and facilitate target selection and design. CHOPCHOP is one of the most widely used web tools for CRISPR- and TALEN-based genome editing. Its overarching principle is to provide an intuitive and powerful tool that can serve both novice and experienced users. In this major update we introduce tools for the next generation of CRISPR advances, including Cpf1 and Cas9 nickases. We support a number of new features that improve the targeting power, usability and efficiency of CHOPCHOP. To increase targeting range and specificity we provide support for custom length sgRNAs, and we evaluate the sequence composition of the whole sgRNA and its surrounding region using models compiled from multiple large-scale studies. These and other new features, coupled with an updated interface for increased usability and support for a continually growing list of organisms, maintain CHOPCHOP as one of the leading tools for CRISPR genome editing. CHOPCHOP v2 can be found at http://chopchop.cbu.uib.no PMID:27185894

  7. Automatic generation of conceptual database design tools from data model specifications

    SciTech Connect

    Hong, Shuguang.

    1989-01-01

    The problems faced in the design and implementation of database software systems based on object-oriented data models are similar to that of other software design, i.e., difficult, complex, yet redundant effort. Automatic generation of database software system has been proposed as a solution to the problems. In order to generate database software system for a variety of object-oriented data models, two critical issues: data model specification and software generation, must be addressed. SeaWeed is a software system that automatically generates conceptual database design tools from data model specifications. A meta model has been defined for the specification of a class of object-oriented data models. This meta model provides a set of primitive modeling constructs that can be used to express the semantics, or unique characteristics, of specific data models. Software reusability has been adopted for the software generation. The technique of design reuse is utilized to derive the requirement specification of the software to be generated from data model specifications. The mechanism of code reuse is used to produce the necessary reusable software components. This dissertation presents the research results of SeaWeed including the meta model, data model specification, a formal representation of design reuse and code reuse, and the software generation paradigm.

  8. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    SciTech Connect

    Boe, Timothy; Lemieux, Paul; Schultheisz, Daniel; Peake, Tom; Hayes, Colin

    2013-07-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies on the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri{sup R} ArcGIS{sup R} scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus{sup R}-MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel{sup R} 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)

  9. Geological applications of automatic grid generation tools for finite elements applied to porous flow modeling

    SciTech Connect

    Gable, C.W.; Trease, H.E.; Cherry, T.A.

    1996-04-01

    The construction of grids that accurately reflect geologic structure and stratigraphy for computational flow and transport models poses a formidable task. Even with a complete understanding of stratigraphy, material properties, boundary and initial conditions, the task of incorporating data into a numerical model can be difficult and time consuming. Furthermore, most tools available for representing complex geologic surfaces and volumes are not designed for producing optimal grids for flow and transport computation. We have developed a modeling tool, GEOMESH, for automating finite element grid generation that maintains the geometric integrity of geologic structure and stratigraphy. The method produces an optimal (Delaunay) tetrahedral grid that can be used for flow and transport computations. The process of developing a flow and transport model can be divided into three parts: (1) Developing accurate conceptual models inclusive of geologic interpretation, material characterization and construction of a stratigraphic and hydrostratigraphic framework model, (2) Building and initializing computational frameworks; grid generation, boundary and initial conditions, (3) Computational physics models of flow and transport. Process (1) and (3) have received considerable attention whereas (2) has not. This work concentrates on grid generation and its connections to geologic characterization and process modeling. Applications of GEOMESH illustrate grid generation for two dimensional cross sections, three dimensional regional models, and adaptive grid refinement in three dimensions. Examples of grid representation of wells and tunnels with GEOMESH can be found in Cherry et al. The resulting grid can be utilized by unstructured finite element or integrated finite difference models.

  10. Simulation Tool to Assess Mechanical and Electrical Stresses on Wind Turbine Generators: Preprint

    SciTech Connect

    Singh, M.; Muljadi, E.; Gevorgian, V.; Jonkman, J.

    2013-10-01

    Wind turbine generators (WTGs) consist of many different components to convert kinetic energy of the wind into electrical energy for end users. Wind energy is accessed to provide mechanical torque for driving the shaft of the electrical generator. The conversion from wind power to mechanical power is governed by the aerodynamic conversion. The aerodynamic-electrical-conversion efficiency of a WTGis influenced by the efficiency of the blades, the gearbox, the generator, and the power converter. This paper describes the use of MATLAB/Simulink to simulate the electrical and grid-related aspects of a WTG coupled with the FAST aero-elastic wind turbine computer-aided engineering tool to simulate the aerodynamic and mechanical aspects of a WTG. The combination of the two enables studiesinvolving both electrical and mechanical aspects of a WTG. This digest includes some examples of the capabilities of the FAST and MATLAB coupling, namely the effects of electrical faults on the blade moments.

  11. A generative tool for building health applications driven by ISO 13606 archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.

  12. Mutation based treatment recommendations from next generation sequencing data: a comparison of web tools

    PubMed Central

    Patel, Jaymin M.; Knopf, Joshua; Reiner, Eric; Bossuyt, Veerle; Epstein, Lianne; DiGiovanna, Michael; Chung, Gina; Silber, Andrea; Sanft, Tara; Hofstatter, Erin; Mougalian, Sarah; Abu-Khalaf, Maysa; Platt, James; Shi, Weiwei; Gershkovich, Peter; Hatzis, Christos; Pusztai, Lajos

    2016-01-01

    Interpretation of complex cancer genome data, generated by tumor target profiling platforms, is key for the success of personalized cancer therapy. How to draw therapeutic conclusions from tumor profiling results is not standardized and may vary among commercial and academically-affiliated recommendation tools. We performed targeted sequencing of 315 genes from 75 metastatic breast cancer biopsies using the FoundationOne assay. Results were run through 4 different web tools including the Drug-Gene Interaction Database (DGidb), My Cancer Genome (MCG), Personalized Cancer Therapy (PCT), and cBioPortal, for drug and clinical trial recommendations. These recommendations were compared amongst each other and to those provided by FoundationOne. The identification of a gene as targetable varied across the different recommendation sources. Only 33% of cases had 4 or more sources recommend the same drug for at least one of the usually several altered genes found in tumor biopsies. These results indicate further development and standardization of broadly applicable software tools that assist in our therapeutic interpretation of genomic data is needed. Existing algorithms for data acquisition, integration and interpretation will likely need to incorporate artificial intelligence tools to improve both content and real-time status. PMID:26980737

  13. A survey of tools for variant analysis of next-generation genome sequencing data

    PubMed Central

    Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R.; Zschocke, Johannes

    2014-01-01

    Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

  14. Metagenomics: Tools and Insights for Analyzing Next-Generation Sequencing Data Derived from Biodiversity Studies

    PubMed Central

    Oulas, Anastasis; Pavloudi, Christina; Polymenakou, Paraskevi; Pavlopoulos, Georgios A; Papanikolaou, Nikolas; Kotoulas, Georgios; Arvanitidis, Christos; Iliopoulos, Ioannis

    2015-01-01

    Advances in next-generation sequencing (NGS) have allowed significant breakthroughs in microbial ecology studies. This has led to the rapid expansion of research in the field and the establishment of “metagenomics”, often defined as the analysis of DNA from microbial communities in environmental samples without prior need for culturing. Many metagenomics statistical/computational tools and databases have been developed in order to allow the exploitation of the huge influx of data. In this review article, we provide an overview of the sequencing technologies and how they are uniquely suited to various types of metagenomic studies. We focus on the currently available bioinformatics techniques, tools, and methodologies for performing each individual step of a typical metagenomic dataset analysis. We also provide future trends in the field with respect to tools and technologies currently under development. Moreover, we discuss data management, distribution, and integration tools that are capable of performing comparative metagenomic analyses of multiple datasets using well-established databases, as well as commonly used annotation standards. PMID:25983555

  15. Systems biology and "omics" tools: a cooperation for next-generation mycorrhizal studies.

    PubMed

    Salvioli, Alessandra; Bonfante, Paola

    2013-04-01

    Omics tools constitute a powerful means of describing the complexity of plants and soil-borne microorganisms. Next generation sequencing technologies, coupled with emerging systems biology approaches, seem promising to represent a new strategy in the study of plant-microbe interactions. Arbuscular mycorrhizal fungi (AMF) are ubiquitous symbionts of plant roots, that provide their host with many benefits. However, as obligate biotrophs, AMF show a genetic, cellular and physiological complexity that makes the study of their biology as well as their effective agronomical exploitation rather difficult. Here, we speculate that the increasing availability of omics data on mycorrhiza and of computational tools that allow systems biology approaches represents a step forward in the understanding of arbuscular mycorrhizal symbiosis. Furthermore, the application of this study-perspective to agriculturally relevant model plants, such as tomato and rice, will lead to a better in-field exploitation of this beneficial symbiosis in the frame of low-input agriculture.

  16. The Creative task Creator: a tool for the generation of customized, Web-based creativity tasks.

    PubMed

    Pretz, Jean E; Link, John A

    2008-11-01

    This article presents a Web-based tool for the creation of divergent-thinking and open-ended creativity tasks. A Java program generates HTML forms with PHP scripting that run an Alternate Uses Task and/or open-ended response items. Researchers may specify their own instructions, objects, and time limits, or use default settings. Participants can also be prompted to select their best responses to the Alternate Uses Task (Silvia et al., 2008). Minimal programming knowledge is required. The program runs on any server, and responses are recorded in a standard MySQL database. Responses can be scored using the consensual assessment technique (Amabile, 1996) or Torrance's (1998) traditional scoring method. Adoption of this Web-based tool should facilitate creativity research across cultures and access to eminent creators. The Creative Task Creator may be downloaded from the Psychonomic Society's Archive of Norms, Stimuli, and Data, www.psychonomic.org/archive.

  17. ModelMage: a tool for automatic model generation, selection and management.

    PubMed

    Flöttmann, Max; Schaber, Jörg; Hoops, Stephan; Klipp, Edda; Mendes, Pedro

    2008-01-01

    Mathematical modeling of biological systems usually involves implementing, simulating, and discriminating several candidate models that represent alternative hypotheses. Generating and managing these candidate models is a tedious and difficult task and can easily lead to errors. ModelMage is a tool that facilitates management of candidate models. It is designed for the easy and rapid development, generation, simulation, and discrimination of candidate models. The main idea of the program is to automatically create a defined set of model alternatives from a single master model. The user provides only one SBML-model and a set of directives from which the candidate models are created by leaving out species, modifiers or reactions. After generating models the software can automatically fit all these models to the data and provides a ranking for model selection, in case data is available. In contrast to other model generation programs, ModelMage aims at generating only a limited set of models that the user can precisely define. ModelMage uses COPASI as a simulation and optimization engine. Thus, all simulation and optimization features of COPASI are readily incorporated. ModelMage can be downloaded from http://sysbio.molgen.mpg.de/modelmage and is distributed as free software. PMID:19425122

  18. Automated protein motif generation in the structure-based protein function prediction tool ProMOL.

    PubMed

    Osipovitch, Mikhail; Lambrecht, Mitchell; Baker, Cameron; Madha, Shariq; Mills, Jeffrey L; Craig, Paul A; Bernstein, Herbert J

    2015-12-01

    ProMOL, a plugin for the PyMOL molecular graphics system, is a structure-based protein function prediction tool. ProMOL includes a set of routines for building motif templates that are used for screening query structures for enzyme active sites. Previously, each motif template was generated manually and required supervision in the optimization of parameters for sensitivity and selectivity. We developed an algorithm and workflow for the automation of motif building and testing routines in ProMOL. The algorithm uses a set of empirically derived parameters for optimization and requires little user intervention. The automated motif generation algorithm was first tested in a performance comparison with a set of manually generated motifs based on identical active sites from the same 112 PDB entries. The two sets of motifs were equally effective in identifying alignments with homologs and in rejecting alignments with unrelated structures. A second set of 296 active site motifs were generated automatically, based on Catalytic Site Atlas entries with literature citations, as an expansion of the library of existing manually generated motif templates. The new motif templates exhibited comparable performance to the existing ones in terms of hit rates against native structures, homologs with the same EC and Pfam designations, and randomly selected unrelated structures with a different EC designation at the first EC digit, as well as in terms of RMSD values obtained from local structural alignments of motifs and query structures. This research is supported by NIH grant GM078077. PMID:26573864

  19. Face acquisition camera design using the NV-IPM image generation tool

    NASA Astrophysics Data System (ADS)

    Howell, Christopher L.; Choi, Hee-Sue; Reynolds, Joseph P.

    2015-05-01

    In this paper, we demonstrate the utility of the Night Vision Integrated Performance Model (NV-IPM) image generation tool by using it to create a database of face images with controlled degradations. Available face recognition algorithms can then be used to directly evaluate camera designs using these degraded images. By controlling camera effects such as blur, noise, and sampling, we can analyze algorithm performance and establish a more complete performance standard for face acquisition cameras. The ability to accurately simulate imagery and directly test with algorithms not only improves the system design process but greatly reduces development cost.

  20. Synthetic biology in mammalian cells: next generation research tools and therapeutics.

    PubMed

    Lienert, Florian; Lohmueller, Jason J; Garg, Abhishek; Silver, Pamela A

    2014-02-01

    Recent progress in DNA manipulation and gene circuit engineering has greatly improved our ability to programme and probe mammalian cell behaviour. These advances have led to a new generation of synthetic biology research tools and potential therapeutic applications. Programmable DNA-binding domains and RNA regulators are leading to unprecedented control of gene expression and elucidation of gene function. Rebuilding complex biological circuits such as T cell receptor signalling in isolation from their natural context has deepened our understanding of network motifs and signalling pathways. Synthetic biology is also leading to innovative therapeutic interventions based on cell-based therapies, protein drugs, vaccines and gene therapies.

  1. In vivo recombination as a tool to generate molecular diversity in phage antibody libraries.

    PubMed

    Sblattero, D; Lou, J; Marzari, R; Bradbury, A

    2001-06-01

    The creation of diversity in populations of polypeptides has become an important tool in the derivation of polypeptides with useful characteristics. This requires efficient methods to create diversity coupled with methods to select polypeptides with desired properties. In this review we describe the use of in vivo recombination as a powerful way to generate diversity. The novel principles for the recombination process and several applications of this process for the creation of phage antibody libraries are described. The advantage and disadvantages are discussed and possible future exploitation presented.

  2. Customised 3D Printing: An Innovative Training Tool for the Next Generation of Orbital Surgeons.

    PubMed

    Scawn, Richard L; Foster, Alex; Lee, Bradford W; Kikkawa, Don O; Korn, Bobby S

    2015-01-01

    Additive manufacturing or 3D printing is the process by which three dimensional data fields are translated into real-life physical representations. 3D printers create physical printouts using heated plastics in a layered fashion resulting in a three-dimensional object. We present a technique for creating customised, inexpensive 3D orbit models for use in orbital surgical training using 3D printing technology. These models allow trainee surgeons to perform 'wet-lab' orbital decompressions and simulate upcoming surgeries on orbital models that replicate a patient's bony anatomy. We believe this represents an innovative training tool for the next generation of orbital surgeons.

  3. Synthetic biology in mammalian cells: Next generation research tools and therapeutics

    PubMed Central

    Lienert, Florian; Lohmueller, Jason J; Garg, Abhishek; Silver, Pamela A

    2014-01-01

    Recent progress in DNA manipulation and gene circuit engineering has greatly improved our ability to programme and probe mammalian cell behaviour. These advances have led to a new generation of synthetic biology research tools and potential therapeutic applications. Programmable DNA-binding domains and RNA regulators are leading to unprecedented control of gene expression and elucidation of gene function. Rebuilding complex biological circuits such as T cell receptor signalling in isolation from their natural context has deepened our understanding of network motifs and signalling pathways. Synthetic biology is also leading to innovative therapeutic interventions based on cell-based therapies, protein drugs, vaccines and gene therapies. PMID:24434884

  4. The Nuclear Network Generator NETGEN v10.0: A Tool for Nuclear Astrophysics

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Goriely, S.; Jorissen, A.; Takahashi, K.; Arnould, M.

    2011-09-01

    We present an updated release of the Brussels Nuclear Network Generator. NETGEN is a tool to help astrophysicists build nuclear reaction networks by generating tables of rates of light-particle (mostly n, p, α) induced reactions, nucleus-nucleus fusion reactions, and photodisintegrations, as well as β-decays and electron captures on temperature grids specified by the user. Nuclear reaction networks relevant to a large variety of astrophysical situations can be constructed, including Big-Bang nucleosynthesis, stellar hydrostatic and explosive hydrogen-, helium- and later burning phases, as well as the synthesis of heavy nuclides (s-, r-, p-, rp-, α-processes). The latest version, NETGEN v10.0, is available on the ULB-IAA website www.astro.ulb.ac.be/Netgen/form.html.

  5. Decidability problems for meta-R-functions

    SciTech Connect

    Losovik, L.P.; Drobyshev, P.V.

    1995-09-01

    In this article we consider classes of functions defined by R-transformers, which are machines that sequentially process real numbers represented in the binary number system. The class of real functions defined by R-transformers includes all continuous and some discontinuous functions. The closure of this class under superposition produces the wider class of real meta-R-functions. Such functions are defined by finite sequences of R-transformers. Here we examine the class of meta-R-meta-R-functions. Such functions are defined by finite sequences of R-transformers. Here we examine the class of meta-R-functions and specifically the class of finite meta-R-functions. The latter are defined by finite sequences of finite R-transformers. Decidability of the equivalent problem in the class of functions defined by finite R-transformers, i.e., the class of finite R-functions, is proved elsewhere. Here we generalize this result to the class of finite meta-R-functions. We investigate not only the equivalence problem, but also the monotonicity and continuity problems. The proof is by reduction to the decidable nonemptiness problem for nondeterministic bounded-mode finite transformers with finite-turnaround counters on labeled trees. We also consider the general properties of the class of meta-R-functions.

  6. Gravity Waves Generated by Convection: A New Idealized Model Tool and Direct Validation with Satellite Observations

    NASA Astrophysics Data System (ADS)

    Alexander, M. Joan; Stephan, Claudia

    2015-04-01

    In climate models, gravity waves remain too poorly resolved to be directly modelled. Instead, simplified parameterizations are used to include gravity wave effects on model winds. A few climate models link some of the parameterized waves to convective sources, providing a mechanism for feedback between changes in convection and gravity wave-driven changes in circulation in the tropics and above high-latitude storms. These convective wave parameterizations are based on limited case studies with cloud-resolving models, but they are poorly constrained by observational validation, and tuning parameters have large uncertainties. Our new work distills results from complex, full-physics cloud-resolving model studies to essential variables for gravity wave generation. We use the Weather Research Forecast (WRF) model to study relationships between precipitation, latent heating/cooling and other cloud properties to the spectrum of gravity wave momentum flux above midlatitude storm systems. Results show the gravity wave spectrum is surprisingly insensitive to the representation of microphysics in WRF. This is good news for use of these models for gravity wave parameterization development since microphysical properties are a key uncertainty. We further use the full-physics cloud-resolving model as a tool to directly link observed precipitation variability to gravity wave generation. We show that waves in an idealized model forced with radar-observed precipitation can quantitatively reproduce instantaneous satellite-observed features of the gravity wave field above storms, which is a powerful validation of our understanding of waves generated by convection. The idealized model directly links observations of surface precipitation to observed waves in the stratosphere, and the simplicity of the model permits deep/large-area domains for studies of wave-mean flow interactions. This unique validated model tool permits quantitative studies of gravity wave driving of regional

  7. Developing a tool to estimate water withdrawal and consumption in electricity generation in the United States.

    SciTech Connect

    Wu, M.; Peng, J.

    2011-02-24

    Freshwater consumption for electricity generation is projected to increase dramatically in the next couple of decades in the United States. The increased demand is likely to further strain freshwater resources in regions where water has already become scarce. Meanwhile, the automotive industry has stepped up its research, development, and deployment efforts on electric vehicles (EVs) and plug-in hybrid electric vehicles (PHEVs). Large-scale, escalated production of EVs and PHEVs nationwide would require increased electricity production, and so meeting the water demand becomes an even greater challenge. The goal of this study is to provide a baseline assessment of freshwater use in electricity generation in the United States and at the state level. Freshwater withdrawal and consumption requirements for power generated from fossil, nonfossil, and renewable sources via various technologies and by use of different cooling systems are examined. A data inventory has been developed that compiles data from government statistics, reports, and literature issued by major research institutes. A spreadsheet-based model has been developed to conduct the estimates by means of a transparent and interactive process. The model further allows us to project future water withdrawal and consumption in electricity production under the forecasted increases in demand. This tool is intended to provide decision makers with the means to make a quick comparison among various fuel, technology, and cooling system options. The model output can be used to address water resource sustainability when considering new projects or expansion of existing plants.

  8. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    NASA Astrophysics Data System (ADS)

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi

  9. Mobile Phones Democratize and Cultivate Next-Generation Imaging, Diagnostics and Measurement Tools

    PubMed Central

    Ozcan, Aydogan

    2014-01-01

    In this article, I discuss some of the emerging applications and the future opportunities and challenges created by the use of mobile phones and their embedded components for the development of next-generation imaging, sensing, diagnostics and measurement tools. The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones, transforming the mobile phone into a cost-effective and yet extremely powerful platform to run e.g., biomedical tests and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally. PMID:24647550

  10. Tool for Generation of MAC/GMC Representative Unit Cell for CMC/PMC Analysis

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Pineda, Evan J.

    2016-01-01

    This document describes a recently developed analysis tool that enhances the resident capabilities of the Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) 4.0. This tool is especially useful in analyzing ceramic matrix composites (CMCs), where higher fidelity with improved accuracy of local response is needed. The tool, however, can be used for analyzing polymer matrix composites (PMCs) as well. MAC/GMC 4.0 is a composite material and laminate analysis software developed at NASA Glenn Research Center. The software package has been built around the concept of the generalized method of cells (GMC). The computer code is developed with a user friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermomechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The primary focus of the current effort is to provide a graphical user interface (GUI) capability that generates a number of different user-defined repeating unit cells (RUCs). In addition, the code has provisions for generation of a MAC/GMC-compatible input text file that can be merged with any MAC/GMC input file tailored to analyze composite materials. Although the primary intention was to address the three different constituents and phases that are usually present in CMCs-namely, fibers, matrix, and interphase-it can be easily modified to address two-phase polymer matrix composite (PMC) materials where an interphase is absent. Currently, the

  11. New generation of medium wattage metal halide lamps and spectroscopic tools for their diagnostics

    NASA Astrophysics Data System (ADS)

    Dunaevsky, A.; Tu, J.; Gibson, R.; Steere, T.; Graham, K.; van der Eyden, J.

    2010-11-01

    A new generation of ceramic metal halide high intensity discharge (HID) lamps has achieved high efficiencies by implementing new design concepts. The shape of the ceramic burner is optimized to withstand high temperatures with minimal thermal stress. Corrosion processes with the ceramic walls are slowed down via adoption of non-aggressive metal halide chemistry. Light losses over life due to tungsten deposition on the walls are minimized by maintaining a self-cleaning chemical process, known as tungsten cycle. All these advancements have made the new ceramic metal halide lamps comparable to high pressure sodium lamps for luminous efficacy, life, and maintenance while providing white light with high color rendering. Direct replacement of quartz metal halide lamps and systems results in the energy saving from 18 up to 50%. High resolution spectroscopy remains the major non-destructive tool for the ceramic metal halide lamps. Approaches to reliable measurements of relative partial pressures of the arc species are discussed.

  12. Mobile phones democratize and cultivate next-generation imaging, diagnostics and measurement tools.

    PubMed

    Ozcan, Aydogan

    2014-09-01

    In this article, I discuss some of the emerging applications and the future opportunities and challenges created by the use of mobile phones and their embedded components for the development of next-generation imaging, sensing, diagnostics and measurement tools. The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones, transforming the mobile phone into a cost-effective and yet extremely powerful platform to run, e.g., biomedical tests, and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally.

  13. Numerical study of electromagnetic waves generated by a prototype dielectric logging tool

    USGS Publications Warehouse

    Ellefsen, K.J.; Abraham, J.D.; Wright, D.L.; Mazzella, A.T.

    2004-01-01

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a numerical study was conducted using both the finite-difference, time-domain method and a frequency-wavenumber method. When the propagation velocity in the borehole was greater than that in the formation (e.g., an air-filled borehole in the unsaturated zone), only a guided wave propagated along the borehole. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave radiated electromagnetic energy into the formation, causing its amplitude to decrease. When the propagation velocity in the borehole was less than that in the formation (e.g., a water-filled borehole in the saturated zone), both a refracted wave and a guided wave propagated along the borehole. The velocity of the refracted wave equaled the phase velocity of a plane wave in the formation, and the refracted wave preceded the guided wave. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave did not radiate electromagnetic energy into the formation. To analyze traces recorded by the prototype tool during laboratory tests, they were compared to traces calculated with the finite-difference method. The first parts of both the recorded and the calculated traces were similar, indicating that guided and refracted waves indeed propagated along the prototype tool. ?? 2004 Society of Exploration Geophysicists. All rights reserved.

  14. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    SciTech Connect

    Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan; Saha, Pradip; Loewen, Eric

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parameters applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.

  15. Recombineering strategies for developing next generation BAC transgenic tools for optogenetics and beyond

    PubMed Central

    Ting, Jonathan T.; Feng, Guoping

    2014-01-01

    The development and application of diverse BAC transgenic rodent lines has enabled rapid progress for precise molecular targeting of genetically-defined cell types in the mammalian central nervous system. These transgenic tools have played a central role in the optogenetic revolution in neuroscience. Indeed, an overwhelming proportion of studies in this field have made use of BAC transgenic Cre driver lines to achieve targeted expression of optogenetic probes in the brain. In addition, several BAC transgenic mouse lines have been established for direct cell-type specific expression of Channelrhodopsin-2 (ChR2). While the benefits of these new tools largely outweigh any accompanying challenges, many available BAC transgenic lines may suffer from confounds due in part to increased gene dosage of one or more “extra” genes contained within the large BAC DNA sequences. Here we discuss this under-appreciated issue and propose strategies for developing the next generation of BAC transgenic lines that are devoid of extra genes. Furthermore, we provide evidence that these strategies are simple, reproducible, and do not disrupt the intended cell-type specific transgene expression patterns for several distinct BAC clones. These strategies may be widely implemented for improved BAC transgenesis across diverse disciplines. PMID:24772073

  16. 34 CFR 85.942 - ED Deciding Official.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false ED Deciding Official. 85.942 Section 85.942 Education Office of the Secretary, Department of Education GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 85.942 ED Deciding Official. The ED Deciding Official is an ED officer who has...

  17. LCFM - LIVING COLOR FRAME MAKER: PC GRAPHICS GENERATION AND MANAGEMENT TOOL FOR REAL-TIME APPLICATIONS

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1994-01-01

    Computer graphics are often applied for better understanding and interpretation of data under observation. These graphics become more complicated when animation is required during "run-time", as found in many typical modern artificial intelligence and expert systems. Living Color Frame Maker is a solution to many of these real-time graphics problems. Living Color Frame Maker (LCFM) is a graphics generation and management tool for IBM or IBM compatible personal computers. To eliminate graphics programming, the graphic designer can use LCFM to generate computer graphics frames. The graphical frames are then saved as text files, in a readable and disclosed format, which can be easily accessed and manipulated by user programs for a wide range of "real-time" visual information applications. For example, LCFM can be implemented in a frame-based expert system for visual aids in management of systems. For monitoring, diagnosis, and/or controlling purposes, circuit or systems diagrams can be brought to "life" by using designated video colors and intensities to symbolize the status of hardware components (via real-time feedback from sensors). Thus status of the system itself can be displayed. The Living Color Frame Maker is user friendly with graphical interfaces, and provides on-line help instructions. All options are executed using mouse commands and are displayed on a single menu for fast and easy operation. LCFM is written in C++ using the Borland C++ 2.0 compiler for IBM PC series computers and compatible computers running MS-DOS. The program requires a mouse and an EGA/VGA display. A minimum of 77K of RAM is also required for execution. The documentation is provided in electronic form on the distribution medium in WordPerfect format. A sample MS-DOS executable is provided on the distribution medium. The standard distribution medium for this program is one 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools

  18. ImageParser: a tool for finite element generation from three-dimensional medical images

    PubMed Central

    Yin, HM; Sun, LZ; Wang, G; Yamada, T; Wang, J; Vannier, MW

    2004-01-01

    Background The finite element method (FEM) is a powerful mathematical tool to simulate and visualize the mechanical deformation of tissues and organs during medical examinations or interventions. It is yet a challenge to build up an FEM mesh directly from a volumetric image partially because the regions (or structures) of interest (ROIs) may be irregular and fuzzy. Methods A software package, ImageParser, is developed to generate an FEM mesh from 3-D tomographic medical images. This software uses a semi-automatic method to detect ROIs from the context of image including neighboring tissues and organs, completes segmentation of different tissues, and meshes the organ into elements. Results The ImageParser is shown to build up an FEM model for simulating the mechanical responses of the breast based on 3-D CT images. The breast is compressed by two plate paddles under an overall displacement as large as 20% of the initial distance between the paddles. The strain and tangential Young's modulus distributions are specified for the biomechanical analysis of breast tissues. Conclusion The ImageParser can successfully exact the geometry of ROIs from a complex medical image and generate the FEM mesh with customer-defined segmentation information. PMID:15461787

  19. gLAB-A Fully Software Tool to Generate, Process and Analyze GNSS Signals

    NASA Astrophysics Data System (ADS)

    Dionisio, Cesare; Citterico, Dario; Pirazzi, Gabriele; De Quattro, Nicola; Marracci, Riccardo; Cucchi, Luca; Valdambrini, Nicola; Formaioni, Irene

    2010-08-01

    In this paper the concept of Software Defined Radio (SDR) and its use in modern GNSS receiver is highlighted demonstrating how software receivers are important in many situations especially for verification and validation. After a brief introduction of gLab, a fully software high modular tool to generate, process and analyze current and future GNSS signals, the different software modules will be described. Demonstrating the wide range of uses concerning gLab, different practical example will be briefly overviewed: from the analysis of real data over the experimental GIOVE-B satellite, to the antenna group delay determination or the CN0 estimation under wide dynamic range etc.. gLab is the result of different projects leaded by Intecs in GNSS SW Radio: the signal generator is the result of the SWAN (Sistemi softWare per Applicazioni di Navigazione) project under Italian Space Agency (ASI) contract, the analyzer and the processing module have been developed for ESA to V&V the IOV (In Orbit Validation) Galileo Phase. In this case the GNSS SW RX works in parallel with Test User Receivers (TUR) in order to validate the Signal In Space (SiS). Is remarkable that gLab is the result of over three years of development and approximately one year of test and validation under ESA (European Space Agency) supervision.

  20. Generation of Look-Up Tables for Dynamic Job Shop Scheduling Decision Support Tool

    NASA Astrophysics Data System (ADS)

    Oktaviandri, Muchamad; Hassan, Adnan; Mohd Shaharoun, Awaluddin

    2016-02-01

    Majority of existing scheduling techniques are based on static demand and deterministic processing time, while most job shop scheduling problem are concerned with dynamic demand and stochastic processing time. As a consequence, the solutions obtained from the traditional scheduling technique are ineffective wherever changes occur to the system. Therefore, this research intends to develop a decision support tool (DST) based on promising artificial intelligent that is able to accommodate the dynamics that regularly occur in job shop scheduling problem. The DST was designed through three phases, i.e. (i) the look-up table generation, (ii) inverse model development and (iii) integration of DST components. This paper reports the generation of look-up tables for various scenarios as a part in development of the DST. A discrete event simulation model was used to compare the performance among SPT, EDD, FCFS, S/OPN and Slack rules; the best performances measures (mean flow time, mean tardiness and mean lateness) and the job order requirement (inter-arrival time, due dates tightness and setup time ratio) which were compiled into look-up tables. The well-known 6/6/J/Cmax Problem from Muth and Thompson (1963) was used as a case study. In the future, the performance measure of various scheduling scenarios and the job order requirement will be mapped using ANN inverse model.

  1. Humans and Insects Decide in Similar Ways

    PubMed Central

    Louâpre, Philippe; van Alphen, Jacques J. M.; Pierre, Jean-Sébastien

    2010-01-01

    Behavioral ecologists assume that animals use a motivational mechanism for decisions such as action selection and time allocation, allowing the maximization of their fitness. They consider both the proximate and ultimate causes of behavior in order to understand this type of decision-making in animals. Experimental psychologists and neuroeconomists also study how agents make decisions but they consider the proximate causes of the behavior. In the case of patch-leaving, motivation-based decision-making remains simple speculation. In contrast to other animals, human beings can assess and evaluate their own motivation by an introspection process. It is then possible to study the declared motivation of humans during decision-making and discuss the mechanism used as well as its evolutionary significance. In this study, we combine both the proximate and ultimate causes of behavior for a better understanding of the human decision-making process. We show for the first time ever that human subjects use a motivational mechanism similar to small insects such as parasitoids [1] and bumblebees [2] to decide when to leave a patch. This result is relevant for behavioral ecologists as it supports the biological realism of this mechanism. Humans seem to use a motivational mechanism of decision making known to be adaptive to a heterogeneously distributed resource. As hypothesized by Hutchinson et al. [3] and Wilke and Todd [4], our results are consistent with the evolutionary shaping of decision making because hominoids were hunters and gatherers on food patches for more than two million years. We discuss the plausibility of a neural basis for the motivation mechanism highlighted here, bridging the gap between behavioral ecology and neuroeconomy. Thus, both the motivational mechanism observed here and the neuroeconomy findings are most likely adaptations that were selected for during ancestral times. PMID:21170378

  2. Geant4-DNA simulations using complex DNA geometries generated by the DnaFabric tool

    NASA Astrophysics Data System (ADS)

    Meylan, S.; Vimont, U.; Incerti, S.; Clairand, I.; Villagrasa, C.

    2016-07-01

    Several DNA representations are used to study radio-induced complex DNA damages depending on the approach and the required level of granularity. Among all approaches, the mechanistic one requires the most resolved DNA models that can go down to atomistic DNA descriptions. The complexity of such DNA models make them hard to modify and adapt in order to take into account different biological conditions. The DnaFabric project was started to provide a tool to generate, visualise and modify such complex DNA models. In the current version of DnaFabric, the models can be exported to the Geant4 code to be used as targets in the Monte Carlo simulation. In this work, the project was used to generate two DNA fibre models corresponding to two DNA compaction levels representing the hetero and the euchromatin. The fibres were imported in a Geant4 application where computations were performed to estimate the influence of the DNA compaction on the amount of calculated DNA damage. The relative difference of the DNA damage computed in the two fibres for the same number of projectiles was found to be constant and equal to 1.3 for the considered primary particles (protons from 300 keV to 50 MeV). However, if only the tracks hitting the DNA target are taken into account, then the relative difference is more important for low energies and decreases to reach zero around 10 MeV. The computations were performed with models that contain up to 18,000 DNA nucleotide pairs. Nevertheless, DnaFabric will be extended to manipulate multi-scale models that go from the molecular to the cellular levels.

  3. Recent advances in i-Gene tools and analysis: microarrays, next generation sequencing and mass spectrometry.

    PubMed

    Moorhouse, Michael J; Sharma, Hari S

    2011-08-01

    Recent advances in technology and associated methodology have made the current period one of the most exciting in molecular biology and medicine. Underlying these is an appreciation that modern research is driven by increasing large amounts of data being interpreted by interdisciplinary collaborative teams which are often geographically dispersed. The availability of cheap computing power, high speed informatics networks and high quality analysis software has been essential to this as has the application of modern quality assurance methodologies. In this review, we discuss the application of modern 'High-Throughput' molecular biological technologies such as 'Microarrays' and 'Next Generation Sequencing' to scientific and biomedical research as we have observed. Furthermore in this review, we also offer some guidance that enables the reader as to understand certain features of these as well as new strategies and help them to apply these i-Gene tools in their endeavours successfully. Collectively, we term this 'i-Gene Analysis'. We also offer predictions as to the developments that are anticipated in the near and more distant future.

  4. Click chemistry generated model DNA-peptide heteroconjugates as tools for mass spectrometry.

    PubMed

    Flett, Fiona J; Walton, Jeffrey G A; Mackay, C Logan; Interthal, Heidrun

    2015-10-01

    UV cross-linking of nucleic acids to proteins in combination with mass spectrometry is a powerful technique to identify proteins, peptides, and the amino acids involved in intermolecular interactions within nucleic acid-protein complexes. However, the mass spectrometric identification of cross-linked nucleic acid-protein heteroconjugates in complex mixtures and MS/MS characterization of the specific sites of cross-linking is extremely challenging. As a tool for the optimization of sample preparation, ionization, fragmentation, and detection by mass spectrometry, novel synthetic DNA-peptide heteroconjugates were generated to act as mimics of UV cross-linked heteroconjugates. Click chemistry was employed to cross-link peptides to DNA oligonucleotides. These heteroconjugates were fully characterized by high resolution FTICR mass spectrometry and by collision-induced dissociation (CID) following nuclease P1 digestion of the DNA moiety to a single nucleotide monophosphate. This allowed the exact site of the cross-linking within the peptide to be unambiguously assigned. These synthetic DNA-peptide heteroconjugates have the potential to be of use for a variety of applications that involve DNA-peptide heteroconjugates.

  5. An accurate tool for the fast generation of dark matter halo catalogues

    NASA Astrophysics Data System (ADS)

    Monaco, P.; Sefusatti, E.; Borgani, S.; Crocce, M.; Fosalba, P.; Sheth, R. K.; Theuns, T.

    2013-08-01

    We present a new parallel implementation of the PINpointing Orbit Crossing-Collapsed HIerarchical Objects (PINOCCHIO) algorithm, a quick tool, based on Lagrangian Perturbation Theory, for the hierarchical build-up of dark matter (DM) haloes in cosmological volumes. To assess its ability to predict halo correlations on large scales, we compare its results with those of an N-body simulation of a 3 h-1 Gpc box sampled with 20483 particles taken from the MICE suite, matching the same seeds for the initial conditions. Thanks to the Fastest Fourier Transforms in the West (FFTW) libraries and to the relatively simple design, the code shows very good scaling properties. The CPU time required by PINOCCHIO is a tiny fraction (˜1/2000) of that required by the MICE simulation. Varying some of PINOCCHIO numerical parameters allows one to produce a universal mass function that lies in the range allowed by published fits, although it underestimates the MICE mass function of Friends-of-Friends (FoF) haloes in the high-mass tail. We compare the matter-halo and the halo-halo power spectra with those of the MICE simulation and find that these two-point statistics are well recovered on large scales. In particular, when catalogues are matched in number density, agreement within 10 per cent is achieved for the halo power spectrum. At scales k > 0.1 h Mpc-1, the inaccuracy of the Zel'dovich approximation in locating halo positions causes an underestimate of the power spectrum that can be modelled as a Gaussian factor with a damping scale of d = 3 h-1 Mpc at z = 0, decreasing at higher redshift. Finally, a remarkable match is obtained for the reduced halo bispectrum, showing a good description of non-linear halo bias. Our results demonstrate the potential of PINOCCHIO as an accurate and flexible tool for generating large ensembles of mock galaxy surveys, with interesting applications for the analysis of large galaxy redshift surveys.

  6. Use of a Best Estimate Power Monitoring Tool to Maximize Power Plant Generation

    SciTech Connect

    Dziuba, Lindsey L.

    2006-07-01

    The Best Estimate Power Monitor (BEPM) is a tool that was developed to maximize nuclear power plant generation, while ensuring regulatory compliance in the face of venturi fouling, industry ultra-sonic flowmeter issues and other technical challenges. The BEPM uses ASME approved 'best estimate' methodology described in PTC 19.1-1985, 'Measurement Uncertainty', Section 3.8, 'Weighting Method'. The BEPM method utilizes many different and independent indicators of core thermal power and independently computes the core thermal power (CTP) from each parameter. The uncertainty of each measurement is used to weight the results of the best estimate computation of CTP such that those with lower uncertainties are weighted more heavily in the computed result. The independence of these measurements is used to minimize the uncertainty of the aggregate result, and the overall uncertainty can be much lower than the uncertainties of any of the individual measured parameters. Examples of the Balance of Plant parameters used in the BEPM are turbine first stage pressure, venturi feedwater flow, condensate flow, main steam flow, high pressure turbine exhaust pressure, low pressure turbine inlet pressure, the two highest pressure feedwater heater extraction pressures, and final feedwater temperature. The BEPM typically makes use of installed plant instrumentation that provide data to the plant computer. Therefore, little or no plant modification is required. In order to compute core thermal power from the independent indicators, a set of baseline data is used for comparison. These baseline conditions are taken from a day when confidence in the value of core thermal power is high (i.e., immediately post outage when venturi fouling is not an issue or from a formal tracer test). This provides the reference point on which to base the core thermal power calculations for each of the independent parameters. The BEPM is effective only at the upper end of the power range, where the independent

  7. Tools for Generating Useful Time-series Data from PhenoCam Images

    NASA Astrophysics Data System (ADS)

    Milliman, T. E.; Friedl, M. A.; Frolking, S.; Hufkens, K.; Klosterman, S.; Richardson, A. D.; Toomey, M. P.

    2012-12-01

    The PhenoCam project (http://phenocam.unh.edu/) is tasked with acquiring, processing, and archiving digital repeat photography to be used for scientific studies of vegetation phenological processes. Over the past 5 years the PhenoCam project has collected over 2 million time series images for a total over 700 GB of image data. Several papers have been published describing derived "vegetation indices" (such as green-chromatic-coordinate or gcc) which can be compared to standard measures such as NDVI or EVI. Imagery from our archive is available for download but converting series of images for a particular camera into useful scientific data, while simple in principle, is complicated by a variety of factors. Cameras are often exposed to harsh weather conditions (high wind, rain, ice, snow pile up), which result in images where the field of view (FOV) is partially obscured or completely blocked for periods of time. The FOV can also change for other reasons (mount failures, tower maintenance, etc.) Some of the relatively inexpensive cameras that are being used can also temporarily lose color balance or exposure controls resulting in loss of imagery. All these factors negatively influence the automated analysis of the image time series making this a non-trivial task. Here we discuss the challenges of processing PhenoCam image time-series for vegetation monitoring and the associated data management tasks. We describe our current processing framework and a simple standardized output format for the resulting time-series data. The time-series data in this format will be generated for specific "regions of interest" (ROI's) for each of the cameras in the PhenoCam network. This standardized output (which will be updated daily) can be considered 'the pulse' of a particular camera and will provide a default phenological dynamic for said camera. The time-series data can also be viewed as a higher level product which can be used to generate "vegetation indices", like gcc, for

  8. Re-Imagining Specialized STEM Academies: Igniting and Nurturing "Decidedly Different Minds", by Design

    ERIC Educational Resources Information Center

    Marshall, Stephanie Pace

    2010-01-01

    This article offers a personal vision and conceptual design for reimagining specialized science, technology, engineering, and mathematics (STEM) academies designed to nurture "decidedly different" STEM minds and ignite a new generation of global STEM talent, innovation, and entrepreneurial leadership. This design enables students to engage…

  9. HAPCAD: An open-source tool to detect PCR crossovers in next-generation sequencing generated HLA data.

    PubMed

    McDevitt, Shana L; Bredeson, Jessen V; Roy, Scott W; Lane, Julie A; Noble, Janelle A

    2016-03-01

    Next-generation sequencing (NGS) based HLA genotyping can generate PCR artifacts corresponding to IMGT/HLA Database alleles, for which multiple examples have been observed, including sequence corresponding to the HLA-DRB1(∗)03:42 allele. Repeat genotyping of 131 samples, previously genotyped as DRB1(∗)03:01 homozygotes using probe-based methods, resulted in the heterozygous call DRB1(∗)03:01+DRB1(∗)03:42. The apparent rare DRB1(∗)03:42 allele is hypothesized to be a "hybrid amplicon" generated by PCR crossover, a process in which a partial PCR product denatures from its template, anneals to a different allele template, and extends to completion. Unlike most PCR crossover products, "hybrid amplicons" always corresponds to an IMGT/HLA Database allele, necessitating a case-by-case analysis of whether its occurrence reflects the actual allele or is simply the result of PCR crossover. The Hybrid Amplicon/PCR Crossover Artifact Detector (HAPCAD) program mimics jumping PCR in silico and flags allele sequences that may also be generated as hybrid amplicon.

  10. Future generations of horizontal tools will make tighter turns and last longer

    SciTech Connect

    Lyle, D.

    1995-10-01

    Operators want horizontal tools that turn tighter and last longer, and manufacturers are working to meet the need. An operator needs control of tools in the hole to drill a good horizontal well, and service and supply companies are trying to improve that control.

  11. 45 CFR 681.33 - How is the case decided?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 3 2013-10-01 2013-10-01 false How is the case decided? 681.33 Section 681.33... FRAUD CIVIL REMEDIES ACT REGULATIONS Decisions and Appeals § 681.33 How is the case decided? (a) The ALJ... or statements identified in the complaint violate this part; and (2) If the defendant is liable...

  12. 45 CFR 681.33 - How is the case decided?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false How is the case decided? 681.33 Section 681.33... FRAUD CIVIL REMEDIES ACT REGULATIONS Decisions and Appeals § 681.33 How is the case decided? (a) The ALJ... or statements identified in the complaint violate this part; and (2) If the defendant is liable...

  13. Deciding not to decide: computational and neural evidence for hidden behavior in sequential choice.

    PubMed

    Gluth, Sebastian; Rieskamp, Jörg; Büchel, Christian

    2013-10-01

    Understanding the cognitive and neural processes that underlie human decision making requires the successful prediction of how, but also of when, people choose. Sequential sampling models (SSMs) have greatly advanced the decision sciences by assuming decisions to emerge from a bounded evidence accumulation process so that response times (RTs) become predictable. Here, we demonstrate a difficulty of SSMs that occurs when people are not forced to respond at once but are allowed to sample information sequentially: The decision maker might decide to delay the choice and terminate the accumulation process temporarily, a scenario not accounted for by the standard SSM approach. We developed several SSMs for predicting RTs from two independent samples of an electroencephalography (EEG) and a functional magnetic resonance imaging (fMRI) study. In these studies, participants bought or rejected fictitious stocks based on sequentially presented cues and were free to respond at any time. Standard SSM implementations did not describe RT distributions adequately. However, by adding a mechanism for postponing decisions to the model we obtained an accurate fit to the data. Time-frequency analysis of EEG data revealed alternating states of de- and increasing oscillatory power in beta-band frequencies (14-30 Hz), indicating that responses were repeatedly prepared and inhibited and thus lending further support for the existence of a decision not to decide. Finally, the extended model accounted for the results of an adapted version of our paradigm in which participants had to press a button for sampling more information. Our results show how computational modeling of decisions and RTs support a deeper understanding of the hidden dynamics in cognition.

  14. deepTools2: a next generation web server for deep-sequencing data analysis.

    PubMed

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-01

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available.

  15. deepTools2: a next generation web server for deep-sequencing data analysis.

    PubMed

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-01

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. PMID:27079975

  16. deepTools2: a next generation web server for deep-sequencing data analysis

    PubMed Central

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-01-01

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de. The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. PMID:27079975

  17. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    NASA Astrophysics Data System (ADS)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  18. A surface data generation method of optical micro-structure and analysis system for Fast Tool Servo fabricating

    NASA Astrophysics Data System (ADS)

    Yang, Fan; Dai, Yi-fan; Wan, Fei; Wang, Gui-lin

    2010-10-01

    High-precision optical micro-structured components are now widely used in the field of military and civilian use. Ultraprecision machining with a fast tool servo (FTS) is one of the leading methodologies for fabrication of such surfaces. The first important issue that faced in ultra-precision and high-effectively fabricating is how to properly describe the complex shapes based on the principle of FTS. In order to meet the demands of FTS machining that need for tool high-frequency response, high data throughput and huge memory space, an off-line discrete data points generation method for microstructure surfaces is presented which can avoid on-line shape calculation in fabricating process. A new analysis software package is developed to compute the speed, acceleration and spectrum over the generated data points which helps to analysis the tool tracking characteristics needed in fabricating. Also a new mechanism for FTS machining data transmission based on the huge-capacity storage device is proposed. Experiments show that the off-line surface data generation method and data transfer mechanism can effectively improve FTS fabricating efficiency, the surface analysis software can help to determine the machining ability of tool-holder and to guide and optimize the processing parameters such as spindle speed, feed rate, etc.

  19. Generated spiral bevel gears: Optimal machine-tool settings and tooth contact analysis

    NASA Technical Reports Server (NTRS)

    Litvin, F. L.; Tsung, W. J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  20. A Study on Tooling and Its Effect on Heat Generation and Mechanical Properties of Welded Joints in Friction Stir Welding

    NASA Astrophysics Data System (ADS)

    Tikader, Sujoy; Biswas, Pankaj; Puri, Asit Baran

    2016-06-01

    Friction stir welding (FSW) has been the most attracting solid state welding process as it serves numerous advantages like good mechanical, metallurgical properties etc. Non weldable aluminium alloys like 5XXX, 7XXX series can be simply joined by this process. In this present study a mathematical model has been developed and experiments were successfully performed to evaluate mechanical properties of FSW on similar aluminium alloys i.e. AA1100 for different process parameters and mainly two kind of tool geometry (straight cylindrical and conical or cylindrical tapered shaped pin with flat shoulder). Tensile strength and micro hardness for different process parameters are reported of the welded plate sample. It was noticed that in FSW of similar alloy with tool made of SS-310 tool steel, friction is the major contributor for the heat generation. It was seen that tool geometry, tool rotational speed, plunging force by the tool and traverse speed have significant effect on tensile strength and hardness of friction stir welded joints.

  1. BEST: Next-Generation Biomedical Entity Search Tool for Knowledge Discovery from Biomedical Literature

    PubMed Central

    Lee, Kyubum; Choi, Jaehoon; Kim, Seongsoon; Jeon, Minji; Lim, Sangrak; Choi, Donghee; Kim, Sunkyu; Tan, Aik-Choon

    2016-01-01

    As the volume of publications rapidly increases, searching for relevant information from the literature becomes more challenging. To complement standard search engines such as PubMed, it is desirable to have an advanced search tool that directly returns relevant biomedical entities such as targets, drugs, and mutations rather than a long list of articles. Some existing tools submit a query to PubMed and process retrieved abstracts to extract information at query time, resulting in a slow response time and limited coverage of only a fraction of the PubMed corpus. Other tools preprocess the PubMed corpus to speed up the response time; however, they are not constantly updated, and thus produce outdated results. Further, most existing tools cannot process sophisticated queries such as searches for mutations that co-occur with query terms in the literature. To address these problems, we introduce BEST, a biomedical entity search tool. BEST returns, as a result, a list of 10 different types of biomedical entities including genes, diseases, drugs, targets, transcription factors, miRNAs, and mutations that are relevant to a user’s query. To the best of our knowledge, BEST is the only system that processes free text queries and returns up-to-date results in real time including mutation information in the results. BEST is freely accessible at http://best.korea.ac.kr. PMID:27760149

  2. EDCATS: An Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Heard, Pamala D.

    1998-01-01

    The purpose of this research is to explore the development of Marshall Space Flight Center Unique Programs. These academic tools provide the Education Program Office with important information from the Education Computer Aided Tracking System (EDCATS). This system is equipped to provide on-line data entry, evaluation, analysis, and report generation, with full archiving for all phases of the evaluation process. Another purpose is to develop reports and data that is tailored to Marshall Space Flight Center Unique Programs. It also attempts to acquire knowledge on how, why, and where information is derived. As a result, a user will be better prepared to decide which available tool is the most feasible for their reports.

  3. BEAT: A Web-Based Boolean Expression Fault-Based Test Case Generation Tool

    ERIC Educational Resources Information Center

    Chen, T. Y.; Grant, D. D.; Lau, M. F.; Ng, S. P.; Vasa, V. R.

    2006-01-01

    BEAT is a Web-based system that generates fault-based test cases from Boolean expressions. It is based on the integration of our several fault-based test case selection strategies. The generated test cases are considered to be fault-based, because they are aiming at the detection of particular faults. For example, when the Boolean expression is in…

  4. ArcCN-Runoff: An ArcGIS tool for generating curve number and runoff maps

    USGS Publications Warehouse

    Zhan, X.; Huang, M.-L.

    2004-01-01

    The development and the application of ArcCN-Runoff tool, an extension of ESRI@ ArcGIS software, are reported. This tool can be applied to determine curve numbers and to calculate runoff or infiltration for a rainfall event in a watershed. Implementation of GIS techniques such as dissolving, intersecting, and a curve-number reference table improve efficiency. Technical processing time may be reduced from days, if not weeks, to hours for producing spatially varied curve number and runoff maps. An application example for a watershed in Lyon County and Osage County, Kansas, USA, is presented. ?? 2004 Elsevier Ltd. All rights reserved.

  5. miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.

    PubMed

    Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M

    2009-07-01

    Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.

  6. Virtual tool mark generation for efficient striation analysis in forensic science

    SciTech Connect

    Ekstrand, Laura

    2012-01-01

    In 2009, a National Academy of Sciences report called for investigation into the scienti c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served to con rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5 and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range of angles

  7. Generating and Analyzing Visual Representations of Conic Sections with the Use of Technological Tools

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Espinosa-Perez, Hugo; Reyes-Rodriguez, Aaron

    2006-01-01

    Technological tools have the potential to offer students the possibility to represent information and relationships embedded in problems and concepts in ways that involve numerical, algebraic, geometric, and visual approaches. In this paper, the authors present and discuss an example in which an initial representation of a mathematical object…

  8. Making Sense of Conceptual Tools in Student-Generated Cases: Student Teachers' Problem-Solving Processes

    ERIC Educational Resources Information Center

    Jahreie, Cecilie Flo

    2010-01-01

    This article examines the way student teachers make sense of conceptual tools when writing cases. In order to understand the problem-solving process, an analysis of the interactions is conducted. The findings show that transforming practical experiences into theoretical reflection is not a straightforward matter. To be able to elaborate on the…

  9. WORD STATISTICS IN THE GENERATION OF SEMANTIC TOOLS FOR INFORMATION SYSTEMS.

    ERIC Educational Resources Information Center

    STONE, DON C.

    ONE OF THE PROBLEMS IN INFORMATION STORAGE AND RETRIEVAL SYSTEMS OF TECHNICAL DOCUMENTS IS THE INTERPRETATION OF WORDS USED TO INDEX DOCUMENTS. SEMANTIC TOOLS, DEFINED AS CHANNELS FOR THE COMMUNICATION OF WORD MEANINGS BETWEEN TECHNICAL EXPERTS, DOCUMENT INDEXERS, AND SEARCHERS, PROVIDE ONE METHOD OF DEALING WITH THE PROBLEM OF MULTIPLE…

  10. DRIVE Analysis Tool Generates Custom Vehicle Drive Cycles Based on Real-World Data (Fact Sheet)

    SciTech Connect

    Not Available

    2013-04-01

    This fact sheet from the National Renewable Energy Laboratory describes the Drive-Cycle Rapid Investigation, Visualization, and Evaluation (DRIVE) analysis tool, which uses GPS and controller area network data to characterize vehicle operation and produce custom vehicle drive cycles, analyzing thousands of hours of data in a matter of minutes.

  11. Generation of orientation tools for automated zebrafish screening assays using desktop 3D printing

    PubMed Central

    2014-01-01

    Background The zebrafish has been established as the main vertebrate model system for whole organism screening applications. However, the lack of consistent positioning of zebrafish embryos within wells of microtiter plates remains an obstacle for the comparative analysis of images acquired in automated screening assays. While technical solutions to the orientation problem exist, dissemination is often hindered by the lack of simple and inexpensive ways of distributing and duplicating tools. Results Here, we provide a cost effective method for the production of 96-well plate compatible zebrafish orientation tools using a desktop 3D printer. The printed tools enable the positioning and orientation of zebrafish embryos within cavities formed in agarose. Their applicability is demonstrated by acquiring lateral and dorsal views of zebrafish embryos arrayed within microtiter plates using an automated screening microscope. This enables the consistent visualization of morphological phenotypes and reporter gene expression patterns. Conclusions The designs are refined versions of previously demonstrated devices with added functionality and strongly reduced production costs. All corresponding 3D models are freely available and digital design can be easily shared electronically. In combination with the increasingly widespread usage of 3D printers, this provides access to the developed tools to a wide range of zebrafish users. Finally, the design files can serve as templates for other additive and subtractive fabrication methods. PMID:24886511

  12. MAKER2: an annotation pipeline and genome-database management tool for second-generation genome projects

    PubMed Central

    2011-01-01

    Background Second-generation sequencing technologies are precipitating major shifts with regards to what kinds of genomes are being sequenced and how they are annotated. While the first generation of genome projects focused on well-studied model organisms, many of today's projects involve exotic organisms whose genomes are largely terra incognita. This complicates their annotation, because unlike first-generation projects, there are no pre-existing 'gold-standard' gene-models with which to train gene-finders. Improvements in genome assembly and the wide availability of mRNA-seq data are also creating opportunities to update and re-annotate previously published genome annotations. Today's genome projects are thus in need of new genome annotation tools that can meet the challenges and opportunities presented by second-generation sequencing technologies. Results We present MAKER2, a genome annotation and data management tool designed for second-generation genome projects. MAKER2 is a multi-threaded, parallelized application that can process second-generation datasets of virtually any size. We show that MAKER2 can produce accurate annotations for novel genomes where training-data are limited, of low quality or even non-existent. MAKER2 also provides an easy means to use mRNA-seq data to improve annotation quality; and it can use these data to update legacy annotations, significantly improving their quality. We also show that MAKER2 can evaluate the quality of genome annotations, and identify and prioritize problematic annotations for manual review. Conclusions MAKER2 is the first annotation engine specifically designed for second-generation genome projects. MAKER2 scales to datasets of any size, requires little in the way of training data, and can use mRNA-seq data to improve annotation quality. It can also update and manage legacy genome annotation datasets. PMID:22192575

  13. SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool

    NASA Astrophysics Data System (ADS)

    Boyer, Jeffrey S.

    1994-11-01

    Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  14. The Three-Generation Pedigree: A Critical Tool in Cancer Genetics Care.

    PubMed

    Mahon, Suzanne M

    2016-09-01

    The family history, a rather low-tech tool, is the backbone of genetic assessment and guides risk assessment and genetic testing decisions. The importance of the pedigree and its application to genetic practice is often overlooked and underestimated. Unfortunately, particularly with electronic health records, standard pedigrees are not routinely constructed. A clear understanding of how pedigrees are employed in clinical oncology practice may lead to improved collection and use of family history data.
. PMID:27541558

  15. SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool

    NASA Technical Reports Server (NTRS)

    Boyer, Jeffrey S.

    1994-01-01

    Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  16. Free Tools and Strategies for the Generation of 3D Finite Element Meshes: Modeling of the Cardiac Structures

    PubMed Central

    Pavarino, E.; Neves, L. A.; Machado, J. M.; de Godoy, M. F.; Shiyou, Y.; Momente, J. C.; Zafalon, G. F. D.; Pinto, A. R.; Valêncio, C. R.

    2013-01-01

    The Finite Element Method is a well-known technique, being extensively applied in different areas. Studies using the Finite Element Method (FEM) are targeted to improve cardiac ablation procedures. For such simulations, the finite element meshes should consider the size and histological features of the target structures. However, it is possible to verify that some methods or tools used to generate meshes of human body structures are still limited, due to nondetailed models, nontrivial preprocessing, or mainly limitation in the use condition. In this paper, alternatives are demonstrated to solid modeling and automatic generation of highly refined tetrahedral meshes, with quality compatible with other studies focused on mesh generation. The innovations presented here are strategies to integrate Open Source Software (OSS). The chosen techniques and strategies are presented and discussed, considering cardiac structures as a first application context. PMID:23762031

  17. Free Tools and Strategies for the Generation of 3D Finite Element Meshes: Modeling of the Cardiac Structures.

    PubMed

    Pavarino, E; Neves, L A; Machado, J M; de Godoy, M F; Shiyou, Y; Momente, J C; Zafalon, G F D; Pinto, A R; Valêncio, C R

    2013-01-01

    The Finite Element Method is a well-known technique, being extensively applied in different areas. Studies using the Finite Element Method (FEM) are targeted to improve cardiac ablation procedures. For such simulations, the finite element meshes should consider the size and histological features of the target structures. However, it is possible to verify that some methods or tools used to generate meshes of human body structures are still limited, due to nondetailed models, nontrivial preprocessing, or mainly limitation in the use condition. In this paper, alternatives are demonstrated to solid modeling and automatic generation of highly refined tetrahedral meshes, with quality compatible with other studies focused on mesh generation. The innovations presented here are strategies to integrate Open Source Software (OSS). The chosen techniques and strategies are presented and discussed, considering cardiac structures as a first application context. PMID:23762031

  18. Photography as a Data Generation Tool for Qualitative Inquiry in Education.

    ERIC Educational Resources Information Center

    Cappello, Marva

    This paper discusses the ways in which photography was used for data generation in a 9-month qualitative study on a mixed-age elementary school classroom. Through a review of the research literature in anthropology, sociology, and education, and an analysis of the research data, the usefulness of photography for educational research with young…

  19. Arkose: A Prototype Mechanism and Tool for Collaborative Information Generation and Distillation

    ERIC Educational Resources Information Center

    Nam, Kevin Kyung

    2010-01-01

    The goals of this thesis have been to gain a better understanding of collaborative knowledge sharing and distilling and to build a prototype collaborative system that supports flexible knowledge generation and distillation. To reach these goals, I have conducted two user studies and built two systems. The first system, Arkose 1.0, is a…

  20. Rational protein design: developing next-generation biological therapeutics and nanobiotechnological tools.

    PubMed

    Wilson, Corey J

    2015-01-01

    Proteins are the most functionally diverse macromolecules observed in nature, participating in a broad array of catalytic, biosensing, transport, scaffolding, and regulatory functions. Fittingly, proteins have become one of the most promising nanobiotechnological tools to date, and through the use of recombinant DNA and other laboratory methods we have produced a vast number of biological therapeutics derived from human genes. Our emerging ability to rationally design proteins (e.g., via computational methods) holds the promise of significantly expanding the number and diversity of protein therapies and has opened the gateway to realizing true and uncompromised personalized medicine. In the last decade computational protein design has been transformed from a set of fundamental strategies to stringently test our understanding of the protein structure-function relationship, to practical tools for developing useful biological processes, nano-devices, and novel therapeutics. As protein design strategies improve (i.e., in terms of accuracy and efficiency) clinicians will be able to leverage individual genetic data and biological metrics to develop and deliver personalized protein therapeutics with minimal delay.

  1. Environmental epigenetics: A promising venue for developing next-generation pollution biomonitoring tools in marine invertebrates.

    PubMed

    Suarez-Ulloa, Victoria; Gonzalez-Romero, Rodrigo; Eirin-Lopez, Jose M

    2015-09-15

    Environmental epigenetics investigates the cause-effect relationships between specific environmental factors and the subsequent epigenetic modifications triggering adaptive responses in the cell. Given the dynamic and potentially reversible nature of the different types of epigenetic marks, environmental epigenetics constitutes a promising venue for developing fast and sensible biomonitoring programs. Indeed, several epigenetic biomarkers have been successfully developed and applied in traditional model organisms (e.g., human and mouse). Nevertheless, the lack of epigenetic knowledge in other ecologically and environmentally relevant organisms has hampered the application of these tools in a broader range of ecosystems, most notably in the marine environment. Fortunately, that scenario is now changing thanks to the growing availability of complete reference genome sequences along with the development of high-throughput DNA sequencing and bioinformatic methods. Altogether, these resources make the epigenetic study of marine organisms (and more specifically marine invertebrates) a reality. By building on this knowledge, the present work provides a timely perspective highlighting the extraordinary potential of environmental epigenetic analyses as a promising source of rapid and sensible tools for pollution biomonitoring, using marine invertebrates as sentinel organisms. This strategy represents an innovative, groundbreaking approach, improving the conservation and management of natural resources in the oceans.

  2. Protein engineering for metabolic engineering: Current and next-generation tools

    SciTech Connect

    Marcheschi, RJ; Gronenberg, LS; Liao, JC

    2013-04-16

    Protein engineering in the context of metabolic engineering is increasingly important to the field of industrial biotechnology. As the demand for biologically produced food, fuels, chemicals, food additives, and pharmaceuticals continues to grow, the ability to design and modify proteins to accomplish new functions will be required to meet the high productivity demands for the metabolism of engineered organisms. We review advances in selecting, modeling, and engineering proteins to improve or alter their activity. Some of the methods have only recently been developed for general use and are just beginning to find greater application in the metabolic engineering community. We also discuss methods of generating random and targeted diversity in proteins to generate mutant libraries for analysis. Recent uses of these techniques to alter cofactor use; produce non-natural amino acids, alcohols, and carboxylic acids; and alter organism phenotypes are presented and discussed as examples of the successful engineering of proteins for metabolic engineering purposes.

  3. Protein engineering for metabolic engineering: current and next-generation tools

    PubMed Central

    Marcheschi, Ryan J.; Gronenberg, Luisa S.; Liao, James C.

    2014-01-01

    Protein engineering in the context of metabolic engineering is increasingly important to the field of industrial biotechnology. As the demand for biologically-produced food, fuels, chemicals, food additives, and pharmaceuticals continues to grow, the ability to design and modify proteins to accomplish new functions will be required to meet the high productivity demands for the metabolism of engineered organisms. This article reviews advances of selecting, modeling, and engineering proteins to improve or alter their activity. Some of the methods have only recently been developed for general use and are just beginning to find greater application in the metabolic engineering community. We also discuss methods of generating random and targeted diversity in proteins to generate mutant libraries for analysis. Recent uses of these techniques to alter cofactor use, produce non-natural amino acids, alcohols, and carboxylic acids, and alter organism phenotypes are presented and discussed as examples of the successful engineering of proteins for metabolic engineering purposes. PMID:23589443

  4. Generation of histo-anatomically representative models of the individual heart: tools and application

    PubMed Central

    Plank, Gernot; Burton, Rebecca A. B.; Hales, Patrick; Bishop, Martin; Mansoori, Tahir; Bernabeu, Miguel; Garny, Alan; Prassl, Anton J.; Bollensdorff, Christian; Mason, Fleur; Mahmood, Fahd; Rodriguez, Blanca; Grau, Vicente; Schneider, Jürgen E.; Gavaghan, David; Kohl, Peter

    2010-01-01

    This paper presents methods to build histo-anatomically detailed individualised cardiac models. The models are based on high-resolution 3D anatomical and/or diffusion tensor magnetic resonance images, combined with serial histological sectioning data, and are used to investigate individualised cardiac function. The current state-of-the-art is reviewed, and its limitations are discussed. We assess the challenges associated with the generation of histo-anatomically representative individualised in-silico models of the heart. The entire processing pipeline including image acquisition, image processing, mesh generation, model set-up and execution of computer simulations, and the underlying methods are described. The multi-faceted challenges associated with these goals are highlighted, suitable solutions are proposed, and an important application of developed high-resolution structure-function models in elucidating the effect of individual structural heterogeneity upon wavefront dynamics is demonstrated. PMID:19414455

  5. Generation of mouse mutants as tools in dissecting the molecular clock.

    PubMed

    Anand, Sneha N; Edwards, Jessica K; Nolan, Patrick M

    2012-01-01

    Elucidation of the molecular basis of mammalian circadian rhythms has progressed dramatically in recent years through the characterization of mouse mutants. With the implementation of numerous mouse genetics programs, comprehensive sets of mutations in genes affecting circadian output measures have been generated. Although incomplete, existing arrays of mutants have been instrumental in our understanding of how the internal SCN clock interacts with the environment and how it conveys its rhythm to remote oscillators. The use of ENU mutagenesis has proven to be a significant contributor, generating mutations leading to subtle and distinct alterations in circadian protein function. In parallel, progress with mouse gene targeting allows one to study gene function in depth by ablating it entirely, in specific tissues at specific times, or by targeting specific functional domains. This has culminated in worldwide efforts to target every gene in the mouse genome allowing researchers to study multiple gene targeting effects systematically.

  6. Next-Generation Sequencing: A Review of Technologies and Tools for Wound Microbiome Research

    PubMed Central

    Hodkinson, Brendan P.; Grice, Elizabeth A.

    2015-01-01

    Significance: The colonization of wounds by specific microbes or communities of microbes may delay healing and/or lead to infection-related complication. Studies of wound-associated microbial communities (microbiomes) to date have primarily relied upon culture-based methods, which are known to have extreme biases and are not reliable for the characterization of microbiomes. Biofilms are very resistant to culture and are therefore especially difficult to study with techniques that remain standard in clinical settings. Recent Advances: Culture-independent approaches employing next-generation DNA sequencing have provided researchers and clinicians a window into wound-associated microbiomes that could not be achieved before and has begun to transform our view of wound-associated biodiversity. Within the past decade, many platforms have arisen for performing this type of sequencing, with various types of applications for microbiome research being possible on each. Critical Issues: Wound care incorporating knowledge of microbiomes gained from next-generation sequencing could guide clinical management and treatments. The purpose of this review is to outline the current platforms, their applications, and the steps necessary to undertake microbiome studies using next-generation sequencing. Future Directions: As DNA sequencing technology progresses, platforms will continue to produce longer reads and more reads per run at lower costs. A major future challenge is to implement these technologies in clinical settings for more precise and rapid identification of wound bioburden. PMID:25566414

  7. Cognitive avionics and watching spaceflight crews think: generation-after-next research tools in functional neuroimaging.

    PubMed

    Genik, Richard J; Green, Christopher C; Graydon, Francis X; Armstrong, Robert E

    2005-06-01

    Confinement and isolation have always confounded the extraordinary endeavor of human spaceflight. Psychosocial health is at the forefront in considering risk factors that imperil missions of 1- to 2-yr duration. Current crewmember selection metrics restricted to behavioral observation by definition observe rather than prevent performance degradation and are thus inadequate when preflight training cannot simulate an entire journey. Nascent techniques to monitor functional and task-related cortical neural activity show promise and can be extended to include whole-brain monitoring. Watching spaceflight crews think can reveal the efficiency of training procedures. Moreover, observing subcortical emotion centers may provide early detection of developing neuropsychiatric disorders. The non-invasive functional neuroimaging modalities electroencephalography (EEG), magnetoencephalography (MEG), magnetic resonance imaging (MRI), and near-infrared spectroscopy (NIRS), and highlights of how they may be engineered for spacecraft are detailed. Preflight and in-flight applications to crewmember behavioral health from current generation, next generation, and generation-after-next neuroscience research studies are also described. The emphasis is on preventing the onset of neuropsychiatric dysfunctions, thus reducing the risk of mission failure due to human error.

  8. Mid-water Software Tools and the Application to Processing and Analysis of the Latest Generation Multibeam Sonars

    NASA Astrophysics Data System (ADS)

    Gee, L.; Doucet, M.

    2010-12-01

    The latest generation of multibeam sonars now has the ability to map the water-column, along with the seafloor. Currently, the users of these sonars have a limited view of the mid-water data in real-time, and if they do store the data, they are restricted to replaying it only, with no ability for further analysis. The water-column data has the potential to address a number of research areas including detection of small targets (wrecks, etc.) above the seabed, mapping of fish and marine mammals and a wide range of physical oceanographic processes. However, researchers have been required to develop their own in-house software tools before they can even begin their study of the water column data. This paper describes the development of more general software tools for the full processing of raw sonar data (bathymetry, backscatter and water-column) to yield output products suitable for visualization in a 4D time-synchronized environment. The huge water-column data volumes generated by the new sonars, combined with the variety of data formats from the different sonar manufacturers, provides a significant challenge in the design and development of tools that can be applied to the wide variety of applications. The development of the mid-water tools on this project addressed this problem by using a unified way of storing the water column data in a generic water column format (GWC). The sonar data are converted into the GWC by re-integrating the water column packets with time-based navigation and attitude, such that downstream in the workflow, the tools will have access to all relevant data of any particular ping. Dependent on the application and the resolution requirements, the conversion process also allows simple sub-sampling. Additionally, each file is indexed to enable fast non-linear lookup and extraction of any packet type or packet type collection in the sonar file. These tools also fully exploit multi-core and hyper-threading technologies to maximize the throughput

  9. Bone Marrow Transplantation in Mice as a Tool to Generate Genetically Modified Animals

    NASA Astrophysics Data System (ADS)

    Rőszer, Tamás; Pintye, Éva; Benkő, Ilona

    2008-12-01

    Transgenic mice can be used either as models of known inherited human diseases or can be applied to perform phenotypic tests of genes with unknown function. In some special applications of gene modification we have to create a tissue specific mutation of a given gene. In some cases however the gene modification can be lethal in the intrauterine life, therefore we should engraft the mutated cells in the postnatal life period. After total body irradiation transplantation of bone marrow cells can be a solution to introduce mutant hematopoietic stem cells into a mature animal. Bone marrow transplantation is a useful and novel tool to study the role of hematopoietic cells in the pathogenesis of inflammation, autoimmune syndromes and many metabolic alterations coupled recently to leukocyte functions.

  10. Bone Marrow Transplantation in Mice as a Tool to Generate Genetically Modified Animals

    SciTech Connect

    Roszer, Tamas; Pintye, Eva; Benko', Ilona

    2008-12-08

    Transgenic mice can be used either as models of known inherited human diseases or can be applied to perform phenotypic tests of genes with unknown function. In some special applications of gene modification we have to create a tissue specific mutation of a given gene. In some cases however the gene modification can be lethal in the intrauterine life, therefore we should engraft the mutated cells in the postnatal life period. After total body irradiation transplantation of bone marrow cells can be a solution to introduce mutant hematopoietic stem cells into a mature animal. Bone marrow transplantation is a useful and novel tool to study the role of hematopoietic cells in the pathogenesis of inflammation, autoimmune syndromes and many metabolic alterations coupled recently to leukocyte functions.

  11. PIPEMicroDB: microsatellite database and primer generation tool for pigeonpea genome.

    PubMed

    Sarika; Arora, Vasu; Iquebal, M A; Rai, Anil; Kumar, Dinesh

    2013-01-01

    Molecular markers play a significant role for crop improvement in desirable characteristics, such as high yield, resistance to disease and others that will benefit the crop in long term. Pigeonpea (Cajanus cajan L.) is the recently sequenced legume by global consortium led by ICRISAT (Hyderabad, India) and been analysed for gene prediction, synteny maps, markers, etc. We present PIgeonPEa Microsatellite DataBase (PIPEMicroDB) with an automated primer designing tool for pigeonpea genome, based on chromosome wise as well as location wise search of primers. Total of 123 387 Short Tandem Repeats (STRs) were extracted from pigeonpea genome, available in public domain using MIcroSAtellite tool (MISA). The database is an online relational database based on 'three-tier architecture' that catalogues information of microsatellites in MySQL and user-friendly interface is developed using PHP. Search for STRs may be customized by limiting their location on chromosome as well as number of markers in that range. This is a novel approach and is not been implemented in any of the existing marker database. This database has been further appended with Primer3 for primer designing of selected markers with left and right flankings of size up to 500 bp. This will enable researchers to select markers of choice at desired interval over the chromosome. Furthermore, one can use individual STRs of a targeted region over chromosome to narrow down location of gene of interest or linked Quantitative Trait Loci (QTLs). Although it is an in silico approach, markers' search based on characteristics and location of STRs is expected to be beneficial for researchers. Database URL: http://cabindb.iasri.res.in/pigeonpea/ PMID:23396298

  12. Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation Electricore, Inc.

    SciTech Connect

    Daye, Tony

    2013-09-30

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  13. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  14. GelClust: a software tool for gel electrophoresis images analysis and dendrogram generation.

    PubMed

    Khakabimamaghani, Sahand; Najafi, Ali; Ranjbar, Reza; Raam, Monireh

    2013-08-01

    This paper presents GelClust, a new software that is designed for processing gel electrophoresis images and generating the corresponding phylogenetic trees. Unlike the most of commercial and non-commercial related softwares, we found that GelClust is very user-friendly and guides the user from image toward dendrogram through seven simple steps. Furthermore, the software, which is implemented in C# programming language under Windows operating system, is more accurate than similar software regarding image processing and is the only software able to detect and correct gel 'smile' effects completely automatically. These claims are supported with experiments.

  15. GelClust: a software tool for gel electrophoresis images analysis and dendrogram generation.

    PubMed

    Khakabimamaghani, Sahand; Najafi, Ali; Ranjbar, Reza; Raam, Monireh

    2013-08-01

    This paper presents GelClust, a new software that is designed for processing gel electrophoresis images and generating the corresponding phylogenetic trees. Unlike the most of commercial and non-commercial related softwares, we found that GelClust is very user-friendly and guides the user from image toward dendrogram through seven simple steps. Furthermore, the software, which is implemented in C# programming language under Windows operating system, is more accurate than similar software regarding image processing and is the only software able to detect and correct gel 'smile' effects completely automatically. These claims are supported with experiments. PMID:23727299

  16. NASA's Learning Technology Project: Developing Educational Tools for the Next Generation of Explorers

    NASA Astrophysics Data System (ADS)

    Federman, A. N.; Hogan, P. J.

    2003-12-01

    Since 1996, NASA's Learning Technology has pioneered the use of innovative technology toinspire students to pursue careers in STEM(Science, Technology, Engineering and Math.) In the past this has included Web sites like Quest and the Observatorium, webcasts and distance learning courses, and even interactive television broadcasts. Our current focus is on development of several mission oriented software packages, targeted primarily at the middle-school population, but flexible enough to be used by elementary to graduate students. These products include contributions to an open source solar system simulator, a 3D planetary encyclopedia), development of a planetary surface viewer (atlas) and others. Whenever possible these software products are written to be 'open source' and multi-platform, for the widest use and easiest access for developers. Along with the software products, we are developing activities and lesson plans that are tested and used by educators in the classroom. The products are reviewed by professional educators. Together these products constitute the NASA Experential Platform for learning, in which the tools used by the public are similar (and in some respects) the same as those used by professional investigators. Efforts are now underway to incorporate actual MODIS and other real time data uplink capabilities.

  17. DynamiX, numerical tool for design of next-generation x-ray telescopes.

    PubMed

    Chauvin, Maxime; Roques, Jean-Pierre

    2010-07-20

    We present a new code aimed at the simulation of grazing-incidence x-ray telescopes subject to deformations and demonstrate its ability with two test cases: the Simbol-X and the International X-ray Observatory (IXO) missions. The code, based on Monte Carlo ray tracing, computes the full photon trajectories up to the detector plane, accounting for the x-ray interactions and for the telescope motion and deformation. The simulation produces images and spectra for any telescope configuration using Wolter I mirrors and semiconductor detectors. This numerical tool allows us to study the telescope performance in terms of angular resolution, effective area, and detector efficiency, accounting for the telescope behavior. We have implemented an image reconstruction method based on the measurement of the detector drifts by an optical sensor metrology. Using an accurate metrology, this method allows us to recover the loss of angular resolution induced by the telescope instability. In the framework of the Simbol-X mission, this code was used to study the impacts of the parameters on the telescope performance. In this paper we present detailed performance analysis of Simbol-X, taking into account the satellite motions and the image reconstruction. To illustrate the versatility of the code, we present an additional performance analysis with a particular configuration of IXO.

  18. Mitochondrial DNA methylation as a next-generation biomarker and diagnostic tool.

    PubMed

    Iacobazzi, Vito; Castegna, Alessandra; Infantino, Vittoria; Andria, Generoso

    2013-01-01

    Recent expansion of our knowledge on epigenetic changes strongly suggests that not only nuclear DNA (nDNA), but also mitochondrial DNA (mtDNA) may be subjected to epigenetic modifications related to disease development, environmental exposure, drug treatment and aging. Thus, mtDNA methylation is attracting increasing attention as a potential biomarker for the detection and diagnosis of diseases and the understanding of cellular behavior in particular conditions. In this paper we review the current advances in mtDNA methylation studies with particular attention to the evidences of mtDNA methylation changes in diseases and physiological conditions so far investigated. Technological advances for the analysis of epigenetic variations are promising tools to provide insights into methylation of mtDNA with similar resolution levels as those reached for nDNA. However, many aspects related to mtDNA methylation are still unclear. More studies are needed to understand whether and how changes in mtDNA methylation patterns, global and gene specific, are associated to diseases or risk factors.

  19. Nematode.net update 2011: addition of data sets and tools featuring next-generation sequencing data.

    PubMed

    Martin, John; Abubucker, Sahar; Heizer, Esley; Taylor, Christina M; Mitreva, Makedonka

    2012-01-01

    Nematode.net (http://nematode.net) has been a publicly available resource for studying nematodes for over a decade. In the past 3 years, we reorganized Nematode.net to provide more user-friendly navigation through the site, a necessity due to the explosion of data from next-generation sequencing platforms. Organism-centric portals containing dynamically generated data are available for over 56 different nematode species. Next-generation data has been added to the various data-mining portals hosted, including NemaBLAST and NemaBrowse. The NemaPath metabolic pathway viewer builds associations using KOs, rather than ECs to provide more accurate and fine-grained descriptions of proteins. Two new features for data analysis and comparative genomics have been added to the site. NemaSNP enables the user to perform population genetics studies in various nematode populations using next-generation sequencing data. HelmCoP (Helminth Control and Prevention) as an independent component of Nematode.net provides an integrated resource for storage, annotation and comparative genomics of helminth genomes to aid in learning more about nematode genomes, as well as drug, pesticide, vaccine and drug target discovery. With this update, Nematode.net will continue to realize its original goal to disseminate diverse bioinformatic data sets and provide analysis tools to the broad scientific community in a useful and user-friendly manner.

  20. Comparison of functional MRI image realignment tools using a computer-generated phantom.

    PubMed

    Morgan, V L; Pickens, D R; Hartmann, S L; Price, R R

    2001-09-01

    This study discusses the development of a computer-generated phantom to compare the effects of image realignment programs on functional MRI (fMRI) pixel activation. The phantom is a whole-head MRI volume with added random noise, activation, and motion. It allows simulation of realistic head motions with controlled areas of activation. Without motion, the phantom shows the effects of realignment on motion-free data sets. Prior to realignment, the phantom illustrates some activation corruption due to motion. Finally, three widely used realignment packages are examined. The results showed that the most accurate algorithms are able to increase specificity through accurate realignment while maintaining sensitivity through effective resampling techniques. In fact, accurate realignment alone is not a powerful indicator of the most effective algorithm in terms of true activation.

  1. PLAN-IT-2: The next generation planning and scheduling tool

    NASA Technical Reports Server (NTRS)

    Eggemeyer, William C.; Cruz, Jennifer W.

    1990-01-01

    PLAN-IT is a scheduling program which has been demonstrated and evaluated in a variety of scheduling domains. The capability enhancements being made for the next generation of PLAN-IT, called PLAN-IT-2 is discussed. PLAN-IT-2 represents a complete rewrite of the original PLAN-IT incorporating major changes as suggested by the application experiences with the original PLAN-IT. A few of the enhancements described are additional types of constraints, such as states and resettable-depletables (batteries), dependencies between constraints, multiple levels of activity planning during the scheduling process, pattern constraint searching for opportunities as opposed to just minimizing the amount of conflicts, additional customization construction features for display and handling of diverse multiple time systems, and reduction in both the size and the complexity for creating the knowledge-base to address the different problem domains.

  2. Generative Topographic Mapping (GTM): Universal Tool for Data Visualization, Structure-Activity Modeling and Dataset Comparison.

    PubMed

    Kireeva, N; Baskin, I I; Gaspar, H A; Horvath, D; Marcou, G; Varnek, A

    2012-04-01

    Here, the utility of Generative Topographic Maps (GTM) for data visualization, structure-activity modeling and database comparison is evaluated, on hand of subsets of the Database of Useful Decoys (DUD). Unlike other popular dimensionality reduction approaches like Principal Component Analysis, Sammon Mapping or Self-Organizing Maps, the great advantage of GTMs is providing data probability distribution functions (PDF), both in the high-dimensional space defined by molecular descriptors and in 2D latent space. PDFs for the molecules of different activity classes were successfully used to build classification models in the framework of the Bayesian approach. Because PDFs are represented by a mixture of Gaussian functions, the Bhattacharyya kernel has been proposed as a measure of the overlap of datasets, which leads to an elegant method of global comparison of chemical libraries.

  3. Next-generation sequencing and metagenomic analysis: a universal diagnostic tool in plant virology.

    PubMed

    Adams, Ian P; Glover, Rachel H; Monger, Wendy A; Mumford, Rick; Jackeviciene, Elena; Navalinskiene, Meletele; Samuitiene, Marija; Boonham, Neil

    2009-07-01

    A novel, unbiased approach to plant viral disease diagnosis has been developed which requires no a priori knowledge of the host or pathogen. Next-generation sequencing coupled with metagenomic analysis was used to produce large quantities of cDNA sequence in a model system of tomato infected with Pepino mosaic virus. The method was then applied to a sample of Gomphrena globosa infected with an unknown pathogen originally isolated from the flowering plant Liatris spicata. This plant was found to contain a new cucumovirus, for which we suggest the name 'Gayfeather mild mottle virus'. In both cases, the full viral genome was sequenced. This method expedites the entire process of novel virus discovery, identification, viral genome sequencing and, subsequently, the development of more routine assays for new viral pathogens.

  4. Fractal analysis of experimentally generated pyroclasts: A tool for volcanic hazard assessment

    NASA Astrophysics Data System (ADS)

    Perugini, Diego; Kueppers, Ulrich

    2012-06-01

    Rapid decompression experiments on natural volcanic rocks mimick explosive eruptions. Fragment size distributions (FSD) of such experimentally generated pyroclasts are investigated using fractal geometry. The fractal dimension of fragmentation, D, of FSD is measured for samples from Unzen (Japan) and Popocatépetl (Mexico) volcanoes. Results show that: (i) FSD are fractal and can be quantified by measuring D values; (ii) D increases linearly with potential energy for fragmentation (PEF) and, thus, with increasing applied pressure; (iii) the rate of increase of D with PEF depends on open porosity: the higher the open porosity, the lower the increase of D with PEF; (iv) at comparable open porosity, samples display a similar behavior for any rock composition. The method proposed here has the potential to become a standard routine to estimate eruptive energy of past and recent eruptions using values of D and open porosity, providing an important step towards volcanic hazard assessment.

  5. Third Harmonic Generation microscopy as a diagnostic tool for the investigation of microglia BV-2 and breast cancer cells activation

    NASA Astrophysics Data System (ADS)

    Gavgiotaki, E.; Filippidis, G.; Psilodimitrakopoulos, S.; Markomanolaki, H.; Kalognomou, M.; Agelaki, S.; Georgoulias, V.; Athanassakis, I.

    2015-07-01

    Nonlinear optical imaging techniques have created new opportunities of research in the biomedical field. Specifically, Third Harmonic Generation (THG) seems to be a suitable noninvasive imaging tool for the delineation and quantification of biological structures at the microscopic level. The aim of this study was to extract information as to the activation state of different cell types by using the THG imaging microscopy as a diagnostic tool. BV-2 microglia cell line was used as a representative biological model enabling the study of resting and activated state of the cells linked to various pathological conditions. Third Harmonic Generation (THG) and Two Photon Excitation Fluorescence (TPEF) measurements were simultaneously collected from stained breast cancer cells, by employing a single homemade experimental apparatus and it was shown that high THG signals mostly arise from lipid bodies. Continuously, BV-2 microglia cells were examined with or without activation by lipopolysaccharide (LPS) in order to discriminate between control and activated cells based on the quantification of THG signals. Statistically quantification was accomplished in both mean area and mean intensity values of THG. The values for mean total area and mean THG intensity values have been increased in activated versus the non-activated cells. Similar studies of quantification are underway in breast cancer cells for the exact discrimination on different cell lines. Furthermore, laser polarization dependence of SHG and THG signal in unstained biological samples is investigated.

  6. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  7. Snooker: a structure-based pharmacophore generation tool applied to class A GPCRs.

    PubMed

    Sanders, Marijn P A; Verhoeven, Stefan; de Graaf, Chris; Roumen, Luc; Vroling, Bas; Nabuurs, Sander B; de Vlieg, Jacob; Klomp, Jan P G

    2011-09-26

    G-protein coupled receptors (GPCRs) are important drug targets for various diseases and of major interest to pharmaceutical companies. The function of individual members of this protein family can be modulated by the binding of small molecules at the extracellular side of the structurally conserved transmembrane (TM) domain. Here, we present Snooker, a structure-based approach to generate pharmacophore hypotheses for compounds binding to this extracellular side of the TM domain. Snooker does not require knowledge of ligands, is therefore suitable for apo-proteins, and can be applied to all receptors of the GPCR protein family. The method comprises the construction of a homology model of the TM domains and prioritization of residues on the probability of being ligand binding. Subsequently, protein properties are converted to ligand space, and pharmacophore features are generated at positions where protein ligand interactions are likely. Using this semiautomated knowledge-driven bioinformatics approach we have created pharmacophore hypotheses for 15 different GPCRs from several different subfamilies. For the beta-2-adrenergic receptor we show that ligand poses predicted by Snooker pharmacophore hypotheses reproduce literature supported binding modes for ∼75% of compounds fulfilling pharmacophore constraints. All 15 pharmacophore hypotheses represent interactions with essential residues for ligand binding as observed in mutagenesis experiments and compound selections based on these hypotheses are shown to be target specific. For 8 out of 15 targets enrichment factors above 10-fold are observed in the top 0.5% ranked compounds in a virtual screen. Additionally, prospectively predicted ligand binding poses in the human dopamine D3 receptor based on Snooker pharmacophores were ranked among the best models in the community wide GPCR dock 2010.

  8. An automated graphics tool for comparative genomics: the Coulson plot generator

    PubMed Central

    2013-01-01

    Background Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. Results We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. Conclusions CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in

  9. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 2 2012-07-01 2012-07-01 false Appeal Deciding Officer. 215.8 Section 215.8 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE NOTICE, COMMENT, AND APPEAL PROCEDURES FOR NATIONAL FOREST SYSTEM PROJECTS AND ACTIVITIES § 215.8 Appeal...

  10. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 2 2011-07-01 2011-07-01 false Appeal Deciding Officer. 215.8 Section 215.8 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE NOTICE, COMMENT, AND APPEAL PROCEDURES FOR NATIONAL FOREST SYSTEM PROJECTS AND ACTIVITIES § 215.8 Appeal...

  11. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Appeal Deciding Officer. 215.8 Section 215.8 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE NOTICE, COMMENT, AND APPEAL PROCEDURES FOR NATIONAL FOREST SYSTEM PROJECTS AND ACTIVITIES § 215.8 Appeal...

  12. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 2 2014-07-01 2014-07-01 false Appeal Deciding Officer. 215.8 Section 215.8 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE NOTICE, COMMENT, AND APPEAL PROCEDURES FOR NATIONAL FOREST SYSTEM PROJECTS AND ACTIVITIES § 215.8 Appeal...

  13. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 2 2013-07-01 2013-07-01 false Appeal Deciding Officer. 215.8 Section 215.8 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE NOTICE, COMMENT, AND APPEAL PROCEDURES FOR NATIONAL FOREST SYSTEM PROJECTS AND ACTIVITIES § 215.8 Appeal...

  14. Deciding when It's Time to Buy a New PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    How to best decide when it's time to replace your PC, whether at home or at work, is always tricky. Spending on computers can make you more productive, but it's money you otherwise cannot spend, invest or save, and faster systems always await you in the future. What is clear is that the computer industry really wants you to buy, and the computer…

  15. Deciding in Democracies: A Role for Thinking Skills?

    ERIC Educational Resources Information Center

    Gardner, Peter

    2014-01-01

    In societies that respect our right to decide many things for ourselves, exercising that right can be a source of anxiety. We want to make the right decisions, which is difficult when we are confronted with complex issues that are usually the preserve of specialists. But is help at hand? Are thinking skills the very things that non-specialists…

  16. Project DECIDE. Business Enterprise Approach to Career Exploration. Implementation Handbook.

    ERIC Educational Resources Information Center

    Post, John O., Jr.; And Others

    The purpose of this document is to describe project DECIDE, a business enterprise career exploration program, in the form of an implementation handbook. Chapter 1 presents the major characteristics of the model, which focuses on providing special needs students and regular junior high students the opportunity to improve their personal, social, and…

  17. Consumer Economics, Book I [and] Book II. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Consumer Economics, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday…

  18. Transposon assisted gene insertion technology (TAGIT): a tool for generating fluorescent fusion proteins.

    PubMed

    Gregory, James A; Becker, Eric C; Jung, James; Tuwatananurak, Ida; Pogliano, Kit

    2010-01-01

    We constructed a transposon (transposon assisted gene insertion technology, or TAGIT) that allows the random insertion of gfp (or other genes) into chromosomal loci without disrupting operon structure or regulation. TAGIT is a modified Tn5 transposon that uses Kan(R) to select for insertions on the chromosome or plasmid, beta-galactosidase to identify in-frame gene fusions, and Cre recombinase to excise the kan and lacZ genes in vivo. The resulting gfp insertions maintain target gene reading frame (to the 5' and 3' of gfp) and are integrated at the native chromosomal locus, thereby maintaining native expression signals. Libraries can be screened to identify GFP insertions that maintain target protein function at native expression levels, allowing more trustworthy localization studies. We here use TAGIT to generate a library of GFP insertions in the Escherichia coli lactose repressor (LacI). We identified fully functional GFP insertions and partially functional insertions that bind DNA but fail to repress the lacZ operon. Several of these latter GFP insertions localize to lacO arrays integrated in the E. coli chromosome without producing the elongated cells frequently observed when functional LacI-GFP fusions are used in chromosome tagging experiments. TAGIT thereby faciliates the isolation of fully functional insertions of fluorescent proteins into target proteins expressed from the native chromosomal locus as well as potentially useful partially functional proteins. PMID:20090956

  19. Second harmonic generation imaging as a potential tool for staging pregnancy and predicting preterm birth

    PubMed Central

    Akins, Meredith L.; Luby-Phelps, Katherine; Mahendroo, Mala

    2010-01-01

    We use second harmonic generation (SHG) microscopy to assess changes in collagen structure of murine cervix during cervical remodeling of normal pregnancy and in a preterm birth model. Visual inspection of SHG images revealed substantial changes in collagen morphology throughout normal gestation. SHG images collected in both the forward and backward directions were analyzed quantitatively for changes in overall mean intensity, forward to backward intensity ratio, collagen fiber size, and porosity. Changes in mean SHG intensity and intensity ratio take place in early pregnancy, suggesting that submicroscopic changes in collagen fibril size and arrangement occur before macroscopic changes become evident. Fiber size progressively increased from early to late pregnancy, while pores between collagen fibers became larger and farther apart. Analysis of collagen features in premature cervical remodeling show that changes in collagen structure are dissimilar from normal remodeling. The ability to quantify multiple morphological features of collagen that characterize normal cervical remodeling and distinguish abnormal remodeling in preterm birth models supports future studies aimed at development of SHG endoscopic devices for clinical assessment of collagen changes during pregnancy in women and for predicting risk of preterm labor which occurs in 12.5% of all pregnancies. PMID:20459265

  20. Development and implementation of an electronic health record generated surgical handoff and rounding tool.

    PubMed

    Raval, Mehul V; Rust, Laura; Thakkar, Rajan K; Kurtovic, Kelli J; Nwomeh, Benedict C; Besner, Gail E; Kenney, Brian D

    2015-02-01

    Electronic health records (EHR) have been adopted across the nation at tremendous effort and expense. The purpose of this study was to assess improvements in accuracy, efficiency, and patient safety for a high-volume pediatric surgical service with adoption of an EHR-generated handoff and rounding list. The quality and quantity of errors were compared pre- and post-EHR-based list implementation. A survey was used to determine time spent by team members using the two versions of the list. Perceived utility, safety, and quality of the list were reported. Serious safety events determined by the hospital were also compared for the two periods. The EHR-based list eliminated clerical errors while improving efficiency by automatically providing data such as vital signs. Survey respondents reported 43 min saved per week per team member, translating to 372 work hours of time saved annually for a single service. EHR-based list users reported higher satisfaction and perceived improvement in efficiency, accuracy, and safety. Serious safety events remained unchanged. In conclusion, creation of an EHR-based list to assist with daily handoffs, rounding, and patient management demonstrated improved accuracy, increased efficiency, and assisted in maintaining a high level of safety. PMID:25631842

  1. Second harmonic generation imaging as a potential tool for staging pregnancy and predicting preterm birth

    NASA Astrophysics Data System (ADS)

    Akins, Meredith L.; Luby-Phelps, Katherine; Mahendroo, Mala

    2010-03-01

    We use second harmonic generation (SHG) microscopy to assess changes in collagen structure of murine cervix during cervical remodeling of normal pregnancy and in a preterm birth model. Visual inspection of SHG images revealed substantial changes in collagen morphology throughout normal gestation. SHG images collected in both the forward and backward directions were analyzed quantitatively for changes in overall mean intensity, forward to backward intensity ratio, collagen fiber size, and porosity. Changes in mean SHG intensity and intensity ratio take place in early pregnancy, suggesting that submicroscopic changes in collagen fibril size and arrangement occur before macroscopic changes become evident. Fiber size progressively increased from early to late pregnancy, while pores between collagen fibers became larger and farther apart. Analysis of collagen features in premature cervical remodeling show that changes in collagen structure are dissimilar from normal remodeling. The ability to quantify multiple morphological features of collagen that characterize normal cervical remodeling and distinguish abnormal remodeling in preterm birth models supports future studies aimed at development of SHG endoscopic devices for clinical assessment of collagen changes during pregnancy in women and for predicting risk of preterm labor which occurs in 12.5% of all pregnancies.

  2. Configurational Entropy in Ice Nanosystems: Tools for Structure Generation and Screening.

    PubMed

    Parkkinen, P; Riikonen, S; Halonen, L

    2014-03-11

    Recently, a number of experimental and theoretical studies of low-temperature ice and water in nanoscale systems have emerged. Any theoretical study trying to model such systems will encounter the proton-disorder problem, i.e., there exist many configurations differing by water-molecule rotations for a fixed oxygen atom structure. An extensive search within the allowed proton-disorder space should always be perfomed to ensure a reasonable low-energy isomer and to address the effect of proton-configurational entropy that may affect experimental observables. In the present work, an efficient general-purpose program for finite, semiperiodic, and periodic systems of hydrogen-bonded molecules is presented, which can be used in searching and enumerating the proton-configurational ensemble. Benchmarking tests are performed for ice nanotubes and finite slabs. Finally, the program is applied to experimentally appropriate ice nanosystems. A boron nitride film supported ice nanodot is studied in detail. Using a systematic generation of its proton-configurational ensemble, we find an isomer that is ∼1 eV lower in total energy than one previously studied. The present isomer features a considerable dipole moment and implies that ice nanodots are inherently ferroelectric parallel to the surface. We conclude by demonstrating how the so-called hydrogen-bond connectivity parameters can be used to screen low-energy isomers.

  3. repgenHMM: a dynamic programming tool to infer the rules of immune receptor generation from sequence data

    PubMed Central

    Elhanati, Yuval; Marcou, Quentin; Mora, Thierry; Walczak, Aleksandra M.

    2016-01-01

    Motivation: The diversity of the immune repertoire is initially generated by random rearrangements of the receptor gene during early T and B cell development. Rearrangement scenarios are composed of random events—choices of gene templates, base pair deletions and insertions—described by probability distributions. Not all scenarios are equally likely, and the same receptor sequence may be obtained in several different ways. Quantifying the distribution of these rearrangements is an essential baseline for studying the immune system diversity. Inferring the properties of the distributions from receptor sequences is a computationally hard problem, requiring enumerating every possible scenario for every sampled receptor sequence. Results: We present a Hidden Markov model, which accounts for all plausible scenarios that can generate the receptor sequences. We developed and implemented a method based on the Baum–Welch algorithm that can efficiently infer the parameters for the different events of the rearrangement process. We tested our software tool on sequence data for both the alpha and beta chains of the T cell receptor. To test the validity of our algorithm, we also generated synthetic sequences produced by a known model, and confirmed that its parameters could be accurately inferred back from the sequences. The inferred model can be used to generate synthetic sequences, to calculate the probability of generation of any receptor sequence, as well as the theoretical diversity of the repertoire. We estimate this diversity to be ≈1023 for human T cells. The model gives a baseline to investigate the selection and dynamics of immune repertoires. Availability and implementation: Source code and sample sequence files are available at https://bitbucket.org/yuvalel/repgenhmm/downloads. Contact: elhanati@lpt.ens.fr or tmora@lps.ens.fr or awalczak@lpt.ens.fr PMID:27153709

  4. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  5. Generation of Fluorogen-Activating Designed Ankyrin Repeat Proteins (FADAs) as Versatile Sensor Tools.

    PubMed

    Schütz, Marco; Batyuk, Alexander; Klenk, Christoph; Kummer, Lutz; de Picciotto, Seymour; Gülbakan, Basri; Wu, Yufan; Newby, Gregory A; Zosel, Franziska; Schöppe, Jendrik; Sedlák, Erik; Mittl, Peer R E; Zenobi, Renato; Wittrup, K Dane; Plückthun, Andreas

    2016-03-27

    Fluorescent probes constitute a valuable toolbox to address a variety of biological questions and they have become irreplaceable for imaging methods. Commonly, such probes consist of fluorescent proteins or small organic fluorophores coupled to biological molecules of interest. Recently, a novel class of fluorescence-based probes, fluorogen-activating proteins (FAPs), has been reported. These binding proteins are based on antibody single-chain variable fragments and activate fluorogenic dyes, which only become fluorescent upon activation and do not fluoresce when free in solution. Here we present a novel class of fluorogen activators, termed FADAs, based on the very robust designed ankyrin repeat protein scaffold, which also readily folds in the reducing environment of the cytoplasm. The FADA generated in this study was obtained by combined selections with ribosome display and yeast surface display. It enhances the fluorescence of malachite green (MG) dyes by a factor of more than 11,000 and thus activates MG to a similar extent as FAPs based on single-chain variable fragments. As shown by structure determination and in vitro measurements, this FADA was evolved to form a homodimer for the activation of MG dyes. Exploiting the favorable properties of the designed ankyrin repeat protein scaffold, we created a FADA biosensor suitable for imaging of proteins on the cell surface, as well as in the cytosol. Moreover, based on the requirement of dimerization for strong fluorogen activation, a prototype FADA biosensor for in situ detection of a target protein and protein-protein interactions was developed. Therefore, FADAs are versatile fluorescent probes that are easily produced and suitable for diverse applications and thus extend the FAP technology.

  6. Generation of Fluorogen-Activating Designed Ankyrin Repeat Proteins (FADAs) as Versatile Sensor Tools.

    PubMed

    Schütz, Marco; Batyuk, Alexander; Klenk, Christoph; Kummer, Lutz; de Picciotto, Seymour; Gülbakan, Basri; Wu, Yufan; Newby, Gregory A; Zosel, Franziska; Schöppe, Jendrik; Sedlák, Erik; Mittl, Peer R E; Zenobi, Renato; Wittrup, K Dane; Plückthun, Andreas

    2016-03-27

    Fluorescent probes constitute a valuable toolbox to address a variety of biological questions and they have become irreplaceable for imaging methods. Commonly, such probes consist of fluorescent proteins or small organic fluorophores coupled to biological molecules of interest. Recently, a novel class of fluorescence-based probes, fluorogen-activating proteins (FAPs), has been reported. These binding proteins are based on antibody single-chain variable fragments and activate fluorogenic dyes, which only become fluorescent upon activation and do not fluoresce when free in solution. Here we present a novel class of fluorogen activators, termed FADAs, based on the very robust designed ankyrin repeat protein scaffold, which also readily folds in the reducing environment of the cytoplasm. The FADA generated in this study was obtained by combined selections with ribosome display and yeast surface display. It enhances the fluorescence of malachite green (MG) dyes by a factor of more than 11,000 and thus activates MG to a similar extent as FAPs based on single-chain variable fragments. As shown by structure determination and in vitro measurements, this FADA was evolved to form a homodimer for the activation of MG dyes. Exploiting the favorable properties of the designed ankyrin repeat protein scaffold, we created a FADA biosensor suitable for imaging of proteins on the cell surface, as well as in the cytosol. Moreover, based on the requirement of dimerization for strong fluorogen activation, a prototype FADA biosensor for in situ detection of a target protein and protein-protein interactions was developed. Therefore, FADAs are versatile fluorescent probes that are easily produced and suitable for diverse applications and thus extend the FAP technology. PMID:26812208

  7. Astronomy as a Tool for Training the Next Generation Technical Workforce

    NASA Astrophysics Data System (ADS)

    Romero, V.; Walsh, G.; Ryan, W.; Ryan, E.

    A major challenge for today's institutes of higher learning is training the next generation of scientists, engineers, and optical specialists to be proficient in the latest technologies they will encounter when they enter the workforce. Although research facilities can offer excellent hands-on instructional opportunities, integrating such experiential learning into academic coursework without disrupting normal operations at such facilities can be difficult. Also, motivating entry level students to increase their skill levels by undertaking and successfully completing difficult coursework can require more creative instructional approaches, including fostering a fun, non-threatening environment for enhancing basic abilities. Astronomy is a universally appealing subject area, and can be very effective as a foundation for cultivating advanced competencies. We report on a project underway at the New Mexico Institute of Mining and Technology (NM Tech), a science and engineering school in Socorro, NM, to incorporate a state-of-the-art optical telescope and laboratory experiments into an entry-level course in basic engineering. Students enrolled in an explosive engineering course were given a topical problem in Planetary Astronomy: they were asked to develop a method to energetically mitigate a potentially hazardous impact between our planet and a Near-Earth asteroid to occur sometime in the future. They were first exposed to basic engineering training in the areas of fracture and material response to failure under different environmental conditions through lectures and traditional laboratory exercises. The students were then given access to NM Tech's Magdalena Ridge Observatory's (MRO) 2.4-meter telescope to collect physical characterization data, (specifically shape information) on two potentially hazardous asteroids (one roughly spherical, the other an elongated ellipsoid). Finally, the students used NM Tech's Energetic Materials Research and Testing Center (EMRTC) to

  8. Deciding How To Decide: Decision Making in Schools. A Presenter's Guide. Research Based Training for School Administrators. Revised.

    ERIC Educational Resources Information Center

    Oregon Univ., Eugene. Center for Educational Policy and Management.

    This workshop presenter's guide is intended for use by administrators in training one another in the Project Leadership program developed by the Association of California School Administrators (ACSA). The purposes of the guide are: to provide administrators with a framework for deciding when others (particularly subordinates) should participate in…

  9. Generation of a Knockout Mouse Embryonic Stem Cell Line Using a Paired CRISPR/Cas9 Genome Engineering Tool.

    PubMed

    Wettstein, Rahel; Bodak, Maxime; Ciaudo, Constance

    2016-01-01

    CRISPR/Cas9, originally discovered as a bacterial immune system, has recently been engineered into the latest tool to successfully introduce site-specific mutations in a variety of different organisms. Composed only of the Cas9 protein as well as one engineered guide RNA for its functionality, this system is much less complex in its setup and easier to handle than other guided nucleases such as Zinc-finger nucleases or TALENs.Here, we describe the simultaneous transfection of two paired CRISPR sgRNAs-Cas9 plasmids, in mouse embryonic stem cells (mESCs), resulting in the knockout of the selected target gene. Together with a four primer-evaluation system, it poses an efficient way to generate new independent knockout mouse embryonic stem cell lines.

  10. Generation of an ABCG2{sup GFPn-puro} transgenic line - A tool to study ABCG2 expression in mice

    SciTech Connect

    Orford, Michael; Mean, Richard; Lapathitis, George; Genethliou, Nicholas; Panayiotou, Elena; Panayi, Helen; Malas, Stavros

    2009-06-26

    The ATP-binding cassette (ABC) transporter 2 (ABCG2) is expressed by stem cells in many organs and in stem cells of solid tumors. These cells are isolated based on the side population (SP) phenotype, a Hoechst 3342 dye efflux property believed to be conferred by ABCG2. Because of the limitations of this approach we generated transgenic mice that express Nuclear GFP (GFPn) coupled to the Puromycin-resistance gene, under the control of ABCG2 promoter/enhancer sequences. We show that ABCG2 is expressed in neural progenitors of the developing forebrain and spinal cord and in embryonic and adult endothelial cells of the brain. Using the neurosphere assay, we isolated tripotent ABCG2-expressing neural stem cells from embryonic mouse brain. This transgenic line is a powerful tool for studying the expression of ABCG2 in many tissues and for performing functional studies in different experimental settings.

  11. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  12. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  13. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  14. A Scalable and Accurate Targeted Gene Assembly Tool (SAT-Assembler) for Next-Generation Sequencing Data

    PubMed Central

    Zhang, Yuan; Sun, Yanni; Cole, James R.

    2014-01-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material. PMID:25122209

  15. A scalable and accurate targeted gene assembly tool (SAT-Assembler) for next-generation sequencing data.

    PubMed

    Zhang, Yuan; Sun, Yanni; Cole, James R

    2014-08-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material. PMID:25122209

  16. MATURE: A Model Driven bAsed Tool to Automatically Generate a langUage That suppoRts CMMI Process Areas spEcification

    NASA Astrophysics Data System (ADS)

    Musat, David; Castaño, Víctor; Calvo-Manzano, Jose A.; Garbajosa, Juan

    Many companies have achieved a higher quality in their processes by using CMMI. Process definition may be efficiently supported by software tools. A higher automation level will make process improvement and assessment activities easier to be adapted to customer needs. At present, automation of CMMI is based on tools that support practice definition in a textual way. These tools are often enhanced spreadsheets. In this paper, following the Model Driven Development paradigm (MDD), a tool that supports automatic generation of a language that can be used to specify process areas practices is presented. The generation is performed from a metamodel that represents CMMI. This tool, differently from others available, can be customized according to user needs. Guidelines to specify the CMMI metamodel are also provided. The paper also shows how this approach can support other assessment methods.

  17. The Society-Deciders Model and Fairness in Nations

    NASA Astrophysics Data System (ADS)

    Flomenbom, Ophir

    2015-05-01

    Modeling the dynamics in nations from economical and sociological perspectives is a central theme in economics and sociology. Accurate models can predict and therefore help all the world's citizens. Yet, recent years have show that the current models are missing. Here, we develop a dynamical society-deciders model that can explain the stability in a nation, based on concepts from dynamics, ecology and socio-econo-physics; a nation has two groups that interconnect, the deciders and the society. We show that a nation is either stable or it collapses. This depends on just two coefficients that we relate with sociological and economical indicators. We define a new socio-economic indicator, fairness. Fairness can measure the stability in a nation and how probable a change favoring the society is. We compute fairness among all the world's nations. Interestingly, in comparison with other indicators, fairness shows that the USA loses its rank among Western democracies, India is the best among the 15 most populated nations, and Egypt, Libya and Tunisia have significantly improved their rankings as a result of recent revolutions, further increasing the probability of additional positive changes. Within the model, long lasting crises are solved rather than with increasing governmental spending or cuts with regulations that reduce the stability of the deciders, namely, increasing fairness, while, for example, shifting wealth in the direction of the people, and therefore increasing further opportunities.

  18. 30 CFR 585.429 - What criteria will BOEM consider in deciding whether to renew a lease or grant?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... criteria in deciding whether to renew a lease or grant: (a) Design life of existing technology. (b) Availability and feasibility of new technology. (c) Environmental and safety record of the lessee or grantee... and fair return considerations. (f) Effects of the lease or grant on generation capacity...

  19. 30 CFR 585.429 - What criteria will BOEM consider in deciding whether to renew a lease or grant?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... criteria in deciding whether to renew a lease or grant: (a) Design life of existing technology. (b) Availability and feasibility of new technology. (c) Environmental and safety record of the lessee or grantee... and fair return considerations. (f) Effects of the lease or grant on generation capacity...

  20. 30 CFR 585.429 - What criteria will BOEM consider in deciding whether to renew a lease or grant?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... criteria in deciding whether to renew a lease or grant: (a) Design life of existing technology. (b) Availability and feasibility of new technology. (c) Environmental and safety record of the lessee or grantee... and fair return considerations. (f) Effects of the lease or grant on generation capacity...

  1. Evaluation of pulsed laser ablation in liquids generated gold nanoparticles as novel transfection tools: efficiency and cytotoxicity

    NASA Astrophysics Data System (ADS)

    Willenbrock, Saskia; Durán, María. Carolina; Barchanski, Annette; Barcikowski, Stephan; Feige, Karsten; Nolte, Ingo; Murua Escobar, Hugo

    2014-03-01

    Varying transfection efficiencies and cytotoxicity are crucial aspects in cell manipulation. The utilization of gold nanoparticles (AuNP) has lately attracted special interest to enhance transfection efficiency. Conventional AuNP are usually generated by chemical reactions or gas pyrolysis requiring often cell-toxic stabilizers or coatings to conserve their characteristics. Alternatively, stabilizer- and coating-free, highly pure, colloidal AuNP can be generated by pulsed laser ablation in liquids (PLAL). Mammalian cells were transfected efficiently by addition of PLAL-AuNP, but data systematically evaluating the cell-toxic potential are lacking. Herein, the transfection efficiency and cytotoxicity of PLAL AuNP was evaluated by transfection of a mammalian cell line with a recombinant HMGB1/GFP DNA expression vector. Different methods were compared using two sizes of PLAL-AuNP, commercialized AuNP, two magnetic NP-based protocols and a conventional transfection reagent (FuGENE HD; FHD). PLAL-AuNP were generated using a Spitfire Pro femtosecond laser system delivering 120 fs laser pulses at a wavelength of 800 nm focusing the fs-laser beam on a 99.99% pure gold target placed in ddH2O. Transfection efficiencies were analyzed after 24h using fluorescence microscopy and flow cytometry. Toxicity was assessed measuring cell proliferation and percentage of necrotic, propidium iodide positive cells (PI %). The addition of PLAL-AuNP significantly enhanced transfection efficiencies (FHD: 31 %; PLAL-AuNP size-1: 46 %; size-2: 50 %) with increased PI% but no reduced cell proliferation. Commercial AuNP-transfection showed significantly lower efficiency (23 %), slightly increased PI % and reduced cell proliferation. Magnetic NP based methods were less effective but showing also lowest cytotoxicity. In conclusion, addition of PLAL-AuNP provides a novel tool for transfection efficiency enhancement with acceptable cytotoxic side-effects.

  2. The abridged patient-generated subjective global assessment is a useful tool for early detection and characterization of cancer cachexia.

    PubMed

    Vigano, Antonio L; di Tomasso, Jonathan; Kilgour, Robert D; Trutschnigg, Barbara; Lucar, Enriqueta; Morais, José A; Borod, Manuel

    2014-07-01

    Cancer cachexia (CC) is a syndrome characterized by wasting of lean body mass and fat, often driven by decreased food intake, hypermetabolism, and inflammation resulting in decreased lifespan and quality of life. Classification of cancer cachexia has improved, but few clinically relevant diagnostic tools exist for its early identification and characterization. The abridged Patient-Generated Subjective Global Assessment (aPG-SGA) is a modification of the original Patient-Generated Subjective Global Assessment, and consists of a four-part questionnaire that scores patients' weight history, food intake, appetite, and performance status. The purpose of this study was to determine whether the aPG-SGA is associated with both features and clinical sequelae of cancer cachexia. In this prospective cohort study, 207 advanced lung and gastrointestinal cancer patients completed the following tests: aPG-SGA, Edmonton Symptom Assessment System, handgrip strength, a complete blood count, albumin, apolipoprotein A and B, and C-reactive protein. Ninety-four participants with good performance status as assessed by the Eastern Cooperative Oncology Group Performance Status completed additional questionnaires and underwent body composition testing. Of these, 68 patients tested for quadriceps strength and completed a 3-day food recall. Multivariable regression models revealed that higher aPG-SGA scores (≥9 vs 0 to 1) are significantly associated (P<0.05) with the following: unfavorable biological markers of cancer cachexia, such as higher white blood cell counts (10.0 vs 6.7×10(9)/L; lower hemoglobin (115.6 vs 127.7 g/L), elevated C-reactive protein (42.7 vs 18.2 mg/L [406.7 vs 173.3 nmol/L]); decreased anthropometric and physical measures, such as body mass index (22.5 vs 27.1); fat mass (14.4 vs 26.0 kg), handgrip (24.7 vs 34.9 kg) and leg strength; an average 12% greater length of hospital stay; a dose reduction in chemotherapy; and increased mortality. Given its association with

  3. Phenotype MicroArrays as a complementary tool to next generation sequencing for characterization of tree endophytes.

    PubMed

    Blumenstein, Kathrin; Macaya-Sanz, David; Martín, Juan A; Albrectsen, Benedicte R; Witzell, Johanna

    2015-01-01

    There is an increasing need to calibrate microbial community profiles obtained through next generation sequencing (NGS) with relevant taxonomic identities of the microbes, and to further associate these identities with phenotypic attributes. Phenotype MicroArray (PM) techniques provide a semi-high throughput assay for characterization and monitoring the microbial cellular phenotypes. Here, we present detailed descriptions of two different PM protocols used in our recent studies on fungal endophytes of forest trees, and highlight the benefits and limitations of this technique. We found that the PM approach enables effective screening of substrate utilization by endophytes. However, the technical limitations are multifaceted and the interpretation of the PM data challenging. For the best result, we recommend that the growth conditions for the fungi are carefully standardized. In addition, rigorous replication and control strategies should be employed whether using pre-configured, commercial microwell-plates or in-house designed PM plates for targeted substrate analyses. With these precautions, the PM technique is a valuable tool to characterize the metabolic capabilities of individual endophyte isolates, or successional endophyte communities identified by NGS, allowing a functional interpretation of the taxonomic data. Thus, PM approaches can provide valuable complementary information for NGS studies of fungal endophytes in forest trees.

  4. Phenotype MicroArrays as a complementary tool to next generation sequencing for characterization of tree endophytes

    PubMed Central

    Blumenstein, Kathrin; Macaya-Sanz, David; Martín, Juan A.; Albrectsen, Benedicte R.; Witzell, Johanna

    2015-01-01

    There is an increasing need to calibrate microbial community profiles obtained through next generation sequencing (NGS) with relevant taxonomic identities of the microbes, and to further associate these identities with phenotypic attributes. Phenotype MicroArray (PM) techniques provide a semi-high throughput assay for characterization and monitoring the microbial cellular phenotypes. Here, we present detailed descriptions of two different PM protocols used in our recent studies on fungal endophytes of forest trees, and highlight the benefits and limitations of this technique. We found that the PM approach enables effective screening of substrate utilization by endophytes. However, the technical limitations are multifaceted and the interpretation of the PM data challenging. For the best result, we recommend that the growth conditions for the fungi are carefully standardized. In addition, rigorous replication and control strategies should be employed whether using pre-configured, commercial microwell-plates or in-house designed PM plates for targeted substrate analyses. With these precautions, the PM technique is a valuable tool to characterize the metabolic capabilities of individual endophyte isolates, or successional endophyte communities identified by NGS, allowing a functional interpretation of the taxonomic data. Thus, PM approaches can provide valuable complementary information for NGS studies of fungal endophytes in forest trees. PMID:26441951

  5. Dynamic combinatorial/covalent chemistry: a tool to read, generate and modulate the bioactivity of compounds and compound mixtures.

    PubMed

    Herrmann, Andreas

    2014-03-21

    Reversible covalent bond formation under thermodynamic control adds reactivity to self-assembled supramolecular systems, and is therefore an ideal tool to assess complexity of chemical and biological systems. Dynamic combinatorial/covalent chemistry (DCC) has been used to read structural information by selectively assembling receptors with the optimum molecular fit around a given template from a mixture of reversibly reacting building blocks. This technique allows access to efficient sensing devices and the generation of new biomolecules, such as small molecule receptor binders for drug discovery, but also larger biomimetic polymers and macromolecules with particular three-dimensional structural architectures. Adding a kinetic factor to a thermodynamically controlled equilibrium results in dynamic resolution and in self-sorting and self-replicating systems, all of which are of major importance in biological systems. Furthermore, the temporary modification of bioactive compounds by reversible combinatorial/covalent derivatisation allows control of their release and facilitates their transport across amphiphilic self-assembled systems such as artificial membranes or cell walls. The goal of this review is to give a conceptual overview of how the impact of DCC on supramolecular assemblies at different levels can allow us to understand, predict and modulate the complexity of biological systems.

  6. Next generation genome-wide association tool: Design and coverage of a high-throughput European-optimized SNP array

    PubMed Central

    Hoffmann, Thomas J.; Kvale, Mark N.; Hesselson, Stephanie E.; Zhan, Yiping; Aquino, Christine; Cao, Yang; Cawley, Simon; Chung, Elaine; Connell, Sheryl; Eshragh, Jasmin; Ewing, Marcia; Gollub, Jeremy; Henderson, Mary; Hubbell, Earl; Iribarren, Carlos; Kaufman, Jay; Lao, Richard Z.; Lu, Yontao; Ludwig, Dana; Mathauda, Gurpreet K.; McGuire, William; Mei, Gangwu; Miles, Sunita; Purdy, Matthew M.; Quesenberry, Charles; Ranatunga, Dilrini; Rowell, Sarah; Sadler, Marianne; Shapero, Michael H.; Shen, Ling; Shenoy, Tanushree R.; Smethurst, David; Van den Eeden, Stephen K.; Walter, Larry; Wan, Eunice; Wearley, Reid; Webster, Teresa; Wen, Christopher C.; Weng, Li; Whitmer, Rachel A.; Williams, Alan; Wong, Simon C.; Zau, Chia; Finn, Andrea; Schaefer, Catherine; Kwok, Pui-Yan; Risch, Neil

    2011-01-01

    The success of genome-wide association studies has paralleled the development of efficient genotyping technologies. We describe the development of a next-generation microarray based on the new highly-efficient Affymetrix Axiom genotyping technology that we are using to genotype individuals of European ancestry from the Kaiser Permanente Research Program on Genes, Environment and Health (RPGEH). The array contains 674,517 SNPs, and provides excellent genome-wide as well as gene-based and candidate-SNP coverage. Coverage was calculated using an approach based on imputation and cross validation. Preliminary results for the first 80,301 saliva-derived DNA samples from the RPGEH demonstrate very high quality genotypes, with sample success rates above 94% and over 98% of successful samples having SNP call rates exceeding 98%. At steady state, we have produced 462 million genotypes per week for each Axiom system. The new array provides a valuable addition to the repertoire of tools for large scale genome-wide association studies. PMID:21565264

  7. Phenotype MicroArrays as a complementary tool to next generation sequencing for characterization of tree endophytes.

    PubMed

    Blumenstein, Kathrin; Macaya-Sanz, David; Martín, Juan A; Albrectsen, Benedicte R; Witzell, Johanna

    2015-01-01

    There is an increasing need to calibrate microbial community profiles obtained through next generation sequencing (NGS) with relevant taxonomic identities of the microbes, and to further associate these identities with phenotypic attributes. Phenotype MicroArray (PM) techniques provide a semi-high throughput assay for characterization and monitoring the microbial cellular phenotypes. Here, we present detailed descriptions of two different PM protocols used in our recent studies on fungal endophytes of forest trees, and highlight the benefits and limitations of this technique. We found that the PM approach enables effective screening of substrate utilization by endophytes. However, the technical limitations are multifaceted and the interpretation of the PM data challenging. For the best result, we recommend that the growth conditions for the fungi are carefully standardized. In addition, rigorous replication and control strategies should be employed whether using pre-configured, commercial microwell-plates or in-house designed PM plates for targeted substrate analyses. With these precautions, the PM technique is a valuable tool to characterize the metabolic capabilities of individual endophyte isolates, or successional endophyte communities identified by NGS, allowing a functional interpretation of the taxonomic data. Thus, PM approaches can provide valuable complementary information for NGS studies of fungal endophytes in forest trees. PMID:26441951

  8. Wireless sensor systems for sense/decide/act/communicate.

    SciTech Connect

    Berry, Nina M.; Cushner, Adam; Baker, James A.; Davis, Jesse Zehring; Stark, Douglas P.; Ko, Teresa H.; Kyker, Ronald D.; Stinnett, Regan White; Pate, Ronald C.; Van Dyke, Colin; Kyckelhahn, Brian

    2003-12-01

    After 9/11, the United States (U.S.) was suddenly pushed into challenging situations they could no longer ignore as simple spectators. The War on Terrorism (WoT) was suddenly ignited and no one knows when this war will end. While the government is exploring many existing and potential technologies, the area of wireless Sensor networks (WSN) has emerged as a foundation for establish future national security. Unlike other technologies, WSN could provide virtual presence capabilities needed for precision awareness and response in military, intelligence, and homeland security applications. The Advance Concept Group (ACG) vision of Sense/Decide/Act/Communicate (SDAC) sensor system is an instantiation of the WSN concept that takes a 'systems of systems' view. Each sensing nodes will exhibit the ability to: Sense the environment around them, Decide as a collective what the situation of their environment is, Act in an intelligent and coordinated manner in response to this situational determination, and Communicate their actions amongst each other and to a human command. This LDRD report provides a review of the research and development done to bring the SDAC vision closer to reality.

  9. Monte Carlo event generators in atomic collisions: A new tool to tackle the few-body dynamics

    NASA Astrophysics Data System (ADS)

    Ciappina, M. F.; Kirchner, T.; Schulz, M.

    2010-04-01

    We present a set of routines to produce theoretical event files, for both single and double ionization of atoms by ion impact, based on a Monte Carlo event generator (MCEG) scheme. Such event files are the theoretical counterpart of the data obtained from a kinematically complete experiment; i.e. they contain the momentum components of all collision fragments for a large number of ionization events. Among the advantages of working with theoretical event files is the possibility to incorporate the conditions present in a real experiment, such as the uncertainties in the measured quantities. Additionally, by manipulating them it is possible to generate any type of cross sections, specially those that are usually too complicated to compute with conventional methods due to a lack of symmetry. Consequently, the numerical effort of such calculations is dramatically reduced. We show examples for both single and double ionization, with special emphasis on a new data analysis tool, called four-body Dalitz plots, developed very recently. Program summaryProgram title: MCEG Catalogue identifier: AEFV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2695 No. of bytes in distributed program, including test data, etc.: 18 501 Distribution format: tar.gz Programming language: FORTRAN 77 with parallelization directives using scripting Computer: Single machines using Linux and Linux servers/clusters (with cores with any clock speed, cache memory and bits in a word) Operating system: Linux (any version and flavor) and FORTRAN 77 compilers Has the code been vectorised or parallelized?: Yes RAM: 64-128 kBytes (the codes are very cpu intensive) Classification: 2.6 Nature of problem: The code deals with single and double

  10. Methylation status of IGFBP-3 as a useful clinical tool for deciding on a concomitant radiotherapy

    PubMed Central

    Pernía, Olga; Belda-Iniesta, Cristobal; Pulido, Veronica; Cortes-Sempere, María; Rodriguez, Carlos; Vera, Olga; Soto, Javier; Jiménez, Julia; Taus, Alvaro; Rojo, Federico; Arriola, Edurne; Rovira, Ana; Albanell, Joan; Macías, M Teresa; de Castro, Javier; Perona, Rosario; Ibañez de Caceres, Inmaculada

    2014-01-01

    The methylation status of the IGFBP-3 gene is strongly associated with cisplatin sensitivity in patients with non-small cell lung cancer (NSCLC). In this study, we found in vitro evidence that linked the presence of an unmethylated promoter with poor response to radiation. Our data also indicate that radiation might sensitize chemotherapy-resistant cells by reactivating IGFBP-3-expression through promoter demethylation, inactivating the PI3K/AKT pathway. We also explored the IGFBP-3 methylation effect on overall survival (OS) in a population of 40 NSCLC patients who received adjuvant therapy after R0 surgery. Our results indicate that patients harboring an unmethylated promoter could benefit more from a chemotherapy schedule alone than from a multimodality therapy involving radiotherapy and platinum-based treatments, increasing their OS by 2.5 y (p = .03). Our findings discard this epi-marker as a prognostic factor in a patient population without adjuvant therapy, indicating that radiotherapy does not improve survival for patients harboring an unmethylated IGFBP-3 promoter. PMID:25482372

  11. Profiling biopharmaceutical deciding properties of absorption of lansoprazole enteric-coated tablets using gastrointestinal simulation technology.

    PubMed

    Wu, Chunnuan; Sun, Le; Sun, Jin; Yang, Yajun; Ren, Congcong; Ai, Xiaoyu; Lian, He; He, Zhonggui

    2013-09-10

    The aim of the present study was to correlate in vitro properties of drug formulation to its in vivo performance, and to elucidate the deciding properties of oral absorption. Gastrointestinal simulation technology (GST) was used to simulate the in vivo plasma concentration-time curve and was implemented by GastroPlus™ software. Lansoprazole, a typical BCS class II drug, was chosen as a model drug. Firstly, physicochemical and pharmacokinetic parameters of lansoprazole were determined or collected from literature to construct the model. Validation of the developed model was performed by comparison of the predicted and the experimental plasma concentration data. We found that the predicted curve was in a good agreement with the experimental data. Then, parameter sensitivity analysis (PSA) was performed to find the key parameters of oral absorption. The absorption was particularly sensitive to dose, solubility and particle size for lansoprazole enteric-coated tablets. With a single dose of 30 mg and the solubility of 0.04 mg/ml, the absorption was complete. A good absorption could be achieved with lansoprazole particle radius down to about 25 μm. In summary, GST is a useful tool for profiling biopharmaceutical deciding properties of absorption of lansoprazole enteric-coated tablets and guiding the formulation optimization.

  12. FANSe2: a robust and cost-efficient alignment tool for quantitative next-generation sequencing applications.

    PubMed

    Xiao, Chuan-Le; Mai, Zhi-Biao; Lian, Xin-Lei; Zhong, Jia-Yong; Jin, Jing-Jie; He, Qing-Yu; Zhang, Gong

    2014-01-01

    Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT) based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads) to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/. PMID:24743329

  13. FANSe2: A Robust and Cost-Efficient Alignment Tool for Quantitative Next-Generation Sequencing Applications

    PubMed Central

    Xiao, Chuan-Le; Mai, Zhi-Biao; Lian, Xin-Lei; Zhong, Jia-Yong; Jin, Jing-jie; He, Qing-Yu; Zhang, Gong

    2014-01-01

    Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT) based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads) to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/. PMID:24743329

  14. FANSe2: a robust and cost-efficient alignment tool for quantitative next-generation sequencing applications.

    PubMed

    Xiao, Chuan-Le; Mai, Zhi-Biao; Lian, Xin-Lei; Zhong, Jia-Yong; Jin, Jing-Jie; He, Qing-Yu; Zhang, Gong

    2014-01-01

    Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT) based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads) to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/.

  15. Evolutionary algorithm based optimization of hydraulic machines utilizing a state-of-the-art block coupled CFD solver and parametric geometry and mesh generation tools

    NASA Astrophysics Data System (ADS)

    S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr

    2014-03-01

    An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.

  16. Generations.

    PubMed

    Chambers, David W

    2005-01-01

    Groups naturally promote their strengths and prefer values and rules that give them an identity and an advantage. This shows up as generational tensions across cohorts who share common experiences, including common elders. Dramatic cultural events in America since 1925 can help create an understanding of the differing value structures of the Silents, the Boomers, Gen Xers, and the Millennials. Differences in how these generations see motivation and values, fundamental reality, relations with others, and work are presented, as are some applications of these differences to the dental profession. PMID:16623137

  17. Deciding Termination for Ancestor Match- Bounded String Rewriting Systems

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2005-01-01

    Termination of a string rewriting system can be characterized by termination on suitable recursively defined languages. This kind of termination criteria has been criticized for its lack of automation. In an earlier paper we have shown how to construct an automated termination criterion if the recursion is aligned with the rewrite relation. We have demonstrated the technique with Dershowitz's forward closure criterion. In this paper we show that a different approach is suitable when the recursion is aligned with the inverse of the rewrite relation. We apply this idea to Kurth's ancestor graphs and obtain ancestor match-bounded string rewriting systems. Termination is shown to be decidable for this class. The resulting method improves upon those based on match-boundedness or inverse match-boundedness.

  18. A Graphic Symbol Tool for the Evaluation of Communication, Satisfaction and Priorities of Individuals with Intellectual Disability Who Use a Speech Generating Device

    ERIC Educational Resources Information Center

    Valiquette, Christine; Sutton, Ann; Ska, Bernadette

    2010-01-01

    This article reports on the views of individuals with learning disability (LD) on their use of their speech generating devices (SGDs), their satisfaction about their communication, and their priorities. The development of an interview tool made of graphic symbols and entitled Communication, Satisfaction and Priorities of SGD Users (CSPU) is…

  19. E-DECIDER Disaster Response and Decision Support Cyberinfrastructure: Technology and Challenges

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    Timely delivery of critical information to decision makers during a disaster is essential to response and damage assessment. Key issues to an efficient emergency response after a natural disaster include rapidly processing and delivering this critical information to emergency responders and reducing human intervention as much as possible. Essential elements of information necessary to achieve situational awareness are often generated by a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. A key challenge is the current state of practice does not easily support information sharing and technology interoperability. NASA E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) has worked with the California Earthquake Clearinghouse and its partners to address these issues and challenges by adopting the XChangeCore Web Service Data Orchestration technology and participating in several earthquake response exercises. The E-DECIDER decision support system provides rapid delivery of advanced situational awareness data products to operations centers and emergency responders in the field. Remote sensing and hazard data, model-based map products, information from simulations, damage detection, and crowdsourcing is integrated into a single geospatial view and delivered through a service oriented architecture for improved decision-making and then directly to mobile devices of responders. By adopting a Service Oriented Architecture based on Open Geospatial Consortium standards, the system provides an extensible, comprehensive framework for geospatial data processing and distribution on Cloud platforms and other distributed environments. While the Clearinghouse and its partners are not first responders, they do support the emergency response community by providing information about the damaging effects earthquakes. It is critical for decision makers to maintain a situational awareness

  20. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school boards decide... 25 Indians 1 2012-04-01 2011-04-01 true Who decides how Language Development funds can be used?...

  1. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... ADMINISTRATION GOVERNMENT CONTRACTING PROGRAMS Contracting with SDVO SBCs § 125.17 Who decides if a contract opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Who decides if a...

  2. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... ADMINISTRATION GOVERNMENT CONTRACTING PROGRAMS Contracting with SDVO SBCs § 125.17 Who decides if a contract opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Who decides if a...

  3. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ADMINISTRATION GOVERNMENT CONTRACTING PROGRAMS Contracting with SDVO SBCs § 125.17 Who decides if a contract opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Who decides if a...

  4. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ADMINISTRATION GOVERNMENT CONTRACTING PROGRAMS Contracting with SDVO SBCs § 125.17 Who decides if a contract opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Who decides if a...

  5. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... ADMINISTRATION GOVERNMENT CONTRACTING PROGRAMS Contracting with SDVO SBCs § 125.17 Who decides if a contract opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Who decides if a...

  6. Dormancy and germination: How does the crop seed decide?

    PubMed

    Shu, K; Meng, Y J; Shuai, H W; Liu, W G; Du, J B; Liu, J; Yang, W Y

    2015-11-01

    Whether seeds germinate or maintain dormancy is decided upon through very intricate physiological processes. Correct timing of these processes is most important for the plants life cycle. If moist conditions are encountered, a low dormancy level causes pre-harvest sprouting in various crop species, such as wheat, corn and rice, this decreases crop yield and negatively impacts downstream industrial processing. In contrast, a deep level of seed dormancy prevents normal germination even under favourable conditions, resulting in a low emergence rate during agricultural production. Therefore, an optimal seed dormancy level is valuable for modern mechanised agricultural systems. Over the past several years, numerous studies have demonstrated that diverse endogenous and environmental factors regulate the balance between dormancy and germination, such as light, temperature, water status and bacteria in soil, and phytohormones such as ABA (abscisic acid) and GA (gibberellic acid). In this updated review, we highlight recent advances regarding the molecular mechanisms underlying regulation of seed dormancy and germination processes, including the external environmental and internal hormonal cues, and primarily focusing on the staple crop species. Furthermore, future challenges and research directions for developing a full understanding of crop seed dormancy and germination are also discussed.

  7. Point of decision: when do pigeons decide to head home?

    NASA Astrophysics Data System (ADS)

    Schiffner, Ingo; Wiltschko, Roswitha

    2009-02-01

    Pigeons released away from their loft usually fly around at the release site for a while before they finally leave. Visual observations had suggested that the moment when the birds decide to head home is associated with a certain change in flying style. To see whether this change is also reflected by GPS-recorded tracks, a group of pigeons equipped with flight recorders was released at two sites about 10 km from their home loft. The initial part of their flight paths was analyzed in order to find objective criteria indicating the point of decision. We selected the highest increase in steadiness as the best estimate for the moment of decision. This criterion allows us to divide the pigeons’ paths in two distinct phases, an initial phase and the homing phase, with the moment of decision, on an average, 2 min after release. The moment of decision marks a change in behavior, with a significant increase in steadiness and flying speed and headings significantly closer to the home direction. The behavior of the individual birds at the two sites was not correlated, suggesting no pronounced individual traits for the length of the initial phase. The behavior during this phase seems to be controlled by flight preparation, exploration, and non-navigational motivations rather than by navigational necessities alone.

  8. Dormancy and germination: How does the crop seed decide?

    PubMed

    Shu, K; Meng, Y J; Shuai, H W; Liu, W G; Du, J B; Liu, J; Yang, W Y

    2015-11-01

    Whether seeds germinate or maintain dormancy is decided upon through very intricate physiological processes. Correct timing of these processes is most important for the plants life cycle. If moist conditions are encountered, a low dormancy level causes pre-harvest sprouting in various crop species, such as wheat, corn and rice, this decreases crop yield and negatively impacts downstream industrial processing. In contrast, a deep level of seed dormancy prevents normal germination even under favourable conditions, resulting in a low emergence rate during agricultural production. Therefore, an optimal seed dormancy level is valuable for modern mechanised agricultural systems. Over the past several years, numerous studies have demonstrated that diverse endogenous and environmental factors regulate the balance between dormancy and germination, such as light, temperature, water status and bacteria in soil, and phytohormones such as ABA (abscisic acid) and GA (gibberellic acid). In this updated review, we highlight recent advances regarding the molecular mechanisms underlying regulation of seed dormancy and germination processes, including the external environmental and internal hormonal cues, and primarily focusing on the staple crop species. Furthermore, future challenges and research directions for developing a full understanding of crop seed dormancy and germination are also discussed. PMID:26095078

  9. Development of a 2nd Generation Decision Support Tool to Optimize Resource and Energy Recovery for Municipal Solid Waste

    EPA Science Inventory

    In 2012, EPA’s Office of Research and Development released the MSW decision support tool (MSW-DST) to help identify strategies for more sustainable MSW management. Depending upon local infrastructure, energy grid mix, population density, and waste composition and quantity, the m...

  10. Canute Rules the Waves?: Hope for E-Library Tools Facing the Challenge of the "Google Generation"

    ERIC Educational Resources Information Center

    Myhill, Martin

    2007-01-01

    Purpose: To consider the findings of a recent e-resources survey at the University of Exeter Library in the context of the dominance of web search engines in academia, balanced by the development of e-library tools such as the library OPAC, OpenURL resolvers, metasearch engines, LDAP and proxy servers, and electronic resource management modules.…

  11. Development of a 2nd Generation Decision Support Tool to Optimize Resource and Energy Recovery for Municipal Solid Waste

    EPA Science Inventory

    In 2012, EPA’s Office of Research and Development released the MSW decision support tool (MSW-DST) to help identify strategies for more sustainable MSW management. Depending upon local infrastructure, energy grid mix, population density, and waste composition and quantity,...

  12. The Circuit of Culture as a Generative Tool of Contemporary Analysis: Examining the Construction of an Education Commodity

    ERIC Educational Resources Information Center

    Leve, Annabelle M.

    2012-01-01

    Contemporary studies in the field of education cannot afford to neglect the ever present interrelationships between power and politics, economics and consumption, representation and identity. In studying a recent cultural phenomenon in government schools, it became clear that a methodological tool that made sense of these interlinked processes was…

  13. Generation Y, Learner Autonomy and the Potential of Web 2.0 Tools for Language Learning and Teaching

    ERIC Educational Resources Information Center

    Morgan, Liam

    2012-01-01

    Purpose: The purpose of this paper is to examine the relationship between the development of learner autonomy and the application of Web 2.0 tools in the language classroom. Design/methodology/approach: The approach taken is that of qualitative action research within an explicit theoretical framework and the data were collected via surveys and…

  14. The Effectiveness of Virtual Learning Tools for Millennial Generation Students in a Community College Criminal Justice Degree Program

    ERIC Educational Resources Information Center

    Snyder, Lawrence

    2013-01-01

    An analysis of data from the Community College Survey of Student Engagement and multiyear analysis of pretest/posttest scores in introductory criminal justice courses revealed there was a systemic decline in student engagement and achievement. Because of this analysis, a commercial virtual learning tool (CJI) that purported great success in…

  15. Evaluating Student-Generated Film as a Learning Tool for Qualitative Methods: Geographical "Drifts" and the City

    ERIC Educational Resources Information Center

    Anderson, Jon

    2013-01-01

    Film as a tool for learning offers considerable opportunity for enhancing student understanding. This paper reflects on the experiences of a project that required students to make a short film demonstrating their practical understanding of qualitative methods. In the psychogeographical tradition, students were asked to "drift" across the…

  16. PARTNERSHIP FOR THE DEVELOPMENT OF NEXT GENERATION SIMULATION TOOLS TO EVALUATE CEMENTITIOUS BARRIERS AND MATERIALS USED IN NUCLEAR APPLICATION - 8388

    SciTech Connect

    Langton, C; Richard Dimenna, R

    2008-01-29

    The US DOE has initiated a multidisciplinary cross cutting project to develop a reasonable and credible set of tools to predict the structural, hydraulic and chemical performance of cement barriers used in nuclear applications over extended time frames (e.g., > 100 years for operating facilities and > 1000 years for waste management). A partnership that combines DOE, NRC, academia, private sector, and international expertise has been formed to accomplish the project objectives by integrating existing information and realizing advancements where necessary. The set of simulation tools and data developed under this project will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems, e.g., waste forms, containment structures, entombments and environmental remediation, including decontamination and decommissioning (D&D) activities. The simulation tools will also support analysis of structural concrete components of nuclear facilities (spent fuel pools, dry spent fuel storage units, and recycling facilities, e.g., fuel fabrication, separations processes). Simulation parameters will be obtained from prior literature and will be experimentally measured under this project, as necessary, to demonstrate application of the simulation tools for three prototype applications (waste form in concrete vault, high level waste tank grouting, and spent fuel pool). Test methods and data needs to support use of the simulation tools for future applications will be defined. This is a national issue that affects all waste disposal sites that use cementitious waste forms and structures, decontamination and decommissioning activities, service life determination of existing structures, and design of future public and private nuclear facilities. The problem is difficult because it requires projecting conditions and responses over extremely long times. Current performance assessment analyses show that engineered barriers are

  17. The determination of waste generation and composition as an essential tool to improve the waste management plan of a university.

    PubMed

    Gallardo, A; Edo-Alcón, N; Carlos, M; Renau, M

    2016-07-01

    When many people work in organized institutions or enterprises, those institutions or enterprises become big meeting places that also have energy, water and resources necessities. One of these necessities is the correct management of the waste that is daily produced by these communities. Universities are a good example of institution where every day a great amount of people go to work or to study. But independently of their task, they use the different services at the University such as cafeterias, canteens, and photocopy and as a result of their activity a cleaning service is also needed. All these activities generate an environmental impact. Nowadays, many Universities have accepted the challenge to minimize this impact applying several measures. One of the impacts to be reduced is the waste generation. The first step to implement measures to implement a waste management plan at a University is to know the composition, the amount and the distribution of the waste generated in its facilities. As the waste composition and generation depend among other things on the climate, these variables should be analysed over one year. This research work estimates the waste generation and composition of a Spanish University, the Universitat Jaume I, during a school year. To achieve this challenge, all the waste streams generated at the University have been identified and quantified emphasizing on those which are not controlled. Furthermore, several statistical analyses have been carried out to know if the season of the year or the day of the week affect waste generation and composition. All this information will allow the University authorities to propose a set of minimization measures to enhance the current management. PMID:27107706

  18. The determination of waste generation and composition as an essential tool to improve the waste management plan of a university.

    PubMed

    Gallardo, A; Edo-Alcón, N; Carlos, M; Renau, M

    2016-07-01

    When many people work in organized institutions or enterprises, those institutions or enterprises become big meeting places that also have energy, water and resources necessities. One of these necessities is the correct management of the waste that is daily produced by these communities. Universities are a good example of institution where every day a great amount of people go to work or to study. But independently of their task, they use the different services at the University such as cafeterias, canteens, and photocopy and as a result of their activity a cleaning service is also needed. All these activities generate an environmental impact. Nowadays, many Universities have accepted the challenge to minimize this impact applying several measures. One of the impacts to be reduced is the waste generation. The first step to implement measures to implement a waste management plan at a University is to know the composition, the amount and the distribution of the waste generated in its facilities. As the waste composition and generation depend among other things on the climate, these variables should be analysed over one year. This research work estimates the waste generation and composition of a Spanish University, the Universitat Jaume I, during a school year. To achieve this challenge, all the waste streams generated at the University have been identified and quantified emphasizing on those which are not controlled. Furthermore, several statistical analyses have been carried out to know if the season of the year or the day of the week affect waste generation and composition. All this information will allow the University authorities to propose a set of minimization measures to enhance the current management.

  19. ESO Council Decides to Continue VLT Project at Paranal

    NASA Astrophysics Data System (ADS)

    1994-08-01

    The Council [1] of the European Southern Observatory has met in extraordinary session at the ESO Headquarters in Garching near Munich on August 8 and 9, 1994. The main agenda items were concerned with the recent developments around ESO's relations with the host state, the Republic of Chile, as well as the status of the organisation's main project, the 16-metre equivalent Very Large Telescope (VLT) which will become the world's largest optical telescope. Council had decided to hold this special meeting [2] because of various uncertainties that have arisen in connection with the implementation of the VLT Project at Cerro Paranal, approx. 130 kilometres south of Antofagasta, capital of the II Region in Chile. Following continued consultations at different levels within the ESO member states and after careful consideration of all aspects of the current situation - including various supportive actions by the Chilean Government as well as the incessive attacks against this international organisation from certain sides reported in the media in that country - Council took the important decision to continue the construction of the VLT Observatory at Paranal, while at the same time requesting the ESO Management to pursue the ongoing studies of alternative solutions. THE COUNCIL DECISIONS In particular, the ESO Council took note of recent positive developments which have occurred since the May 1994 round of discussions with the Chilean authorities in Santiago. The confirmation of ESO's immunities as an International Organization in Chile, contained in a number of important statements and documents, is considered a significant step by the Chilean Government to insure to ESO the unhindered erection and later operation of the VLT on Paranal. Under these circumstances and in order to maintain progress on the VLT project, the ESO Council authorized the ESO Management to continue the on-site work at Paranal. Council also took note of the desire expressed by the Chilean Government

  20. A Useful Laboratory Tool

    ERIC Educational Resources Information Center

    Johnson, Samuel A.; Tutt, Tye

    2008-01-01

    Recently, a high school Science Club generated a large number of questions involving temperature. Therefore, they decided to construct a thermal gradient apparatus in order to conduct a wide range of experiments beyond the standard "cookbook" labs. They felt that this apparatus could be especially useful in future ninth-grade biology classes, in…

  1. Creating User-Friendly Tools for Data Analysis and Visualization in K-12 Classrooms: A Fortran Dinosaur Meets Generation Y

    NASA Technical Reports Server (NTRS)

    Chambers, L. H.; Chaudhury, S.; Page, M. T.; Lankey, A. J.; Doughty, J.; Kern, Steven; Rogerson, Tina M.

    2008-01-01

    During the summer of 2007, as part of the second year of a NASA-funded project in partnership with Christopher Newport University called SPHERE (Students as Professionals Helping Educators Research the Earth), a group of undergraduate students spent 8 weeks in a research internship at or near NASA Langley Research Center. Three students from this group formed the Clouds group along with a NASA mentor (Chambers), and the brief addition of a local high school student fulfilling a mentorship requirement. The Clouds group was given the task of exploring and analyzing ground-based cloud observations obtained by K-12 students as part of the Students' Cloud Observations On-Line (S'COOL) Project, and the corresponding satellite data. This project began in 1997. The primary analysis tools developed for it were in FORTRAN, a computer language none of the students were familiar with. While they persevered through computer challenges and picky syntax, it eventually became obvious that this was not the most fruitful approach for a project aimed at motivating K-12 students to do their own data analysis. Thus, about halfway through the summer the group shifted its focus to more modern data analysis and visualization tools, namely spreadsheets and Google(tm) Earth. The result of their efforts, so far, is two different Excel spreadsheets and a Google(tm) Earth file. The spreadsheets are set up to allow participating classrooms to paste in a particular dataset of interest, using the standard S'COOL format, and easily perform a variety of analyses and comparisons of the ground cloud observation reports and their correspondence with the satellite data. This includes summarizing cloud occurrence and cloud cover statistics, and comparing cloud cover measurements from the two points of view. A visual classification tool is also provided to compare the cloud levels reported from the two viewpoints. This provides a statistical counterpart to the existing S'COOL data visualization tool

  2. Cost and availability are deciding factors for new technology

    SciTech Connect

    Kimura, S.G.

    1996-10-01

    Higher gas turbine efficiencies, more gasification plants and an ever-increasing focus on the environment worldwide are among the issues that will shape the power generation industry in the next few years and into the next century. For fossil energy systems, gas turbines have emerged as the technology of choice because of their high efficiencies, Already, operating efficiencies of 55% and above in combined-cycle, advanced gas turbines will be achieving 60% thermal efficiency by the end of this decade. With the next generation of gas turbines, there could be combined-cycle efficiencies a few points higher by the year 2010.

  3. Using next-generation sequencing as a genetic diagnostic tool in rare autosomal recessive neurologic Mendelian disorders.

    PubMed

    Chen, Zhao; Wang, Jun-Ling; Tang, Bei-Sha; Sun, Zhan-Fang; Shi, Yu-Ting; Shen, Lu; Lei, Li-Fang; Wei, Xiao-Ming; Xiao, Jing-Jing; Hu, Zheng-Mao; Pan, Qian; Xia, Kun; Zhang, Qing-Yan; Dai, Mei-Zhi; Liu, Yu; Ashizawa, Tetsuo; Jiang, Hong

    2013-10-01

    Next-generation sequencing was used to investigate 9 rare Chinese pedigrees with rare autosomal recessive neurologic Mendelian disorders. Five probands with ataxia-telangectasia and 1 proband with chorea-acanthocytosis were analyzed by targeted gene sequencing. Whole-exome sequencing was used to investigate 3 affected individuals with Joubert syndrome, nemaline myopathy, or spastic ataxia Charlevoix-Saguenay type. A list of known and novel candidate variants was identified for each causative gene. All variants were genetically verified by Sanger sequencing or quantitative polymerase chain reaction with the strategy of disease segregation in related pedigrees and healthy controls. The advantages of using next-generation sequencing to diagnose rare autosomal recessive neurologic Mendelian disorders characterized by genetic and phenotypic heterogeneity are demonstrated. A genetic diagnostic strategy combining the use of targeted gene sequencing and whole-exome sequencing with the aid of next-generation sequencing platforms has shown great promise for improving the diagnosis of neurologic Mendelian disorders. PMID:23726790

  4. Changing genetic paradigms: creating next-generation genetic databases as tools to understand the emerging complexities of genotype/phenotype relationships

    PubMed Central

    2014-01-01

    Understanding genotype/phenotype relationships has become more complicated as increasing amounts of inter- and intra-tissue genetic heterogeneity have been revealed through next-generation sequencing and evidence showing that factors such as epigenetic modifications, non-coding RNAs and RNA editing can play an important role in determining phenotype. Such findings have challenged a number of classic genetic assumptions including (i) analysis of genomic sequence obtained from blood is an accurate reflection of the genotype responsible for phenotype expression in an individual; (ii) that significant genetic alterations will be found only in diseased individuals, in germline tissues in inherited diseases, or in specific diseased tissues in somatic diseases such as cancer; and (iii) that mutation rates in putative disease-associated genes solely determine disease phenotypes. With the breakdown of our traditional understanding of genotype to phenotype relationships, it is becoming increasingly apparent that new analytical tools will be required to determine the relationship between genotype and phenotypic expression. To this end, we are proposing that next-generation genetic database (NGDB) platforms be created that include new bioinformatics tools based on algorithms that can evaluate genetic heterogeneity, as well as powerful systems biology analysis tools to actively process and evaluate the vast amounts of both genomic and genomic-modifying information required to reveal the true relationships between genotype and phenotype. PMID:24885908

  5. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    PubMed

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.

  6. Effectiveness of Student-Generated Video as a Teaching Tool for an Instrumental Technique in the Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Jordan, Jeremy T.; Box, Melinda C.; Eguren, Kristen E.; Parker, Thomas A.; Saraldi-Gallardo, Victoria M.; Wolfe, Michael I.; Gallardo-Williams, Maria T.

    2016-01-01

    Multimedia instruction has been shown to serve as an effective learning aid for chemistry students. In this study, the viability of student-generated video instruction for organic chemistry laboratory techniques and procedure was examined and its effectiveness compared to instruction provided by a teaching assistant (TA) was evaluated. After…

  7. Next generation sequencing technology: a powerful tool for the genome characterization of sugarcane mosaic virus from Sorghum almum

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Next generation sequencing (NGS) technology was used to analyze the occurrence of viruses in Sorghum almum plants in Florida exhibiting mosaic symptoms. Total RNA was extracted from symptomatic leaves and used as a template for cDNA library preparation. The resulting library was sequenced on an Illu...

  8. The C3 Framework: A Powerful Tool for Preparing Future Generations for Informed and Engaged Civic Life

    ERIC Educational Resources Information Center

    Croddy, Marshall; Levine, Peter

    2014-01-01

    As the C3 Framework for the social studies rolls out, it is hoped that its influence will grow, offering a vision and guidance for the development of a new generation of state social studies standards that promote deeper student learning and the acquisition of essentials skills for college, career, and civic life. In the interim, it can be an…

  9. Using Tablets as Tools for Learner-Generated Drawings in the Context of Teaching the Kinetic Theory of Gases

    ERIC Educational Resources Information Center

    Lehtinen, A.; Viiri, J.

    2014-01-01

    Even though research suggests that the use of drawings could be an important part of learning science, learner-generated drawings have not received much attention in physics classrooms. This paper presents a method for recording students' drawings and group discussions using tablets. Compared to pen and paper, tablets offer unique benefits,…

  10. Using a resource effect study pre-pilot to inform a large randomized trial: the Decide2Quit.Org Web-assisted tobacco intervention.

    PubMed

    Sadasivam, Rajani S; Allison, Jeroan J; Ray, Midge N; Ford, Daniel E; Houston, Thomas K

    2012-01-01

    Resource effect studies can be useful in highlighting areas of improvement in informatics tools. Before a large randomized trial, we tested the functions of the Decide2Quit.org Web-assisted tobacco intervention using smokers (N=204) recruited via Google advertisements. These smokers were given access to Decide2Quit.org for six months and we tracked their usage and assessed their six months cessation using a rigorous follow-up. Multiple, interesting findings were identified: we found the use of tailored emails to dramatically increase participation for a short period. We also found varied effects of the different functions. Functions supporting "seeking social support" (Your Online Community and Family Tools), Healthcare Provider Tools, and the Library had positive effects on quit outcomes. One surprising finding, which needs further investigation, was that writing to our Tobacco Treatment Specialists was negatively associated with quit outcomes.

  11. Restoring the ON Switch in Blind Retinas: Opto-mGluR6, a Next-Generation, Cell-Tailored Optogenetic Tool

    PubMed Central

    van Wyk, Michiel; Pielecka-Fortuna, Justyna; Löwel, Siegrid; Kleinlogel, Sonja

    2015-01-01

    Photoreceptor degeneration is one of the most prevalent causes of blindness. Despite photoreceptor loss, the inner retina and central visual pathways remain intact over an extended time period, which has led to creative optogenetic approaches to restore light sensitivity in the surviving inner retina. The major drawbacks of all optogenetic tools recently developed and tested in mouse models are their low light sensitivity and lack of physiological compatibility. Here we introduce a next-generation optogenetic tool, Opto-mGluR6, designed for retinal ON-bipolar cells, which overcomes these limitations. We show that Opto-mGluR6, a chimeric protein consisting of the intracellular domains of the ON-bipolar cell–specific metabotropic glutamate receptor mGluR6 and the light-sensing domains of melanopsin, reliably recovers vision at the retinal, cortical, and behavioral levels under moderate daylight illumination. PMID:25950461

  12. A comprehensive tool for image-based generation of fetus and pregnant women mesh models for numerical dosimetry studies

    NASA Astrophysics Data System (ADS)

    Dahdouh, S.; Varsier, N.; Serrurier, A.; De la Plata, J.-P.; Anquez, J.; Angelini, E. D.; Wiart, J.; Bloch, I.

    2014-08-01

    Fetal dosimetry studies require the development of accurate numerical 3D models of the pregnant woman and the fetus. This paper proposes a 3D articulated fetal growth model covering the main phases of pregnancy and a pregnant woman model combining the utero-fetal structures and a deformable non-pregnant woman body envelope. The structures of interest were automatically or semi-automatically (depending on the stage of pregnancy) segmented from a database of images and surface meshes were generated. By interpolating linearly between fetal structures, each one can be generated at any age and in any position. A method is also described to insert the utero-fetal structures in the maternal body. A validation of the fetal models is proposed, comparing a set of biometric measurements to medical reference charts. The usability of the pregnant woman model in dosimetry studies is also investigated, with respect to the influence of the abdominal fat layer.

  13. electronic Ligand Builder and Optimisation Workbench (eLBOW): A tool for ligand coordinate and restraint generation

    SciTech Connect

    Moriarty, Nigel; Grosse-Kunstleve, Ralf; Adams, Paul

    2009-07-01

    The electronic Ligand Builder and Optimisation Workbench (eLBOW) is a program module of the PHENIX suite of computational crystallographic software. It's designed to be a flexible procedure using simple and fast quantum chemical techniques to provide chemically accurate information for novel and known ligands alike. A variety of input formats and options allow for the attainment of a number of diverse goals including geometry optimisation and generation of restraints.

  14. Functional connectivity associated with hand shape generation: Imitating novel hand postures and pantomiming tool grips challenge different nodes of a shared neural network.

    PubMed

    Vingerhoets, Guy; Clauwaert, Amanda

    2015-09-01

    Clinical research suggests that imitating meaningless hand postures and pantomiming tool-related hand shapes rely on different neuroanatomical substrates. We investigated the BOLD responses to different tasks of hand posture generation in 14 right handed volunteers. Conjunction and contrast analyses were applied to select regions that were either common or sensitive to imitation and/or pantomime tasks. The selection included bilateral areas of medial and lateral extrastriate cortex, superior and inferior regions of the lateral and medial parietal lobe, primary motor and somatosensory cortex, and left dorsolateral prefrontal, and ventral and dorsal premotor cortices. Functional connectivity analysis revealed that during hand shape generation the BOLD-response of every region correlated significantly with every other area regardless of the hand posture task performed, although some regions were more involved in some hand postures tasks than others. Based on between-task differences in functional connectivity we predict that imitation of novel hand postures would suffer most from left superior parietal disruption and that pantomiming hand postures for tools would be impaired following left frontal damage, whereas both tasks would be sensitive to inferior parietal dysfunction. We also unveiled that posterior temporal cortex is committed to pantomiming tool grips, but that the involvement of this region to the execution of hand postures in general appears limited. We conclude that the generation of hand postures is subserved by a highly interconnected task-general neural network. Depending on task requirements some nodes/connections will be more engaged than others and these task-sensitive findings are in general agreement with recent lesion studies. PMID:26095674

  15. Functional connectivity associated with hand shape generation: Imitating novel hand postures and pantomiming tool grips challenge different nodes of a shared neural network.

    PubMed

    Vingerhoets, Guy; Clauwaert, Amanda

    2015-09-01

    Clinical research suggests that imitating meaningless hand postures and pantomiming tool-related hand shapes rely on different neuroanatomical substrates. We investigated the BOLD responses to different tasks of hand posture generation in 14 right handed volunteers. Conjunction and contrast analyses were applied to select regions that were either common or sensitive to imitation and/or pantomime tasks. The selection included bilateral areas of medial and lateral extrastriate cortex, superior and inferior regions of the lateral and medial parietal lobe, primary motor and somatosensory cortex, and left dorsolateral prefrontal, and ventral and dorsal premotor cortices. Functional connectivity analysis revealed that during hand shape generation the BOLD-response of every region correlated significantly with every other area regardless of the hand posture task performed, although some regions were more involved in some hand postures tasks than others. Based on between-task differences in functional connectivity we predict that imitation of novel hand postures would suffer most from left superior parietal disruption and that pantomiming hand postures for tools would be impaired following left frontal damage, whereas both tasks would be sensitive to inferior parietal dysfunction. We also unveiled that posterior temporal cortex is committed to pantomiming tool grips, but that the involvement of this region to the execution of hand postures in general appears limited. We conclude that the generation of hand postures is subserved by a highly interconnected task-general neural network. Depending on task requirements some nodes/connections will be more engaged than others and these task-sensitive findings are in general agreement with recent lesion studies.

  16. Combining experimental evolution with next-generation sequencing: a powerful tool to study adaptation from standing genetic variation.

    PubMed

    Schlötterer, C; Kofler, R; Versace, E; Tobler, R; Franssen, S U

    2015-05-01

    Evolve and resequence (E&R) is a new approach to investigate the genomic responses to selection during experimental evolution. By using whole genome sequencing of pools of individuals (Pool-Seq), this method can identify selected variants in controlled and replicable experimental settings. Reviewing the current state of the field, we show that E&R can be powerful enough to identify causative genes and possibly even single-nucleotide polymorphisms. We also discuss how the experimental design and the complexity of the trait could result in a large number of false positive candidates. We suggest experimental and analytical strategies to maximize the power of E&R to uncover the genotype-phenotype link and serve as an important research tool for a broad range of evolutionary questions.

  17. The GEISA system in 1996: towards an operational tool for the second generation vertical sounders radiance simulation.

    NASA Astrophysics Data System (ADS)

    Jacquinet-Husson, N.; Scott, N. A.; Chedin, A.; Bonnet, B.; Barbe, A.; Tyuterev, V. G.; Champion, J. P.; Winnewisser, M.; Brown, L. R.; Gamache, R.; Golovko, V. F.; Chursin, A. A.

    1998-05-01

    Since their creation, in 1974, the GEISA (Gestion et Etude des Informations Spectroscopiques Atmospheriques: Management and Study of Atmospheric Spectroscopic Information) database system (more than 730,000 entries between 0 and 22,656 cm-1, corresponding to 40 molecules and 86 isotopic species, in its 1992 edition) and the associated software have been widely used for forward atmospheric radiative transfer modelling, with the maximum reliability, tractability and efficiency. For the upcoming high spectral resolution sounders like IASI (Infrared Atmospheric Sounding Interferometer) and AIRS (Atmospheric InfraRed Sounder), more complete and accurate laboratory measurements of spectroscopic parameters, presently included in the databases, are required, and more sophisticated theoretical radiative transfer modelling should be developed. Consequently, it is intended to elaborate the GEISA database as an interactive tool, named GEISA/IASI, designed for providing spectroscopic information tailored to the IASI sounding radiative transfer modelling.

  18. NOTE: A software tool for 2D/3D visualization and analysis of phase-space data generated by Monte Carlo modelling of medical linear accelerators

    NASA Astrophysics Data System (ADS)

    Neicu, Toni; Aljarrah, Khaled M.; Jiang, Steve B.

    2005-10-01

    A computer program has been developed for novel 2D/3D visualization and analysis of the phase-space parameters of Monte Carlo simulations of medical accelerator radiation beams. The software is written in the IDL language and reads the phase-space data generated in the BEAMnrc/BEAM Monte Carlo code format. Contour and colour-wash plots of the fluence, mean energy, energy fluence, mean angle, spectra distribution, energy fluence distribution, angular distribution, and slices and projections of the 3D ZLAST distribution can be calculated and displayed. Based on our experience of using it at Massachusetts General Hospital, the software has proven to be a useful tool for analysis and verification of the Monte Carlo generated phase-space files. The software is in the public domain.

  19. InfiniCharges: A tool for generating partial charges via the simultaneous fit of multiframe electrostatic potential (ESP) and total dipole fluctuations (TDF)

    NASA Astrophysics Data System (ADS)

    Sant, Marco; Gabrieli, Andrea; Demontis, Pierfranco; Suffritti, Giuseppe B.

    2016-03-01

    The InfiniCharges computer program, for generating reliable partial charges for molecular simulations in periodic systems, is here presented. This tool is an efficient implementation of the recently developed DM-REPEAT method, where the stability of the resulting charges, over a large set of fitting regions, is obtained through the simultaneous fit of multiple electrostatic potential (ESP) configurations together with the total dipole fluctuations (TDF). Besides DM-REPEAT, the program can also perform standard REPEAT fit and its multiframe extension (M-REPEAT), with the possibility to restrain the charges to an arbitrary value. Finally, the code is employed to generate partial charges for ZIF-90, a microporous material of the metal organic frameworks (MOFs) family, and an extensive analysis of the results is carried out.

  20. Hydroquinone-quinone oxidation by molecular oxygen: a simple tool for signal amplification through auto-generation of hydrogen peroxide.

    PubMed

    Sella, Eran; Shabat, Doron

    2013-08-21

    Signal amplification methods are of obvious importance for various diagnostic assays. We have developed a new small-molecule-based probe that, upon activation with sub-stoichiometric amounts of hydrogen peroxide, produces an auto-inductive amplification reaction. The signal is produced through the oxidation reaction of hydroquinone to the corresponding quinone derivative by molecular oxygen. This oxidation is accompanied by the formation of hydrogen peroxide, which can enter the amplification sequence and initiate a new diagnostic cycle. The generated quinone is composed of a donor-acceptor conjugated pair and fluoresces at a distinct wavelength, allowing the formation to be monitored by a convenient fluorescence assay.

  1. Using tablets as tools for learner-generated drawings in the context of teaching the kinetic theory of gases

    NASA Astrophysics Data System (ADS)

    Lehtinen, A.; Viiri, J.

    2014-05-01

    Even though research suggests that the use of drawings could be an important part of learning science, learner-generated drawings have not received much attention in physics classrooms. This paper presents a method for recording students’ drawings and group discussions using tablets. Compared to pen and paper, tablets offer unique benefits, which include the recording of the whole drawing process and of the discussion associated with the drawing. A study, which investigated the use of drawings and the need for guidance among Finnish upper secondary school students, is presented alongside ideas for teachers on how to see drawing in a new light.

  2. Appendix 2. Guide for Running AgMIP Climate Scenario Generation Tools with R in Windows, Version 2.3

    NASA Technical Reports Server (NTRS)

    Hudson, Nicholas; Ruane, Alexander Clark

    2013-01-01

    This Guide explains how to create climate series and climate change scenarios by using the AgMip Climate team's methodology as outlined in the AgMIP Guide for Regional Assessment: Handbook of Methods and Procedures. It details how to: install R and the required packages to run the AgMIP Climate Scenario Generation scripts, and create climate scenarios from CMIP5 GCMs using a 30-year baseline daily weather dataset. The Guide also outlines a workflow that can be modified for application to your own climate data.

  3. 20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881 Deciding whether someone is your parent or stepparent. (a) We consider your parent to be— (1) Your...

  4. 34 CFR 644.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 34 Education 3 2014-07-01 2014-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 644.20 How does the Secretary decide which new grants to make? (a) The Secretary... not make a new grant to an applicant if the applicant's prior project involved the fraudulent use...

  5. 34 CFR 646.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 34 Education 3 2014-07-01 2014-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 646.20 How does the Secretary decide which new grants to make? (a) The Secretary... Secretary does not make a new grant to an applicant if the applicant's prior project involved the...

  6. 34 CFR 645.30 - How does the Secretary decide which grants to make?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary decide which grants to make? 645... Make a Grant? § 645.30 How does the Secretary decide which grants to make? (a) The Secretary evaluates... Program. (d) The Secretary may decline to make a grant to an applicant that carried out a project...

  7. 34 CFR 647.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 34 Education 3 2014-07-01 2014-07-01 false How does the Secretary decide which new grants to make... PROGRAM How Does the Secretary Make a Grant? § 647.20 How does the Secretary decide which new grants to... Secretary considers only the locations of new projects. (d) The Secretary does not make a new grant to...

  8. 34 CFR 647.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 34 Education 3 2013-07-01 2013-07-01 false How does the Secretary decide which new grants to make... PROGRAM How Does the Secretary Make a Grant? § 647.20 How does the Secretary decide which new grants to... Secretary considers only the locations of new projects. (d) The Secretary does not make a new grant to...

  9. 34 CFR 645.30 - How does the Secretary decide which grants to make?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 34 Education 3 2014-07-01 2014-07-01 false How does the Secretary decide which grants to make? 645... Make a Grant? § 645.30 How does the Secretary decide which grants to make? (a) The Secretary evaluates... Secretary does not make a new grant to an applicant if the applicant's prior project involved the...

  10. 34 CFR 644.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 34 Education 3 2013-07-01 2013-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 644.20 How does the Secretary decide which new grants to make? (a) The Secretary... not make a new grant to an applicant if the applicant's prior project involved the fraudulent use...

  11. 34 CFR 644.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 3 2011-07-01 2011-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 644.20 How does the Secretary decide which new grants to make? (a) The Secretary... not make a new grant to an applicant if the applicant's prior project involved the fraudulent use...

  12. 34 CFR 647.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary decide which new grants to make... PROGRAM How Does the Secretary Make a Grant? § 647.20 How does the Secretary decide which new grants to... decline to make a grant to an applicant that carried out a Federal TRIO Program project that involved...

  13. 34 CFR 644.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 34 Education 3 2012-07-01 2012-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 644.20 How does the Secretary decide which new grants to make? (a) The Secretary... not make a new grant to an applicant if the applicant's prior project involved the fraudulent use...

  14. 34 CFR 643.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 34 Education 3 2014-07-01 2014-07-01 false How does the Secretary decide which new grants to make...) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TALENT SEARCH How Does the Secretary Make a Grant? § 643.20 How does the Secretary decide which new grants to make? (a) The Secretary evaluates...

  15. 34 CFR 643.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 34 Education 3 2013-07-01 2013-07-01 false How does the Secretary decide which new grants to make...) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TALENT SEARCH How Does the Secretary Make a Grant? § 643.20 How does the Secretary decide which new grants to make? (a) The Secretary evaluates...

  16. 34 CFR 645.30 - How does the Secretary decide which grants to make?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 34 Education 3 2012-07-01 2012-07-01 false How does the Secretary decide which grants to make? 645... Make a Grant? § 645.30 How does the Secretary decide which grants to make? (a) The Secretary evaluates... Secretary does not make a new grant to an applicant if the applicant's prior project involved the...

  17. 34 CFR 646.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 3 2011-07-01 2011-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 646.20 How does the Secretary decide which new grants to make? (a) The Secretary... Secretary does not make a new grant to an applicant if the applicant's prior project involved the...

  18. 34 CFR 645.30 - How does the Secretary decide which grants to make?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 3 2011-07-01 2011-07-01 false How does the Secretary decide which grants to make? 645... Make a Grant? § 645.30 How does the Secretary decide which grants to make? (a) The Secretary evaluates... Secretary does not make a new grant to an applicant if the applicant's prior project involved the...

  19. 34 CFR 647.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 3 2011-07-01 2011-07-01 false How does the Secretary decide which new grants to make... PROGRAM How Does the Secretary Make a Grant? § 647.20 How does the Secretary decide which new grants to... Secretary considers only the locations of new projects. (d) The Secretary does not make a new grant to...

  20. 34 CFR 643.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 34 Education 3 2012-07-01 2012-07-01 false How does the Secretary decide which new grants to make...) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TALENT SEARCH How Does the Secretary Make a Grant? § 643.20 How does the Secretary decide which new grants to make? (a) The Secretary evaluates...

  1. 34 CFR 643.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 3 2011-07-01 2011-07-01 false How does the Secretary decide which new grants to make...) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TALENT SEARCH How Does the Secretary Make a Grant? § 643.20 How does the Secretary decide which new grants to make? (a) The Secretary evaluates...

  2. 34 CFR 646.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 34 Education 3 2013-07-01 2013-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 646.20 How does the Secretary decide which new grants to make? (a) The Secretary... Secretary does not make a new grant to an applicant if the applicant's prior project involved the...

  3. 34 CFR 644.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 644.20 How does the Secretary decide which new grants to make? (a) The Secretary...) The Secretary may decline to make a grant to an applicant that carried out a project that involved...

  4. 34 CFR 646.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 646.20 How does the Secretary decide which new grants to make? (a) The Secretary... Support Services Program. (d) The Secretary does not make grants to applicants that carried out a...

  5. 34 CFR 646.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 34 Education 3 2012-07-01 2012-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 646.20 How does the Secretary decide which new grants to make? (a) The Secretary... Secretary does not make a new grant to an applicant if the applicant's prior project involved the...

  6. 34 CFR 647.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 34 Education 3 2012-07-01 2012-07-01 false How does the Secretary decide which new grants to make... PROGRAM How Does the Secretary Make a Grant? § 647.20 How does the Secretary decide which new grants to... Secretary considers only the locations of new projects. (d) The Secretary does not make a new grant to...

  7. 34 CFR 645.30 - How does the Secretary decide which grants to make?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 34 Education 3 2013-07-01 2013-07-01 false How does the Secretary decide which grants to make? 645... Make a Grant? § 645.30 How does the Secretary decide which grants to make? (a) The Secretary evaluates... Secretary does not make a new grant to an applicant if the applicant's prior project involved the...

  8. 49 CFR 40.377 - Who decides whether to issue a PIE?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Who decides whether to issue a PIE? 40.377 Section 40.377 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Public Interest Exclusions § 40.377 Who decides whether to...

  9. 20 CFR 404.708 - How we decide what is enough evidence.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false How we decide what is enough evidence. 404.708 Section 404.708 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Evidence General § 404.708 How we decide what is enough evidence. When you...

  10. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2010-04-01 2010-04-01 false Who decides how Language Development funds can be...

  11. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2013-04-01 2013-04-01 false Who decides how Language Development funds can be...

  12. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2014-04-01 2014-04-01 false Who decides how Language Development funds can be...

  13. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2011-04-01 2011-04-01 false Who decides how Language Development funds can be...

  14. 20 CFR 670.200 - Who decides where Job Corps centers will be located?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false Who decides where Job Corps centers will be... LABOR (CONTINUED) THE JOB CORPS UNDER TITLE I OF THE WORKFORCE INVESTMENT ACT Site Selection and Protection and Maintenance of Facilities § 670.200 Who decides where Job Corps centers will be located?...

  15. 20 CFR 670.200 - Who decides where Job Corps centers will be located?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Who decides where Job Corps centers will be... LABOR THE JOB CORPS UNDER TITLE I OF THE WORKFORCE INVESTMENT ACT Site Selection and Protection and Maintenance of Facilities § 670.200 Who decides where Job Corps centers will be located? (a) The...

  16. 5 CFR 890.1029 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Permissive Debarments § 890.1029 Deciding a contest after a fact... official's findings of fact, unless they are arbitrary, capricious, or clearly erroneous. If the...

  17. 5 CFR 890.1041 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Suspension § 890.1041 Deciding a contest after a fact-finding... suspending or debarring the provider, based on the same facts. Effect of Debarment...

  18. 5 CFR 890.1041 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Suspension § 890.1041 Deciding a contest after a fact-finding... suspending or debarring the provider, based on the same facts. Effect of Debarment...

  19. 5 CFR 890.1029 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Permissive Debarments § 890.1029 Deciding a contest after a fact... official's findings of fact, unless they are arbitrary, capricious, or clearly erroneous. If the...

  20. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... BENEFIT Reopening, ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding...

  1. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH..., ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding an Appeal before...

  2. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... BENEFIT Reopening, ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding...

  3. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH..., ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding an Appeal before...

  4. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... BENEFIT Reopening, ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding...

  5. Decided and Undecided Students: Career Self-Efficacy, Negative Thinking, and Decision-Making Difficulties

    ERIC Educational Resources Information Center

    Bullock-Yowell, Emily; McConnell, Amy E.; Schedin, Emily A.

    2014-01-01

    The career concern differences between undecided and decided college students (N = 223) are examined. Undecided college students (n = 83) reported lower career decision-making self-efficacy, higher incidences of negative career thoughts, and more career decision-making difficulties than their decided peers (n = 143). Results reveal that undecided…

  6. 25 CFR 162.566 - How will BIA decide whether to approve a WSR lease?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a WSR lease? 162.566 Section 162.566 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Approval § 162.566 How will BIA decide whether...

  7. 25 CFR 162.566 - How will BIA decide whether to approve a WSR lease?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a WSR lease? 162.566 Section 162.566 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Approval § 162.566 How will BIA decide whether...

  8. 25 CFR 162.531 - How will BIA decide whether to approve a WEEL?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a WEEL? 162.531 Section 162.531 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Weel Approval § 162.531 How will BIA decide whether to approve...

  9. 25 CFR 162.531 - How will BIA decide whether to approve a WEEL?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a WEEL? 162.531 Section 162.531 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Weel Approval § 162.531 How will BIA decide whether to approve...

  10. Radical generating coordination complexes as tools for rapid and effective fragmentation and fluorescent labeling of nucleic acids for microchip hybridization.

    SciTech Connect

    Kelly, J. J.; Chernov, B. N.; Mirzabekov, A. D.; Bavykin, S. G.; Biochip Technology Center; Northwestern Univ.; Engelhardt Inst. of Molecular Biology

    2002-01-01

    DNA microchip technology is a rapid, high-throughput method for nucleic acid hybridization reactions. This technology requires random fragmentation and fluorescent labeling of target nucleic acids prior to hybridization. Radical-generating coordination complexes, such as 1,10-phenanthroline-Cu(II) (OP-Cu) and Fe(II)-EDTA (Fe-EDTA), have been commonly used as sequence nonspecific 'chemical nucleases' to introduce single-strand breaks in nucleic acids. Here we describe a new method based on these radical-generating complexes for random fragmentation and labeling of both single- and double-stranded forms of RNA and DNA. Nucleic acids labeled with the OP-Cu and the Fe-EDTA protocols revealed high hybridization specificity in hybridization with DNA microchips containing oligonucleotide probes selected for identification of 16S rRNA sequences of the Bacillus group microorganisms.We also demonstrated cDNA- and cRNA-labeling and fragmentation with this method. Both the OP-Cu and Fe-EDTA fragmentation and labeling procedures are quick and inexpensive compared to other commonly used methods. A column-based version of the described method does not require centrifugation and therefore is promising for the automation of sample preparations in DNA microchip technology as well as in other nucleic acid hybridization studies.

  11. Family presence during cardiopulmonary resuscitation: who should decide?

    PubMed

    Lederman, Zohar; Garasic, Mirko; Piperberg, Michelle

    2014-05-01

    Whether to allow the presence of family members during cardiopulmonary resuscitation (CPR) has been a highly contentious topic in recent years. Even though a great deal of evidence and professional guidelines support the option of family presence during resuscitation (FPDR), many healthcare professionals still oppose it. One of the main arguments espoused by the latter is that family members should not be allowed for the sake of the patient's best interests, whether it is to increase his chances of survival, respect his privacy or leave his family with a last positive impression of him. In this paper, we examine the issue of FPDR from the patient's point of view. Since the patient requires CPR, he is invariably unconscious and therefore incompetent. We discuss the Autonomy Principle and the Three-Tiered process for surrogate decision making, as well as the Beneficence Principle and show that these are limited in providing us with an adequate tool for decision making in this particular case. Rather, we rely on a novel principle (or, rather, a novel specification of an existing principle) and a novel integrated model for surrogate decision making. We show that this model is more satisfactory in taking the patient's true wishes under consideration and encourages a joint decision making process by all parties involved.

  12. Cenp-E inhibitor GSK923295: Novel synthetic route and use as a tool to generate aneuploidy.

    PubMed

    Bennett, Ailsa; Bechi, Beatrice; Tighe, Anthony; Thompson, Sarah; Procter, David J; Taylor, Stephen S

    2015-08-28

    Aneuploidy is a common feature of cancer, with human solid tumour cells typically harbouring abnormal chromosome complements. The aneuploidy observed in cancer is often caused by a chromosome instability phenotype, resulting in genomic heterogeneity. However, the role aneuploidy and chromosome instability play in tumour evolution and chemotherapy response remains poorly understood. In some contexts, aneuploidy has oncogenic effects, whereas in others it is anti-proliferative and tumour-suppressive. Dissecting fully the role aneuploidy plays in tumourigenesis requires tools and facile assays that allow chromosome missegregation to be induced experimentally in cells that are otherwise diploid and chromosomally stable. Here, we describe a chemical biology approach that induces low-level aneuploidy across a large population of cells. Specifically, cells are first exposed to GSK923295, an inhibitor targeting the mitotic kinesin Cenp-E; while the majority of chromosomes align at the cell's equator, a small number cluster near the spindle poles. By then driving these cells into anaphase using AZ3146, an inhibitor targeting the spindle checkpoint kinase Mps1, the polar chromosomes are missegregated. This results in, on average, two chromosome missegregation events per division, and avoids trapping chromosomes in the spindle midzone, which could otherwise lead to DNA damage. We also describe an efficient route for the synthesis of GSK923295 that employs a novel enzymatic resolution. Together, the approaches described here open up new opportunities for studying cellular responses to aneuploidy.

  13. COV2HTML: A Visualization and Analysis Tool of Bacterial Next Generation Sequencing (NGS) Data for Postgenomics Life Scientists

    PubMed Central

    Orgeur, Mickael; Camiade, Emilie; Brehier, Clément; Dupuy, Bruno

    2014-01-01

    Abstract COV2HTML is an interactive web interface, which is addressed to biologists, and allows performing both coverage visualization and analysis of NGS alignments performed on prokaryotic organisms (bacteria and phages). It combines two processes: a tool that converts the huge NGS mapping or coverage files into light specific coverage files containing information on genetic elements; and a visualization interface allowing a real-time analysis of data with optional integration of statistical results. To demonstrate the scope of COV2HTML, the program was tested with data from two published studies. The first data were from RNA-seq analysis of Campylobacter jejuni, based on comparison of two conditions with two replicates. We were able to recover 26 out of 27 genes highlighted in the publication using COV2HTML. The second data comprised of stranded TSS and RNA-seq data sets on the Archaea Sulfolobus solfataricus. COV2HTML was able to highlight most of the TSSs from the article and allows biologists to visualize both TSS and RNA-seq on the same screen. The strength of the COV2HTML interface is making possible NGS data analysis without software installation, login, or a long training period. A web version is accessible at https://mmonot.eu/COV2HTML/. This website is free and open to users without any login requirement. PMID:24512253

  14. COV2HTML: a visualization and analysis tool of bacterial next generation sequencing (NGS) data for postgenomics life scientists.

    PubMed

    Monot, Marc; Orgeur, Mickael; Camiade, Emilie; Brehier, Clément; Dupuy, Bruno

    2014-03-01

    COV2HTML is an interactive web interface, which is addressed to biologists, and allows performing both coverage visualization and analysis of NGS alignments performed on prokaryotic organisms (bacteria and phages). It combines two processes: a tool that converts the huge NGS mapping or coverage files into light specific coverage files containing information on genetic elements; and a visualization interface allowing a real-time analysis of data with optional integration of statistical results. To demonstrate the scope of COV2HTML, the program was tested with data from two published studies. The first data were from RNA-seq analysis of Campylobacter jejuni, based on comparison of two conditions with two replicates. We were able to recover 26 out of 27 genes highlighted in the publication using COV2HTML. The second data comprised of stranded TSS and RNA-seq data sets on the Archaea Sulfolobus solfataricus. COV2HTML was able to highlight most of the TSSs from the article and allows biologists to visualize both TSS and RNA-seq on the same screen. The strength of the COV2HTML interface is making possible NGS data analysis without software installation, login, or a long training period. A web version is accessible at https://mmonot.eu/COV2HTML/ . This website is free and open to users without any login requirement.

  15. Cenp-E inhibitor GSK923295: Novel synthetic route and use as a tool to generate aneuploidy

    PubMed Central

    Bennett, Ailsa; Bechi, Beatrice; Tighe, Anthony; Thompson, Sarah; Procter, David J.; Taylor, Stephen S.

    2015-01-01

    Aneuploidy is a common feature of cancer, with human solid tumour cells typically harbouring abnormal chromosome complements. The aneuploidy observed in cancer is often caused by a chromosome instability phenotype, resulting in genomic heterogeneity. However, the role aneuploidy and chromosome instability play in tumour evolution and chemotherapy response remains poorly understood. In some contexts, aneuploidy has oncogenic effects, whereas in others it is anti-proliferative and tumour-suppressive. Dissecting fully the role aneuploidy plays in tumourigenesis requires tools and facile assays that allow chromosome missegregation to be induced experimentally in cells that are otherwise diploid and chromosomally stable. Here, we describe a chemical biology approach that induces low-level aneuploidy across a large population of cells. Specifically, cells are first exposed to GSK923295, an inhibitor targeting the mitotic kinesin Cenp-E; while the majority of chromosomes align at the cell's equator, a small number cluster near the spindle poles. By then driving these cells into anaphase using AZ3146, an inhibitor targeting the spindle checkpoint kinase Mps1, the polar chromosomes are missegregated. This results in, on average, two chromosome missegregation events per division, and avoids trapping chromosomes in the spindle midzone, which could otherwise lead to DNA damage. We also describe an efficient route for the synthesis of GSK923295 that employs a novel enzymatic resolution. Together, the approaches described here open up new opportunities for studying cellular responses to aneuploidy. PMID:26320186

  16. Evolution of the design methodologies for the next generation of RPV Extensive role of the thermal-hydraulics numerical tools

    SciTech Connect

    Goreaud, Nicolas; Nicaise, Norbert; Stoudt, Roger

    2004-07-01

    The thermal-hydraulic design of the first PWR's was mainly based on an experimental approach, with a large series of test on the main equipment (control rod guide tubes, RPV plenums..), to check its performances. Development of CFD-codes and computers now allows for complex simulations of hydraulic phenomena. Provided adequate qualification, these numerical tools are efficient means to determine hydraulics in given design, and to perform sensitivities for optimization of new designs. Experiments always play their role, first for qualification, and for validation at the last stage of the design. The design of the European Pressurized water Reactor (EPR), is based on both hydraulic calculations and experiments, handled in a complementary approach. This paper describes the effort launched by Framatome-ANP on hydraulic calculations for the Reactor Pressure Vessel (RPV) of the EPR reactor. It concerns 3D-calculations of RPV-inlet including cold legs, RPV-downcomer and lower plenum, RPV-upper plenum up to and including hot legs. It covers normal operating conditions, but also accidental conditions as PTS (Pressurized Thermal Shock) in small break loss of coolant accident (SB-LOCA). Those hydraulic studies have provided numerous useful information for the mechanical design of RPV-internals. (authors)

  17. Local stakeholders' acceptance of model-generated data used as a communication tool in water management: The Rönneå Study.

    PubMed

    Alkan Olsson, Johanna; Berg, Karin

    2005-11-01

    The objective of this study was to increase the knowledge of local stakeholders' acceptance of model-generated data when used as a communication tool in water quality management. The Rönneå catchment in the southwest of Sweden was chosen as the study area. The results indicate the model-generated data served as a uniting factor. Simultaneously, the stakeholders were concerned with presented data, the main problems being sources of pollution, which were not accounted for, lack of trustworthiness when measuring pollution, and the uncertainty of the impact of natural variation and delayed effects. Four clusters of factors were identified as influencing stakeholders' acceptance of the model-generated data: confidence in its practical applications, confidence in the people involved in or providing material for the dialog (such as experts, decision-makers, and media), the social characteristics of the participants (such as age and profession), and the way of communicating the data (such as tone of communication, group composition, duration, and geographical scope of the dialog). The perception of the fairness of the practical application of given model-generated data was also an important factor for acceptance.

  18. Next-Generation Sequencing: A powerful tool for the discovery of molecular markers in breast ductal carcinoma in situ

    PubMed Central

    Kaur, Hitchintan; Mao, Shihong; Shah, Seema; Gorski, David H.; Krawetz, Stephen A.; Sloane, Bonnie F.; Mattingly, Raymond R.

    2013-01-01

    Mammographic screening leads to frequent biopsies and concomitant overdiagnosis of breast cancer, particularly ductal carcinoma in situ (DCIS). Some DCIS lesions rapidly progress to invasive carcinoma whereas others remain indolent. Because we cannot yet predict which lesions will not progress, all DCIS is regarded as malignant, and many women are overtreated. Thus, there is a pressing need for a panel of molecular markers in addition to the current clinical and pathologic factors to provide prognostic information. Genomic technologies such as microarrays have made major contributions to defining sub-types of breast cancer. Next-generation sequencing (NGS) modalities offer unprecedented depth of expression analysis through revealing transcriptional boundaries, mutations, rare transcripts and alternative splice variants. NGS approaches are just beginning to be applied to DCIS. Here, we review the applications and challenges of NGS in discovering novel potential therapeutic targets and candidate biomarkers in the premalignant progression of breast cancer. PMID:23477556

  19. Developmental dysplasia of the hip: usefulness of next generation genomic tools for characterizing the underlying genes - a mini review.

    PubMed

    Basit, S; Hannan, M A; Khoshhal, K I

    2016-07-01

    Developmental dysplasia of the hip (DDH) is one of the most common skeletal anomalies. DDH encompasses a spectrum of the disorder ranging from minor acetabular dysplasia to irreducible dislocation, which may lead to premature arthritis in later life. Involvement of genetic factors underlying DDH became evident when several studies reported chromosomal loci linked to DDH in families with multiple affected individuals. Moreover, using association studies, variants in genes involved in chondrogenesis and joint formation have been shown to be associated with DDH. At least, one study identified a pathogenic variant in the chemokine receptor gene in DDH. No genetic analysis has been reported or carried out in DDH patients from the Middle East. Here, we review the literature related to genetics of DDH and emphasized the usefulness of new generation technologies in identifying genetic variants underlying DDH in consanguineous families. PMID:26842108

  20. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    NASA Astrophysics Data System (ADS)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  1. miRMOD: a tool for identification and analysis of 5' and 3' miRNA modifications in Next Generation Sequencing small RNA data.

    PubMed

    Kaushik, Abhinav; Saraf, Shradha; Mukherjee, Sunil K; Gupta, Dinesh

    2015-01-01

    In the past decade, the microRNAs (miRNAs) have emerged to be important regulators of gene expression across various species. Several studies have confirmed different types of post-transcriptional modifications at terminal ends of miRNAs. The reports indicate that miRNA modifications are conserved and functionally significant as it may affect miRNA stability and ability to bind mRNA targets, hence affecting target gene repression. Next Generation Sequencing (NGS) of the small RNA (sRNA) provides an efficient and reliable method to explore miRNA modifications. The need for dedicated software, especially for users with little knowledge of computers, to determine and analyze miRNA modifications in sRNA NGS data, motivated us to develop miRMOD. miRMOD is a user-friendly, Microsoft Windows and Graphical User Interface (GUI) based tool for identification and analysis of 5' and 3' miRNA modifications (non-templated nucleotide additions and trimming) in sRNA NGS data. In addition to identification of miRNA modifications, the tool also predicts and compares the targets of query and modified miRNAs. In order to compare binding affinities for the same target, miRMOD utilizes minimum free energies of the miRNA:target and modified-miRNA:target interactions. Comparisons of the binding energies may guide experimental exploration of miRNA post-transcriptional modifications. The tool is available as a stand-alone package to overcome large data transfer problems commonly faced in web-based high-throughput (HT) sequencing data analysis tools. miRMOD package is freely available at http://bioinfo.icgeb.res.in/miRMOD. PMID:26623179

  2. On resonant ICRF absorption in three-ion component plasmas: a new promising tool for fast ion generation

    NASA Astrophysics Data System (ADS)

    Kazakov, Ye. O.; Van Eester, D.; Dumont, R.; Ongena, J.

    2015-03-01

    We report on a very efficient ion-cyclotron-resonance-frequency (ICRF) absorption scheme (Z)-Y-X, which hinges on the presence of three ion species residing in the plasma. A mode conversion (cutoff-resonance) layer is well known to appear in two-ion species plasmas. If the location of the L-cutoff in Y-X plasmas, which can be controlled by varying the Y : X density ratio, almost coincides with the fundamental cyclotron resonance of the third ion species Z (resonant absorber), the latter—albeit present only in trace quantities—is shown to absorb almost all the incoming RF power. A quantitative criterion for the resonant Y : X plasma composition is derived and a few numerical examples are given. Since the absorbed power per resonant particle is much larger than for any other ICRF scheme, the here discussed scenarios are particularly promising for fast particle generation. Their possible application as a source of high-energy ions for the stellarator W7-X and to mimic alpha particles during the non-activated phase of ITER tokamak is briefly discussed.

  3. Generation of growth arrested Leishmania amastigotes: a tool to develop live attenuated vaccine candidates against visceral leishmaniasis.

    PubMed

    Selvapandiyan, Angamuthu; Dey, Ranadhir; Gannavaram, Sreenivas; Solanki, Sumit; Salotra, Poonam; Nakhasi, Hira L

    2014-06-30

    Visceral leishmaniasis (VL) is fatal if not treated and is prevalent widely in the tropical and sub-tropical regions of world. VL is caused by the protozoan parasite Leishmania donovani or Leishmania infantum. Although several second generation vaccines have been licensed to protect dogs against VL, there are no effective vaccines against human VL [1]. Since people cured of leishmaniasis develop lifelong protection, development of live attenuated Leishmania parasites as vaccines, which can have controlled infection, may be a close surrogate to leishmanization. This can be achieved by deletion of genes involved in the regulation of growth and/or virulence of the parasite. Such mutant parasites generally do not revert to virulence in animal models even under conditions of induced immune suppression due to complete deletion of the essential gene(s). In the Leishmania life cycle, the intracellular amastigote form is the virulent form and causes disease in the mammalian hosts. We developed centrin gene deleted L. donovani parasites that displayed attenuated growth only in the amastigote stage and were found safe and efficacious against virulent challenge in the experimental animal models. Thus, targeting genes differentially expressed in the amastigote stage would potentially attenuate only the amastigote stage and hence controlled infectivity may be effective in developing immunity. This review lays out the strategies for attenuation of the growth of the amastigote form of Leishmania for use as live vaccine against leishmaniasis, with a focus on visceral leishmaniasis. PMID:24837513

  4. Generation of growth arrested Leishmania amastigotes: a tool to develop live attenuated vaccine candidates against visceral leishmaniasis.

    PubMed

    Selvapandiyan, Angamuthu; Dey, Ranadhir; Gannavaram, Sreenivas; Solanki, Sumit; Salotra, Poonam; Nakhasi, Hira L

    2014-06-30

    Visceral leishmaniasis (VL) is fatal if not treated and is prevalent widely in the tropical and sub-tropical regions of world. VL is caused by the protozoan parasite Leishmania donovani or Leishmania infantum. Although several second generation vaccines have been licensed to protect dogs against VL, there are no effective vaccines against human VL [1]. Since people cured of leishmaniasis develop lifelong protection, development of live attenuated Leishmania parasites as vaccines, which can have controlled infection, may be a close surrogate to leishmanization. This can be achieved by deletion of genes involved in the regulation of growth and/or virulence of the parasite. Such mutant parasites generally do not revert to virulence in animal models even under conditions of induced immune suppression due to complete deletion of the essential gene(s). In the Leishmania life cycle, the intracellular amastigote form is the virulent form and causes disease in the mammalian hosts. We developed centrin gene deleted L. donovani parasites that displayed attenuated growth only in the amastigote stage and were found safe and efficacious against virulent challenge in the experimental animal models. Thus, targeting genes differentially expressed in the amastigote stage would potentially attenuate only the amastigote stage and hence controlled infectivity may be effective in developing immunity. This review lays out the strategies for attenuation of the growth of the amastigote form of Leishmania for use as live vaccine against leishmaniasis, with a focus on visceral leishmaniasis.

  5. Bioanalytical tools for the evaluation of organic micropollutants during sewage treatment, water recycling and drinking water generation.

    PubMed

    Macova, Miroslava; Toze, Simon; Hodgers, Leonie; Mueller, Jochen F; Bartkow, Michael; Escher, Beate I

    2011-08-01

    performed that allows direct comparison of different treatment technologies and covers several orders of magnitude of TEQ from highly contaminated sewage to drinking water with TEQ close or below the limit of detection. Detection limits of the bioassays were decreased in comparison to earlier studies by optimizing sample preparation and test protocols, and were comparable to or lower than the quantification limits of the routine chemical analysis, which allowed monitoring of the presence and removal of micropollutants post Barrier 2 and in drinking water. The results obtained by bioanalytical tools were reproducible, robust and consistent with previous studies assessing the effectiveness of the wastewater and advanced water treatment plants. The results of this study indicate that bioanalytical results expressed as TEQ are useful to assess removal efficiency of micropollutants throughout all treatment steps of water recycling. PMID:21704353

  6. Bioanalytical tools for the evaluation of organic micropollutants during sewage treatment, water recycling and drinking water generation.

    PubMed

    Macova, Miroslava; Toze, Simon; Hodgers, Leonie; Mueller, Jochen F; Bartkow, Michael; Escher, Beate I

    2011-08-01

    performed that allows direct comparison of different treatment technologies and covers several orders of magnitude of TEQ from highly contaminated sewage to drinking water with TEQ close or below the limit of detection. Detection limits of the bioassays were decreased in comparison to earlier studies by optimizing sample preparation and test protocols, and were comparable to or lower than the quantification limits of the routine chemical analysis, which allowed monitoring of the presence and removal of micropollutants post Barrier 2 and in drinking water. The results obtained by bioanalytical tools were reproducible, robust and consistent with previous studies assessing the effectiveness of the wastewater and advanced water treatment plants. The results of this study indicate that bioanalytical results expressed as TEQ are useful to assess removal efficiency of micropollutants throughout all treatment steps of water recycling.

  7. Metabolomics as a Hypothesis-Generating Functional Genomics Tool for the Annotation of Arabidopsis thaliana Genes of “Unknown Function”

    PubMed Central

    Quanbeck, Stephanie M.; Brachova, Libuse; Campbell, Alexis A.; Guan, Xin; Perera, Ann; He, Kun; Rhee, Seung Y.; Bais, Preeti; Dickerson, Julie A.; Dixon, Philip; Wohlgemuth, Gert; Fiehn, Oliver; Barkan, Lenore; Lange, Iris; Lange, B. Markus; Lee, Insuk; Cortes, Diego; Salazar, Carolina; Shuman, Joel; Shulaev, Vladimir; Huhman, David V.; Sumner, Lloyd W.; Roth, Mary R.; Welti, Ruth; Ilarslan, Hilal; Wurtele, Eve S.; Nikolau, Basil J.

    2012-01-01

    Metabolomics is the methodology that identifies and measures global pools of small molecules (of less than about 1,000 Da) of a biological sample, which are collectively called the metabolome. Metabolomics can therefore reveal the metabolic outcome of a genetic or environmental perturbation of a metabolic regulatory network, and thus provide insights into the structure and regulation of that network. Because of the chemical complexity of the metabolome and limitations associated with individual analytical platforms for determining the metabolome, it is currently difficult to capture the complete metabolome of an organism or tissue, which is in contrast to genomics and transcriptomics. This paper describes the analysis of Arabidopsis metabolomics data sets acquired by a consortium that includes five analytical laboratories, bioinformaticists, and biostatisticians, which aims to develop and validate metabolomics as a hypothesis-generating functional genomics tool. The consortium is determining the metabolomes of Arabidopsis T-DNA mutant stocks, grown in standardized controlled environment optimized to minimize environmental impacts on the metabolomes. Metabolomics data were generated with seven analytical platforms, and the combined data is being provided to the research community to formulate initial hypotheses about genes of unknown function (GUFs). A public database (www.PlantMetabolomics.org) has been developed to provide the scientific community with access to the data along with tools to allow for its interactive analysis. Exemplary datasets are discussed to validate the approach, which illustrate how initial hypotheses can be generated from the consortium-produced metabolomics data, integrated with prior knowledge to provide a testable hypothesis concerning the functionality of GUFs. PMID:22645570

  8. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    NASA Astrophysics Data System (ADS)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  9. Fine tuning of trehalose biosynthesis and hydrolysis as novel tools for the generation of abiotic stress tolerant plants

    PubMed Central

    Delorge, Ines; Janiak, Michal; Carpentier, Sebastien; Van Dijck, Patrick

    2014-01-01

    The impact of abiotic stress on plant growth and development has been and still is a major research topic. An important pathway that has been linked to abiotic stress tolerance is the trehalose biosynthetic pathway. Recent findings showed that trehalose metabolism is also important for normal plant growth and development. The intermediate compound – trehalose-6-phosphate (T6P) – is now confirmed to act as a sensor for available sucrose, hereby directly influencing the type of response to the changing environmental conditions. This is possible because T6P and/or trehalose or their biosynthetic enzymes are part of complex interaction networks with other crucial hormone and sugar-induced signaling pathways, which may function at different developmental stages. Because of its effect on plant growth and development, modification of trehalose biosynthesis, either at the level of T6P synthesis, T6P hydrolysis, or trehalose hydrolysis, has been utilized to try to improve crop yield and biomass. It was shown that alteration of the amounts of either T6P and/or trehalose did result in increased stress tolerance, but also resulted in many unexpected phenotypic alterations. A main challenge is to characterize the part of the signaling pathway resulting in improved stress tolerance, without affecting the pathways resulting in the unwanted phenotypes. One such specific pathway where modification of trehalose metabolism improved stress tolerance, without any side effects, was recently obtained by overexpression of trehalase, which results in a more sensitive reaction of the stomatal guard cells and closing of the stomata under drought stress conditions. We have used the data that have been obtained from different studies to generate the optimal plant that can be constructed based on modifications of trehalose metabolism. PMID:24782885

  10. Constraint-Based Model of Shewanella oneidensis MR-1 Metabolism: A Tool for Data Analysis and Hypothesis Generation

    PubMed Central

    Hill, Eric A.; Geydebrekht, Oleg V.; De Ingeniis, Jessica; Zhang, Xiaolin; Osterman, Andrei; Scott, James H.; Reed, Samantha B.; Romine, Margaret F.; Konopka, Allan E.; Beliaev, Alexander S.; Fredrickson, Jim K.

    2010-01-01

    Shewanellae are gram-negative facultatively anaerobic metal-reducing bacteria commonly found in chemically (i.e., redox) stratified environments. Occupying such niches requires the ability to rapidly acclimate to changes in electron donor/acceptor type and availability; hence, the ability to compete and thrive in such environments must ultimately be reflected in the organization and utilization of electron transfer networks, as well as central and peripheral carbon metabolism. To understand how Shewanella oneidensis MR-1 utilizes its resources, the metabolic network was reconstructed. The resulting network consists of 774 reactions, 783 genes, and 634 unique metabolites and contains biosynthesis pathways for all cell constituents. Using constraint-based modeling, we investigated aerobic growth of S. oneidensis MR-1 on numerous carbon sources. To achieve this, we (i) used experimental data to formulate a biomass equation and estimate cellular ATP requirements, (ii) developed an approach to identify cycles (such as futile cycles and circulations), (iii) classified how reaction usage affects cellular growth, (iv) predicted cellular biomass yields on different carbon sources and compared model predictions to experimental measurements, and (v) used experimental results to refine metabolic fluxes for growth on lactate. The results revealed that aerobic lactate-grown cells of S. oneidensis MR-1 used less efficient enzymes to couple electron transport to proton motive force generation, and possibly operated at least one futile cycle involving malic enzymes. Several examples are provided whereby model predictions were validated by experimental data, in particular the role of serine hydroxymethyltransferase and glycine cleavage system in the metabolism of one-carbon units, and growth on different sources of carbon and energy. This work illustrates how integration of computational and experimental efforts facilitates the understanding of microbial metabolism at a systems

  11. LSG: An External-Memory Tool to Compute String Graphs for Next-Generation Sequencing Data Assembly.

    PubMed

    Bonizzoni, Paola; Vedova, Gianluca Della; Pirola, Yuri; Previtali, Marco; Rizzi, Raffaella

    2016-03-01

    The large amount of short read data that has to be assembled in future applications, such as in metagenomics or cancer genomics, strongly motivates the investigation of disk-based approaches to index next-generation sequencing (NGS) data. Positive results in this direction stimulate the investigation of efficient external memory algorithms for de novo assembly from NGS data. Our article is also motivated by the open problem of designing a space-efficient algorithm to compute a string graph using an indexing procedure based on the Burrows-Wheeler transform (BWT). We have developed a disk-based algorithm for computing string graphs in external memory: the light string graph (LSG). LSG relies on a new representation of the FM-index that is exploited to use an amount of main memory requirement that is independent from the size of the data set. Moreover, we have developed a pipeline for genome assembly from NGS data that integrates LSG with the assembly step of SGA (Simpson and Durbin, 2012 ), a state-of-the-art string graph-based assembler, and uses BEETL for indexing the input data. LSG is open source software and is available online. We have analyzed our implementation on a 875-million read whole-genome dataset, on which LSG has built the string graph using only 1GB of main memory (reducing the memory occupation by a factor of 50 with respect to SGA), while requiring slightly more than twice the time than SGA. The analysis of the entire pipeline shows an important decrease in memory usage, while managing to have only a moderate increase in the running time.

  12. Constraint-Based Model of Shewanella oneidensis MR-1 Metabolism: a Tool for Data Analysis and Hypothesis Generation

    SciTech Connect

    Pinchuk, Grigoriy E.; Hill, Eric A.; Geydebrekht, Oleg V.; De Ingeniis, Jessica; Zhang, Xiaolin; Osterman, Andrei; Scott, James H.; Reed, Samantha B.; Romine, Margaret F.; Konopka, Allan; Beliaev, Alex S.; Fredrickson, Jim K.; Reed, Jennifer L.

    2010-06-24

    Shewanellae are gram-negative facultatively anaerobic metal-reducing bacteria commonly found in chemically (i.e., redox) stratified environments. Occupying such niches requires the ability to rapidly acclimate to changes in electron donor/acceptor type and availability; hence, the ability to compete and thrive in such environments must ultimately be reflected in the organization and flexibility of the electron transfer networks as well as central and peripheral carbon metabolism pathways. To understand the factors contributing to the ecophysiological success of Shewanellae, the metabolic network of S. oneidensis MR-1 was reconstructed. The resulting network consists of 774 reactions, 783 genes, and 634 unique metabolites and contains biosynthesis pathways for all cell constituents. Using constraint-based modeling, we investigated aerobic growth of S. oneidensis MR-1 on numerous carbon sources. To achieve this, we (i) used experimental data to formulate a biomass equation and estimate cellular ATP requirements, (ii) developed an approach to identify futile cycles, (iii) classified how reaction usage affects cellular growth, (iv) predicted cellular biomass yields on different carbon sources and compared model predictions to experimental measurements, and (v) used experimental results to refine metabolic fluxes for growth on lactate. The results revealed that aerobic lactate-grown cells of S. oneidensis MR-1 used less efficient enzymes to couple electron transport to proton motive force generation, and possibly operated at least one futile cycle involving malic enzymes. Several examples are provided whereby model predictions were validated by experimental data, in particular the role of serine hydroxymethyltransferase and glycine cleavage system in the metabolism of one-carbon units, and growth on different sources of carbon and energy. This work illustrates how integration of computational and experimental efforts facilitates the understanding of microbial metabolism

  13. 20 CFR 405.340 - Deciding a claim without a hearing before an administrative law judge.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the decision is based. (b) You do not wish to appear. The administrative law judge may decide a claim on the record and not conduct a hearing if— (1) You state in writing that you do not wish to...

  14. Electric pulses: a flexible tool to manipulate cytosolic calcium concentrations and generate spontaneous-like calcium oscillations in mesenchymal stem cells.

    PubMed

    de Menorval, Marie-Amelie; Andre, Franck M; Silve, Aude; Dalmay, Claire; Français, Olivier; Le Pioufle, Bruno; Mir, Lluis M

    2016-01-01

    Human adipose mesenchymal stem cells (haMSCs) are multipotent adult stem cells of great interest in regenerative medicine or oncology. They present spontaneous calcium oscillations related to cell cycle progression or differentiation but the correlation between these events is still unclear. Indeed, it is difficult to mimic haMSCs spontaneous calcium oscillations with chemical means. Pulsed electric fields (PEFs) can permeabilise plasma and/or organelles membranes depending on the applied pulses and therefore generate cytosolic calcium peaks by recruiting calcium from the external medium or from internal stores. We show that it is possible to mimic haMSCs spontaneous calcium oscillations (same amplitude, duration and shape) using 100 μs PEFs or 10 ns PEFs. We propose a model that explains the experimental situations reported. PEFs can therefore be a flexible tool to manipulate cytosolic calcium concentrations. This tool, that can be switched on and off instantaneously, contrary to chemicals agents, can be very useful to investigate the role of calcium oscillations in cell physiology and/or to manipulate cell fate. PMID:27561994

  15. Electric pulses: a flexible tool to manipulate cytosolic calcium concentrations and generate spontaneous-like calcium oscillations in mesenchymal stem cells

    PubMed Central

    de Menorval, Marie-Amelie; Andre, Franck M.; Silve, Aude; Dalmay, Claire; Français, Olivier; Le Pioufle, Bruno; Mir, Lluis M.

    2016-01-01

    Human adipose mesenchymal stem cells (haMSCs) are multipotent adult stem cells of great interest in regenerative medicine or oncology. They present spontaneous calcium oscillations related to cell cycle progression or differentiation but the correlation between these events is still unclear. Indeed, it is difficult to mimic haMSCs spontaneous calcium oscillations with chemical means. Pulsed electric fields (PEFs) can permeabilise plasma and/or organelles membranes depending on the applied pulses and therefore generate cytosolic calcium peaks by recruiting calcium from the external medium or from internal stores. We show that it is possible to mimic haMSCs spontaneous calcium oscillations (same amplitude, duration and shape) using 100 μs PEFs or 10 ns PEFs. We propose a model that explains the experimental situations reported. PEFs can therefore be a flexible tool to manipulate cytosolic calcium concentrations. This tool, that can be switched on and off instantaneously, contrary to chemicals agents, can be very useful to investigate the role of calcium oscillations in cell physiology and/or to manipulate cell fate. PMID:27561994

  16. PyBetVH: A Python tool for probabilistic volcanic hazard assessment and for generation of Bayesian hazard curves and maps

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary

    2015-06-01

    PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.

  17. FluxSuite: a New Scientific Tool for Advanced Network Management and Cross-Sharing of Next-Generation Flux Stations

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.

    2015-12-01

    Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from

  18. Modeling of low-temperature plasmas generated using laser-induced breakdown spectroscopy: the ChemCam diagnostic tool on the Mars Science Laboratory Rover

    NASA Astrophysics Data System (ADS)

    Colgan, James

    2016-05-01

    We report on efforts to model the low-temperature plasmas generated using laser-induced breakdown spectroscopy (LIBS). LIBS is a minimally invasive technique that can quickly and efficiently determine the elemental composition of a target and is employed in an extremely wide range of applications due to its ease of use and fast turnaround. In particular, LIBS is the diagnostic tool used by the ChemCam instrument on the Mars Science Laboratory rover Curiosity. In this talk, we report on the use of the Los Alamos plasma modeling code ATOMIC to simulate LIBS plasmas, which are typically at temperatures of order 1 eV and electron densities of order 10 16 - 17 cm-3. At such conditions, these plasmas are usually in local-thermodynamic equilibrium (LTE) and normally contain neutral and singly ionized species only, which then requires that modeling must use accurate atomic structure data for the element under investigation. Since LIBS devices are often employed in a very wide range of applications, it is therefore desirable to have accurate data for most of the elements in the periodic table, ideally including actinides. Here, we discuss some recent applications of our modeling using ATOMIC that have explored the plasma physics aspects of LIBS generated plasmas, and in particular discuss the modeling of a plasma formed from a basalt sample used as a ChemCam standard1. We also highlight some of the more general atomic physics challenges that are encountered when attempting to model low-temperature plasmas. The Los Alamos National Laboratory is operated by Los Alamos National Security, LLC for the National Nuclear Security Administration of the U.S. Department of Energy under Contract No. DE-AC5206NA25396. Work performed in conjunction with D. P. Kilcrease, H. M. Johns, E. J. Judge, J. E. Barefield, R. C. Wiens, S. M. Clegg.

  19. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  20. The Effect of "Career Cruising" on the Self-Efficacy of Students Deciding on Majors

    ERIC Educational Resources Information Center

    Cunningham, Karen; Smothers, Anthony

    2014-01-01

    We analyzed the impact of a self-assessment instrument on the self-efficacy of those deciding on majors in a university setting. Using a pre- and post-test methodology, we employed "Career Cruising" to measure career decision-making self-efficacy. Participants completed the "Career Decision Self-Efficacy-Short Form" (CDSE-SF)…

  1. 49 CFR 40.387 - What matters does the Director decide concerning a proposed PIE?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 1 2013-10-01 2013-10-01 false What matters does the Director decide concerning a... TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Public Interest Exclusions § 40.387 What matters does... complete information needed for a decision, the Director may remand the matter to the initiating...

  2. College or Training Programs: How to Decide. PACER Center ACTion Information Sheets. PHP-c115

    ERIC Educational Resources Information Center

    PACER Center, 2006

    2006-01-01

    A high school diploma opens the door to many exciting new options. These might include a first full-time job, or part-time or full-time attendance at a technical school, community college, or university. Students might want to obtain a certificate, an associate degree, or a diploma. With so many choices, it can be a challenge to decide which path…

  3. The Five Stages of Deciding on a Purchase...or a Job.

    ERIC Educational Resources Information Center

    Summey, John H.; Anderson, Carol H.

    1992-01-01

    Describes five stages of deciding on purchase or job: recognition of employment need; career information search; evaluation of career alternatives; identification and acceptance of employment; and postchoice evaluation. Evaluated importance of freedom/significance, growth, and variety in career decisions of 362 college students. Concludes…

  4. 49 CFR 40.387 - What matters does the Director decide concerning a proposed PIE?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false What matters does the Director decide concerning a proposed PIE? 40.387 Section 40.387 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Public Interest Exclusions § 40.387 What matters...

  5. 42 CFR 405.1038 - Deciding a case without a hearing before an ALJ.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... decision is based. (b) Parties do not wish to appear. (1) The ALJ may decide a case on the record and not conduct a hearing if— (i) All the parties indicate in writing that they do not wish to appear before the... wants to appear, and there are no other parties who wish to appear. (2) When a hearing is not held,...

  6. 42 CFR 405.1038 - Deciding a case without a hearing before an ALJ.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... decision is based. (b) Parties do not wish to appear. (1) The ALJ may decide a case on the record and not conduct a hearing if— (i) All the parties indicate in writing that they do not wish to appear before the... wants to appear, and there are no other parties who wish to appear. (2) When a hearing is not held,...

  7. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... debarment. 890.1013 Section 890.1013 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED... permissive debarment. (a) Review factors. The factors OPM shall consider in deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of...

  8. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of FEHBP... meet professionally-recognized quality standards, OPM shall obtain the input of trained...

  9. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of FEHBP... meet professionally-recognized quality standards, OPM shall obtain the input of trained...

  10. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of FEHBP... meet professionally-recognized quality standards, OPM shall obtain the input of trained...

  11. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of FEHBP... meet professionally-recognized quality standards, OPM shall obtain the input of trained...

  12. 5 CFR 890.1036 - Information considered in deciding a contest.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Information considered in deciding a contest. 890.1036 Section 890.1036 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED... official shall not give such a denial any probative weight. (c) Mandatory disclosures. Any...

  13. 5 CFR 890.1023 - Information considered in deciding a contest.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Information considered in deciding a contest. 890.1023 Section 890.1023 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED... give such a denial any probative weight. (c) Mandatory disclosures. Regardless of the basis for...

  14. Withdrawing life support. Do families and physicians decide as patients do?

    PubMed

    Munoz Silva, J E; Kjellstrand, C M

    1988-01-01

    We studied whether families and physicians decided as patients do, in discontinuation of life-supporting treatment. We did so by comparing 66 competent patients, who themselves decided to stop dialysis to die, and 66 incompetent patients for whom families and physicians decided. We also compared comatose to demented patients and families' to physician's decision-making. There was no difference in sex, diagnosis, age, time period, decision maker (family or physician), site of residence, duration or type of dialysis, home or in-center dialysis or survival time after discontinuation. More competent than incompetent patients died at home (p less than 0.005). All incompetent patients had emerging complications, but such complications were present in only 40/60 competent patients (p less than 0.0005). In the early 1970s the physician initiated the termination of dialysis in all cases of incompetent patients; in the 1980s this had decreased to 48% (less than 0.001). No case was decided by court or hospital committee. There was no difference between comatose or demented incompetent patients, nor was there any important difference between family and physician decision-making. We believe our study indicates that substitute judgement is applied appropriately and that the decision can safely and best be left to families and physicians.

  15. 43 CFR 2932.26 - How will BLM decide whether to issue a Special Recreation Permit?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Special Recreation Permit? 2932.26 Section 2932.26 Public Lands: Interior Regulations Relating to Public...) PERMITS FOR RECREATION ON PUBLIC LANDS Special Recreation Permits for Commercial Use, Competitive Events, Organized Groups, and Recreation Use in Special Areas § 2932.26 How will BLM decide whether to issue...

  16. 43 CFR 2932.26 - How will BLM decide whether to issue a Special Recreation Permit?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Special Recreation Permit? 2932.26 Section 2932.26 Public Lands: Interior Regulations Relating to Public...) PERMITS FOR RECREATION ON PUBLIC LANDS Special Recreation Permits for Commercial Use, Competitive Events, Organized Groups, and Recreation Use in Special Areas § 2932.26 How will BLM decide whether to issue...

  17. 43 CFR 2932.26 - How will BLM decide whether to issue a Special Recreation Permit?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Special Recreation Permit? 2932.26 Section 2932.26 Public Lands: Interior Regulations Relating to Public...) PERMITS FOR RECREATION ON PUBLIC LANDS Special Recreation Permits for Commercial Use, Competitive Events, Organized Groups, and Recreation Use in Special Areas § 2932.26 How will BLM decide whether to issue...

  18. 43 CFR 2932.26 - How will BLM decide whether to issue a Special Recreation Permit?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Special Recreation Permit? 2932.26 Section 2932.26 Public Lands: Interior Regulations Relating to Public...) PERMITS FOR RECREATION ON PUBLIC LANDS Special Recreation Permits for Commercial Use, Competitive Events, Organized Groups, and Recreation Use in Special Areas § 2932.26 How will BLM decide whether to issue...

  19. 30 CFR 585.1006 - How will BOEM decide whether to issue an Alternate Use RUE?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Alternate Use RUE? 585.1006 Section 585.1006 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT... Facilities Requesting An Alternate Use Rue § 585.1006 How will BOEM decide whether to issue an Alternate Use RUE? (a) We will consider requests for an Alternate Use RUE on a case-by-case basis. In...

  20. 30 CFR 285.1006 - How will MMS decide whether to issue an Alternate Use RUE?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Alternate Use RUE? 285.1006 Section 285.1006 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION... Activities Using Existing OCS Facilities Requesting An Alternate Use Rue § 285.1006 How will MMS decide whether to issue an Alternate Use RUE? (a) We will consider requests for an Alternate Use RUE on a...

  1. 30 CFR 585.1006 - How will BOEM decide whether to issue an Alternate Use RUE?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Alternate Use RUE? 585.1006 Section 585.1006 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT... Facilities Requesting An Alternate Use Rue § 585.1006 How will BOEM decide whether to issue an Alternate Use RUE? (a) We will consider requests for an Alternate Use RUE on a case-by-case basis. In...

  2. 30 CFR 585.1006 - How will BOEM decide whether to issue an Alternate Use RUE?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Alternate Use RUE? 585.1006 Section 585.1006 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT... Facilities Requesting An Alternate Use Rue § 585.1006 How will BOEM decide whether to issue an Alternate Use RUE? (a) We will consider requests for an Alternate Use RUE on a case-by-case basis. In...

  3. 30 CFR 285.1006 - How will MMS decide whether to issue an Alternate Use RUE?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Alternate Use RUE? 285.1006 Section 285.1006 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF... Requesting An Alternate Use Rue § 285.1006 How will MMS decide whether to issue an Alternate Use RUE? (a) We will consider requests for an Alternate Use RUE on a case-by-case basis. In considering such...

  4. Live and Let Die: CITES--How We Decide the Fate of the World's Species.

    ERIC Educational Resources Information Center

    Beasley, Conger, Jr.

    1992-01-01

    Discusses the significance of the decisions made at the Eighth Convention on the International Trade of Endangered Species (CITES) when governmental delegates and nongovernmental organizations from around the world decided the fate of potentially threatened and endangered species of plants and animals. Particular emphasis is placed on the politics…

  5. 42 CFR 51a.5 - What criteria will DHHS use to decide which projects to fund?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICES GRANTS PROJECT GRANTS FOR MATERNAL AND CHILD HEALTH § 51a.5 What criteria will DHHS use to decide...) The extent to which the project will contribute to the advancement of maternal and child health and/or... mortality rate (relative to the latest average infant mortality rate in the United States or in the State...

  6. 20 CFR 416.1861 - Deciding whether you are a child: Are you a student?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... student? 416.1861 Section 416.1861 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered A Child § 416.1861 Deciding whether you are a child: Are you a student? (a) Are you a student? You are a student...

  7. 20 CFR 416.1861 - Deciding whether you are a child: Are you a student?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... student? 416.1861 Section 416.1861 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered A Child § 416.1861 Deciding whether you are a child: Are you a student? (a) Are you a student? You are a student...

  8. 20 CFR 416.1861 - Deciding whether you are a child: Are you a student?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... student? 416.1861 Section 416.1861 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered A Child § 416.1861 Deciding whether you are a child: Are you a student? (a) Are you a student? You are a student...

  9. 12 CFR 617.7415 - How does a qualified lender decide to restructure a loan?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false How does a qualified lender decide to restructure a loan? 617.7415 Section 617.7415 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM... projections, the lender may use benchmarks to determine the operational input costs and chattel...

  10. 12 CFR 617.7415 - How does a qualified lender decide to restructure a loan?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false How does a qualified lender decide to restructure a loan? 617.7415 Section 617.7415 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM... projections, the lender may use benchmarks to determine the operational input costs and chattel...

  11. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Deciding a contest without additional fact-finding. 890.1038 Section 890.1038 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... fact-finding. (a) Written decision. The suspending official shall issue a written decision on...

  12. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Deciding a contest without additional fact-finding. 890.1038 Section 890.1038 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... fact-finding. (a) Written decision. The suspending official shall issue a written decision on...

  13. 42 CFR 51a.5 - What criteria will DHHS use to decide which projects to fund?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SERVICES GRANTS PROJECT GRANTS FOR MATERNAL AND CHILD HEALTH § 51a.5 What criteria will DHHS use to decide... Children 2000: National Health Promotion and Disease Prevention Objectives Related to Mothers, Infants, Children, Adolescents, and Youth is a special compendium of health status goals and national...

  14. 42 CFR 51a.5 - What criteria will DHHS use to decide which projects to fund?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... SERVICES GRANTS PROJECT GRANTS FOR MATERNAL AND CHILD HEALTH § 51a.5 What criteria will DHHS use to decide... Children 2000: National Health Promotion and Disease Prevention Objectives Related to Mothers, Infants, Children, Adolescents, and Youth is a special compendium of health status goals and national...

  15. 42 CFR 51a.5 - What criteria will DHHS use to decide which projects to fund?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... SERVICES GRANTS PROJECT GRANTS FOR MATERNAL AND CHILD HEALTH § 51a.5 What criteria will DHHS use to decide... Children 2000: National Health Promotion and Disease Prevention Objectives Related to Mothers, Infants, Children, Adolescents, and Youth is a special compendium of health status goals and national...

  16. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false When will SBA not decide an SDB protest? 124.1008 Section 124.1008 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification,...

  17. Generating "Random" Integers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2011-01-01

    One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…

  18. Examining the impact of larval source management and insecticide-treated nets using a spatial agent-based model of Anopheles gambiae and a landscape generator tool

    PubMed Central

    2013-01-01

    Background Agent-based models (ABMs) have been used to estimate the effects of malaria-control interventions. Early studies have shown the efficacy of larval source management (LSM) and insecticide-treated nets (ITNs) as vector-control interventions, applied both in isolation and in combination. However, the robustness of results can be affected by several important modelling assumptions, including the type of boundary used for landscapes, and the number of replicated simulation runs reported in results. Selection of the ITN coverage definition may also affect the predictive findings. Hence, by replication, independent verification of prior findings of published models bears special importance. Methods A spatially-explicit entomological ABM of Anopheles gambiae is used to simulate the resource-seeking process of mosquitoes in grid-based landscapes. To explore LSM and replicate results of an earlier LSM study, the original landscapes and scenarios are replicated by using a landscape generator tool, and 1,800 replicated simulations are run using absorbing and non-absorbing boundaries. To explore ITNs and evaluate the relative impacts of the different ITN coverage schemes, the settings of an earlier ITN study are replicated, the coverage schemes are defined and simulated, and 9,000 replicated simulations for three ITN parameters (coverage, repellence and mortality) are run. To evaluate LSM and ITNs in combination, landscapes with varying densities of houses and human populations are generated, and 12,000 simulations are run. Results General agreement with an earlier LSM study is observed when an absorbing boundary is used. However, using a non-absorbing boundary produces significantly different results, which may be attributed to the unrealistic killing effect of an absorbing boundary. Abundance cannot be completely suppressed by removing aquatic habitats within 300 m of houses. Also, with density-dependent oviposition, removal of insufficient number of aquatic

  19. An In Vitro Dormancy Model of Estrogen-sensitive Breast Cancer in the Bone Marrow: A Tool for Molecular Mechanism Studies and Hypothesis Generation.

    PubMed

    Tivari, Samir; Korah, Reju; Lindy, Michael; Wieder, Robert

    2015-01-01

    The study of breast cancer dormancy in the bone marrow is an exceptionally difficult undertaking due to the complexity of the interactions of dormant cells with their microenvironment, their rarity and the overwhelming excess of hematopoietic cells. Towards this end, we developed an in vitro 2D clonogenic model of dormancy of estrogen-sensitive breast cancer cells in the bone marrow. The model consists of a few key elements necessary for dormancy. These include 1) the use of estrogen sensitive breast cancer cells, which are the type likely to remain dormant for extended periods, 2) incubation of cells at clonogenic density, where the structural interaction of each cell is primarily with the substratum, 3) fibronectin, a key structural element of the marrow and 4) FGF-2, a growth factor abundantly synthesized by bone marrow stromal cells and heavily deposited in the extracellular matrix. Cells incubated with FGF-2 form dormant clones after 6 days, which consist of 12 or less cells that have a distinct flat appearance, are significantly larger and more spread out than growing cells and have large cytoplasm to nucleus ratios. In contrast, cells incubated without FGF-2 form primarily growing colonies consisting of>30 relatively small cells. Perturbations of the system with antibodies, inhibitors, peptides or nucleic acids on day 3 after incubation can significantly affect various phenotypic and molecular aspects of the dormant cells at 6 days and can be used to assess the roles of membrane-localized or intracellular molecules, factors or signaling pathways on the dormant state or survival of dormant cells. While recognizing the in vitro nature of the assay, it can function as a highly useful tool to glean significant information about the molecular mechanisms necessary for establishment and survival of dormant cells. This data can be used to generate hypotheses to be tested in vivo models. PMID:26168083

  20. Analysis of the Vaginal Microbiome by Next-Generation Sequencing and Evaluation of its Performance as a Clinical Diagnostic Tool in Vaginitis

    PubMed Central

    Hong, Ki Ho; Hong, Sung Kuk; Cho, Sung Im; Ra, Eunkyung; Han, Kyung Hee; Kang, Soon Beom; Kim, Eui-Chong; Park, Sung Sup

    2016-01-01

    Background Next-generation sequencing (NGS) can detect many more microorganisms of a microbiome than traditional methods. This study aimed to analyze the vaginal microbiomes of Korean women by using NGS that included bacteria and other microorganisms. The NGS results were compared with the results of other assays, and NGS was evaluated for its feasibility for predicting vaginitis. Methods In total, 89 vaginal swab specimens were collected. Microscopic examinations of Gram staining and microbiological cultures were conducted on 67 specimens. NGS was performed with GS junior system on all of the vaginal specimens for the 16S rRNA, internal transcribed spacer (ITS), and Tvk genes to detect bacteria, fungi, and Trichomonas vaginalis. In addition, DNA probe assays of the Candida spp., Gardnerella vaginalis, and Trichomonas vaginalis were performed. Various predictors of diversity that were obtained from the NGS data were analyzed to predict vaginitis. Results ITS sequences were obtained in most of the specimens (56.2%). The compositions of the intermediate and vaginitis Nugent score groups were similar to each other but differed from the composition of the normal score group. The fraction of the Lactobacillus spp. showed the highest area under the curve value (0.8559) in ROC curve analysis. The NGS and DNA probe assay results showed good agreement (range, 86.2-89.7%). Conclusions Fungi as well as bacteria should be considered for the investigation of vaginal microbiome. The intermediate and vaginitis Nugent score groups were indistinguishable in NGS. NGS is a promising diagnostic tool of the vaginal microbiome and vaginitis, although some problems need to be resolved. PMID:27374709

  1. Cratylia mollis 1, 4 lectin: a new biotechnological tool in IL-6, IL-17A, IL-22, and IL-23 induction and generation of immunological memory.

    PubMed

    de Oliveira, Priscilla Stela Santana; Rêgo, Moacyr Jesus Barreto de Melo; da Silva, Rafael Ramos; Cavalcanti, Mariana Brayner; Galdino, Suely Lins; Correia, Maria Tereza dos Santos; Coelho, Luana Cassandra Breitenbach Barroso; Pitta, Maira Galdino da Rocha

    2013-01-01

    Cratylia mollis lectin has already established cytokine induction in Th1 and Th2 pathways. Thereby, this study aimed to evaluate Cramoll 1, 4 in IL-6, IL-17A, IL-22, and IL-23 induction as well as analyze immunologic memory mechanism by reinducing lymphocyte stimulation. Initially we performed a screening in cultured splenocytes where Cramoll 1, 4 stimulated IL-6 production 5x more than ConA (P < 0.05). The same behavior was observed with IL-22 where the increase was greater than 4x. Nevertheless, IL-17A induction was similar for both lectins. In PBMCs, the same splenocytes course was observed for IL-6 and IL-17A. Concerning the stimulation of IL-22 and IL-23 Cramoll 1, 4 was more efficient than ConA in cytokines stimulation mainly in IL-23 (P < 0.01). Analyzing reinduced lymphocyte stimulation, IL-17A production was higher (P < 0.001) when the first stimulus was realized with Cramoll 1, 4 at 1 μ g/mL and the second at 5 μ g/mL. IL-22 shows significant differences (P < 0.01) at the same condition. Nevertheless, IL-23 revels the best response when the first stimuli was realized with Cramoll1, 4 at 100 ng/mL and the second with 5 μ g/mL. We conclude that the Cramoll 1, 4 is able to induce IL-6, IL-17A, IL-22, and IL-23 cytokines in vitro better than Concavalin A, besides immunologic memory generation, being a potential biotechnological tool in Th17 pathway studies. PMID:23586026

  2. Downhole tool

    SciTech Connect

    Hall, David R.; Muradov, Andrei; Pixton, David S.; Dahlgren, Scott Steven; Briscoe, Michael A.

    2007-03-20

    A double shouldered downhole tool connection comprises box and pin connections having mating threads intermediate mating primary and secondary shoulders. The connection further comprises a secondary shoulder component retained in the box connection intermediate a floating component and the primary shoulders. The secondary shoulder component and the pin connection cooperate to transfer a portion of makeup load to the box connection. The downhole tool may be selected from the group consisting of drill pipe, drill collars, production pipe, and reamers. The floating component may be selected from the group consisting of electronics modules, generators, gyroscopes, power sources, and stators. The secondary shoulder component may comprises an interface to the box connection selected from the group consisting of radial grooves, axial grooves, tapered grooves, radial protrusions, axial protrusions, tapered protrusions, shoulders, and threads.

  3. Deciding Not to Un-Do the "I Do:" Therapy Experiences of Women Who Consider Divorce But Decide to Remain Married.

    PubMed

    Kanewischer, Erica J W; Harris, Steven M

    2015-07-01

    This study explores women's experience of marital therapy while they navigated decision making around divorce. A qualitative method was used to gain a deeper understanding of the participants' therapy and relationship decision-making experiences. How are women's decisions whether or not to exit their marriage affected by therapy? The researchers interviewed 15 women who had considered initiating divorce before they turned 40 and had attended at least five marital therapy sessions but ultimately decided not to divorce. In general, participants reported that the therapy was helpful to them, their decision-making process and their marriages. Five main themes emerged from the interviews: Women Initiated Therapy, Therapist Was Experienced as Unbiased, Therapy was Helpful, Importance of Extra-therapeutic Factors, and Gradual Process.

  4. 5 CFR 890.1069 - Information the debarring official must consider in deciding a provider's contest of proposed...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL... deciding a provider's contest of proposed penalties and assessments. (a) Documentary material and...

  5. Databases and tools for nuclear astrophysics applications. BRUSsels Nuclear LIBrary (BRUSLIB), Nuclear Astrophysics Compilation of REactions II (NACRE II) and Nuclear NETwork GENerator (NETGEN)

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Goriely, S.; Jorissen, A.; Chen, G. L.; Arnould, M.

    2013-01-01

    An update of a previous description of the BRUSLIB + NACRE package of nuclear data for astrophysics and of the web-based nuclear network generator NETGEN is presented. The new version of BRUSLIB contains the latest predictions of a wide variety of nuclear data based on the most recent version of the Brussels-Montreal Skyrme-Hartree-Fock-Bogoliubov model. The nuclear masses, radii, spin/parities, deformations, single-particle schemes, matter densities, nuclear level densities, E1 strength functions, fission properties, and partition functions are provided for all nuclei lying between the proton and neutron drip lines over the 8 ≤ Z ≤ 110 range, whose evaluation is based on a unique microscopic model that ensures a good compromise between accuracy, reliability, and feasibility. In addition, these various ingredients are used to calculate about 100 000 Hauser-Feshbach neutron-, proton-, α-, and γ-induced reaction rates based on the reaction code TALYS. NACRE is superseded by the NACRE II compilation for 15 charged-particle transfer reactions and 19 charged-particle radiative captures on stable targets with mass numbers A < 16. NACRE II features the inclusion of experimental data made available after the publication of NACRE in 1999 and up to 2011. In addition, the extrapolation of the available data to the very low energies of astrophysical relevance is improved through the systematic use of phenomenological potential models. Uncertainties in the rates are also evaluated on this basis. Finally, the latest release v10.0 of the web-based tool NETGEN is presented. In addition to the data already used in the previous NETGEN package, it contains in a fully documented form the new BRUSLIB and NACRE II data, as well as new experiment-based radiative neutron capture cross sections. The full new versions of BRUSLIB, NACRE II, and NETGEN are available electronically from the nuclear database at http://www.astro.ulb.ac.be/NuclearData. The nuclear material is presented in

  6. The Next Generation Virgo Cluster Survey-Infrared (NGVS-IR). I. A New Near-Ultraviolet, Optical, and Near-Infrared Globular Cluster Selection Tool

    NASA Astrophysics Data System (ADS)

    Muñoz, Roberto P.; Puzia, Thomas H.; Lançon, Ariane; Peng, Eric W.; Côté, Patrick; Ferrarese, Laura; Blakeslee, John P.; Mei, Simona; Cuillandre, Jean-Charles; Hudelot, Patrick; Courteau, Stéphane; Duc, Pierre-Alain; Balogh, Michael L.; Boselli, Alessandro; Bournaud, Frédéric; Carlberg, Raymond G.; Chapman, Scott C.; Durrell, Patrick; Eigenthaler, Paul; Emsellem, Eric; Gavazzi, Giuseppe; Gwyn, Stephen; Huertas-Company, Marc; Ilbert, Olivier; Jordán, Andrés; Läsker, Ronald; Licitra, Rossella; Liu, Chengze; MacArthur, Lauren; McConnachie, Alan; McCracken, Henry Joy; Mellier, Yannick; Peng, Chien Y.; Raichoor, Anand; Taylor, Matthew A.; Tonry, John L.; Tully, R. Brent; Zhang, Hongxin

    2014-01-01

    The NGVS-IR project (Next Generation Virgo Cluster Survey-Infrared) is a contiguous, near-infrared imaging survey of the Virgo cluster of galaxies. It complements the optical wide-field survey of Virgo (NGVS). In its current state, NGVS-IR consists of Ks -band imaging of 4 deg2 centered on M87 and J- and Ks -band imaging of ~16 deg2 covering the region between M49 and M87. We present observations of the central 4 deg2 centered on Virgo's core region. The data were acquired with WIRCam on the Canada-France-Hawaii Telescope, and the total integration time was 41 hr distributed over 34 contiguous tiles. A survey-specific strategy was designed to account for extended galaxies while still measuring accurate sky brightness within the survey area. The average 5σ limiting magnitude is Ks = 24.4 AB mag, and the 50% completeness limit is Ks = 23.75 AB mag for point-source detections, when using only images with better than 0.''7 seeing (median seeing 0.''54). Star clusters are marginally resolved in these image stacks, and Virgo galaxies with \\mu _{K_s} \\simeq 24.4 AB mag arcsec-2 are detected. Combining the Ks data with optical and ultraviolet data, we build the uiKs color-color diagram, which allows a very clean color-based selection of globular clusters in Virgo. This diagnostic plot will provide reliable globular cluster candidates for spectroscopic follow-up campaigns, needed to continue the exploration of Virgo's photometric and kinematic substructures, and will help the design of future searches for globular clusters in extragalactic systems. We show that the new uiKs diagram displays significantly clearer substructure in the distribution of stars, globular clusters, and galaxies than the gzKs diagram—the NGVS + NGVS-IR equivalent of the BzK diagram that is widely used in cosmological surveys. Equipped with this powerful new tool, future NGVS-IR investigations based on the uiKs diagram will address the mapping and analysis of extended structures and compact

  7. Split views among parents regarding children's right to decide about participation in research: a questionnaire survey.

    PubMed

    Swartling, U; Helgesson, G; Hansson, M G; Ludvigsson, J

    2009-07-01

    Based on extensive questionnaire data, this paper focuses on parents' views about children's right to decide about participation in research. The data originates from 4000 families participating in a longitudinal prospective screening as 1997. Although current regulations and recommendations underline that children should have influence over their participation, many parents in this study disagree. Most (66%) were positive providing information to the child about relevant aspects of the study. However, responding parents were split about whether or not children should at some point be allowed decisional authority when participating in research: 41.6% of the parents reported being against or unsure. Those who responded positively believed that children should be allowed to decide about blood-sampling procedures (70%), but to a less extent about participation (48.5%), analyses of samples (19.7%) and biological bank storage (15.4%). That as many as possible should remain in the study, and that children do not have the competence to understand the consequences for research was strongly stressed by respondents who do not think children should have a right to decide. When asked what interests they consider most important in paediatric research, child autonomy and decision-making was ranked lowest. We discuss the implications of these findings.

  8. 42 CFR 59.7 - What criteria will the Department of Health and Human Services use to decide which family...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Human Services use to decide which family planning services projects to fund and in what amount? 59.7 Section 59.7 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS GRANTS... Department of Health and Human Services use to decide which family planning services projects to fund and...

  9. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1,...

  10. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1,...

  11. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1,...

  12. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1,...

  13. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1,...

  14. 41 CFR 102-37.365 - What steps must a SASP take if the State decides to liquidate the agency?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What steps must a SASP take if the State decides to liquidate the agency? 102-37.365 Section 102-37.365 Public Contracts and...) Liquidating A Sasp § 102-37.365 What steps must a SASP take if the State decides to liquidate the...

  15. 41 CFR 102-37.365 - What steps must a SASP take if the State decides to liquidate the agency?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What steps must a SASP take if the State decides to liquidate the agency? 102-37.365 Section 102-37.365 Public Contracts and...) Liquidating A Sasp § 102-37.365 What steps must a SASP take if the State decides to liquidate the...

  16. 20 CFR 422.435 - What happens when we decide to send an administrative wage garnishment order to your employer?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false What happens when we decide to send an administrative wage garnishment order to your employer? 422.435 Section 422.435 Employees' Benefits SOCIAL SECURITY ADMINISTRATION ORGANIZATION AND PROCEDURES Collection of Debts by Administrative Wage Garnishment § 422.435 What happens when we decide...

  17. 33 CFR 96.440 - How will the Coast Guard decide whether to approve an organization's request to be authorized?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false How will the Coast Guard decide... Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY VESSEL OPERATING REGULATIONS RULES FOR THE... Act on Behalf of the U.S. § 96.440 How will the Coast Guard decide whether to approve an...

  18. 33 CFR 96.440 - How will the Coast Guard decide whether to approve an organization's request to be authorized?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false How will the Coast Guard decide... Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY VESSEL OPERATING REGULATIONS RULES FOR THE... Act on Behalf of the U.S. § 96.440 How will the Coast Guard decide whether to approve an...

  19. 33 CFR 96.440 - How will the Coast Guard decide whether to approve an organization's request to be authorized?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false How will the Coast Guard decide... Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY VESSEL OPERATING REGULATIONS RULES FOR THE... Act on Behalf of the U.S. § 96.440 How will the Coast Guard decide whether to approve an...

  20. 20 CFR 416.1866 - Deciding whether you are a child: Are you the head of a household?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered A Child § 416.1866 Deciding whether you are a child: Are you the head of a household? (a) Meaning of head of... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false Deciding whether you are a child: Are you...

  1. 12 CFR 617.7610 - What should the System institution do when it decides to sell acquired agricultural real estate?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it...

  2. 33 CFR 96.440 - How will the Coast Guard decide whether to approve an organization's request to be authorized?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false How will the Coast Guard decide... Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY VESSEL OPERATING REGULATIONS RULES FOR THE... Act on Behalf of the U.S. § 96.440 How will the Coast Guard decide whether to approve an...

  3. 40 CFR 35.4225 - What if my group decides a prospective contractor has a conflict of interest?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....4220, your group decides a prospective contractor has a significant conflict of interest that cannot be... 40 Protection of Environment 1 2010-07-01 2010-07-01 false What if my group decides a prospective contractor has a conflict of interest? 35.4225 Section 35.4225 Protection of Environment...

  4. 25 CFR 103.34 - What if the lender and borrower decide to change the terms of the loan?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... identity or organizational structure of the borrower. (5) Allow any material change in the use of loan... 25 Indians 1 2010-04-01 2010-04-01 false What if the lender and borrower decide to change the... What if the lender and borrower decide to change the terms of the loan? (a) The lender must...

  5. 5 CFR 1201.175 - Judicial review of cases decided under 5 U.S.C. 7702.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Discrimination Special Panel § 1201.175 Judicial review of cases decided under 5 U.S.C. 7702. (a) Place and type... of cases decided under 5 U.S.C. 7702. Those cases include appeals from actions taken under the... reviewable action....

  6. Application of ChemDraw NMR Tool: Correlation of Program-Generated (Super 13)C Chemical Shifts and pK[subscript a] Values of Para-Substituted Benzoic Acids

    ERIC Educational Resources Information Center

    Hongyi Wang

    2005-01-01

    A study uses the ChemDraw nuclear magnetic resonance spectroscopy (NMR) tool to process 15 para-substituted benzoic acids and generate (super 13)C NMR chemical shifts of C1 through C5. The data were plotted against their pK[subscript a] value and a fairly good linear fit was found for pK[subscript a] versus delta[subscript c1].

  7. The Synthesis Map Is a Multidimensional Educational Tool That Provides Insight into Students' Mental Models and Promotes Students' Synthetic Knowledge Generation

    ERIC Educational Resources Information Center

    Ortega, Ryan A.; Brame, Cynthia J.

    2015-01-01

    Concept mapping was developed as a method of displaying and organizing hierarchical knowledge structures. Using the new, multidimensional presentation software Prezi, we have developed a new teaching technique designed to engage higher-level skills in the cognitive domain. This tool, synthesis mapping, is a natural evolution of concept mapping,…

  8. Comparison of select reference management tools.

    PubMed

    Zhang, Yingting

    2012-01-01

    Bibliographic management tools have been widely used by researchers to store, organize, and manage their references for research papers, theses, dissertations, journal articles, and other publications. There are a number of reference management tools available. In order for users to decide which tool is best for their needs, it is important to know each tool's strengths and weaknesses. This article compares four reference management tools, one of which is licensed by University of Medicine and Dentistry of New Jersey libraries and the other three are open source and freely available. They were chosen based on their functionality, ease of use, availability to library users, and popularity. These four tools are EndNote/EndNote Web, Zotero, Connotea, and Mendeley Desktop/Mendeley Web. Each tool is analyzed in terms of the following features: accessing, collecting, organizing, collaborating, and citing/formatting. A comparison table is included to summarize the key features of these tools.

  9. Comparison of select reference management tools.

    PubMed

    Zhang, Yingting

    2012-01-01

    Bibliographic management tools have been widely used by researchers to store, organize, and manage their references for research papers, theses, dissertations, journal articles, and other publications. There are a number of reference management tools available. In order for users to decide which tool is best for their needs, it is important to know each tool's strengths and weaknesses. This article compares four reference management tools, one of which is licensed by University of Medicine and Dentistry of New Jersey libraries and the other three are open source and freely available. They were chosen based on their functionality, ease of use, availability to library users, and popularity. These four tools are EndNote/EndNote Web, Zotero, Connotea, and Mendeley Desktop/Mendeley Web. Each tool is analyzed in terms of the following features: accessing, collecting, organizing, collaborating, and citing/formatting. A comparison table is included to summarize the key features of these tools. PMID:22289095

  10. LensTools: Weak Lensing computing tools

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  11. Machine tools get smarter

    SciTech Connect

    Valenti, M.

    1995-11-01

    This article describes how, using software, sensors, and controllers, a new generation of intelligent machine tools are optimizing grinding, milling, and molding processes. A paradox of manufacturing parts is that the faster the parts are made, the less accurate they are--and vice versa. However, a combination of software, sensors, controllers, and mechanical innovations are being used to create a new generation of intelligent machine tools capable of optimizing their own grinding, milling, and molding processes. These brainy tools are allowing manufacturers to machine more-complex, higher-quality parts in shorter cycle times. The technology also lowers scrap rates and reduces or eliminates the need for polishing inadequately finished parts.

  12. DG-AMMOS: A New tool to generate 3D conformation of small molecules using Distance Geometry and Automated Molecular Mechanics Optimization for in silico Screening

    PubMed Central

    2009-01-01

    Background Discovery of new bioactive molecules that could enter drug discovery programs or that could serve as chemical probes is a very complex and costly endeavor. Structure-based and ligand-based in silico screening approaches are nowadays extensively used to complement experimental screening approaches in order to increase the effectiveness of the process and facilitating the screening of thousands or millions of small molecules against a biomolecular target. Both in silico screening methods require as input a suitable chemical compound collection and most often the 3D structure of the small molecules has to be generated since compounds are usually delivered in 1D SMILES, CANSMILES or in 2D SDF formats. Results Here, we describe the new open source program DG-AMMOS which allows the generation of the 3D conformation of small molecules using Distance Geometry and their energy minimization via Automated Molecular Mechanics Optimization. The program is validated on the Astex dataset, the ChemBridge Diversity database and on a number of small molecules with known crystal structures extracted from the Cambridge Structural Database. A comparison with the free program Balloon and the well-known commercial program Omega generating the 3D of small molecules is carried out. The results show that the new free program DG-AMMOS is a very efficient 3D structure generator engine. Conclusion DG-AMMOS provides fast, automated and reliable access to the generation of 3D conformation of small molecules and facilitates the preparation of a compound collection prior to high-throughput virtual screening computations. The validation of DG-AMMOS on several different datasets proves that generated structures are generally of equal quality or sometimes better than structures obtained by other tested methods. PMID:19912625

  13. Automatically-Programed Machine Tools

    NASA Technical Reports Server (NTRS)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  14. 30 CFR 285.429 - What criteria will MMS consider in deciding whether to renew a lease or grant?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... deciding whether to renew a lease or grant: (a) Design life of existing technology. (b) Availability and feasibility of new technology. (c) Environmental and safety record of the lessee or grantee. (d)...

  15. 30 CFR 250.1470 - How does BSEE decide what the amount of the penalty should be?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Outer Continental Shelf Civil Penalties General Provisions § 250.1470 How does BSEE decide...

  16. 30 CFR 250.1470 - How does BSEE decide what the amount of the penalty should be?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Outer Continental Shelf Civil Penalties General Provisions § 250.1470 How does BSEE decide...

  17. 50 CFR 23.2 - How do I decide if these regulations apply to my shipment or me?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ENDANGERED SPECIES OF WILD FAUNA AND FLORA (CITES) Introduction § 23.2 How do I decide if these regulations... plant species (including parts, products, derivatives, whether wild-collected, or born or propagated...

  18. 50 CFR 23.2 - How do I decide if these regulations apply to my shipment or me?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ENDANGERED SPECIES OF WILD FAUNA AND FLORA (CITES) Introduction § 23.2 How do I decide if these regulations... plant species (including parts, products, derivatives, whether wild-collected, or born or propagated...

  19. 50 CFR 23.2 - How do I decide if these regulations apply to my shipment or me?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ENDANGERED SPECIES OF WILD FAUNA AND FLORA (CITES) Introduction § 23.2 How do I decide if these regulations... plant species (including parts, products, derivatives, whether wild-collected, or born or propagated...

  20. 30 CFR 250.1470 - How does BSEE decide what the amount of the penalty should be?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Outer Continental Shelf Civil Penalties General Provisions § 250.1470 How does BSEE decide...

  1. Sowing the Seeds for a Bountiful Harvest: Shaping the Rules and Creating the Tools for Wisconsin's Next Generation of Wind Farms

    SciTech Connect

    Vickerman, Michael Jay

    2012-03-29

    Project objectives are twofold: (1) to engage wind industry stakeholders to participate in formulating uniform permitting standards applicable to commercial wind energy installations; and (2) to create and maintain an online Wisconsin Wind Information Center to enable policymakers and the public to increaser their knowledge of and support for wind generation in Wisconsin.

  2. Reasons why women who have mastectomy decide to have or not to have breast reconstruction.

    PubMed

    Reaby, L L

    1998-06-01

    Breast reconstruction after mastectomy is chosen by approximately 10 percent of Australian women. Younger women are more likely to have this surgical procedure. This suggests that there may be many factors determining this choice. Sixty-four women who wore an external postmastectomy breast prosthesis and 31 women who had postmastectomy breast reconstruction participated in the present study. The purpose was to gain a greater understanding through semi-structured interviews of why women who had breast reconstruction chose this alternative and why women who wore the external postmastectomy breast prosthesis elected not to have reconstruction. The study also ascertained how difficult it was for the women in both groups to decide their particular breast restoration alternative. The most frequently endorsed reasons for not having breast reconstruction in the prosthesis group included: (1) not essential for physical well being, (2) not essential for emotional well being, (3) not having enough information about the procedure, and (4) not wanting anything unnatural in the body. When each member of the group was asked to identify a major reason for not having reconstruction, two predominant issues emerged: (1) fearing complications and (2) perceiving themselves as being too old for the procedure. Twelve percent of the prosthesis group experienced difficulty in making the decision not to have reconstruction. Three factors accounted for this difficulty: (1) the lack of family support, (2) the inability to have a specific type of reconstruction, and (3) the perception that friends and acquaintances saw the surgery as cosmetic. The most frequently reported reasons given by the reconstruction group for having reconstruction included: (1) to get rid of the external breast prosthesis, (2) to be able to wear many different types of clothing, (3) to regain femininity, and (4) to feel whole again. The least influential factors were to improve marital and sexual relations. The major

  3. PSI decides to write off most of its $2. 7B Marble Hill investment

    SciTech Connect

    Not Available

    1986-03-01

    After the Indiana Supreme Court ruled last November that the utility may not recover its investment from the cancelled plant, Public Service Indiana (PSI) decided to write off a substantial portion of the $2.7 million already invested in the cancelled Marble Hill nuclear plant. The board will omit common stock dividends for three years and the preferred stock dividend for the first quarter. It will also accept a negotiated rate settlement of 8.2% increase. A 5% emergency surcharge will become permanent. The settlement calls for the utility to restrict capital expenditures over the next three years to the $285.1 million already budgeted for construction. Opposition from a consumers group argues that ratepayers should not be the risk bearers for PSI, but the utility argues that its long-term financial health depends on attracting and keeping investors.

  4. Neurocomputational account of how the human brain decides when to have a break.

    PubMed

    Meyniel, Florent; Sergent, Claire; Rigoux, Lionel; Daunizeau, Jean; Pessiglione, Mathias

    2013-02-12

    No pain, no gain: cost-benefit trade-off has been formalized in classical decision theory to account for how we choose whether to engage effort. However, how the brain decides when to have breaks in the course of effort production remains poorly understood. We propose that decisions to cease and resume work are triggered by a cost evidence accumulation signal reaching upper and lower bounds, respectively. We developed a task in which participants are free to exert a physical effort knowing that their payoff would be proportional to their effort duration. Functional MRI and magnetoencephalography recordings conjointly revealed that the theoretical cost evidence accumulation signal was expressed in proprioceptive regions (bilateral posterior insula). Furthermore, the slopes and bounds of the accumulation process were adapted to the difficulty of the task and the money at stake. Cost evidence accumulation might therefore provide a dynamical mechanistic account of how the human brain maximizes benefits while preventing exhaustion.

  5. Sliding versus Deciding in Relationships: Associations with Relationship Quality, Commitment, and Infidelity

    PubMed Central

    Owen, Jesse; Rhoades, Galena K.; Stanley, Scott M.

    2013-01-01

    From choosing a partner to date to deciding to cohabit or marry, individuals are faced with many relationship choices. Given the costs of failed relationships (e.g., personal distress, problems with work, lower well-being for children, lost opportunities to meet other partners), it is important consider how individuals are approaching these decisions. The current study tested if more thoughtful and clear relationship decision-making processes would relate to individuals’ levels of satisfaction with and dedication to their partners as well as their extra-dyadic involvements. In a sample of 252 men and women, the results showed that regardless of relationship status (i.e., dating, cohabiting, or married), those who reported more thoughtful decision-making processes also reported more dedication to their partners, higher satisfaction with the relationship, and fewer extra-dyadic involvements. PMID:23690736

  6. How to decide whether to move species threatened by climate change.

    PubMed

    Rout, Tracy M; McDonald-Madden, Eve; Martin, Tara G; Mitchell, Nicola J; Possingham, Hugh P; Armstrong, Doug P

    2013-01-01

    Introducing species to areas outside their historical range to secure their future under climate change is a controversial strategy for preventing extinction. While the debate over the wisdom of this strategy continues, such introductions are already taking place. Previous frameworks for analysing the decision to introduce have lacked a quantifiable management objective and mathematically rigorous problem formulation. Here we develop the first rigorous quantitative framework for deciding whether or not a particular introduction should go ahead, which species to prioritize for introduction, and where and how to introduce them. It can also be used to compare introduction with alternative management actions, and to prioritise questions for future research. We apply the framework to a case study of tuatara (Sphenodon punctatus) in New Zealand. While simple and accessible, this framework can accommodate uncertainty in predictions and values. It provides essential support for the existing IUCN guidelines by presenting a quantitative process for better decision-making about conservation introductions. PMID:24146778

  7. FDA preemption of drug and device labeling: who should decide what goes on a drug label?

    PubMed

    Valoir, Tamsen; Ghosh, Shubha

    2011-01-01

    The Supreme Court decided an issue that is critical to consumer health and safety last year. In April 2009, the Supreme Court held that extensive FDA regulation of drugs did not preempt a state law claim that an additional warning on the label was necessary to make the drug reasonably safe for use. Thus, states--and even courts and juries--are now free to cast their vote on what a drug label should say. This is in direct contrast to medical devices, where the federal statute regulating medical devices expressly provides that state regulations are preempted. This Article discusses basic preemption principles and drugs, and explores the policy ramifications of pro- and anti-preemption policy in the healthcare industry.

  8. Neurocomputational account of how the human brain decides when to have a break.

    PubMed

    Meyniel, Florent; Sergent, Claire; Rigoux, Lionel; Daunizeau, Jean; Pessiglione, Mathias

    2013-02-12

    No pain, no gain: cost-benefit trade-off has been formalized in classical decision theory to account for how we choose whether to engage effort. However, how the brain decides when to have breaks in the course of effort production remains poorly understood. We propose that decisions to cease and resume work are triggered by a cost evidence accumulation signal reaching upper and lower bounds, respectively. We developed a task in which participants are free to exert a physical effort knowing that their payoff would be proportional to their effort duration. Functional MRI and magnetoencephalography recordings conjointly revealed that the theoretical cost evidence accumulation signal was expressed in proprioceptive regions (bilateral posterior insula). Furthermore, the slopes and bounds of the accumulation process were adapted to the difficulty of the task and the money at stake. Cost evidence accumulation might therefore provide a dynamical mechanistic account of how the human brain maximizes benefits while preventing exhaustion. PMID:23341598

  9. Should I call an interpreter?-How do physicians with second language skills decide?

    PubMed

    Andres, Ellie; Wynia, Matthew; Regenstein, Marsha; Maul, Lauren

    2013-05-01

    Very little is known about how and when clinicians use their second language skills in patient care and when they rely on interpreters. The purpose of this study was to identify the factors most relevant to physicians' decision-making process when confronting the question of whether their language skills suffice to communicate effectively with patients in particular encounters. We conducted 25 in-depth, semi-structured telephone interviews with physicians in different practice settings who, while not native speakers, routinely interact with LEP patients using second language skills. Physicians consider a variety of factors in deciding whether to use their own language skills in clinical care, including their own and their patient's language proficiency, costs, convenience, and the clinical risk or complexity of the encounter. This study suggests the need for practical guidance and training for clinicians on the appropriate use of second language skills and interpreters in clinical care.

  10. [DECIDE: developing and evaluating communication strategies to support informed decisions and practice based on evidence].

    PubMed

    Parmelli, Elena; Amato, Laura; Saitto, Carlo; Davoli, Marina

    2013-10-01

    Healthcare systems are offered with a wide range of technologies and services, but they have to cope with decreasing resources and the uncertainty about what is effective and more appropriate. Making decisions about health care interventions is complex. Decisions should be informed by the best available evidence, being comprehensive to take into account all the relevant aspects (e.g. efficacy, safety, equity, costs), and taken within a limited time period. DECIDE is a project funded by the European Community that, using the GRADE methodology, aims at implementing strategies to enhance dissemination and communication of scientific evidence to support on-time evidence-based decision making in clinical practice and healthcare policies. Communication strategies are developed in order to address different target audiences, trying to meet their information needs. One key target are policy makers and managers who are responsible for coverage decision making. PMID:24326703

  11. An absurd inconsistency in law: Nicklinson's case and deciding to die.

    PubMed

    Douglas, Michael

    2014-03-01

    R (Nicklinson) v Ministry of Justice [2012] EWHC 2381 was a tragic case that considered a perennial question: whether voluntary active euthanasia is murder. The traditional position was affirmed, that is, it is indeed murder. The law's treatment of decisions to refuse treatment resulting in death is a stark contrast to the position in respect of voluntary, active euthanasia. In cases of refusing treatment, principles of individual autonomy are paramount. This article presents an overview of the legal distinction between refusing medical treatment and voluntary, active euthanasia. It questions the purported differences between what are described as acts of "active" or "passive" euthanasia. It also highlights the inconsistency of the law's treatment of different ways that people decide to die. PMID:24804532

  12. An absurd inconsistency in law: Nicklinson's case and deciding to die.

    PubMed

    Douglas, Michael

    2014-03-01

    R (Nicklinson) v Ministry of Justice [2012] EWHC 2381 was a tragic case that considered a perennial question: whether voluntary active euthanasia is murder. The traditional position was affirmed, that is, it is indeed murder. The law's treatment of decisions to refuse treatment resulting in death is a stark contrast to the position in respect of voluntary, active euthanasia. In cases of refusing treatment, principles of individual autonomy are paramount. This article presents an overview of the legal distinction between refusing medical treatment and voluntary, active euthanasia. It questions the purported differences between what are described as acts of "active" or "passive" euthanasia. It also highlights the inconsistency of the law's treatment of different ways that people decide to die.

  13. Decide now, pay later: Early influences in math and science education

    SciTech Connect

    Malcom, S.

    1995-12-31

    Who are the people deciding to major in science, math or engineering in college? The early interest in science and math education which can lead to science and engineering careers, is shaped as much by the encompassing world of the child as it is by formal education experiences. This paper documents what we know and what we need to know about the influences on children from pre-kindergarten through sixth grade, including the home, pre-school groups, science and math programs in churches, community groups, the media, cultural institutions (museums, zoos, botanical gardens), libraries, and schools (curriculum, instruction, policies and assessment). It also covers the nature and quality of curricular and intervention programs, and identifies strategies that appear to be most effective for various groups.

  14. Deciding Optimal Noise Monitoring Sites with Matrix Gray Absolute Relation Degree Theory

    NASA Astrophysics Data System (ADS)

    Gao, Zhihua; Li, Yadan; Zhao, Limin; Wang, Shuangwei

    2015-08-01

    Noise maps are applied to assess noise level in cities all around the world. There are mainly two ways of producing noise maps: one way is producing noise maps through theoretical simulations with the surrounding conditions, such as traffic flow, building distribution, etc.; the other one is calculating noise level with actual measurement data from noise monitors. Currently literature mainly focuses on considering more factors that affect sound traveling during theoretical simulations and interpolation methods in producing noise maps based on measurements of noise. Although many factors were considered during simulation, noise maps have to be calibrated by actual noise measurements. Therefore, the way of obtaining noise data is significant to both producing and calibrating a noise map. However, there is little literature mentioned about rules of deciding the right monitoring sites when placed the specified number of noise sensors and given the deviation of a noise map produced with data from them. In this work, by utilizing matrix Gray Absolute Relation Degree Theory, we calculated the relation degrees between the most precise noise surface and those interpolated with different combinations of noise data with specified number. We found that surfaces plotted with different combinations of noise data produced different relation degrees with the most precise one. Then we decided the least significant one among the total and calculated the corresponding deviation when it was excluded in making a noise surface. Processing the left noise data in the same way, we found out the least significant datum among the left data one by one. With this method, we optimized the noise sensor’s distribution in an area about 2km2. And we also calculated the bias of surfaces with the least significant data removed. Our practice provides an optimistic solution to the situation faced by most governments that there is limited financial budget available for noise monitoring, especially in

  15. Role of serum interleukin-6 in deciding therapy for multidrug resistant oral lichen planus

    PubMed Central

    Marwah, Akanksha; Kaushik, Smita; Garg, Vijay K.; Gupta, Sunita

    2015-01-01

    Background Oral lichen planus (OLP) is a T cell mediated immune response. T cells locally present in the involved tissues release cytokines like interleukin-6 (IL-6), which contributes to pathogenesis of OLP. Also IL-6 has been associated with multidrug resistance protein (MRP) expression by keratinocytes. Correspondingly, upregulation of MRP was found in OLP. We conducted this study to evaluate the effects of various drugs on serum IL-6 in OLP; and correlation of these effects with the nature of clinical response and resistance pattern seen in OLP lesions with various therapeutic modalities. Thus we evaluated the role of serum IL-6 in deciding therapy for multidrug resistant OLP. Material and Methods Serum IL-6 was evaluated in 42 erosive OLP (EOLP) patients and 10 normal mucosa and 10 oral squamous cell carcinoma cases using ELISA technique. OLP patients were randomly divided into 3 groups of 14 patients each and were subjected to Pimecrolimus local application, oral Mycophenolate Mofetil (MMF) and Methotrexate (MTX) alongwith Pimecrolimus local application. IL-6 levels were evaluated before and after treatment. Results Serum IL-6 levels were raised above 3pg/ml in 26.19% erosive OLP (EOLP) cases (mean- 3.72±8.14). EOLP (5%) cases with IL-6 levels above 5pg/ml were resistant in MTX group. However significant decrease in serum IL-6 corresponding with the clinical resolution was seen in MMF group. Conclusions Significantly raised IL-6 levels in EOLP reflect the chronic inflammatory nature of the disease. As serum IL-6 levels significantly decreased in MMF group, correspondingly no resistance to treatment was noted. However with MTX there was no significant decrease in IL-6 and resistance to treatment was noted in some, especially plaque type lesions. Thus IL-6 can be a possible biomarker in deciding the best possible therapy for treatment resistant OLP. Key words:Lichen planus, biological markers, cytokines, enzyme-linked immunosorbent assay, immunosuppressive

  16. Non-therapeutic research with minors: how do chairpersons of German research ethics committees decide?

    PubMed Central

    Lenk, C; Radenbach, K; Dahl, M; Wiesemann, C

    2004-01-01

    Objectives: Clinical trials in humans in Germany—as in many other countries—must be approved by local research ethics committees (RECs). The current study has been designed to document and evaluate decisions of chairpersons of RECs in the problematic field of non-therapeutic research with minors. The authors' purpose was to examine whether non-therapeutic research was acceptable for chairpersons at all, and whether there was certainty on how to decide in research trials involving more than minimal risk. Design: In a questionnaire, REC chairpersons had to evaluate five different scenarios with (in parts) non-therapeutic research. The scenarios described realistic potential research projects with minors, involving increasing levels of risk for the research participants. The chairpersons had to decide whether the respective projects should be approved. Methods: A total of 49 German REC chairpersons were sent questionnaires; 29 questionnaires were returned. The main measurements were approval or rejection of research scenarios. Results: Chairpersons of German RECs generally tend to accept non-therapeutic research with minors if the apparent risk for the participating children is low. If the risk is clearly higher than "minimal", the chairpersons' decisions differ widely. Conclusion: The fact that there seem to be different attitudes of chairpersons to non-therapeutic research with minors is problematic from an ethical point of view. It suggests a general uncertainty about the standards of protection for minor research participants in Germany. Therefore, further ethical and legal regulation of non-therapeutic research with minors in Germany seems necessary. PMID:14872082

  17. The Project Manager's Tool Kit

    NASA Technical Reports Server (NTRS)

    Cameron, W. Scott

    2003-01-01

    Project managers are rarely described as being funny. Moreover, a good sense of humor rarely seems to be one of the deciding factors in choosing someone to be a project manager, or something that pops up as a major discussion point at an annual performance review. Perhaps this is because people think you aren't serious about your work if you laugh. I disagree with this assessment, but that's not really my point. As I talk to people either pursuing a career in project management, or broadening their assignment to include project management, I encourage them to consider what tools they need to be successful. I suggest that they consider any strength they have to be part of their Project Management (PM) Tool Kit, and being funny could be one of the tools they need.

  18. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries

    PubMed Central

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Background and objective Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Methods Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. Results The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, p<0.0001), but a similar F-measure to that of the SAAP (0.89 vs 0.87). For the Procedure terms, the F-measure was not significantly different among the three pipelines. Conclusions The combination of a semi-automatic annotation approach and the NLP application seems to be a solution for generating entry-level interoperable clinical documents. PMID:25332357

  19. Machine tool locator

    DOEpatents

    Hanlon, John A.; Gill, Timothy J.

    2001-01-01

    Machine tools can be accurately measured and positioned on manufacturing machines within very small tolerances by use of an autocollimator on a 3-axis mount on a manufacturing machine and positioned so as to focus on a reference tooling ball or a machine tool, a digital camera connected to the viewing end of the autocollimator, and a marker and measure generator for receiving digital images from the camera, then displaying or measuring distances between the projection reticle and the reference reticle on the monitoring screen, and relating the distances to the actual position of the autocollimator relative to the reference tooling ball. The images and measurements are used to set the position of the machine tool and to measure the size and shape of the machine tool tip, and examine cutting edge wear. patent

  20. The synthesis map is a multidimensional educational tool that provides insight into students' mental models and promotes students' synthetic knowledge generation.

    PubMed

    Ortega, Ryan A; Brame, Cynthia J

    2015-01-01

    Concept mapping was developed as a method of displaying and organizing hierarchical knowledge structures. Using the new, multidimensional presentation software Prezi, we have developed a new teaching technique designed to engage higher-level skills in the cognitive domain. This tool, synthesis mapping, is a natural evolution of concept mapping, which utilizes embedding to layer information within concepts. Prezi's zooming user interface lets the author of the presentation use both depth as well as distance to show connections between data, ideas, and concepts. Students in the class Biology of Cancer created synthesis maps to illustrate their knowledge of tumorigenesis. Students used multiple organizational schemes to build their maps. We present an analysis of student work, placing special emphasis on organization within student maps and how the organization of knowledge structures in student maps can reveal strengths and weaknesses in student understanding or instruction. We also provide a discussion of best practices for instructors who would like to implement synthesis mapping in their classrooms.

  1. Real-Time Measurement of the Tool-Tissue Interaction in Minimally Invasive Abdominal Surgery: The First Step to Developing the Next Generation of Smart Laparoscopic Instruments.

    PubMed

    Barrie, Jenifer; Jayne, David G; Neville, Anne; Hunter, Louise; Hood, Adrian J; Culmer, Peter R

    2016-10-01

    Introduction Analysis of force application in laparoscopic surgery is critical to understanding the nature of the tool-tissue interaction. The aim of this study is to provide real-time data about manipulations to abdominal organs. Methods An instrumented short fenestrated grasper was used in an in vivo porcine model, measuring force at the grasper handle. Grasping force and duration over 5 small bowel manipulation tasks were analyzed. Forces required to retract gallbladder, bladder, small bowel, large bowel, and rectum were measured over 30 seconds. Four parameters were calculated-T(hold), the grasp time; T(close), time taken for the jaws to close; F(max), maximum force reached; and F(rms), root mean square force (representing the average force across the grasp time). Results Mean F(max) to manipulate the small bowel was 20.5 N (±7.2) and F(rms) was 13.7 N (±5.4). Mean T(close) was 0.52 seconds (±0.26) and T(hold) was 3.87 seconds (±1.5). In individual organs, mean F(max) was 49 N (±15) to manipulate the rectum and 59 N (±13.4) for the colon. The mean F(max) for bladder and gallbladder retraction was 28.8 N (±7.4) and 50.7 N (±3.8), respectively. All organs exhibited force relaxation, the F(rms) reduced to below 25 N for all organs except the small bowel, with a mean F(rms) of less than 10 N. Conclusion This study has commenced the process of quantifying tool-tissue interaction. The static measurements discussed here should evolve to include dynamic measurements such as shear, torque, and retraction forces, and be correlated with evidence of histological damage to tissue.

  2. The Xygra gun simulation tool.

    SciTech Connect

    Garasi, Christopher Joseph; Lamppa, Derek C.; Aubuchon, Matthew S.; Shirley, David Noyes; Robinson, Allen Conrad; Russo, Thomas V.

    2008-12-01

    Inductive electromagnetic launchers, or coilguns, use discrete solenoidal coils to accelerate a coaxial conductive armature. To date, Sandia has been using an internally developed code, SLINGSHOT, as a point-mass lumped circuit element simulation tool for modeling coilgun behavior for design and verification purposes. This code has shortcomings in terms of accurately modeling gun performance under stressful electromagnetic propulsion environments. To correct for these limitations, it was decided to attempt to closely couple two Sandia simulation codes, Xyce and ALEGRA, to develop a more rigorous simulation capability for demanding launch applications. This report summarizes the modifications made to each respective code and the path forward to completing interfacing between them.

  3. Percussion tool

    DOEpatents

    Reed, Teddy R.

    2006-11-28

    A percussion tool is described and which includes a housing mounting a tool bit; a reciprocally moveable hammer borne by the housing and which is operable to repeatedly strike the tool bit; and a reciprocally moveable piston enclosed within the hammer and which imparts reciprocal movement to the reciprocally moveable hammer.

  4. FORTRAN tools

    NASA Technical Reports Server (NTRS)

    Presser, L.

    1978-01-01

    An integrated set of FORTRAN tools that are commercially available is described. The basic purpose of various tools is summarized and their economic impact highlighted. The areas addressed by these tools include: code auditing, error detection, program portability, program instrumentation, documentation, clerical aids, and quality assurance.

  5. Radical-generating coordination complexes as tools for rapid and effective fragmentation and fluorescent labeling of nucleic acids for microchip hybridization.

    SciTech Connect

    Kelly, J. J.; Chernov, B. K.; Tovstanovsky, I.; Mirzabekov, A. D.; Bavykin, S. G.; Biochip Technology Center; Northwestern Univ.; Engelhardt Inst. of Molecular Biology

    2002-12-15

    DNA microchip technology is a rapid, high-throughput method for nucleic acid hybridization reactions. This technology requires random fragmentation and fluorescent labeling of target nucleic acids prior to hybridization. Radical-generating coordination complexes, such as 1,10-phenanthroline-Cu(II) (OP-Cu) and Fe(II)-EDTA (Fe-EDTA), have been commonly used as sequence nonspecific 'chemical nucleases' to introduce single-strand breaks in nucleic acids. Here we describe a new method based on these radical-generating complexes for random fragmentation and labeling of both single- and double-stranded forms of RNA and DNA. Nucleic acids labeled with the OP-Cu and the Fe-EDTA protocols revealed high hybridization specificity in hybridization with DNA microchips containing oligonucleotide probes selected for identification of 16S rRNA sequences of the Bacillus group microorganisms.We also demonstrated cDNA- and cRNA-labeling and fragmentation with this method. Both the OP-Cu and Fe-EDTA fragmentation and labeling procedures are quick and inexpensive compared to other commonly used methods. A column-based version of the described method does not require centrifugation and therefore is promising for the automation of sample preparations in DNA microchip technology as well as in other nucleic acid hybridization studies.

  6. Assessment of Targeted Next-Generation Sequencing as a Tool for the Diagnosis of Charcot-Marie-Tooth Disease and Hereditary Motor Neuropathy.

    PubMed

    Lupo, Vincenzo; García-García, Francisco; Sancho, Paula; Tello, Cristina; García-Romero, Mar; Villarreal, Liliana; Alberti, Antonia; Sivera, Rafael; Dopazo, Joaquín; Pascual-Pascual, Samuel I; Márquez-Infante, Celedonio; Casasnovas, Carlos; Sevilla, Teresa; Espinós, Carmen

    2016-03-01

    Charcot-Marie-Tooth disease is characterized by broad genetic heterogeneity with >50 known disease-associated genes. Mutations in some of these genes can cause a pure motor form of hereditary motor neuropathy, the genetics of which are poorly characterized. We designed a panel comprising 56 genes associated with Charcot-Marie-Tooth disease/hereditary motor neuropathy. We validated this diagnostic tool by first testing 11 patients with pathological mutations. A cohort of 33 affected subjects was selected for this study. The DNAJB2 c.352+1G>A mutation was detected in two cases; novel changes and/or variants with low frequency (<1%) were found in 12 cases. There were no candidate variants in 18 cases, and amplification failed for one sample. The DNAJB2 c.352+1G>A mutation was also detected in three additional families. On haplotype analysis, all of the patients from these five families shared the same haplotype; therefore, the DNAJB2 c.352+1G>A mutation may be a founder event. Our gene panel allowed us to perform a very rapid and cost-effective screening of genes involved in Charcot-Marie-Tooth disease/hereditary motor neuropathy. Our diagnostic strategy was robust in terms of both coverage and read depth for all of the genes and patient samples. These findings demonstrate the difficulty in achieving a definitive molecular diagnosis because of the complexity of interpreting new variants and the genetic heterogeneity that is associated with these neuropathies.

  7. HBP Builder: A Tool to Generate Hyperbranched Polymers and Hyperbranched Multi-Arm Copolymers for Coarse-grained and Fully Atomistic Molecular Simulations

    PubMed Central

    Yu, Chunyang; Ma, Li; Li, Shanlong; Tan, Haina; Zhou, Yongfeng; Yan, Deyue

    2016-01-01

    Computer simulation has been becoming a versatile tool that can investigate detailed information from the microscopic scale to the mesoscopic scale. However, the crucial first step of molecular simulation is model building, particularly for hyperbranched polymers (HBPs) and hyperbranched multi-arm copolymers (HBMCs) with complex and various topological structures. Unlike well-defined polymers, not only the molar weight of HBPs/HBMCs with polydispersity, but the HBPs/HBMCs with the same degree of polymerization (DP) and degree of branching (DB) also have many possible topological structures, thus making difficulties for user to build model in molecular simulation. In order to build a bridge between model building and molecular simulation of HBPs and HBMCs, we developed HBP Builder, a C language open source HBPs/HBMCs building toolkit. HBP Builder implements an automated protocol to build various coarse-grained and fully atomistic structures of HBPs/HBMCs according to user’s specific requirements. Meanwhile, coarse-grained and fully atomistic output structures can be directly employed in popular simulation packages, including HOOMD, Tinker and Gromacs. Moreover, HBP Builder has an easy-to-use graphical user interface and the modular architecture, making it easy to extend and reuse it as a part of other program. PMID:27188541

  8. Style Guide: An Interdisciplinary Communication Tool to Support the Process of Generating Tailored Infographics From Electronic Health Data Using EnTICE3

    PubMed Central

    Arcia, Adriana; Velez, Mark; Bakken, Suzanne

    2015-01-01

    Purpose: In this case study we describe key features of the structured communication tool—a style guide—used to support interdisciplinary collaboration, and we propose the use of such a tool for research teams engaged in similar projects. We employ tailored infographics to present patient reported outcome data from a community health survey back, in a comprehensible and actionable manner, to the individuals who provided it. The style guide was developed to bridge the semantic gap between the domain and programming experts engaged in this effort. Innovation: The style guide supports the communication of complex design specifications in a highly structured format that is nevertheless flexible enough to accommodate project growth. Unlike the typical corporate style guide that has a more narrative format, our style guide is innovative in its use of consistent fields across multiple, standalone entries. Credibility: The process of populating the style guide prompted the designer toward greater design efficiency and led to consistent and specific instructions that met the framework architect’s stated information needs. Discussion and Conclusion: The guiding values in the creation of the style guide were consistency, clarity, and flexibility. It serves as a durable reference to the desired look and functionality of the final infographic product without dictating an implementation strategy. The style guide format can be adapted to meet the communication needs of other interdisciplinary teams facing a semantic gap. PMID:25848634

  9. Next-generation sequencing (NGS) as a fast molecular diagnosis tool for left ventricular noncompaction in an infant with compound mutations in the MYBPC3 gene.

    PubMed

    Schaefer, Elise; Helms, Pauline; Marcellin, Luc; Desprez, Philippe; Billaud, Philippe; Chanavat, Valérie; Rousson, Robert; Millat, Gilles

    2014-03-01

    Left ventricular noncompaction (LVNC) is a clinically heterogeneous disorder characterized by a trabecular meshwork and deep intertrabecular myocardial recesses that communicate with the left ventricular cavity. LVNC is classified as a rare genetic cardiomyopathy. Molecular diagnosis is a challenge for the medical community as the condition shares morphologic features of hypertrophic and dilated cardiomyopathies. Several genetic causes of LVNC have been reported, with variable modes of inheritance, including autosomal dominant and X-linked inheritance, but relatively few responsible genes have been identified. In this report, we describe a case of a severe form of LVNC leading to death at 6 months of life. NGS sequencing using a custom design for hypertrophic cardiomyopathy panel allowed us to identify compound heterozygosity in the MYBPC3 gene (p.Lys505del, p.Pro955fs) in 3 days, confirming NGS sequencing as a fast molecular diagnosis tool. Other studies have reported neonatal presentation of cardiomyopathies associated with compound heterozygous or homozygous MYBPC3 mutations. In this family and in families in which parental truncating MYBPC3 mutations are identified, preimplantation or prenatal genetic screening should be considered as these genotypes leads to neonatal mortality and morbidity.

  10. HBP Builder: A Tool to Generate Hyperbranched Polymers and Hyperbranched Multi-Arm Copolymers for Coarse-grained and Fully Atomistic Molecular Simulations

    NASA Astrophysics Data System (ADS)

    Yu, Chunyang; Ma, Li; Li, Shanlong; Tan, Haina; Zhou, Yongfeng; Yan, Deyue

    2016-05-01

    Computer simulation has been becoming a versatile tool that can investigate detailed information from the microscopic scale to the mesoscopic scale. However, the crucial first step of molecular simulation is model building, particularly for hyperbranched polymers (HBPs) and hyperbranched multi-arm copolymers (HBMCs) with complex and various topological structures. Unlike well-defined polymers, not only the molar weight of HBPs/HBMCs with polydispersity, but the HBPs/HBMCs with the same degree of polymerization (DP) and degree of branching (DB) also have many possible topological structures, thus making difficulties for user to build model in molecular simulation. In order to build a bridge between model building and molecular simulation of HBPs and HBMCs, we developed HBP Builder, a C language open source HBPs/HBMCs building toolkit. HBP Builder implements an automated protocol to build various coarse-grained and fully atomistic structures of HBPs/HBMCs according to user’s specific requirements. Meanwhile, coarse-grained and fully atomistic output structures can be directly employed in popular simulation packages, including HOOMD, Tinker and Gromacs. Moreover, HBP Builder has an easy-to-use graphical user interface and the modular architecture, making it easy to extend and reuse it as a part of other program.

  11. Identification of the first multi-exonic WDR72 deletion in isolated amelogenesis imperfecta, and generation of a WDR72-specific copy number screening tool.

    PubMed

    Hentschel, Julia; Tatun, Dana; Parkhomchuk, Dmitri; Kurth, Ingo; Schimmel, Bettina; Heinrich-Weltzien, Roswitha; Bertzbach, Sabine; Peters, Hartmut; Beetz, Christian

    2016-09-15

    Amelogenesis imperfecta (AI) is a clinically and genetically heterogeneous disorder of tooth development which is due to aberrant deposition or composition of enamel. Both syndromic and isolated forms exist; they may be inherited in an X-linked, autosomal recessive, or autosomal dominant manner. WDR72 is one of ten currently known genes for recessive isolated AI; nine WDR72 mutations affecting single nucleotides have been described to date. Based on whole exome sequencing in a large consanguineous AI pedigree, we obtained evidence for presence of a multi-exonic WDR72 deletion. A home-made multiplex ligation-dependent probe amplification assay was used to confirm the aberration, to narrow its extent, and to identify heterozygous carriers. Our study extends the mutational spectrum for WDR72 to include large deletions, and supports a relevance of the previously proposed loss-of-function mechanism. It also introduces an easy-to-use and highly sensitive tool for detecting WDR72 copy number alterations.

  12. Circularly polarized high harmonics generated by a bicircular field from inert atomic gases in the p state: A tool for exploring chirality-sensitive processes

    NASA Astrophysics Data System (ADS)

    Milošević, D. B.

    2015-10-01

    S -matrix theory of high-order harmonic generation (HHG) is generalized to multielectron atoms. In the multielectron case the harmonic power is expressed via a coherent sum of the time-dependent dipoles, while for the one-electron models a corresponding incoherent sum appears. This difference is important for the inert atomic gases having a p ground state as used in a recent HHG experiment with a bicircular field [Nat. Photonics 9, 99 (2015), 10.1038/nphoton.2014.293]. We investigate HHG by such a bicircular field, which consists of two coplanar counter-rotating circularly polarized fields of frequency r ω and s ω . Selection rules for HHG by a bicircular field are analyzed from the aspects of dynamical symmetry of the system, conservation of the projection of the angular momentum on a fixed quantization axis, and the quantum number of the initial and final atomic ground states. A distinction is made between the selection rules for atoms with closed [J. Phys. B 48, 171001 (2015), 10.1088/0953-4075/48/17/171001] and nonclosed shells. An asymmetry in emission of the left- and right-circularly polarized harmonics is found and explained by using a semiclassical model and the electron probability currents which are related to a nonzero magnetic quantum number. This asymmetry can be important for the application of such harmonics to the exploration of chirality-sensitive processes and for generation of elliptic or even circular attosecond pulse trains. Such attosecond pulse trains are analyzed for longer wavelengths than in Opt. Lett. 40, 2381 (2015), 10.1364/OL.40.002381, and for various field-component intensities.

  13. The use of the replication region of plasmid pRS7 from Oenococcus oeni as a putative tool to generate cloning vectors for lactic acid bacteria.

    PubMed

    Rodríguez, M Carmen; Alegre, M Teresa; Martín, M Cruz; Mesas, Juan M

    2015-01-01

    A chimeric plasmid, pRS7Rep (6.1 kb), was constructed using the replication region of pRS7, a large plasmid from Oenococcus oeni, and pEM64, a plasmid derived from pIJ2925 and containing a gene for resistance to chloramphenicol. pRS7Rep is a shuttle vector that replicates in Escherichia coli using its pIJ2925 component and in lactic acid bacteria (LAB) using the replication region of pRS7. High levels of transformants per µg of DNA were obtained by electroporation of pRS7Rep into Pediococcus acidilactici (1.5 × 10(7)), Lactobacillus plantarum (5.7 × 10(5)), Lactobacillus casei (2.3 × 10(5)), Leuconostoc citreum (2.7 × 10(5)), and Enterococcus faecalis (2.4 × 10(5)). A preliminary optimisation of the technical conditions of electrotransformation showed that P. acidilactici and L. plantarum are better transformed at a later exponential phase of growth, whereas L. casei requires the early exponential phase for better electrotransformation efficiency. pRS7Rep contains single restriction sites useful for cloning purposes, BamHI, XbaI, SalI, HincII, SphI and PstI, and was maintained at an acceptable rate (>50%) over 100 generations without selective pressure in L. plantarum, but was less stable in L. casei and P. acidilactici. The ability of pRS7Rep to accept and express other genes was assessed. To the best of our knowledge, this is the first time that the replication region of a plasmid from O. oeni has been used to generate a cloning vector.

  14. Assessment of Targeted Next-Generation Sequencing as a Tool for the Diagnosis of Charcot-Marie-Tooth Disease and Hereditary Motor Neuropathy.

    PubMed

    Lupo, Vincenzo; García-García, Francisco; Sancho, Paula; Tello, Cristina; García-Romero, Mar; Villarreal, Liliana; Alberti, Antonia; Sivera, Rafael; Dopazo, Joaquín; Pascual-Pascual, Samuel I; Márquez-Infante, Celedonio; Casasnovas, Carlos; Sevilla, Teresa; Espinós, Carmen

    2016-03-01

    Charcot-Marie-Tooth disease is characterized by broad genetic heterogeneity with >50 known disease-associated genes. Mutations in some of these genes can cause a pure motor form of hereditary motor neuropathy, the genetics of which are poorly characterized. We designed a panel comprising 56 genes associated with Charcot-Marie-Tooth disease/hereditary motor neuropathy. We validated this diagnostic tool by first testing 11 patients with pathological mutations. A cohort of 33 affected subjects was selected for this study. The DNAJB2 c.352+1G>A mutation was detected in two cases; novel changes and/or variants with low frequency (<1%) were found in 12 cases. There were no candidate variants in 18 cases, and amplification failed for one sample. The DNAJB2 c.352+1G>A mutation was also detected in three additional families. On haplotype analysis, all of the patients from these five families shared the same haplotype; therefore, the DNAJB2 c.352+1G>A mutation may be a founder event. Our gene panel allowed us to perform a very rapid and cost-effective screening of genes involved in Charcot-Marie-Tooth disease/hereditary motor neuropathy. Our diagnostic strategy was robust in terms of both coverage and read depth for all of the genes and patient samples. These findings demonstrate the difficulty in achieving a definitive molecular diagnosis because of the complexity of interpreting new variants and the genetic heterogeneity that is associated with these neuropathies. PMID:26752306

  15. A Web Service Tool (SOAR) for the Dynamic Generation of L1 Grids of Coincident AIRS, AMSU and MODIS Satellite Sounding Radiance Data for Climate Studies

    NASA Astrophysics Data System (ADS)

    Halem, M.; Yesha, Y.; Tilmes, C.; Chapman, D.; Goldberg, M.; Zhou, L.

    2007-05-01

    Three decades of Earth remote sensing from NASA, NOAA and DOD operational and research satellites carrying successive generations of improved atmospheric sounder instruments have resulted in petabytes of radiance data with varying spatial and spectral resolutions being stored at different data archives in various data formats by the respective agencies. This evolution of sounders and the diversities of these archived data sets have led to data processing obstacles limiting the science community from readily accessing and analyzing such long-term climate data records. We address this problem by the development of a web based Service Oriented Atmospheric Radiance (SOAR) system built on the SOA paradigm that makes it practical for the science community to dynamically access, manipulate and generate long term records of L1 pre-gridded sounding radiances of coincident multi-sensor data for regions specified according to user chosen criteria. SOAR employs a modification of the standard Client Server interactions that allows users to represent themselves directly to the Process Server through their own web browsers. The browser uses AJAX to request Javascript libraries and DHTML interfaces that define the possible client interactions and communicates the SOAP messages to the Process server allowing for dynamic web dialogs with the user to take place on the fly. The Process Server is also connected to an underlying high performance compute cluster and storage system which provides much of the data processing capabilities required to service the client requests. The compute cluster employs optical communications to NOAA and NASA for accessing the data and under the governance of the Process Server invokes algorithms for on-demand spatial, temporal, and spectral gridding. Scientists can choose from a variety of statistical averaging techniques for compositing satellite observed sounder radiances from the AIRS, AMSU or MODIS instruments to form spatial-temporal grids for

  16. Deciding on gender in children with intersex conditions: considerations and controversies.

    PubMed

    Thyen, Ute; Richter-Appelt, Hertha; Wiesemann, Claudia; Holterhus, Paul-Martin; Hiort, Olaf

    2005-01-01

    Biologic factors such as genetic and hormonal influences contribute to gender identity, gender role behavior, and sexual orientation in humans, but this relationship is considerably modified by psychologic, social, and cultural factors. The recognition of biologically determined conditions leading to incongruity of genetically determined sex, somatic phenotype, and gender identity has led to growing interest in gender role development and gender identity in individuals with intersex conditions. Sex assignment of children with ambiguous genitalia remains a difficult decision for the families involved and subject to controversial discussion among professionals and self-help groups. Although systematic empirical data on outcomes of functioning and health-related quality of life are sparse, anecdotal evidence from case series and individual patients about their experiences in healthcare suggests traumatic experiences in some. This article reviews the earlier 'optimal gender policy' as well as the more recent 'full consent policy' and reviews published data on both surgical and psychosocial outcomes. The professional debate on deciding on sex assignment in children with intersex conditions is embedded in a much wider public discourse on gender as a social construction. Given that the empirical basis of our knowledge of the causes, treatment options, long-term outcomes, and patient preferences is insufficient, we suggest preliminary recommendations based on clinical experience, study of the literature, and interviews with affected individuals.

  17. Liberty to decide on dual use biomedical research: an acknowledged necessity.

    PubMed

    Keuleyan, Emma

    2010-03-01

    Humanity entered the twenty-first century with revolutionary achievements in biomedical research. At the same time multiple "dual-use" results have been published. The battle against infectious diseases is meeting new challenges, with newly emerging and re-emerging infections. Both natural disaster epidemics, such as SARS, avian influenza, haemorrhagic fevers, XDR and MDR tuberculosis and many others, and the possibility of intentional mis-use, such as letters containing anthrax spores in USA, 2001, have raised awareness of the real threats. Many great men, including Goethe, Spinoza, J.B. Shaw, Fr. Engels, J.F. Kennedy and others, have recognized that liberty is also a responsibility. That is why the liberty to decide now represents an acknowledged necessity: biomedical research should be supported, conducted and published with appropriate measures to prevent potential "dual use". Biomedical scientists should work according to the ethical principles of their Code of Conduct, an analogue of Hippocrates Oath of doctors; and they should inform government, society and their juniors about the problem. National science consulting boards of experts should be created to prepare guidelines and control the problem at state level. An international board should develop minimum standards to be applicable by each country. Bio-preparedness is considered another key-measure.

  18. How to decide on stent insertion or surgery in colorectal obstruction?

    PubMed Central

    Zahid, Assad; Young, Christopher John

    2016-01-01

    Colorectal cancer is one of the most common cancers in western society and malignant obstruction of the colon accounts for 8%-29% of all large bowel obstructions. Conventional treatment of these patients with malignant obstruction requiring urgent surgery is associated with a greater physiological insult on already nutritionally replete patients. Of late the utility of colonic stents has offered an option in the management of these patients in both the palliative and bridge to surgery setting. This has been the subject of many reviews which highlight its efficacy, particulary in reducing ostomy rates, allowing quicker return to oral diet, minimising extended post-operative recovery as well as some quality of life benefits. The uncertainity in managing patients with malignant colonic obstructions has lead to a more cautious use of stenting technology as community equipoise exists. Decision making analysis has demonstrated that surgeons’ favored the use of stents in the palliative setting preferentially when compared to the curative setting where surgery was preferred. We aim to review the literature regarding the use of stent or surgery in colorectal obstruction, and then provide a discourse with regards to the approach in synthesising the data and applying it when deciding the appropriate application of stent or surgery in colorectal obstruction. PMID:26843916

  19. Liberty to decide on dual use biomedical research: an acknowledged necessity.

    PubMed

    Keuleyan, Emma

    2010-03-01

    Humanity entered the twenty-first century with revolutionary achievements in biomedical research. At the same time multiple "dual-use" results have been published. The battle against infectious diseases is meeting new challenges, with newly emerging and re-emerging infections. Both natural disaster epidemics, such as SARS, avian influenza, haemorrhagic fevers, XDR and MDR tuberculosis and many others, and the possibility of intentional mis-use, such as letters containing anthrax spores in USA, 2001, have raised awareness of the real threats. Many great men, including Goethe, Spinoza, J.B. Shaw, Fr. Engels, J.F. Kennedy and others, have recognized that liberty is also a responsibility. That is why the liberty to decide now represents an acknowledged necessity: biomedical research should be supported, conducted and published with appropriate measures to prevent potential "dual use". Biomedical scientists should work according to the ethical principles of their Code of Conduct, an analogue of Hippocrates Oath of doctors; and they should inform government, society and their juniors about the problem. National science consulting boards of experts should be created to prepare guidelines and control the problem at state level. An international board should develop minimum standards to be applicable by each country. Bio-preparedness is considered another key-measure. PMID:18427955

  20. Slime moulds use heuristics based on within-patch experience to decide when to leave.

    PubMed

    Latty, Tanya; Beekman, Madeleine

    2015-04-15

    Animals foraging in patchy, non-renewing or slowly renewing environments must make decisions about how long to remain within a patch. Organisms can use heuristics ('rules of thumb') based on available information to decide when to leave the patch. Here, we investigated proximate patch-departure heuristics in two species of giant, brainless amoeba: the slime moulds Didymium bahiense and Physarum polycephalum. We explicitly tested the importance of information obtained through experience by eliminating chemosensory cues of patch quality. In P. polycephalum, patch departure was influenced by the consumption of high, and to a much lesser extent low, quality food items such that engulfing a food item increased patch-residency time. Physarum polycephalum also tended to forage for longer in darkened, 'safe' patches. In D. bahiense, engulfment of any food item increased patch residency irrespective of that food item's quality. Exposure to light had no effect on the patch-residency time of D. bahiense. Given that these organisms lack a brain, our results illustrate how the use of simple heuristics can give the impression that individuals make sophisticated foraging decisions.