Science.gov

Sample records for generation tool decider

  1. E-DECIDER: Earthquake Disaster Decision Support and Response Tools - Development and Experiences

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Blom, R. G.; Bawden, G. W.; Fox, G.; Pierce, M.; Rundle, J. B.; Wang, J.; Ma, Y.; yoder, M. R.; Sachs, M. K.; Parker, J. W.

    2011-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision-making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. The overall goal of the project is to deliver these capabilities as standards-compliant Geographical Information System (GIS) data products through a web portal/web services infrastructure that will allow easy use by decision-makers; this design ensures that the system will be readily supportable and extensible in the future. E-DECIDER is incorporating the earthquake forecasting methodology developed through NASA's QuakeSim project, as well as other QuakeSim geophysical modeling tools. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, will allow us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). We are also working to provide a catalog of HAZUS input files and models for scenario earthquakes based on the QuakeSim forecast models, as well as designing an automated workflow for generating HAZUS models in the event of an earthquake (triggered from the USGS earthquake feed). Initially, E-DECIDER's focus was to deliver rapid and readily accessible InSAR products following earthquake disasters. Following our experiences with recent past events, such as the Baja Mexico earthquake and the Tohoku-oki Japan earthquake, we found that in many instances, radar data is not readily available following the event, whereas optical imagery can be provided fairly quickly as a result of the invocation of the International Charter. This led us to re-evaluate the type of data we would need to process and the products we could deliver

  2. Disaster Response Tools for Decision Support and Data Discovery - E-DECIDER and GeoGateway

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Granat, R. A.; Lyzenga, G. A.; Pierce, M. E.; Wang, J.; Grant Ludwig, L.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2015-12-01

    Providing actionable data for situational awareness following an earthquake or other disaster is critical to decision makers in order to improve their ability to anticipate requirements and provide appropriate resources for response. E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) is a decision support system producing remote sensing and geophysical modeling products that are relevant to the emergency preparedness and response communities and serves as a gateway to enable the delivery of actionable information to these communities. GeoGateway is a data product search and analysis gateway for scientific discovery, field use, and disaster response focused on NASA UAVSAR and GPS data that integrates with fault data, seismicity and models. Key information on the nature, magnitude and scope of damage, or Essential Elements of Information (EEI), necessary to achieve situational awareness are often generated from a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. We have worked in partnership with the California Earthquake Clearinghouse to develop actionable data products for use in their response efforts, particularly in regularly scheduled, statewide exercises like the recent May 2015 Capstone/SoCal NLE/Ardent Sentry Exercises and in the August 2014 South Napa earthquake activation. We also provided a number of products, services, and consultation to the NASA agency-wide response to the April 2015 Gorkha, Nepal earthquake. We will present perspectives on developing tools for decision support and data discovery in partnership with the Clearinghouse and for the Nepal earthquake. Products delivered included map layers as part of the common operational data plan for the Clearinghouse, delivered through XchangeCore Web Service Data Orchestration, enabling users to create merged datasets from multiple providers. For the Nepal response effort, products included models

  3. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  4. Web Tools: The Second Generation

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  5. E-DECIDER: Using Earth Science Data and Modeling Tools to Develop Decision Support for Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, Margaret T.; Wang, Jun; Pierce, Marlon E.; Yoder, Mark R.; Parker, Jay W.; Burl, Michael C.; Stough, Timothy M.; Granat, Robert A.; Donnellan, Andrea; Rundle, John B.; Ma, Yu; Bawden, Gerald W.; Yuen, Karen

    2015-08-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). This in turn is delivered through standards-compliant web services for desktop and hand-held devices.

  6. Sense, decide, act, communicate (SDAC): next generation of smart sensor systems

    NASA Astrophysics Data System (ADS)

    Berry, Nina; Davis, Jesse; Ko, Teresa H.; Kyker, Ron; Pate, Ron; Stark, Doug; Stinnett, Regan; Baker, James; Cushner, Adam; Van Dyke, Colin; Kyckelhahn, Brian

    2004-09-01

    The recent war on terrorism and increased urban warfare has been a major catalysis for increased interest in the development of disposable unattended wireless ground sensors. While the application of these sensors to hostile domains has been generally governed by specific tasks, this research explores a unique paradigm capitalizing on the fundamental functionality related to sensor systems. This functionality includes a sensors ability to Sense - multi-modal sensing of environmental events, Decide - smart analysis of sensor data, Act - response to environmental events, and Communication - internal to system and external to humans (SDAC). The main concept behind SDAC sensor systems is to integrate the hardware, software, and networking to generate 'knowledge and not just data'. This research explores the usage of wireless SDAC units to collectively make up a sensor system capable of persistent, adaptive, and autonomous behavior. These systems are base on the evaluation of scenarios and existing systems covering various domains. This paper presents a promising view of sensor network characteristics, which will eventually yield smart (intelligent collectives) network arrays of SDAC sensing units generally applicable to multiple related domains. This paper will also discuss and evaluate the demonstration system developed to test the concepts related to SDAC systems.

  7. E-DECIDER: Using Earth Science Data and Modeling Tools to Develop Decision Support for Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Stough, T. M.; Burl, M. C.; Pierce, M.; Wang, J.; Ma, Y.; Rundle, J. B.; yoder, M. R.; Bawden, G. W.

    2012-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision-making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. Geodetic imaging data, including from inteferometric synthetic aperture radar (InSAR) and GPS, have a rich scientific heritage for use in earthquake research. Survey grade GPS was developed in the 1980s and the first InSAR image of an earthquake was produced for the 1992 Landers event. As more of these types of data have become increasingly available they have also shown great utility for providing key information for disaster response. Work has been done to translate these data into useful and actionable information for decision makers in the event of an earthquake disaster. In addition to observed data, modeling tools provide essential preliminary estimates while data are still being collected and/or processed, which can be refined as data products become available. Now, with more data and better models, we are able apply these to responders who need easy tools and routinely produced data products. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER has taken advantage of the legacy of Earth science data, including MODIS, Landsat, SCIGN, PBO, UAVSAR, and modeling tools such as the ones developed by QuakeSim, in order to deliver successful decision support products for earthquake disaster response. The project has

  8. Health. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Health, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday Living." It…

  9. Systems Prototyping with Fourth Generation Tools.

    ERIC Educational Resources Information Center

    Sholtys, Phyllis

    1983-01-01

    The development of information systems using an engineering approach that uses both traditional programing techniques and fourth generation software tools is described. Fourth generation applications tools are used to quickly develop a prototype system that is revised as the user clarifies requirements. (MLW)

  10. SUPPORT Tools for evidence-informed health Policymaking (STP) 8: Deciding how much confidence to place in a systematic review

    PubMed Central

    2009-01-01

    This article is part of a series written for people responsible for making decisions about health policies and programmes and for those who support these decision makers. The reliability of systematic reviews of the effects of health interventions is variable. Consequently, policymakers and others need to assess how much confidence can be placed in such evidence. The use of systematic and transparent processes to determine such decisions can help to prevent the introduction of errors and bias in these judgements. In this article, we suggest five questions that can be considered when deciding how much confidence to place in the findings of a systematic review of the effects of an intervention. These are: 1. Did the review explicitly address an appropriate policy or management question? 2. Were appropriate criteria used when considering studies for the review? 3. Was the search for relevant studies detailed and reasonably comprehensive? 4. Were assessments of the studies' relevance to the review topic and of their risk of bias reproducible? 5. Were the results similar from study to study? PMID:20018115

  11. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  12. Decision generation tools and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  13. Automatic tool path generation for finish machining

    SciTech Connect

    Kwok, Kwan S.; Loucks, C.S.; Driessen, B.J.

    1997-03-01

    A system for automatic tool path generation was developed at Sandia National Laboratories for finish machining operations. The system consists of a commercially available 5-axis milling machine controlled by Sandia developed software. This system was used to remove overspray on cast turbine blades. A laser-based, structured-light sensor, mounted on a tool holder, is used to collect 3D data points around the surface of the turbine blade. Using the digitized model of the blade, a tool path is generated which will drive a 0.375 inch diameter CBN grinding pin around the tip of the blade. A fuzzified digital filter was developed to properly eliminate false sensor readings caused by burrs, holes and overspray. The digital filter was found to successfully generate the correct tool path for a blade with intentionally scanned holes and defects. The fuzzified filter improved the computation efficiency by a factor of 25. For application to general parts, an adaptive scanning algorithm was developed and presented with simulation results. A right pyramid and an ellipsoid were scanned successfully with the adaptive algorithm.

  14. Next generation tools for genomic data generation, distribution, and visualization

    PubMed Central

    2010-01-01

    Background With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Results Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. Conclusions These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq. PMID:20828407

  15. Groundwater Monitoring Report Generation Tools - 12005

    SciTech Connect

    Lopez, Natalie

    2012-07-01

    Compliance with National and State environmental regulations (e.g. Resource Conservation and Recovery Act (RCRA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) aka SuperFund) requires Savannah River Site (SRS) to extensively collect and report groundwater monitoring data, with potential fines for missed reporting deadlines. Several utilities have been developed at SRS to facilitate production of the regulatory reports which include maps, data tables, charts and statistics. Components of each report are generated in accordance with complex sets of regulatory requirements specific to each site monitored. SRS developed a relational database to incorporate the detailed reporting rules with the groundwater data, and created a set of automation tools to interface with the information and generate the report components. These process improvements enhanced quality and consistency by centralizing the information, and have reduced manpower and production time through automated efficiencies. (author)

  16. GROUNDWATER MONITORING REPORT GENERATION TOOLS - 12005

    SciTech Connect

    Lopez, N.

    2011-11-21

    Compliance with National and State environmental regulations (e.g. Resource Conservation and Recovery Act (RCRA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) aka SuperFund) requires Savannah River Site (SRS) to extensively collect and report groundwater monitoring data, with potential fines for missed reporting deadlines. Several utilities have been developed at SRS to facilitate production of the regulatory reports which include maps, data tables, charts and statistics. Components of each report are generated in accordance with complex sets of regulatory requirements specific to each site monitored. SRS developed a relational database to incorporate the detailed reporting rules with the groundwater data, and created a set of automation tools to interface with the information and generate the report components. These process improvements enhanced quality and consistency by centralizing the information, and have reduced manpower and production time through automated efficiencies.

  17. Next Generation of Visualization Tools for Astrophysics

    NASA Astrophysics Data System (ADS)

    Gheller, C.; Becciani, U.; Teuben, P. J.

    2007-10-01

    Visualization exists in many niche packages, each with their own strengths and weaknesses. In this BoF a series of presentations was meant to stimulate a discussion between developers and users about the future of visualization tools. A wiki {http://wiki.eurovotech.org/twiki/bin/view/VOTech/BoFADASSTucson2006} page has been set up to log and continue this discussion. One recent technique, dubbed ``plastifying,'' has enabled different tools to inter-operate data between them and result in a very flexible environment. Also deemed important are the ability to write plug-ins for analysis in visualization tools. Publication quality graphs are sometimes missing from the tools we use.

  18. Next-Generation Design and Simulation Tools

    NASA Technical Reports Server (NTRS)

    Weber, Tod A.

    1997-01-01

    Thirty years ago, the CAD industry was created as electronic drafting tools were developed to move people from the traditional two-dimensional drafting boards. While these tools provided an improvement in accuracy (true perpendicular lines, etc.), they did offer a significant improvement in productivity or impact development times. They electronically captured a manual process.

  19. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  20. Projectile-generating explosive access tool

    SciTech Connect

    Jakaboski, Juan-Carlos; Hughs, Chance G; Todd, Steven N

    2013-06-11

    A method for generating a projectile using an explosive device that can generate a projectile from the opposite side of a wall from the side where the explosive device is detonated. The projectile can be generated without breaching the wall of the structure or container. The device can optionally open an aperture in a solid wall of a structure or a container and form a high-kinetic-energy projectile from the portion of the wall removed to create the aperture.

  1. Dewarless Logging Tool - 1st Generation

    SciTech Connect

    HENFLING,JOSEPH A.; NORMANN,RANDY A.

    2000-08-01

    This report focuses on Sandia National Laboratories' effort to create high-temperature logging tools for geothermal applications without the need for heat shielding. One of the mechanisms for failure in conventional downhole tools is temperature. They can only survive a limited number of hours in high temperature environments. For the first time since the evolution of integrated circuits, components are now commercially available that are qualified to 225 C with many continuing to work up to 300 C. These components are primarily based on Silicon-On-Insulator (SOI) technology. Sandia has developed and tested a simple data logger based on this technology that operates up to 300 C with a few limiting components operating to only 250 C without thermal protection. An actual well log to 240 C without shielding is discussed. The first prototype high-temperature tool measures pressure and temperature using a wire-line for power and communication. The tool is based around the HT83C51 microcontroller. A brief discussion of the background and status of the High Temperature Instrumentation program at Sandia, objectives, data logger development, and future project plans are given.

  2. Generating genomic tools for blueberry improvement

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Because of their recognized health benefits, there has been increased demand and consumption of blueberries in recent years. Great strides have been made in cultivar development since its domestication using traditional breeding approaches. However, genomic tools are lacking in blueberry, which coul...

  3. Projectile-generating explosive access tool

    DOEpatents

    Jakaboski, Juan-Carlos; Todd, Steven N.

    2011-10-18

    An explosive device that can generate a projectile from the opposite side of a wall from the side where the explosive device is detonated. The projectile can be generated without breaching the wall of the structure or container. The device can optionally open an aperture in a solid wall of a structure or a container and form a high-kinetic-energy projectile from the portion of the wall removed to create the aperture.

  4. Next-Generation Ion Thruster Design Tool

    NASA Technical Reports Server (NTRS)

    Stolz, Peter

    2015-01-01

    Computational tools that accurately predict the performance of electric propulsion devices are highly desirable and beneficial to NASA and the broader electric propulsion community. The current state of the art in electric propulsion modeling relies heavily on empirical data and numerous computational "knobs." In Phase I of this project, Tech-X Corporation developed the most detailed ion engine discharge chamber model that currently exists. This kinetic model simulates all particles in the discharge chamber along with a physically correct simulation of the electric fields. In addition, kinetic erosion models are included for modeling the ion-impingement effects on thruster component erosion. In Phase II, Tech-X developed a user-friendly computer program for NASA and other governmental and industry customers. Tech-X has implemented a number of advanced numerical routines to bring the computational time down to a commercially acceptable level. NASA now has a highly sophisticated, user-friendly ion engine discharge chamber modeling tool.

  5. BGen: A UML Behavior Network Generator Tool

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry; Reder, Leonard J.; Balian, Harry

    2010-01-01

    BGen software was designed for autogeneration of code based on a graphical representation of a behavior network used for controlling automatic vehicles. A common format used for describing a behavior network, such as that used in the JPL-developed behavior-based control system, CARACaS ["Control Architecture for Robotic Agent Command and Sensing" (NPO-43635), NASA Tech Briefs, Vol. 32, No. 10 (October 2008), page 40] includes a graph with sensory inputs flowing through the behaviors in order to generate the signals for the actuators that drive and steer the vehicle. A computer program to translate Unified Modeling Language (UML) Freeform Implementation Diagrams into a legacy C implementation of Behavior Network has been developed in order to simplify the development of C-code for behavior-based control systems. UML is a popular standard developed by the Object Management Group (OMG) to model software architectures graphically. The C implementation of a Behavior Network is functioning as a decision tree.

  6. Integrating Fourth-Generation Tools Into the Applications Development Environment.

    ERIC Educational Resources Information Center

    Litaker, R. G.; And Others

    1985-01-01

    Much of the power of the "information center" comes from its ability to effectively use fourth-generation productivity tools to provide information processing services. A case study of the use of these tools at Western Michigan University is presented. (Author/MLW)

  7. HepMCAnalyser: A tool for Monte Carlo generator validation

    NASA Astrophysics Data System (ADS)

    Ay, C.; Johnert, S.; Katzy, J.; Qin, Zhonghua

    2010-04-01

    HepMCAnalyser is a tool for Monte Carlo (MC) generator validation and comparisons. It is a stable, easy-to-use and extendable framework allowing for easy access/integration to generator level analysis. It comprises a class library with benchmark physics processes to analyse MC generator HepMC output and to fill root histograms. A web-interface is provided to display all or selected histogramms, compare to references and validate the results based on Kolmogorov Tests. Steerable example programs can be used for event generation. The default steering is tuned to optimally align the distributions of the different MC generators. The tool will be used for MC generator validation by the Generator Services (GENSER) LCG project, e.g. for version upgrades. It is supported on the same platforms as the GENSER libraries and is already in use at ATLAS.

  8. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  9. The parser generator as a general purpose tool

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.; Collins, W. R.

    1985-01-01

    The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.

  10. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  11. MEAT: An Authoring Tool for Generating Adaptable Learning Resources

    ERIC Educational Resources Information Center

    Kuo, Yen-Hung; Huang, Yueh-Min

    2009-01-01

    Mobile learning (m-learning) is a new trend in the e-learning field. The learning services in m-learning environments are supported by fundamental functions, especially the content and assessment services, which need an authoring tool to rapidly generate adaptable learning resources. To fulfill the imperious demand, this study proposes an…

  12. Deciding to quit drinking alcohol

    MedlinePlus

    ... Alcoholism - deciding to quit References American Psychiatric Association. Diagnostic and statistical manual of mental disorders . 5th ed. Arlington, VA: American Psychiatric Association, 2013. ...

  13. Tools for Simulation and Benchmark Generation at Exascale

    SciTech Connect

    Lagadapati, Mahesh; Mueller, Frank; Engelmann, Christian

    2013-01-01

    The path to exascale high-performance computing (HPC) poses several challenges related to power, performance, resilience, productivity, programmability, data movement, and data management. Investigating the performance of parallel applications at scale on future architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. Simulations using models of future HPC systems and communication traces from applications running on existing HPC systems can offer an insight into the performance of future architectures. This work targets technology developed for scalable application tracing of communication events and memory profiles, but can be extended to other areas, such as I/O, control flow, and data flow. It further focuses on extreme-scale simulation of millions of Message Passing Interface (MPI) ranks using a lightweight parallel discrete event simulation (PDES) toolkit for performance evaluation. Instead of simply replaying a trace within a simulation, the approach is to generate a benchmark from it and to run this benchmark within a simulation using models to reflect the performance characteristics of future-generation HPC systems. This provides a number of benefits, such as eliminating the data intensive trace replay and enabling simulations at different scales. The presented work utilizes the ScalaTrace tool to generate scalable trace files, the ScalaBenchGen tool to generate the benchmark, and the xSim tool to run the benchmark within a simulation.

  14. An Infrastructure for UML-Based Code Generation Tools

    NASA Astrophysics Data System (ADS)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  15. A computer-based tool for generation of progress notes.

    PubMed Central

    Campbell, K. E.; Wieckert, K.; Fagan, L. M.; Musen, M. A.

    1993-01-01

    IVORY, a computer-based tool that uses clinical findings as the basic unit for composing progress notes, generates progress notes more efficiently than does a character-based word processor. IVORY's clinical findings are contained within a structured vocabulary that we developed to support generation of both prose progress notes and SNOMED III codes. Observational studies of physician participation in the development of IVORY's structured vocabulary have helped us to identify areas where changes are required before IVORY will be acceptable for routine clinical use. PMID:8130479

  16. S3D: An interactive surface grid generation tool

    NASA Technical Reports Server (NTRS)

    Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David

    1992-01-01

    S3D, an interactive software tool for surface grid generation, is described. S3D provides the means with which a geometry definition based either on a discretized curve set or a rectangular set can be quickly processed towards the generation of a surface grid for computational fluid dynamics (CFD) applications. This is made possible as a result of implementing commonly encountered surface gridding tasks in an environment with a highly efficient and user friendly graphical interface. Some of the more advanced features of S3D include surface-surface intersections, optimized surface domain decomposition and recomposition, and automated propagation of edge distributions to surrounding grids.

  17. Nearly arc-length tool path generation and tool radius compensation algorithm research in FTS turning

    NASA Astrophysics Data System (ADS)

    Zhao, Minghui; Zhao, Xuesen; Li, Zengqiang; Sun, Tao

    2014-08-01

    In the non-rotational symmetrical microstrcture surfaces generation using turning method with Fast Tool Servo(FTS), non-uniform distribution of the interpolation data points will lead to long processing cycle and poor surface quality. To improve this situation, nearly arc-length tool path generation algorithm is proposed, which generates tool tip trajectory points in nearly arc-length instead of the traditional interpolation rule of equal angle and adds tool radius compensation. All the interpolation points are equidistant in radial distribution because of the constant feeding speed in X slider, the high frequency tool radius compensation components are in both X direction and Z direction, which makes X slider difficult to follow the input orders due to its large mass. Newton iterative method is used to calculate the neighboring contour tangent point coordinate value with the interpolation point X position as initial value, in this way, the new Z coordinate value is gotten, and the high frequency motion components in X direction is decomposed into Z direction. Taking a typical microstructure with 4μm PV value for test, which is mixed with two 70μm wave length sine-waves, the max profile error at the angle of fifteen is less than 0.01μm turning by a diamond tool with big radius of 80μm. The sinusoidal grid is machined on a ultra-precision lathe succesfully, the wavelength is 70.2278μm the Ra value is 22.81nm evaluated by data points generated by filtering out the first five harmonics.

  18. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  19. Retraction Notice: Generation of Knock down Tools for Transcription Factor 7-like-2 (TCF7L2) and Evaluation of its Expression Pattern in Developing Chicken Optic Tectum.

    PubMed

    2016-01-01

    The publishers have decided to retract the manuscript entitled "Generation of Knock Down Tools for Transcription Factor 7-Like-2 (TCF7L2) and Evaluation of its Expression Pattern in Developing Chicken Optic Tectum" published in MicroRNA, volume 4, issue 3, page numbers 209-216, 2015, for the following reasons. • Due to conflict of interests between authors and the Principal Investigator. PMID:26861895

  20. Automatic Tool Path Generation for Robot Integrated Surface Sculpturing System

    NASA Astrophysics Data System (ADS)

    Zhu, Jiang; Suzuki, Ryo; Tanaka, Tomohisa; Saito, Yoshio

    In this paper, a surface sculpturing system based on 8-axis robot is proposed, the CAD/CAM software and tool path generation algorithm for this sculpturing system are presented. The 8-axis robot is composed of a 6-axis manipulator and a 2-axis worktable, it carves block of polystyrene foams by heated cutting tools. Multi-DOF (Degree of Freedom) robot benefits from the faster fashion than traditional RP (Rapid Prototyping) methods and more flexibility than CNC machining. With its flexibility driven from an 8-axis configuration, as well as efficient custom-developed software for rough cutting and finish cutting, this surface sculpturing system can carve sculptured surface accurately and efficiently.

  1. The Requirements Generation System: A tool for managing mission requirements

    NASA Technical Reports Server (NTRS)

    Sheppard, Sylvia B.

    1994-01-01

    Historically, NASA's cost for developing mission requirements has been a significant part of a mission's budget. Large amounts of time have been allocated in mission schedules for the development and review of requirements by the many groups who are associated with a mission. Additionally, tracing requirements from a current document to a parent document has been time-consuming and costly. The Requirements Generation System (RGS) is a computer-supported cooperative-work tool that assists mission developers in the online creation, review, editing, tracing, and approval of mission requirements as well as in the production of requirements documents. This paper describes the RGS and discusses some lessons learned during its development.

  2. MCM generator: a Java-based tool for generating medical metadata.

    PubMed

    Munoz, F; Hersh, W

    1998-01-01

    In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files. PMID:9929299

  3. MCM generator: a Java-based tool for generating medical metadata.

    PubMed Central

    Munoz, F.; Hersh, W.

    1998-01-01

    In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files. Images Figure 1 Figure 2 Figure 3 Figure 4 PMID:9929299

  4. Ambit-Tautomer: An Open Source Tool for Tautomer Generation.

    PubMed

    Kochev, Nikolay T; Paskaleva, Vesselina H; Jeliazkova, Nina

    2013-06-01

    We present a new open source tool for automatic generation of all tautomeric forms of a given organic compound. Ambit-Tautomer is a part of the open source software package Ambit2. It implements three tautomer generation algorithms: combinatorial method, improved combinatorial method and incremental depth-first search algorithm. All algorithms utilize a set of fully customizable rules for tautomeric transformations. The predefined knowledge base covers 1-3, 1-5 and 1-7 proton tautomeric shifts. Some typical supported tautomerism rules are keto-enol, imin-amin, nitroso-oxime, azo-hydrazone, thioketo-thioenol, thionitroso-thiooxime, amidine-imidine, diazoamino-diazoamino, thioamide-iminothiol and nitrosamine-diazohydroxide. Ambit-Tautomer uses a simple energy based system for tautomer ranking implemented by a set of empirically derived rules. A fine-grained output control is achieved by a set of post-generation filters. We performed an exhaustive comparison of the Ambit-Tautomer Incremental algorithm against several other software packages which offer tautomer generation: ChemAxon Marvin, Molecular Networks MN.TAUTOMER, ACDLabs, CACTVS and the CDK implementation of the algorithm, based on the mobile H atoms listed in the InChI. According to the presented test results, Ambit-Tautomer's performance is either comparable to or better than the competing algorithms. Ambit-Tautomer module is available for download as a Java library, a command line application, a demo web page or OpenTox API compatible Web service. PMID:27481667

  5. EIGER: A new generation of computational electromagnetics tools

    SciTech Connect

    Wilton, D.R.; Johnson, W.A.; Jorgenson, R.E.; Sharpe, R.M.; Grant, J.B.

    1996-03-01

    The EIGER project (Electromagnetic Interactions GenERalized) endeavors to bring the next generation of spectral domain electromagnetic analysis tools to maturity and to cast them in a general form which is amenable to a variety of applications. The tools are written in Fortran 90 and with an object oriented philosophy to yield a package that is easily ported to a variety of platforms, simply maintained, and above all efficiently modified to address wide ranging applications. The modular development style and the choice of Fortran 90 is also driven by the desire to run efficiently on existing high performance computer platforms and to remain flexible for new architectures that are anticipated. The electromagnetic tool box consists of extremely accurate physics models for 2D and 3D electromagnetic scattering, radiation, and penetration problems. The models include surface and volume formulations for conductors and complex materials. In addition, realistic excitations and symmetries are incorporated, as well as, complex environments through the use of Green`s functions.

  6. Governance and Factions--Who Decides Who Decides?

    ERIC Educational Resources Information Center

    Hodgkinson, Harold L.

    In several projects, the Center is studying the question: who will decide which factions will be represented in the decision-making process. In the Campus Governance Project investigating the nature of governance, over 3,000 questionnaires were administered and 900 intensive interviews conducted at 19 institutions. The questionnaire was designed…

  7. Benchmarking the next generation of homology inference tools

    PubMed Central

    Saripella, Ganapathi Varma; Sonnhammer, Erik L. L.; Forslund, Kristoffer

    2016-01-01

    Motivation: Over the last decades, vast numbers of sequences were deposited in public databases. Bioinformatics tools allow homology and consequently functional inference for these sequences. New profile-based homology search tools have been introduced, allowing reliable detection of remote homologs, but have not been systematically benchmarked. To provide such a comparison, which can guide bioinformatics workflows, we extend and apply our previously developed benchmark approach to evaluate the ‘next generation’ of profile-based approaches, including CS-BLAST, HHSEARCH and PHMMER, in comparison with the non-profile based search tools NCBI-BLAST, USEARCH, UBLAST and FASTA. Method: We generated challenging benchmark datasets based on protein domain architectures within either the PFAM + Clan, SCOP/Superfamily or CATH/Gene3D domain definition schemes. From each dataset, homologous and non-homologous protein pairs were aligned using each tool, and standard performance metrics calculated. We further measured congruence of domain architecture assignments in the three domain databases. Results: CSBLAST and PHMMER had overall highest accuracy. FASTA, UBLAST and USEARCH showed large trade-offs of accuracy for speed optimization. Conclusion: Profile methods are superior at inferring remote homologs but the difference in accuracy between methods is relatively small. PHMMER and CSBLAST stand out with the highest accuracy, yet still at a reasonable computational cost. Additionally, we show that less than 0.1% of Swiss-Prot protein pairs considered homologous by one database are considered non-homologous by another, implying that these classifications represent equivalent underlying biological phenomena, differing mostly in coverage and granularity. Availability and Implementation: Benchmark datasets and all scripts are placed at (http://sonnhammer.org/download/Homology_benchmark). Contact: forslund@embl.de Supplementary information: Supplementary data are available at

  8. Deciding where to Stop Speaking

    ERIC Educational Resources Information Center

    Tydgat, Ilse; Stevens, Michael; Hartsuiker, Robert J.; Pickering, Martin J.

    2011-01-01

    This study investigated whether speakers strategically decide where to interrupt their speech once they need to stop. We conducted four naming experiments in which pictures of colored shapes occasionally changed in color or shape. Participants then merely had to stop (Experiment 1); or they had to stop and resume speech (Experiments 2-4). They…

  9. Improving Dynamic Load and Generator Response PerformanceTools

    SciTech Connect

    Lesieutre, Bernard C.

    2005-11-01

    This report is a scoping study to examine research opportunities to improve the accuracy of the system dynamic load and generator models, data and performance assessment tools used by CAISO operations engineers and planning engineers, as well as those used by their counterparts at the California utilities, to establish safe operating margins. Model-based simulations are commonly used to assess the impact of credible contingencies in order to determine system operating limits (path ratings, etc.) to ensure compliance with NERC and WECC reliability requirements. Improved models and a better understanding of the impact of uncertainties in these models will increase the reliability of grid operations by allowing operators to more accurately study system voltage problems and the dynamic stability response of the system to disturbances.

  10. Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools

    NASA Technical Reports Server (NTRS)

    Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory

    2013-01-01

    Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.

  11. Tool for Generating Realistic Residential Hot Water Event Schedules: Preprint

    SciTech Connect

    Hendron, B.; Burch, J.; Barker, G.

    2010-08-01

    The installed energy savings for advanced residential hot water systems can depend greatly on detailed occupant use patterns. Quantifying these patterns is essential for analyzing measures such as tankless water heaters, solar hot water systems with demand-side heat exchangers, distribution system improvements, and recirculation loops. This paper describes the development of an advanced spreadsheet tool that can generate a series of year-long hot water event schedules consistent with realistic probability distributions of start time, duration and flow rate variability, clustering, fixture assignment, vacation periods, and seasonality. This paper also presents the application of the hot water event schedules in the context of an integral-collector-storage solar water heating system in a moderate climate.

  12. PRIST: a fourth-generation tool for medical information systems.

    PubMed

    Cristiani, P; Larizza, C

    1990-04-01

    PRIST is a fourth-generation software package purposely oriented to development and management of medical applications, running under MS/DOS IBM compatible personal computers. The tool has been developed on the top of DBIII Plus language utilizing the Clipper Compiler networking features for the integration in a LAN environment. Several routines written in C and BASIC Microsoft languages integrated this DBMS-kernel system providing I/O, graphics, statistics, retrieval utilities. To increase the interactivity of the system both menu-driven and windowing interfaces have been implemented. PRIST has been utilized to develop a wide variety of small medical applications ranging from research laboratories to intensive care units. The great majority of reactions from the use of these applications were positive, confirming that PRIST is able to assist in practice management and patient care as well as research purposes. PMID:2345045

  13. Automatic Generation of Remote Visualization Tools with WATT

    NASA Astrophysics Data System (ADS)

    Jensen, P. A.; Bollig, E. F.; Yuen, D. A.; Erlebacher, G.; Momsen, A. R.

    2006-12-01

    The ever increasing size and complexity of geophysical and other scientific datasets has forced developers to turn to more powerful alternatives for visualizing results of computations and experiments. These alternative need to be faster, scalable, more efficient, and able to be run on large machines. At the same time, advances in scripting languages and visualization libraries have significantly decreased the development time of smaller, desktop visualization tools. Ideally, programmers would be able to develop visualization tools in a high-level, local, scripted environment and then automatically convert their programs into compiled, remote visualization tools for integration into larger computation environments. The Web Automation and Translation Toolkit (WATT) [1] converts a Tcl script for the Visualization Toolkit (VTK) [2] into a standards-compliant web service. We will demonstrate the used of WATT for the automated conversion of a desktop visualization application (written in Tcl for VTK) into a remote visualization service of interest to geoscientists. The resulting service will allow real-time access to a large dataset through the Internet, and will be easily integrated into the existing architecture of the Virtual Laboratory for Earth and Planetary Materials (VLab) [3]. [1] Jensen, P.A., Yuen, D.A., Erlebacher, G., Bollig, E.F., Kigelman, D.G., Shukh, E.A., Automated Generation of Web Services for Visualization Toolkits, Eos Trans. AGU, 86(52), Fall Meet. Suppl., Abstract IN42A-06, 2005. [2] The Visualization Toolkit, http://www.vtk.org [3] The Virtual Laboratory for Earth and Planetary Materials, http://vlab.msi.umn.edu

  14. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    NASA Astrophysics Data System (ADS)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  15. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGESBeta

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; et al

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  16. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    SciTech Connect

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  17. Sandia Generated Matrix Tool (SGMT) v. 1.0

    Energy Science and Technology Software Center (ESTSC)

    2010-03-24

    Provides a tool with which create and characterize a very large set of matrix-based visual analogy problems that have properties that are similar to Raven™s Progressive Matrices (RPMs). The software uses the same underlying patterns found in RPMs to generate large numbers of unique matrix problems using parameters chosen by the researcher. Specifically, the software is designed so that researchers can choose the type, direction, and number of relations in a problem and then createmore » any number of unique matrices that share the same underlying structure (e.g. changes in numerosity in a diagonal pattern) but have different surface features (e.g. shapes, colors).Raven™s Progressive Matrices (RPMs) are a widely-used test for assessing intelligence and reasoning ability. Since the test is non-verbal, it can be applied to many different populations and has been used all over the world. However, there are relatively few matrices in the sets developed by Raven, which limits their use in experiments requiring large numbers of stimuli. This tool creates a matrix set in a systematic way that allows researchers to have a great deal of control over the underlying structure, surface features, and difficulty of the matrix problems while providing a large set of novel matrices with which to conduct experiments.« less

  18. Sandia Generated Matrix Tool (SGMT) v. 1.0

    SciTech Connect

    Benz, Zachary; & Dixon, Kevin

    2010-03-24

    Provides a tool with which create and characterize a very large set of matrix-based visual analogy problems that have properties that are similar to Raven™s Progressive Matrices (RPMs). The software uses the same underlying patterns found in RPMs to generate large numbers of unique matrix problems using parameters chosen by the researcher. Specifically, the software is designed so that researchers can choose the type, direction, and number of relations in a problem and then create any number of unique matrices that share the same underlying structure (e.g. changes in numerosity in a diagonal pattern) but have different surface features (e.g. shapes, colors).Raven™s Progressive Matrices (RPMs) are a widely-used test for assessing intelligence and reasoning ability. Since the test is non-verbal, it can be applied to many different populations and has been used all over the world. However, there are relatively few matrices in the sets developed by Raven, which limits their use in experiments requiring large numbers of stimuli. This tool creates a matrix set in a systematic way that allows researchers to have a great deal of control over the underlying structure, surface features, and difficulty of the matrix problems while providing a large set of novel matrices with which to conduct experiments.

  19. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    SciTech Connect

    Cem Sarica; Holden Zhang

    2006-05-31

    The developments of oil and gas fields in deep waters (5000 ft and more) will become more common in the future. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas, oil and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of hydrocarbon recovery from design to operation. Recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications, including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is crucial for any multiphase separation technique, either at topside, seabed or bottom-hole, to know inlet conditions such as flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. Therefore, the development of a new generation of multiphase flow predictive tools is needed. The overall objective of the proposed study is to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). In the current multiphase modeling approach, flow pattern and flow behavior (pressure gradient and phase fractions) prediction modeling are separated. Thus, different models based on different physics are employed, causing inaccuracies and discontinuities. Moreover, oil and water are treated as a pseudo single phase, ignoring the distinct characteristics of both oil and water, and often resulting in inaccurate design that leads to operational problems. In this study, a new model is being developed through a theoretical and experimental study employing a revolutionary approach. The

  20. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  1. Bioethics in America: Who decides

    SciTech Connect

    Yesley, M.S.

    1992-01-01

    This paper is concerned with the process by which bioethics decisions are made as well as the actual decisions that are reached. The process commonly is one of shared decision-making,'' that is, decisionmaking at several levels, beginning with the government and ending with the individual. After the government has defined a scope of permissible activity, the research or health care institution may further limit what activities are permitted. Finally, the individual patient, or, if the patient is incompetent, the patient's legal representative decides whether or not to participate in the activity. Because bioethics in general, and bioethics related to genetics in particular, evolves through this process of decisionmaking at several levels, this paper briefly traces the process, to see how it works in several areas of bioethics, in order to provide a perspective on the way in which ethical decisions related to genetics are or will be made.

  2. Bioethics in America: Who decides?

    SciTech Connect

    Yesley, M.S.

    1992-05-01

    This paper is concerned with the process by which bioethics decisions are made as well as the actual decisions that are reached. The process commonly is one of ``shared decision-making,`` that is, decisionmaking at several levels, beginning with the government and ending with the individual. After the government has defined a scope of permissible activity, the research or health care institution may further limit what activities are permitted. Finally, the individual patient, or, if the patient is incompetent, the patient`s legal representative decides whether or not to participate in the activity. Because bioethics in general, and bioethics related to genetics in particular, evolves through this process of decisionmaking at several levels, this paper briefly traces the process, to see how it works in several areas of bioethics, in order to provide a perspective on the way in which ethical decisions related to genetics are or will be made.

  3. Developing the Next Generation of Tools for Simulating Galaxy Outflows

    NASA Astrophysics Data System (ADS)

    Scannapieco, Evan

    Outflows are observed in starbursting galaxies of all masses and at all cosmological epochs. They play a key role throughout the history of the Universe: shaping the galaxy mass-metallicity relation, drastically affecting the content and number density of dwarf galaxies, and transforming the chemical composition of the intergalactic medium. Yet, a complete model of galaxy out ows has proven to be elusive, as it requires both a better understanding of the evolution of the turbulent, multiphase gas in and around starbursting galaxies, and better tools to reproduce this evolution in galaxy-scale simulations. Here we propose to conduct a detailed series of numerical simulations designed to help develop such next-generation tools for the simulation of galaxy outflows. The program will consist of three types of direct numerical simulations, each of which will be targeted to allow galaxy-scale simulations to more accurately model key microphysical processes and their observational consequences. Our first set of simulations will be targeted at better modeling the starbursting interstellar medium (ISM) from which galaxy outflows are driven. The surface densities in starbursting galaxies are much larger than those in the Milky Way, resulting in larger gravitational accelerations and random velocities exceeding 30 or even 100 km/s. Under these conditions, the thermal stability of the ISM is changed dramatically, due to the sharp peak in gas cooling efficiency at H 200,000 K. Our simulations will carefully quantify the key ways in which this medium differs from the local ISM, and the consequences of these differences for when, where, and how outflows are driven. A second set of simulations will be targeted at better modeling the observed properties of rapidly cooling, highly turbulent gas. Because gas cooling in and around starbursts is extremely efficient, turbulent motions are often supersonic, which leads to a distribution of ionization states that is vastly different than

  4. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    SciTech Connect

    Tulsa Fluid Flow

    2008-08-31

    The developments of fields in deep waters (5000 ft and more) is a common occurrence. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas-oil-and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of the hydrocarbon recovery from design to operation. The recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is very crucial to any multiphase separation technique that is employed either at topside, seabed or bottom-hole to know inlet conditions such as the flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. The overall objective was to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict the flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). The project was conducted in two periods. In Period 1 (four years), gas-oil-water flow in pipes were investigated to understand the fundamental physical mechanisms describing the interaction between the gas-oil-water phases under flowing conditions, and a unified model was developed utilizing a novel modeling approach. A gas-oil-water pipe flow database including field and laboratory data was formed in Period 2 (one year). The database was utilized in model performance demonstration. Period 1 primarily consisted of the development of a unified model and software to predict the gas-oil-water flow, and experimental studies of the gas-oil-water project, including flow behavior description and

  5. Fine-Tuning Next-Generation Genome Editing Tools.

    PubMed

    Kanchiswamy, Chidananda Nagamangala; Maffei, Massimo; Malnoy, Mickael; Velasco, Riccardo; Kim, Jin-Soo

    2016-07-01

    The availability of genome sequences of numerous organisms and the revolution brought about by genome editing tools (e.g., ZFNs, TALENs, and CRISPR/Cas9 or RGENs) has provided a breakthrough in introducing targeted genetic changes both to explore emergent phenotypes and to introduce new functionalities. However, the wider application of these tools in biology, agriculture, medicine, and biotechnology is limited by off-target mutation effects. In this review, we compare available methods for detecting, measuring, and analyzing off-target mutations. Furthermore, we particularly focus on CRISPR/Cas9 regarding various methods, tweaks, and software tools available to nullify off-target effects. PMID:27167723

  6. Prostate Cancer: Take Time to Decide

    MedlinePlus

    ... printing [PDF-983KB] Cancer Home Prostate Cancer: Take Time to Decide Infographic Language: English Español (Spanish) Recommend on Facebook Tweet Share Compartir Prostate Cancer: Take Time to Decide Most prostate cancers grow slowly, and ...

  7. Ontodog: a web-based ontology community view generation tool.

    PubMed

    Zheng, Jie; Xiang, Zuoshuang; Stoeckert, Christian J; He, Yongqun

    2014-05-01

    Biomedical ontologies are often very large and complex. Only a subset of the ontology may be needed for a specified application or community. For ontology end users, it is desirable to have community-based labels rather than the labels generated by ontology developers. Ontodog is a web-based system that can generate an ontology subset based on Excel input, and support generation of an ontology community view, which is defined as the whole or a subset of the source ontology with user-specified annotations including user-preferred labels. Ontodog allows users to easily generate community views with minimal ontology knowledge and no programming skills or installation required. Currently >100 ontologies including all OBO Foundry ontologies are available to generate the views based on user needs. We demonstrate the application of Ontodog for the generation of community views using the Ontology for Biomedical Investigations as the source ontology. PMID:24413522

  8. Virtual Tool Mark Generation for Efficient Striation Analysis

    SciTech Connect

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  9. E-DECIDER Decision Support Gateway For Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.

    2013-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that

  10. Skateboards or Wildlife? Kids Decide!

    ERIC Educational Resources Information Center

    Thomas, Julie; Cooper, Sandra; Haukos, David

    2004-01-01

    How can teachers make science learning relevant to today's technology savvy students? They can incorporate the Internet and use it as a tool to help solve real-life problems. A group of university professors, a field biologist, and classroom teachers teamed up to create an exciting, interactive Web-based learning environment for students and…

  11. New generation of exploration tools: interactive modeling software and microcomputers

    SciTech Connect

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly, these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.

  12. JVM: Java Visual Mapping tool for next generation sequencing read.

    PubMed

    Yang, Ye; Liu, Juan

    2015-01-01

    We developed a program JVM (Java Visual Mapping) for mapping next generation sequencing read to reference sequence. The program is implemented in Java and is designed to deal with millions of short read generated by sequence alignment using the Illumina sequencing technology. It employs seed index strategy and octal encoding operations for sequence alignments. JVM is useful for DNA-Seq, RNA-Seq when dealing with single-end resequencing. JVM is a desktop application, which supports reads capacity from 1 MB to 10 GB. PMID:25387956

  13. Rover Team Decides: Safety First

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA's Mars Exploration Rover Spirit recorded this view while approaching the northwestern edge of 'Home Plate,' a circular plateau-like area of bright, layered outcrop material roughly 80 meters (260 feet) in diameter. The images combined into this mosaic were taken by Spirit's navigation camera during the rover's 746th, 748th and 750th Martian days, or sols (Feb. 7, 9 and 11, 2006).

    With Martian winter closing in, engineers and scientists working with NASA's Mars Exploration Rover Spirit decided to play it safe for the time being rather than attempt to visit the far side of Home Plate in search of rock layers that might show evidence of a past watery environment. This feature has been one of the major milestones of the mission. Though it's conceivable that rock layers might be exposed on the opposite side, sunlight is diminishing on the rover's solar panels and team members chose not to travel in a counterclockwise direction that would take the rover to the west and south slopes of the plateau. Slopes in that direction are hidden from view and team members chose, following a long, thorough discussion, to have the rover travel clockwise and remain on north-facing slopes rather than risk sending the rover deeper into unknown terrain.

    In addition to studying numerous images from Spirit's cameras, team members studied three-dimensional models created with images from the Mars Orbiter Camera on NASA's Mars Globel Surveyor orbiter. The models showed a valley on the southern side of Home Plate, the slopes of which might cause the rover's solar panels to lose power for unknown lengths of time. In addition, images from Spirit's cameras showed a nearby, talus-covered section of slope on the west side of Home Plate, rather than exposed rock layers scientists eventually hope to investigate.

    Home Plate has been on the rover's potential itinerary since the early days of the mission, when it stood out in images taken by the Mars Orbiter Camera shortly after

  14. Next generation tools to accelerate the synthetic biology process.

    PubMed

    Shih, Steve C C; Moraes, Christopher

    2016-05-16

    Synthetic biology follows the traditional engineering paradigm of designing, building, testing and learning to create new biological systems. While such approaches have enormous potential, major challenges still exist in this field including increasing the speed at which this workflow can be performed. Here, we present recently developed microfluidic tools that can be used to automate the synthetic biology workflow with the goal of advancing the likelihood of producing desired functionalities. With the potential for programmability, automation, and robustness, the integration of microfluidics and synthetic biology has the potential to accelerate advances in areas such as bioenergy, health, and biomaterials. PMID:27146265

  15. CUEMAP: A tool for generating hierarchical charts and dataflow diagrams

    SciTech Connect

    Lee, J.W.

    1987-12-01

    CUEMAP is a preprocessor to the MAPPER program, which generates report quality visual aids. CUEMAP uses text blocks, symbols, and line connectors to lay out hierarchical charts and dataflow diagrams. A grid is specified as a reference point on which the labels and symbols can be placed. Connectors are added to complete the diagram. Modifications and enhancements require knowledge of the MAPPER syntax. 1 ref., 2 figs.

  16. HALOGEN: a tool for fast generation of mock halo catalogues

    NASA Astrophysics Data System (ADS)

    Avila, Santiago; Murray, Steven G.; Knebe, Alexander; Power, Chris; Robotham, Aaron S. G.; Garcia-Bellido, Juan

    2015-06-01

    We present a simple method of generating approximate synthetic halo catalogues: HALOGEN. This method uses a combination of second-order Lagrangian Perturbation Theory (2LPT) in order to generate the large-scale matter distribution, analytical mass functions to generate halo masses, and a single-parameter stochastic model for halo bias to position haloes. HALOGEN represents a simplification of similar recently published methods. Our method is constrained to recover the two-point function at intermediate (10 h-1 Mpc < r < 50 h-1 Mpc) scales, which we show is successful to within 2 per cent. Larger scales (˜100 h-1 Mpc) are reproduced to within 15 per cent. We compare several other statistics (e.g. power spectrum, point distribution function, redshift space distortions) with results from N-body simulations to determine the validity of our method for different purposes. One of the benefits of HALOGEN is its flexibility, and we demonstrate this by showing how it can be adapted to varying cosmologies and simulation specifications. A driving motivation for the development of such approximate schemes is the need to compute covariance matrices and study the systematic errors for large galaxy surveys, which requires thousands of simulated realizations. We discuss the applicability of our method in this context, and conclude that it is well suited to mass production of appropriate halo catalogues. The code is publicly available at https://github.com/savila/halogen.

  17. Automated tools for the generation of performance-based training

    SciTech Connect

    Trainor, M.S.; Fries, J.

    1990-01-01

    The field of educational technology is not a new one, but the emphasis in the past has been on the use of technologies for the delivery of instruction and tests. This paper explores the application of technology to the development of performance-based instruction and to the analyses leading up to the development of the instruction. Several technologies are discussed, with specific software packages described. The purpose of these technologies is to streamline the instructional analysis and design process, using the computer for its strengths to aid the human-in-the-loop. Currently, the process is all accomplished manually. Applying automated tools to the process frees the humans from some of the tedium involved so that they can be dedicated to the more complex aspects of the process. 12 refs.

  18. Multiscale Toxicology - Building the Next Generation Tools for Toxicology

    SciTech Connect

    Thrall, Brian D.; Minard, Kevin R.; Teeguarden, Justin G.; Waters, Katrina M.

    2012-09-01

    A Cooperative Research and Development Agreement (CRADA) was sponsored by Battelle Memorial Institute (Battelle, Columbus), to initiate a collaborative research program across multiple Department of Energy (DOE) National Laboratories aimed at developing a suite of new capabilities for predictive toxicology. Predicting the potential toxicity of emerging classes of engineered nanomaterials was chosen as one of two focusing problems for this program. PNNL’s focus toward this broader goal was to refine and apply experimental and computational tools needed to provide quantitative understanding of nanoparticle dosimetry for in vitro cell culture systems, which is necessary for comparative risk estimates for different nanomaterials or biological systems. Research conducted using lung epithelial and macrophage cell models successfully adapted magnetic particle detection and fluorescent microscopy technologies to quantify uptake of various forms of engineered nanoparticles, and provided experimental constraints and test datasets for benchmark comparison against results obtained using an in vitro computational dosimetry model, termed the ISSD model. The experimental and computational approaches developed were used to demonstrate how cell dosimetry is applied to aid in interpretation of genomic studies of nanoparticle-mediated biological responses in model cell culture systems. The combined experimental and theoretical approach provides a highly quantitative framework for evaluating relationships between biocompatibility of nanoparticles and their physical form in a controlled manner.

  19. Profiting from competition: Financial tools for electric generation companies

    NASA Astrophysics Data System (ADS)

    Richter, Charles William, Jr.

    Regulations governing the operation of electric power systems in North America and many other areas of the world are undergoing major changes designed to promote competition. This process of change is often referred to as deregulation. Participants in deregulated electricity systems may find that their profits will greatly benefit from the implementation of successful bidding strategies. While the goal of the regulators may be to create rules which balance reliable power system operation with maximization of the total benefit to society, the goal of generation companies is to maximize their profit, i.e., return to their shareholders. The majority of the research described here is conducted from the point of view of generation companies (GENCOs) wishing to maximize their expected utility function, which is generally comprised of expected profit and risk. Strategies that help a GENCO to maximize its objective function must consider the impact of (and aid in making) operating decisions that may occur within a few seconds to multiple years. The work described here assumes an environment in which energy service companies (ESCOs) buy and GENCOs sell power via double auctions in regional commodity exchanges. Power is transported on wires owned by transmission companies (TRANSCOs) and distribution companies (DISTCOs). The proposed market framework allows participants to trade electrical energy contracts via the spot, futures, options, planning, and swap markets. An important method of studying these proposed markets and the behavior of participating agents is the field of experimental/computational economics. For much of the research reported here, the market simulator developed by Kumar and Sheble and similar simulators has been adapted to allow computerized agents to trade energy. Creating computerized agents that can react as rationally or irrationally as a human trader is a difficult problem for which we have turned to the field of artificial intelligence. Some of our

  20. PC graphics generation and management tool for real-time applications

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1992-01-01

    A graphics tool was designed and developed for easy generation and management of personal computer graphics. It also provides methods and 'run-time' software for many common artificial intelligence (AI) or expert system (ES) applications.

  1. Deciding as Intentional Action: Control over Decisions

    PubMed Central

    Shepherd, Joshua

    2015-01-01

    Common-sense folk psychology and mainstream philosophy of action agree about decisions: these are under an agent's direct control, and are thus intentional actions for which agents can be held responsible. I begin this paper by presenting a problem for this view. In short, since the content of the motivational attitudes that drive deliberation and decision remains open-ended until the moment of decision, it is unclear how agents can be thought to exercise control over what they decide at the moment of deciding. I note that this problem might motivate a non-actional view of deciding—a view that decisions are not actions, but are instead passive events of intention acquisition. For without an understanding of how an agent might exercise control over what is decided at the moment of deciding, we lack a good reason for maintaining commitment to an actional view of deciding. However, I then offer the required account of how agents exercise control over decisions at the moment of deciding. Crucial to this account is an understanding of the relation of practical deliberation to deciding, an understanding of skilled deliberative activity, and the role of attention in the mental action of deciding. PMID:26321765

  2. Multiscale Toxicology- Building the Next Generation Tools for Toxicology

    SciTech Connect

    Retterer, S. T.; Holsapple, M. P.

    2013-10-31

    A Cooperative Research and Development Agreement (CRADA) was established between Battelle Memorial Institute (BMI), Pacific Northwest National Laboratory (PNNL), Oak Ridge National Laboratory (ORNL), Brookhaven National Laboratory (BNL), Lawrence Livermore National Laboratory (LLNL) with the goal of combining the analytical and synthetic strengths of the National Laboratories with BMI's expertise in basic and translational medical research to develop a collaborative pipeline and suite of high throughput and imaging technologies that could be used to provide a more comprehensive understanding of material and drug toxicology in humans. The Multi-Scale Toxicity Initiative (MSTI), consisting of the team members above, was established to coordinate cellular scale, high-throughput in vitro testing, computational modeling and whole animal in vivo toxicology studies between MSTI team members. Development of a common, well-characterized set of materials for testing was identified as a crucial need for the initiative. Two research tracks were established by BMI during the course of the CRADA. The first research track focused on the development of tools and techniques for understanding the toxicity of nanomaterials, specifically inorganic nanoparticles (NPs). ORNL"s work focused primarily on the synthesis, functionalization and characterization of a common set of NPs for dissemination to the participating laboratories. These particles were synthesized to retain the same surface characteristics and size, but to allow visualization using the variety of imaging technologies present across the team. Characterization included the quantitative analysis of physical and chemical properties of the materials as well as the preliminary assessment of NP toxicity using commercially available toxicity screens and emerging optical imaging strategies. Additional efforts examined the development of high-throughput microfluidic and imaging assays for measuring NP uptake, localization, and

  3. Next generation sequencing technologies: tool to study avian virus diversity.

    PubMed

    Kapgate, S S; Barbuddhe, S B; Kumanan, K

    2015-03-01

    Increased globalisation, climatic changes and wildlife-livestock interface led to emergence of novel viral pathogens or zoonoses that have become serious concern to avian, animal and human health. High biodiversity and bird migration facilitate spread of the pathogen and provide reservoirs for emerging infectious diseases. Current classical diagnostic methods designed to be virus-specific or aim to be limited to group of viral agents, hinder identifying of novel viruses or viral variants. Recently developed approaches of next-generation sequencing (NGS) provide culture-independent methods that are useful for understanding viral diversity and discovery of novel virus, thereby enabling a better diagnosis and disease control. This review discusses the different possible steps of a NGS study utilizing sequence-independent amplification, high-throughput sequencing and bioinformatics approaches to identify novel avian viruses and their diversity. NGS lead to the identification of a wide range of new viruses such as picobirnavirus, picornavirus, orthoreovirus and avian gamma coronavirus associated with fulminating disease in guinea fowl and is also used in describing viral diversity among avian species. The review also briefly discusses areas of viral-host interaction and disease associated causalities with newly identified avian viruses. PMID:25790045

  4. Enzyme Function Initiative-Enzyme Similarity Tool (EFI-EST): A web tool for generating protein sequence similarity networks

    PubMed Central

    Gerlt, John A.; Bouvier, Jason T.; Davidson, Daniel B.; Imker, Heidi J.; Sadkhin, Boris; Slater, David R.; Whalen, Katie L.

    2015-01-01

    The Enzyme Function Initiative, an NIH/NIGMS-supported Large-Scale Collaborative Project (EFI; U54GM093342; http://enzymefunction.org/), is focused on devising and disseminating bioinformatics and computational tools as well as experimental strategies for the prediction and assignment of functions (in vitro activities and in vivo physiological/metabolic roles) to uncharacterized enzymes discovered in genome projects. Protein sequence similarity networks (SSNs) are visually powerful tools for analyzing sequence relationships in protein families (H.J. Atkinson, J.H. Morris, T.E. Ferrin, and P.C. Babbitt, PLoS One 2009, 4, e4345). However, the members of the biological/biomedical community have not had access to the capability to generate SSNs for their “favorite” protein families. In this article we announce the EFI-EST (Enzyme Function Initiative-Enzyme Similarity Tool) web tool (http://efi.igb.illinois.edu/efi-est/) that is available without cost for the automated generation of SSNs by the community. The tool can create SSNs for the “closest neighbors” of a user-supplied protein sequence from the UniProt database (Option A) or of members of any user-supplied Pfam and/or InterPro family (Option B). We provide an introduction to SSNs, a description of EFI-EST, and a demonstration of the use of EFI-EST to explore sequence-function space in the OMP decarboxylase superfamily (PF00215). This article is designed as a tutorial that will allow members of the community to use the EFI-EST web tool for exploring sequence/function space in protein families. PMID:25900361

  5. E-DECIDER Rapid Response to the M 6.0 South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    E-DECIDER initiated rapid response mode when the California Earthquake Clearinghouse was activated the morning following the M6 Napa earthquake. Data products, including: 1) rapid damage and loss estimates, 2) deformation magnitude and slope change maps, and 3) aftershock forecasts were provided to the Clearinghouse partners within 24 hours of the event via XchangeCore Web Service Data Orchestration sharing. NASA data products were provided to end-users via XchangeCore, EERI and Clearinghouse websites, and ArcGIS online for Napa response, reaching a wide response audience. The E-DECIDER team helped facilitate rapid delivery of NASA products to stakeholders and participated in Clearinghouse Napa earthquake briefings to update stakeholders on product information. Rapid response products from E-DECIDER can be used to help prioritize response efforts shortly after the event has occurred. InLET (Internet Loss Estimation Tool) post-event damage and casualty estimates were generated quickly after the Napa earthquake. InLET provides immediate post-event estimates of casualties and building damage by performing loss/impact simulations using USGS ground motion data and FEMA HAZUS damage estimation technology. These results were provided to E-DECIDER by their collaborators, ImageCat, Inc. and the Community Stakeholder Network (CSN). Strain magnitude and slope change maps were automatically generated when the Napa earthquake appeared on the USGS feed. These maps provide an early estimate of where the deformation has occurred and where damage may be localized. Using E-DECIDER critical infrastructure overlays with damage estimates, decision makers can direct response effort that can be verified later with field reconnaissance and remote sensing-based observations. Earthquake aftershock forecast maps were produced within hours of the event. These maps highlight areas where aftershocks are likely to occur and can also be coupled with infrastructure overlays to help direct response

  6. Generation X Teaches College: Generation Construction as Pedagogical Tool in the Writing Classroom.

    ERIC Educational Resources Information Center

    Hassel, Holly; Epp, Dawn Vernooy

    In the 1996 book "Generation X Goes to College: An Eye-Opening Account of Teaching in Post-Modern America," Peter Sacks probes the "decay" of higher education in the United States; a decay he attributes to listless, entitled students. This paper interrogates the paradigm of Boomers and Generation Xers poised in opposition to one another,…

  7. Deciding to have knee or hip replacement

    MedlinePlus

    ... patientinstructions/000368.htm Deciding to have knee or hip replacement To use the sharing features on this page, ... make a decision. Who Benefits From Knee or hip Replacement Surgery? The most common reason to have a ...

  8. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  9. A comparison of tools for the simulation of genomic next-generation sequencing data.

    PubMed

    Escalona, Merly; Rocha, Sara; Posada, David

    2016-08-01

    Computer simulation of genomic data has become increasingly popular for assessing and validating biological models or for gaining an understanding of specific data sets. Several computational tools for the simulation of next-generation sequencing (NGS) data have been developed in recent years, which could be used to compare existing and new NGS analytical pipelines. Here we review 23 of these tools, highlighting their distinct functionality, requirements and potential applications. We also provide a decision tree for the informed selection of an appropriate NGS simulation tool for the specific question at hand. PMID:27320129

  10. The Sequence of Events generator: A powerful tool for mission operations

    NASA Technical Reports Server (NTRS)

    Wobbe, Hubertus; Braun, Armin

    1994-01-01

    The functions and features of the sequence of events (SOE) and flight operations procedures (FOP) generator developed and used at DLR/GSOC for the positioning of EUTELSAT 2 satellites are presented. The SOE and FOP are the main operational documents that are prepared for nominal as well as for non-nominal mission execution. Their structure and application are described. Both of these documents are generated, validated, and maintained by a common software tool. Its main features and advantages are demonstrated. The tool has been improved continuously over the last 5 years. Due to its flexibility it can easily be applied to other projects and new features may be added.

  11. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the

  12. A decision support tool for landfill methane generation and gas collection.

    PubMed

    Emkes, Harriet; Coulon, Frédéric; Wagland, Stuart

    2015-09-01

    This study presents a decision support tool (DST) to enhance methane generation at individual landfill sites. To date there is no such tool available to provide landfill decision makers with clear and simplified information to evaluate biochemical processes within a landfill site, to assess performance of gas production and to identify potential remedies to any issues. The current lack in understanding stems from the complexity of the landfill waste degradation process. Two scoring sets for landfill gas production performance are calculated with the tool: (1) methane output score which measures the deviation of the actual methane output rate at each site which the prediction generated by the first order decay model LandGEM; and (2) landfill gas indicators' score, which measures the deviation of the landfill gas indicators from their ideal ranges for optimal methane generation conditions. Landfill gas indicators include moisture content, temperature, alkalinity, pH, BOD, COD, BOD/COD ratio, ammonia, chloride, iron and zinc. A total landfill gas indicator score is provided using multi-criteria analysis to calculate the sum of weighted scores for each indicator. The weights for each indicator are calculated using an analytical hierarchical process. The tool is tested against five real scenarios for landfill sites in UK with a range of good, average and poor landfill methane generation over a one year period (2012). An interpretation of the results is given for each scenario and recommendations are highlighted for methane output rate enhancement. Results demonstrate how the tool can help landfill managers and operators to enhance their understanding of methane generation at a site-specific level, track landfill methane generation over time, compare and rank sites, and identify problems areas within a landfill site. PMID:26168873

  13. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  14. Short circuiting the circadian system with a new generation of precision tools.

    PubMed

    Loh, Dawn H; Kudo, Takashi; Colwell, Christopher S

    2015-03-01

    Circadian behavior in mammals is coordinated by neurons within the suprachiasmatic nucleus (SCN). In this issue, Lee et al. (2015) and Mieda et al. (2015) applied state-of-the-art genetic tools to dissect the microcircuits within the SCN generating circadian rhythmic behavior. PMID:25741718

  15. Evaluation of Computer Tools for Idea Generation and Team Formation in Project-Based Learning

    ERIC Educational Resources Information Center

    Ardaiz-Villanueva, Oscar; Nicuesa-Chacon, Xabier; Brene-Artazcoz, Oscar; Sanz de Acedo Lizarraga, Maria Luisa; Sanz de Acedo Baquedano, Maria Teresa

    2011-01-01

    The main objective of this research was to validate the effectiveness of Wikideas and Creativity Connector tools to stimulate the generation of ideas and originality by university students organized into groups according to their indexes of creativity and affinity. Another goal of the study was to evaluate the classroom climate created by these…

  16. Trajectory Assessment and Modification Tools for Next Generation Air Traffic Management Operations

    NASA Technical Reports Server (NTRS)

    Brasil, Connie; Lee, Paul; Mainini, Matthew; Lee, Homola; Lee, Hwasoo; Prevot, Thomas; Smith, Nancy

    2011-01-01

    This paper reviews three Next Generation Air Transportation System (NextGen) based high fidelity air traffic control human-in-the-loop (HITL) simulations, with a focus on the expected requirement of enhanced automated trajectory assessment and modification tools to support future air traffic flow management (ATFM) planning positions. The simulations were conducted at the National Aeronautics and Space Administration (NASA) Ames Research Centers Airspace Operations Laboratory (AOL) in 2009 and 2010. The test airspace for all three simulations assumed the mid-term NextGenEn-Route high altitude environment utilizing high altitude sectors from the Kansas City and Memphis Air Route Traffic Control Centers. Trajectory assessment, modification and coordination decision support tools were developed at the AOL in order to perform future ATFM tasks. Overall tool usage results and user acceptability ratings were collected across three areas of NextGen operatoins to evaluate the tools. In addition to the usefulness and usability feedback, feasibility issues, benefits, and future requirements were also addressed. Overall, the tool sets were rated very useful and usable, and many elements of the tools received high scores and were used frequently and successfully. Tool utilization results in all three HITLs showed both user and system benefits including better airspace throughput, reduced controller workload, and highly effective communication protocols in both full Data Comm and mixed-equipage environments.

  17. The role of optimization in the next generation of computer-based design tools

    NASA Technical Reports Server (NTRS)

    Rogan, J. Edward

    1989-01-01

    There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.

  18. General application of rapid 3-D digitizing and tool path generation for complex shapes

    SciTech Connect

    Kwok, K.S.; Loucks, C.S.; Driessen, B.J.

    1997-09-01

    A system for automatic tool path generation was developed at Sandia National Laboratories for finish machining operations. The system consists of a commercially available 5-axis milling machine controlled by Sandia developed software. This system was used to remove overspray on cast turbine blades. A laser-based, structured-light sensor, mounted on a tool holder, is used to collect 3D data points around the surface of the turbine blade. Using the digitized model of the blade, a tool path is generated which will drive a 0.375 inch grinding pin around the tip of the blade. A fuzzified digital filter was developed to properly eliminate false sensor readings caused by burrs, holes and overspray. The digital filter was found to successfully generate the correct tool path for a blade with intentionally scanned holes and defects. The fuzzified filter improved the computation efficiency by a factor of 25. For application to general parts, an adaptive scanning algorithm was developed and presented with simulation and experimental results. A right pyramid and an ellipsoid were scanned successfully with the adaptive algorithm in simulation studies. In actual experiments, a nose cone and a turbine blade were successfully scanned. A complex shaped turbine blade was successfully scanned and finished machined using these algorithms.

  19. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  20. Fault Simulation and Test Generation for Transistor Shorts Using Stuck-at Test Tools

    NASA Astrophysics Data System (ADS)

    Higami, Yoshinobu; Saluja, Kewal K.; Takahashi, Hiroshi; Kobayashi, Shin-Ya; Takamatsu, Yuzo

    This paper presents methods for detecting transistor short faults using logic level fault simulation and test generation. The paper considers two types of transistor level faults, namely strong shorts and weak shorts, which were introduced in our previous research. These faults are defined based on the values of outputs of faulty gates. The proposed fault simulation and test generation are performed using gate-level tools designed to deal with stuck-at faults, and no transistor-level tools are required. In the test generation process, a circuit is modified by inserting inverters, and a stuck-at test generator is used. The modification of a circuit does not mean a design-for-testability technique, as the modified circuit is used only during the test generation process. Further, generated test patterns are compacted by fault simulation. Also, since the weak short model involves uncertainty in its behavior, we define fault coverage and fault efficiency in three different way, namely, optimistic, pessimistic and probabilistic and assess them. Finally, experimental results for ISCAS benchmark circuits are used to demonstrate the effectiveness of the proposed methods.

  1. Forecasting municipal solid waste generation using prognostic tools and regression analysis.

    PubMed

    Ghinea, Cristina; Drăgoi, Elena Niculina; Comăniţă, Elena-Diana; Gavrilescu, Marius; Câmpean, Teofil; Curteanu, Silvia; Gavrilescu, Maria

    2016-11-01

    For an adequate planning of waste management systems the accurate forecast of waste generation is an essential step, since various factors can affect waste trends. The application of predictive and prognosis models are useful tools, as reliable support for decision making processes. In this paper some indicators such as: number of residents, population age, urban life expectancy, total municipal solid waste were used as input variables in prognostic models in order to predict the amount of solid waste fractions. We applied Waste Prognostic Tool, regression analysis and time series analysis to forecast municipal solid waste generation and composition by considering the Iasi Romania case study. Regression equations were determined for six solid waste fractions (paper, plastic, metal, glass, biodegradable and other waste). Accuracy Measures were calculated and the results showed that S-curve trend model is the most suitable for municipal solid waste (MSW) prediction. PMID:27454099

  2. Developing Next-Generation Telehealth Tools and Technologies: Patients, Systems, and Data Perspectives

    PubMed Central

    Filart, Rosemarie; Burgess, Lawrence P.; Lee, Insup; Poropatich, Ronald K.

    2010-01-01

    Abstract The major goals of telemedicine today are to develop next-generation telehealth tools and technologies to enhance healthcare delivery to medically underserved populations using telecommunication technology, to increase access to medical specialty services while decreasing healthcare costs, and to provide training of healthcare providers, clinical trainees, and students in health-related fields. Key drivers for these tools and technologies are the need and interest to collaborate among telehealth stakeholders, including patients, patient communities, research funders, researchers, healthcare services providers, professional societies, industry, healthcare management/economists, and healthcare policy makers. In the development, marketing, adoption, and implementation of these tools and technologies, communication, training, cultural sensitivity, and end-user customization are critical pieces to the process. Next-generation tools and technologies are vehicles toward personalized medicine, extending the telemedicine model to include cell phones and Internet-based telecommunications tools for remote and home health management with video assessment, remote bedside monitoring, and patient-specific care tools with event logs, patient electronic profile, and physician note-writing capability. Telehealth is ultimately a system of systems in scale and complexity. To cover the full spectrum of dynamic and evolving needs of end-users, we must appreciate system complexity as telehealth moves toward increasing functionality, integration, interoperability, outreach, and quality of service. Toward that end, our group addressed three overarching questions: (1) What are the high-impact topics? (2) What are the barriers to progress? and (3) What roles can the National Institutes of Health and its various institutes and centers play in fostering the future development of telehealth? PMID:20043711

  3. Developing next-generation telehealth tools and technologies: patients, systems, and data perspectives.

    PubMed

    Ackerman, Michael J; Filart, Rosemarie; Burgess, Lawrence P; Lee, Insup; Poropatich, Ronald K

    2010-01-01

    The major goals of telemedicine today are to develop next-generation telehealth tools and technologies to enhance healthcare delivery to medically underserved populations using telecommunication technology, to increase access to medical specialty services while decreasing healthcare costs, and to provide training of healthcare providers, clinical trainees, and students in health-related fields. Key drivers for these tools and technologies are the need and interest to collaborate among telehealth stakeholders, including patients, patient communities, research funders, researchers, healthcare services providers, professional societies, industry, healthcare management/economists, and healthcare policy makers. In the development, marketing, adoption, and implementation of these tools and technologies, communication, training, cultural sensitivity, and end-user customization are critical pieces to the process. Next-generation tools and technologies are vehicles toward personalized medicine, extending the telemedicine model to include cell phones and Internet-based telecommunications tools for remote and home health management with video assessment, remote bedside monitoring, and patient-specific care tools with event logs, patient electronic profile, and physician note-writing capability. Telehealth is ultimately a system of systems in scale and complexity. To cover the full spectrum of dynamic and evolving needs of end-users, we must appreciate system complexity as telehealth moves toward increasing functionality, integration, interoperability, outreach, and quality of service. Toward that end, our group addressed three overarching questions: (1) What are the high-impact topics? (2) What are the barriers to progress? and (3) What roles can the National Institutes of Health and its various institutes and centers play in fostering the future development of telehealth? PMID:20043711

  4. Method and tool for generating and managing image quality allocations through the design and development process

    NASA Astrophysics Data System (ADS)

    Sparks, Andrew W.; Olson, Craig; Theisen, Michael J.; Addiego, Chris J.; Hutchins, Tiffany G.; Goodman, Timothy D.

    2016-05-01

    Performance models for infrared imaging systems require image quality parameters; optical design engineers need image quality design goals; systems engineers develop image quality allocations to test imaging systems against. It is a challenge to maintain consistency and traceability amongst the various expressions of image quality. We present a method and parametric tool for generating and managing expressions of image quality during the system modeling, requirements specification, design, and testing phases of an imaging system design and development project.

  5. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  6. Automatic generation of bioinformatics tools for predicting protein–ligand binding sites

    PubMed Central

    Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-01-01

    Motivation: Predictive tools that model protein–ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein–ligand binding predictive tools would be useful. Results: We developed a system for automatically generating protein–ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5–1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. Availability and implementation: The source code and web application are freely available for download at http://utprot.net. They are implemented in Python and supported on Linux. Contact: shimizu@bi.a.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26545824

  7. CHOPCHOP v2: a web tool for the next generation of CRISPR genome engineering.

    PubMed

    Labun, Kornel; Montague, Tessa G; Gagnon, James A; Thyme, Summer B; Valen, Eivind

    2016-07-01

    In just 3 years CRISPR genome editing has transformed biology, and its popularity and potency continue to grow. New CRISPR effectors and rules for locating optimum targets continue to be reported, highlighting the need for computational CRISPR targeting tools to compile these rules and facilitate target selection and design. CHOPCHOP is one of the most widely used web tools for CRISPR- and TALEN-based genome editing. Its overarching principle is to provide an intuitive and powerful tool that can serve both novice and experienced users. In this major update we introduce tools for the next generation of CRISPR advances, including Cpf1 and Cas9 nickases. We support a number of new features that improve the targeting power, usability and efficiency of CHOPCHOP. To increase targeting range and specificity we provide support for custom length sgRNAs, and we evaluate the sequence composition of the whole sgRNA and its surrounding region using models compiled from multiple large-scale studies. These and other new features, coupled with an updated interface for increased usability and support for a continually growing list of organisms, maintain CHOPCHOP as one of the leading tools for CRISPR genome editing. CHOPCHOP v2 can be found at http://chopchop.cbu.uib.no. PMID:27185894

  8. System-level tools and reconfigurable computing for next-generation HWIL systems

    NASA Astrophysics Data System (ADS)

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy

    2001-08-01

    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  9. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  10. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    SciTech Connect

    Boe, Timothy; Lemieux, Paul; Schultheisz, Daniel; Peake, Tom; Hayes, Colin

    2013-07-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies on the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri{sup R} ArcGIS{sup R} scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus{sup R}-MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel{sup R} 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)

  11. Automatic generation of conceptual database design tools from data model specifications

    SciTech Connect

    Hong, Shuguang.

    1989-01-01

    The problems faced in the design and implementation of database software systems based on object-oriented data models are similar to that of other software design, i.e., difficult, complex, yet redundant effort. Automatic generation of database software system has been proposed as a solution to the problems. In order to generate database software system for a variety of object-oriented data models, two critical issues: data model specification and software generation, must be addressed. SeaWeed is a software system that automatically generates conceptual database design tools from data model specifications. A meta model has been defined for the specification of a class of object-oriented data models. This meta model provides a set of primitive modeling constructs that can be used to express the semantics, or unique characteristics, of specific data models. Software reusability has been adopted for the software generation. The technique of design reuse is utilized to derive the requirement specification of the software to be generated from data model specifications. The mechanism of code reuse is used to produce the necessary reusable software components. This dissertation presents the research results of SeaWeed including the meta model, data model specification, a formal representation of design reuse and code reuse, and the software generation paradigm.

  12. Geological applications of automatic grid generation tools for finite elements applied to porous flow modeling

    SciTech Connect

    Gable, C.W.; Trease, H.E.; Cherry, T.A.

    1996-04-01

    The construction of grids that accurately reflect geologic structure and stratigraphy for computational flow and transport models poses a formidable task. Even with a complete understanding of stratigraphy, material properties, boundary and initial conditions, the task of incorporating data into a numerical model can be difficult and time consuming. Furthermore, most tools available for representing complex geologic surfaces and volumes are not designed for producing optimal grids for flow and transport computation. We have developed a modeling tool, GEOMESH, for automating finite element grid generation that maintains the geometric integrity of geologic structure and stratigraphy. The method produces an optimal (Delaunay) tetrahedral grid that can be used for flow and transport computations. The process of developing a flow and transport model can be divided into three parts: (1) Developing accurate conceptual models inclusive of geologic interpretation, material characterization and construction of a stratigraphic and hydrostratigraphic framework model, (2) Building and initializing computational frameworks; grid generation, boundary and initial conditions, (3) Computational physics models of flow and transport. Process (1) and (3) have received considerable attention whereas (2) has not. This work concentrates on grid generation and its connections to geologic characterization and process modeling. Applications of GEOMESH illustrate grid generation for two dimensional cross sections, three dimensional regional models, and adaptive grid refinement in three dimensions. Examples of grid representation of wells and tunnels with GEOMESH can be found in Cherry et al. The resulting grid can be utilized by unstructured finite element or integrated finite difference models.

  13. A generative tool for building health applications driven by ISO 13606 archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques. PMID:21968574

  14. Simulation Tool to Assess Mechanical and Electrical Stresses on Wind Turbine Generators: Preprint

    SciTech Connect

    Singh, M.; Muljadi, E.; Gevorgian, V.; Jonkman, J.

    2013-10-01

    Wind turbine generators (WTGs) consist of many different components to convert kinetic energy of the wind into electrical energy for end users. Wind energy is accessed to provide mechanical torque for driving the shaft of the electrical generator. The conversion from wind power to mechanical power is governed by the aerodynamic conversion. The aerodynamic-electrical-conversion efficiency of a WTGis influenced by the efficiency of the blades, the gearbox, the generator, and the power converter. This paper describes the use of MATLAB/Simulink to simulate the electrical and grid-related aspects of a WTG coupled with the FAST aero-elastic wind turbine computer-aided engineering tool to simulate the aerodynamic and mechanical aspects of a WTG. The combination of the two enables studiesinvolving both electrical and mechanical aspects of a WTG. This digest includes some examples of the capabilities of the FAST and MATLAB coupling, namely the effects of electrical faults on the blade moments.

  15. A survey of tools for variant analysis of next-generation genome sequencing data

    PubMed Central

    Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R.; Zschocke, Johannes

    2014-01-01

    Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

  16. Metagenomics: Tools and Insights for Analyzing Next-Generation Sequencing Data Derived from Biodiversity Studies

    PubMed Central

    Oulas, Anastasis; Pavloudi, Christina; Polymenakou, Paraskevi; Pavlopoulos, Georgios A; Papanikolaou, Nikolas; Kotoulas, Georgios; Arvanitidis, Christos; Iliopoulos, Ioannis

    2015-01-01

    Advances in next-generation sequencing (NGS) have allowed significant breakthroughs in microbial ecology studies. This has led to the rapid expansion of research in the field and the establishment of “metagenomics”, often defined as the analysis of DNA from microbial communities in environmental samples without prior need for culturing. Many metagenomics statistical/computational tools and databases have been developed in order to allow the exploitation of the huge influx of data. In this review article, we provide an overview of the sequencing technologies and how they are uniquely suited to various types of metagenomic studies. We focus on the currently available bioinformatics techniques, tools, and methodologies for performing each individual step of a typical metagenomic dataset analysis. We also provide future trends in the field with respect to tools and technologies currently under development. Moreover, we discuss data management, distribution, and integration tools that are capable of performing comparative metagenomic analyses of multiple datasets using well-established databases, as well as commonly used annotation standards. PMID:25983555

  17. Mutation based treatment recommendations from next generation sequencing data: a comparison of web tools

    PubMed Central

    Patel, Jaymin M.; Knopf, Joshua; Reiner, Eric; Bossuyt, Veerle; Epstein, Lianne; DiGiovanna, Michael; Chung, Gina; Silber, Andrea; Sanft, Tara; Hofstatter, Erin; Mougalian, Sarah; Abu-Khalaf, Maysa; Platt, James; Shi, Weiwei; Gershkovich, Peter; Hatzis, Christos; Pusztai, Lajos

    2016-01-01

    Interpretation of complex cancer genome data, generated by tumor target profiling platforms, is key for the success of personalized cancer therapy. How to draw therapeutic conclusions from tumor profiling results is not standardized and may vary among commercial and academically-affiliated recommendation tools. We performed targeted sequencing of 315 genes from 75 metastatic breast cancer biopsies using the FoundationOne assay. Results were run through 4 different web tools including the Drug-Gene Interaction Database (DGidb), My Cancer Genome (MCG), Personalized Cancer Therapy (PCT), and cBioPortal, for drug and clinical trial recommendations. These recommendations were compared amongst each other and to those provided by FoundationOne. The identification of a gene as targetable varied across the different recommendation sources. Only 33% of cases had 4 or more sources recommend the same drug for at least one of the usually several altered genes found in tumor biopsies. These results indicate further development and standardization of broadly applicable software tools that assist in our therapeutic interpretation of genomic data is needed. Existing algorithms for data acquisition, integration and interpretation will likely need to incorporate artificial intelligence tools to improve both content and real-time status. PMID:26980737

  18. Mutation based treatment recommendations from next generation sequencing data: a comparison of web tools.

    PubMed

    Patel, Jaymin M; Knopf, Joshua; Reiner, Eric; Bossuyt, Veerle; Epstein, Lianne; DiGiovanna, Michael; Chung, Gina; Silber, Andrea; Sanft, Tara; Hofstatter, Erin; Mougalian, Sarah; Abu-Khalaf, Maysa; Platt, James; Shi, Weiwei; Gershkovich, Peter; Hatzis, Christos; Pusztai, Lajos

    2016-04-19

    Interpretation of complex cancer genome data, generated by tumor target profiling platforms, is key for the success of personalized cancer therapy. How to draw therapeutic conclusions from tumor profiling results is not standardized and may vary among commercial and academically-affiliated recommendation tools. We performed targeted sequencing of 315 genes from 75 metastatic breast cancer biopsies using the FoundationOne assay. Results were run through 4 different web tools including the Drug-Gene Interaction Database (DGidb), My Cancer Genome (MCG), Personalized Cancer Therapy (PCT), and cBioPortal, for drug and clinical trial recommendations. These recommendations were compared amongst each other and to those provided by FoundationOne. The identification of a gene as targetable varied across the different recommendation sources. Only 33% of cases had 4 or more sources recommend the same drug for at least one of the usually several altered genes found in tumor biopsies. These results indicate further development and standardization of broadly applicable software tools that assist in our therapeutic interpretation of genomic data is needed. Existing algorithms for data acquisition, integration and interpretation will likely need to incorporate artificial intelligence tools to improve both content and real-time status. PMID:26980737

  19. Angular Determination of Toolmarks Using a Computer-Generated Virtual Tool.

    PubMed

    Spotts, Ryan; Chumbley, L Scott; Ekstrand, Laura; Zhang, Song; Kreiser, James

    2015-07-01

    A blind study to determine whether virtual toolmarks created using a computer could be used to identify and characterize angle of incidence of physical toolmarks was conducted. Six sequentially manufactured screwdriver tips and one random screwdriver were used to create toolmarks at various angles. An apparatus controlled tool angle. Resultant toolmarks were randomly coded and sent to the researchers, who scanned both tips and toolmarks using an optical profilometer to obtain 3D topography data. Developed software was used to create virtual marks based on the tool topography data. Virtual marks generated at angles from 30 to 85° (5° increments) were compared to physical toolmarks using a statistical algorithm. Twenty of twenty toolmarks were correctly identified by the algorithm. On average, the algorithm misidentified the correct angle of incidence by -6.12°. This study presents the results, their significance, and offers reasons for the average angular misidentification. PMID:25929523

  20. ModelMage: a tool for automatic model generation, selection and management.

    PubMed

    Flöttmann, Max; Schaber, Jörg; Hoops, Stephan; Klipp, Edda; Mendes, Pedro

    2008-01-01

    Mathematical modeling of biological systems usually involves implementing, simulating, and discriminating several candidate models that represent alternative hypotheses. Generating and managing these candidate models is a tedious and difficult task and can easily lead to errors. ModelMage is a tool that facilitates management of candidate models. It is designed for the easy and rapid development, generation, simulation, and discrimination of candidate models. The main idea of the program is to automatically create a defined set of model alternatives from a single master model. The user provides only one SBML-model and a set of directives from which the candidate models are created by leaving out species, modifiers or reactions. After generating models the software can automatically fit all these models to the data and provides a ranking for model selection, in case data is available. In contrast to other model generation programs, ModelMage aims at generating only a limited set of models that the user can precisely define. ModelMage uses COPASI as a simulation and optimization engine. Thus, all simulation and optimization features of COPASI are readily incorporated. ModelMage can be downloaded from http://sysbio.molgen.mpg.de/modelmage and is distributed as free software. PMID:19425122

  1. Automated protein motif generation in the structure-based protein function prediction tool ProMOL.

    PubMed

    Osipovitch, Mikhail; Lambrecht, Mitchell; Baker, Cameron; Madha, Shariq; Mills, Jeffrey L; Craig, Paul A; Bernstein, Herbert J

    2015-12-01

    ProMOL, a plugin for the PyMOL molecular graphics system, is a structure-based protein function prediction tool. ProMOL includes a set of routines for building motif templates that are used for screening query structures for enzyme active sites. Previously, each motif template was generated manually and required supervision in the optimization of parameters for sensitivity and selectivity. We developed an algorithm and workflow for the automation of motif building and testing routines in ProMOL. The algorithm uses a set of empirically derived parameters for optimization and requires little user intervention. The automated motif generation algorithm was first tested in a performance comparison with a set of manually generated motifs based on identical active sites from the same 112 PDB entries. The two sets of motifs were equally effective in identifying alignments with homologs and in rejecting alignments with unrelated structures. A second set of 296 active site motifs were generated automatically, based on Catalytic Site Atlas entries with literature citations, as an expansion of the library of existing manually generated motif templates. The new motif templates exhibited comparable performance to the existing ones in terms of hit rates against native structures, homologs with the same EC and Pfam designations, and randomly selected unrelated structures with a different EC designation at the first EC digit, as well as in terms of RMSD values obtained from local structural alignments of motifs and query structures. This research is supported by NIH grant GM078077. PMID:26573864

  2. Face acquisition camera design using the NV-IPM image generation tool

    NASA Astrophysics Data System (ADS)

    Howell, Christopher L.; Choi, Hee-Sue; Reynolds, Joseph P.

    2015-05-01

    In this paper, we demonstrate the utility of the Night Vision Integrated Performance Model (NV-IPM) image generation tool by using it to create a database of face images with controlled degradations. Available face recognition algorithms can then be used to directly evaluate camera designs using these degraded images. By controlling camera effects such as blur, noise, and sampling, we can analyze algorithm performance and establish a more complete performance standard for face acquisition cameras. The ability to accurately simulate imagery and directly test with algorithms not only improves the system design process but greatly reduces development cost.

  3. Customised 3D Printing: An Innovative Training Tool for the Next Generation of Orbital Surgeons.

    PubMed

    Scawn, Richard L; Foster, Alex; Lee, Bradford W; Kikkawa, Don O; Korn, Bobby S

    2015-01-01

    Additive manufacturing or 3D printing is the process by which three dimensional data fields are translated into real-life physical representations. 3D printers create physical printouts using heated plastics in a layered fashion resulting in a three-dimensional object. We present a technique for creating customised, inexpensive 3D orbit models for use in orbital surgical training using 3D printing technology. These models allow trainee surgeons to perform 'wet-lab' orbital decompressions and simulate upcoming surgeries on orbital models that replicate a patient's bony anatomy. We believe this represents an innovative training tool for the next generation of orbital surgeons. PMID:26121063

  4. Synthetic biology in mammalian cells: Next generation research tools and therapeutics

    PubMed Central

    Lienert, Florian; Lohmueller, Jason J; Garg, Abhishek; Silver, Pamela A

    2014-01-01

    Recent progress in DNA manipulation and gene circuit engineering has greatly improved our ability to programme and probe mammalian cell behaviour. These advances have led to a new generation of synthetic biology research tools and potential therapeutic applications. Programmable DNA-binding domains and RNA regulators are leading to unprecedented control of gene expression and elucidation of gene function. Rebuilding complex biological circuits such as T cell receptor signalling in isolation from their natural context has deepened our understanding of network motifs and signalling pathways. Synthetic biology is also leading to innovative therapeutic interventions based on cell-based therapies, protein drugs, vaccines and gene therapies. PMID:24434884

  5. Unexpected benefits of deciding by mind wandering

    PubMed Central

    Giblin, Colleen E.; Morewedge, Carey K.; Norton, Michael I.

    2013-01-01

    The mind wanders, even when people are attempting to make complex decisions. We suggest that mind wandering—allowing one's thoughts to wander until the “correct” choice comes to mind—can positively impact people's feelings about their decisions. We compare post-choice satisfaction from choices made by mind wandering to reason-based choices and randomly assigned outcomes. Participants chose a poster by mind wandering or deliberating, or were randomly assigned a poster. Whereas forecasters predicted that participants who chose by mind wandering would evaluate their outcome as inferior to participants who deliberated (Experiment 1), participants who used mind wandering as a decision strategy evaluated their choice just as positively as did participants who used deliberation (Experiment 2). In some cases, it appears that people can spare themselves the effort of deliberation and instead “decide by wind wandering,” yet experience no decrease in satisfaction. PMID:24046760

  6. Experiences with the application of the ADIC automatic differentiation tool for to the CSCMDO 3-D volume grid generation code

    SciTech Connect

    Bischof, C.H.; Mauer, A.; Jones, W.T.

    1995-12-31

    Automatic differentiation (AD) is a methodology for developing reliable sensitivity-enhanced versions of arbitrary computer programs with little human effort. It can vastly accelerate the use of advanced simulation codes in multidisciplinary design optimization, since the time for generating and verifying derivative codes is greatly reduced. In this paper, we report on the application of the recently developed ADIC automatic differentiation tool for ANSI C programs to the CSCMDO multiblock three-dimensional volume grid generator. The ADIC-generated code can easily be interfaced with Fortran derivative codes generated with the ADIFOR AD tool FORTRAN 77 programs, thus providing efficient sensitivity-enhancement techniques for multilanguage, multidiscipline problems.

  7. Gravity Waves Generated by Convection: A New Idealized Model Tool and Direct Validation with Satellite Observations

    NASA Astrophysics Data System (ADS)

    Alexander, M. Joan; Stephan, Claudia

    2015-04-01

    In climate models, gravity waves remain too poorly resolved to be directly modelled. Instead, simplified parameterizations are used to include gravity wave effects on model winds. A few climate models link some of the parameterized waves to convective sources, providing a mechanism for feedback between changes in convection and gravity wave-driven changes in circulation in the tropics and above high-latitude storms. These convective wave parameterizations are based on limited case studies with cloud-resolving models, but they are poorly constrained by observational validation, and tuning parameters have large uncertainties. Our new work distills results from complex, full-physics cloud-resolving model studies to essential variables for gravity wave generation. We use the Weather Research Forecast (WRF) model to study relationships between precipitation, latent heating/cooling and other cloud properties to the spectrum of gravity wave momentum flux above midlatitude storm systems. Results show the gravity wave spectrum is surprisingly insensitive to the representation of microphysics in WRF. This is good news for use of these models for gravity wave parameterization development since microphysical properties are a key uncertainty. We further use the full-physics cloud-resolving model as a tool to directly link observed precipitation variability to gravity wave generation. We show that waves in an idealized model forced with radar-observed precipitation can quantitatively reproduce instantaneous satellite-observed features of the gravity wave field above storms, which is a powerful validation of our understanding of waves generated by convection. The idealized model directly links observations of surface precipitation to observed waves in the stratosphere, and the simplicity of the model permits deep/large-area domains for studies of wave-mean flow interactions. This unique validated model tool permits quantitative studies of gravity wave driving of regional

  8. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    NASA Astrophysics Data System (ADS)

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi

  9. Developing a tool to estimate water withdrawal and consumption in electricity generation in the United States.

    SciTech Connect

    Wu, M.; Peng, J.

    2011-02-24

    Freshwater consumption for electricity generation is projected to increase dramatically in the next couple of decades in the United States. The increased demand is likely to further strain freshwater resources in regions where water has already become scarce. Meanwhile, the automotive industry has stepped up its research, development, and deployment efforts on electric vehicles (EVs) and plug-in hybrid electric vehicles (PHEVs). Large-scale, escalated production of EVs and PHEVs nationwide would require increased electricity production, and so meeting the water demand becomes an even greater challenge. The goal of this study is to provide a baseline assessment of freshwater use in electricity generation in the United States and at the state level. Freshwater withdrawal and consumption requirements for power generated from fossil, nonfossil, and renewable sources via various technologies and by use of different cooling systems are examined. A data inventory has been developed that compiles data from government statistics, reports, and literature issued by major research institutes. A spreadsheet-based model has been developed to conduct the estimates by means of a transparent and interactive process. The model further allows us to project future water withdrawal and consumption in electricity production under the forecasted increases in demand. This tool is intended to provide decision makers with the means to make a quick comparison among various fuel, technology, and cooling system options. The model output can be used to address water resource sustainability when considering new projects or expansion of existing plants.

  10. System diagnostic builder: a rule-generation tool for expert systems that do intelligent data evaluation

    NASA Astrophysics Data System (ADS)

    Nieten, Joseph L.; Burke, Roger

    1993-03-01

    The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.

  11. Mobile Phones Democratize and Cultivate Next-Generation Imaging, Diagnostics and Measurement Tools

    PubMed Central

    Ozcan, Aydogan

    2014-01-01

    In this article, I discuss some of the emerging applications and the future opportunities and challenges created by the use of mobile phones and their embedded components for the development of next-generation imaging, sensing, diagnostics and measurement tools. The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones, transforming the mobile phone into a cost-effective and yet extremely powerful platform to run e.g., biomedical tests and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally. PMID:24647550

  12. New generation of medium wattage metal halide lamps and spectroscopic tools for their diagnostics

    NASA Astrophysics Data System (ADS)

    Dunaevsky, A.; Tu, J.; Gibson, R.; Steere, T.; Graham, K.; van der Eyden, J.

    2010-11-01

    A new generation of ceramic metal halide high intensity discharge (HID) lamps has achieved high efficiencies by implementing new design concepts. The shape of the ceramic burner is optimized to withstand high temperatures with minimal thermal stress. Corrosion processes with the ceramic walls are slowed down via adoption of non-aggressive metal halide chemistry. Light losses over life due to tungsten deposition on the walls are minimized by maintaining a self-cleaning chemical process, known as tungsten cycle. All these advancements have made the new ceramic metal halide lamps comparable to high pressure sodium lamps for luminous efficacy, life, and maintenance while providing white light with high color rendering. Direct replacement of quartz metal halide lamps and systems results in the energy saving from 18 up to 50%. High resolution spectroscopy remains the major non-destructive tool for the ceramic metal halide lamps. Approaches to reliable measurements of relative partial pressures of the arc species are discussed.

  13. Numerical study of electromagnetic waves generated by a prototype dielectric logging tool

    USGS Publications Warehouse

    Ellefsen, K.J.; Abraham, J.D.; Wright, D.L.; Mazzella, A.T.

    2004-01-01

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a numerical study was conducted using both the finite-difference, time-domain method and a frequency-wavenumber method. When the propagation velocity in the borehole was greater than that in the formation (e.g., an air-filled borehole in the unsaturated zone), only a guided wave propagated along the borehole. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave radiated electromagnetic energy into the formation, causing its amplitude to decrease. When the propagation velocity in the borehole was less than that in the formation (e.g., a water-filled borehole in the saturated zone), both a refracted wave and a guided wave propagated along the borehole. The velocity of the refracted wave equaled the phase velocity of a plane wave in the formation, and the refracted wave preceded the guided wave. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave did not radiate electromagnetic energy into the formation. To analyze traces recorded by the prototype tool during laboratory tests, they were compared to traces calculated with the finite-difference method. The first parts of both the recorded and the calculated traces were similar, indicating that guided and refracted waves indeed propagated along the prototype tool. ?? 2004 Society of Exploration Geophysicists. All rights reserved.

  14. Virtual Geographic Environments (VGEs): A New Generation of Geographic Analysis Tool

    NASA Astrophysics Data System (ADS)

    Lin, Hui; Chen, Min; Lu, Guonian; Zhu, Qing; Gong, Jiahua; You, Xiong; Wen, Yongning; Xu, Bingli; Hu, Mingyuan

    2013-11-01

    Virtual Geographic Environments (VGEs) are proposed as a new generation of geographic analysis tool to contribute to human understanding of the geographic world and assist in solving geographic problems at a deeper level. The development of VGEs is focused on meeting the three scientific requirements of Geographic Information Science (GIScience) — multi-dimensional visualization, dynamic phenomenon simulation, and public participation. To provide a clearer image that improves user understanding of VGEs and to contribute to future scientific development, this article reviews several aspects of VGEs. First, the evolutionary process from maps to previous GISystems and then to VGEs is illustrated, with a particular focus on the reasons VGEs were created. Then, extended from the conceptual framework and the components of a complete VGE, three use cases are identified that together encompass the current state of VGEs at different application levels: 1) a tool for geo-object-based multi-dimensional spatial analysis and multi-channel interaction, 2) a platform for geo-process-based simulation of dynamic geographic phenomena, and 3) a workspace for multi-participant-based collaborative geographic experiments. Based on the above analysis, the differences between VGEs and other similar platforms are discussed to draw their clear boundaries. Finally, a short summary of the limitations of current VGEs is given, and future directions are proposed to facilitate ongoing progress toward forming a comprehensive version of VGEs.

  15. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    SciTech Connect

    Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan; Saha, Pradip; Loewen, Eric

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parameters applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.

  16. A Practical Comparison of De Novo Genome Assembly Software Tools for Next-Generation Sequencing Technologies

    PubMed Central

    Zhang, Wenyu; Chen, Jiajia; Yang, Yang; Tang, Yifei; Shang, Jing; Shen, Bairong

    2011-01-01

    The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulated datasets from Solexa sequencing platform. Considering the computational time, maximum random access memory (RAM) occupancy, assembly accuracy and integrity, our study indicate that string-based assemblers, overlap-layout-consensus (OLC) assemblers are well-suited for very short reads and longer reads of small genomes respectively. For large datasets of more than hundred millions of short reads, De Bruijn graph-based assemblers would be more appropriate. In terms of software implementation, string-based assemblers are superior to graph-based ones, of which SOAPdenovo is complex for the creation of configuration file. Our comparison study will assist researchers in selecting a well-suited assembler and offer essential information for the improvement of existing assemblers or the developing of novel assemblers. PMID:21423806

  17. Recombineering strategies for developing next generation BAC transgenic tools for optogenetics and beyond

    PubMed Central

    Ting, Jonathan T.; Feng, Guoping

    2014-01-01

    The development and application of diverse BAC transgenic rodent lines has enabled rapid progress for precise molecular targeting of genetically-defined cell types in the mammalian central nervous system. These transgenic tools have played a central role in the optogenetic revolution in neuroscience. Indeed, an overwhelming proportion of studies in this field have made use of BAC transgenic Cre driver lines to achieve targeted expression of optogenetic probes in the brain. In addition, several BAC transgenic mouse lines have been established for direct cell-type specific expression of Channelrhodopsin-2 (ChR2). While the benefits of these new tools largely outweigh any accompanying challenges, many available BAC transgenic lines may suffer from confounds due in part to increased gene dosage of one or more “extra” genes contained within the large BAC DNA sequences. Here we discuss this under-appreciated issue and propose strategies for developing the next generation of BAC transgenic lines that are devoid of extra genes. Furthermore, we provide evidence that these strategies are simple, reproducible, and do not disrupt the intended cell-type specific transgene expression patterns for several distinct BAC clones. These strategies may be widely implemented for improved BAC transgenesis across diverse disciplines. PMID:24772073

  18. LCFM - LIVING COLOR FRAME MAKER: PC GRAPHICS GENERATION AND MANAGEMENT TOOL FOR REAL-TIME APPLICATIONS

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1994-01-01

    Computer graphics are often applied for better understanding and interpretation of data under observation. These graphics become more complicated when animation is required during "run-time", as found in many typical modern artificial intelligence and expert systems. Living Color Frame Maker is a solution to many of these real-time graphics problems. Living Color Frame Maker (LCFM) is a graphics generation and management tool for IBM or IBM compatible personal computers. To eliminate graphics programming, the graphic designer can use LCFM to generate computer graphics frames. The graphical frames are then saved as text files, in a readable and disclosed format, which can be easily accessed and manipulated by user programs for a wide range of "real-time" visual information applications. For example, LCFM can be implemented in a frame-based expert system for visual aids in management of systems. For monitoring, diagnosis, and/or controlling purposes, circuit or systems diagrams can be brought to "life" by using designated video colors and intensities to symbolize the status of hardware components (via real-time feedback from sensors). Thus status of the system itself can be displayed. The Living Color Frame Maker is user friendly with graphical interfaces, and provides on-line help instructions. All options are executed using mouse commands and are displayed on a single menu for fast and easy operation. LCFM is written in C++ using the Borland C++ 2.0 compiler for IBM PC series computers and compatible computers running MS-DOS. The program requires a mouse and an EGA/VGA display. A minimum of 77K of RAM is also required for execution. The documentation is provided in electronic form on the distribution medium in WordPerfect format. A sample MS-DOS executable is provided on the distribution medium. The standard distribution medium for this program is one 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools

  19. gLAB-A Fully Software Tool to Generate, Process and Analyze GNSS Signals

    NASA Astrophysics Data System (ADS)

    Dionisio, Cesare; Citterico, Dario; Pirazzi, Gabriele; De Quattro, Nicola; Marracci, Riccardo; Cucchi, Luca; Valdambrini, Nicola; Formaioni, Irene

    2010-08-01

    In this paper the concept of Software Defined Radio (SDR) and its use in modern GNSS receiver is highlighted demonstrating how software receivers are important in many situations especially for verification and validation. After a brief introduction of gLab, a fully software high modular tool to generate, process and analyze current and future GNSS signals, the different software modules will be described. Demonstrating the wide range of uses concerning gLab, different practical example will be briefly overviewed: from the analysis of real data over the experimental GIOVE-B satellite, to the antenna group delay determination or the CN0 estimation under wide dynamic range etc.. gLab is the result of different projects leaded by Intecs in GNSS SW Radio: the signal generator is the result of the SWAN (Sistemi softWare per Applicazioni di Navigazione) project under Italian Space Agency (ASI) contract, the analyzer and the processing module have been developed for ESA to V&V the IOV (In Orbit Validation) Galileo Phase. In this case the GNSS SW RX works in parallel with Test User Receivers (TUR) in order to validate the Signal In Space (SiS). Is remarkable that gLab is the result of over three years of development and approximately one year of test and validation under ESA (European Space Agency) supervision.

  20. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    NASA Astrophysics Data System (ADS)

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  1. Generation of Look-Up Tables for Dynamic Job Shop Scheduling Decision Support Tool

    NASA Astrophysics Data System (ADS)

    Oktaviandri, Muchamad; Hassan, Adnan; Mohd Shaharoun, Awaluddin

    2016-02-01

    Majority of existing scheduling techniques are based on static demand and deterministic processing time, while most job shop scheduling problem are concerned with dynamic demand and stochastic processing time. As a consequence, the solutions obtained from the traditional scheduling technique are ineffective wherever changes occur to the system. Therefore, this research intends to develop a decision support tool (DST) based on promising artificial intelligent that is able to accommodate the dynamics that regularly occur in job shop scheduling problem. The DST was designed through three phases, i.e. (i) the look-up table generation, (ii) inverse model development and (iii) integration of DST components. This paper reports the generation of look-up tables for various scenarios as a part in development of the DST. A discrete event simulation model was used to compare the performance among SPT, EDD, FCFS, S/OPN and Slack rules; the best performances measures (mean flow time, mean tardiness and mean lateness) and the job order requirement (inter-arrival time, due dates tightness and setup time ratio) which were compiled into look-up tables. The well-known 6/6/J/Cmax Problem from Muth and Thompson (1963) was used as a case study. In the future, the performance measure of various scheduling scenarios and the job order requirement will be mapped using ANN inverse model.

  2. Geant4-DNA simulations using complex DNA geometries generated by the DnaFabric tool

    NASA Astrophysics Data System (ADS)

    Meylan, S.; Vimont, U.; Incerti, S.; Clairand, I.; Villagrasa, C.

    2016-07-01

    Several DNA representations are used to study radio-induced complex DNA damages depending on the approach and the required level of granularity. Among all approaches, the mechanistic one requires the most resolved DNA models that can go down to atomistic DNA descriptions. The complexity of such DNA models make them hard to modify and adapt in order to take into account different biological conditions. The DnaFabric project was started to provide a tool to generate, visualise and modify such complex DNA models. In the current version of DnaFabric, the models can be exported to the Geant4 code to be used as targets in the Monte Carlo simulation. In this work, the project was used to generate two DNA fibre models corresponding to two DNA compaction levels representing the hetero and the euchromatin. The fibres were imported in a Geant4 application where computations were performed to estimate the influence of the DNA compaction on the amount of calculated DNA damage. The relative difference of the DNA damage computed in the two fibres for the same number of projectiles was found to be constant and equal to 1.3 for the considered primary particles (protons from 300 keV to 50 MeV). However, if only the tracks hitting the DNA target are taken into account, then the relative difference is more important for low energies and decreases to reach zero around 10 MeV. The computations were performed with models that contain up to 18,000 DNA nucleotide pairs. Nevertheless, DnaFabric will be extended to manipulate multi-scale models that go from the molecular to the cellular levels.

  3. Evaluating an image-fusion algorithm with synthetic-image-generation tools

    NASA Astrophysics Data System (ADS)

    Gross, Harry N.; Schott, John R.

    1996-06-01

    An algorithm that combines spectral mixing and nonlinear optimization is used to fuse multiresolution images. Image fusion merges images of different spatial and spectral resolutions to create a high spatial resolution multispectral combination. High spectral resolution allows identification of materials in the scene, while high spatial resolution locates those materials. In this algorithm, conventional spectral mixing estimates the percentage of each material (called endmembers) within each low resolution pixel. Three spectral mixing models are compared; unconstrained, partially constrained, and fully constrained. In the partially constrained application, the endmember fractions are required to sum to one. In the fully constrained application, all fractions are additionally required to lie between zero and one. While negative fractions seem inappropriate, they can arise from random spectral realizations of the materials. In the second part of the algorithm, the low resolution fractions are used as inputs to a constrained nonlinear optimization that calculates the endmember fractions for the high resolution pixels. The constraints mirror the low resolution constraints and maintain consistency with the low resolution fraction results. The algorithm can use one or more higher resolution sharpening images to locate the endmembers to high spatial accuracy. The algorithm was evaluated with synthetic image generation (SIG) tools. A SIG developed image can be used to control the various error sources that are likely to impair the algorithm performance. These error sources include atmospheric effects, mismodeled spectral endmembers, and variability in topography and illumination. By controlling the introduction of these errors, the robustness of the algorithm can be studied and improved upon. The motivation for this research is to take advantage of the next generation of multi/hyperspectral sensors. Although the hyperspectral images will be of modest to low resolution

  4. DDBJ launches a new archive database with analytical tools for next-generation sequence data.

    PubMed

    Kaminuma, Eli; Mashima, Jun; Kodama, Yuichi; Gojobori, Takashi; Ogasawara, Osamu; Okubo, Kousaku; Takagi, Toshihisa; Nakamura, Yasukazu

    2010-01-01

    The DNA Data Bank of Japan (DDBJ) (http://www.ddbj.nig.ac.jp) has collected and released 1,701,110 entries/1,116,138,614 bases between July 2008 and June 2009. A few highlighted data releases from DDBJ were the complete genome sequence of an endosymbiont within protist cells in the termite gut and Cap Analysis Gene Expression tags for human and mouse deposited from the Functional Annotation of the Mammalian cDNA consortium. In this period, we started a novel user announcement service using Really Simple Syndication (RSS) to deliver a list of data released from DDBJ on a daily basis. Comprehensive visualization of a DDBJ release data was attempted by using a word cloud program. Moreover, a new archive for sequencing data from next-generation sequencers, the 'DDBJ Read Archive' (DRA), was launched. Concurrently, for read data registered in DRA, a semi-automatic annotation tool called the 'DDBJ Read Annotation Pipeline' was released as a preliminary step. The pipeline consists of two parts: basic analysis for reference genome mapping and de novo assembly and high-level analysis of structural and functional annotations. These new services will aid users' research and provide easier access to DDBJ databases. PMID:19850725

  5. Diagnostic tool for red blood cell membrane disorders: Assessment of a new generation ektacytometer.

    PubMed

    Da Costa, Lydie; Suner, Ludovic; Galimand, Julie; Bonnel, Amandine; Pascreau, Tiffany; Couque, Nathalie; Fenneteau, Odile; Mohandas, Narla

    2016-01-01

    Inherited red blood cell (RBC) membrane disorders, such as hereditary spherocytosis, elliptocytosis and hereditary ovalocytosis, result from mutations in genes encoding various RBC membrane and skeletal proteins. The RBC membrane, a composite structure composed of a lipid bilayer linked to a spectrin/actin-based membrane skeleton, confers upon the RBC unique features of deformability and mechanical stability. The disease severity is primarily dependent on the extent of membrane surface area loss. RBC membrane disorders can be readily diagnosed by various laboratory approaches that include RBC cytology, flow cytometry, ektacytometry, electrophoresis of RBC membrane proteins and genetics. The reference technique for diagnosis of RBC membrane disorders is the osmotic gradient ektacytometry. However, in spite of its recognition as the reference technique, this technique is rarely used as a routine diagnosis tool for RBC membrane disorders due to its limited availability. This may soon change as a new generation of ektacytometer has been recently engineered. In this review, we describe the workflow of the samples shipped to our Hematology laboratory for RBC membrane disorder analysis and the data obtained for a large cohort of French patients presenting with RBC membrane disorders using a newly available version of the ektacytomer. PMID:26603718

  6. Space Laboratory on a Table Top: A Next Generative ECLSS design and diagnostic tool

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale-time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data

  7. An accurate tool for the fast generation of dark matter halo catalogues

    NASA Astrophysics Data System (ADS)

    Monaco, P.; Sefusatti, E.; Borgani, S.; Crocce, M.; Fosalba, P.; Sheth, R. K.; Theuns, T.

    2013-08-01

    We present a new parallel implementation of the PINpointing Orbit Crossing-Collapsed HIerarchical Objects (PINOCCHIO) algorithm, a quick tool, based on Lagrangian Perturbation Theory, for the hierarchical build-up of dark matter (DM) haloes in cosmological volumes. To assess its ability to predict halo correlations on large scales, we compare its results with those of an N-body simulation of a 3 h-1 Gpc box sampled with 20483 particles taken from the MICE suite, matching the same seeds for the initial conditions. Thanks to the Fastest Fourier Transforms in the West (FFTW) libraries and to the relatively simple design, the code shows very good scaling properties. The CPU time required by PINOCCHIO is a tiny fraction (˜1/2000) of that required by the MICE simulation. Varying some of PINOCCHIO numerical parameters allows one to produce a universal mass function that lies in the range allowed by published fits, although it underestimates the MICE mass function of Friends-of-Friends (FoF) haloes in the high-mass tail. We compare the matter-halo and the halo-halo power spectra with those of the MICE simulation and find that these two-point statistics are well recovered on large scales. In particular, when catalogues are matched in number density, agreement within 10 per cent is achieved for the halo power spectrum. At scales k > 0.1 h Mpc-1, the inaccuracy of the Zel'dovich approximation in locating halo positions causes an underestimate of the power spectrum that can be modelled as a Gaussian factor with a damping scale of d = 3 h-1 Mpc at z = 0, decreasing at higher redshift. Finally, a remarkable match is obtained for the reduced halo bispectrum, showing a good description of non-linear halo bias. Our results demonstrate the potential of PINOCCHIO as an accurate and flexible tool for generating large ensembles of mock galaxy surveys, with interesting applications for the analysis of large galaxy redshift surveys.

  8. Re-Imagining Specialized STEM Academies: Igniting and Nurturing "Decidedly Different Minds", by Design

    ERIC Educational Resources Information Center

    Marshall, Stephanie Pace

    2010-01-01

    This article offers a personal vision and conceptual design for reimagining specialized science, technology, engineering, and mathematics (STEM) academies designed to nurture "decidedly different" STEM minds and ignite a new generation of global STEM talent, innovation, and entrepreneurial leadership. This design enables students to engage…

  9. Existing computer applications, maintain or redesign: how to decide

    SciTech Connect

    Brice, L.

    1981-01-01

    Maintenance of large applications programs is an aspect of performance management that has been largely ignored by those studies that attempt to bring structure to the software production environment. Maintenance in this paper means: fixing bugs, modifying current design features, adding enhancements, and porting applications to other computer systems. It is often difficult to decide whether to maintain or redesign. One reason for the difficulty is that good models and methods do not exist for differentiating between those programs that should be maintained and those that should be redesigned. This enigma is illustrated by the description of a large application case study. The application was monitored for maintenance effort, thereby providing some insight into the redesign/maintain decision. Those tools which currently exist for the collection and measurement of performance data are highlighted. Suggestions are then made for yet other categories of data, difficult to collect and measure, yet ultimately necessary for the establishment of accurate predictions about the value of maintaining versus the value of redesigning. Finally, it is concluded that this aspect of performance management deserves increased attention in order to establish better guidelines with which to aid management in making the necessary but difficult decision: maintain or redesign.

  10. Career Cruising Impact on the Self Efficacy of Deciding Majors

    ERIC Educational Resources Information Center

    Smother, Anthony William

    2012-01-01

    The purpose of this study was to analyze the impact of "Career Cruising"© on self-efficacy of deciding majors in a university setting. The use of the self-assessment instrument, "Career Cruising"©, was used with measuring the career-decision making self-efficacy in a pre and post-test with deciding majors. The independent…

  11. Tools for Generating Useful Time-series Data from PhenoCam Images

    NASA Astrophysics Data System (ADS)

    Milliman, T. E.; Friedl, M. A.; Frolking, S.; Hufkens, K.; Klosterman, S.; Richardson, A. D.; Toomey, M. P.

    2012-12-01

    The PhenoCam project (http://phenocam.unh.edu/) is tasked with acquiring, processing, and archiving digital repeat photography to be used for scientific studies of vegetation phenological processes. Over the past 5 years the PhenoCam project has collected over 2 million time series images for a total over 700 GB of image data. Several papers have been published describing derived "vegetation indices" (such as green-chromatic-coordinate or gcc) which can be compared to standard measures such as NDVI or EVI. Imagery from our archive is available for download but converting series of images for a particular camera into useful scientific data, while simple in principle, is complicated by a variety of factors. Cameras are often exposed to harsh weather conditions (high wind, rain, ice, snow pile up), which result in images where the field of view (FOV) is partially obscured or completely blocked for periods of time. The FOV can also change for other reasons (mount failures, tower maintenance, etc.) Some of the relatively inexpensive cameras that are being used can also temporarily lose color balance or exposure controls resulting in loss of imagery. All these factors negatively influence the automated analysis of the image time series making this a non-trivial task. Here we discuss the challenges of processing PhenoCam image time-series for vegetation monitoring and the associated data management tasks. We describe our current processing framework and a simple standardized output format for the resulting time-series data. The time-series data in this format will be generated for specific "regions of interest" (ROI's) for each of the cameras in the PhenoCam network. This standardized output (which will be updated daily) can be considered 'the pulse' of a particular camera and will provide a default phenological dynamic for said camera. The time-series data can also be viewed as a higher level product which can be used to generate "vegetation indices", like gcc, for

  12. Systems Prototyping with Fourth Generation Tools: One Answer to the Productivity Puzzle? AIR 1983 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Sholtys, Phyllis A.

    The development of information systems using an engineering approach employing both traditional programming techniques and nonprocedural languages is described. A fourth generation application tool is used to develop a prototype system that is revised and expanded as the user clarifies individual requirements. When fully defined, a combination of…

  13. Generating Animal and Tool Names: An fMRI Study of Effective Connectivity

    ERIC Educational Resources Information Center

    Vitali, P.; Abutalebi, J.; Tettamanti, M.; Rowe, J.; Scifo, P.; Fazio, F.; Cappa, S.F.; Perani, D.

    2005-01-01

    The present fMRI study of semantic fluency for animal and tool names provides further evidence for category-specific brain activations, and reports task-related changes in effective connectivity among defined cerebral regions. Two partially segregated systems of functional integration were highlighted: the tool condition was associated with an…

  14. Future generations of horizontal tools will make tighter turns and last longer

    SciTech Connect

    Lyle, D.

    1995-10-01

    Operators want horizontal tools that turn tighter and last longer, and manufacturers are working to meet the need. An operator needs control of tools in the hole to drill a good horizontal well, and service and supply companies are trying to improve that control.

  15. HAPCAD: An open-source tool to detect PCR crossovers in next-generation sequencing generated HLA data.

    PubMed

    McDevitt, Shana L; Bredeson, Jessen V; Roy, Scott W; Lane, Julie A; Noble, Janelle A

    2016-03-01

    Next-generation sequencing (NGS) based HLA genotyping can generate PCR artifacts corresponding to IMGT/HLA Database alleles, for which multiple examples have been observed, including sequence corresponding to the HLA-DRB1(∗)03:42 allele. Repeat genotyping of 131 samples, previously genotyped as DRB1(∗)03:01 homozygotes using probe-based methods, resulted in the heterozygous call DRB1(∗)03:01+DRB1(∗)03:42. The apparent rare DRB1(∗)03:42 allele is hypothesized to be a "hybrid amplicon" generated by PCR crossover, a process in which a partial PCR product denatures from its template, anneals to a different allele template, and extends to completion. Unlike most PCR crossover products, "hybrid amplicons" always corresponds to an IMGT/HLA Database allele, necessitating a case-by-case analysis of whether its occurrence reflects the actual allele or is simply the result of PCR crossover. The Hybrid Amplicon/PCR Crossover Artifact Detector (HAPCAD) program mimics jumping PCR in silico and flags allele sequences that may also be generated as hybrid amplicon. PMID:26802209

  16. deepTools2: a next generation web server for deep-sequencing data analysis

    PubMed Central

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-01-01

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de. The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. PMID:27079975

  17. deepTools2: a next generation web server for deep-sequencing data analysis.

    PubMed

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-01

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. PMID:27079975

  18. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    NASA Astrophysics Data System (ADS)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  19. A surface data generation method of optical micro-structure and analysis system for Fast Tool Servo fabricating

    NASA Astrophysics Data System (ADS)

    Yang, Fan; Dai, Yi-fan; Wan, Fei; Wang, Gui-lin

    2010-10-01

    High-precision optical micro-structured components are now widely used in the field of military and civilian use. Ultraprecision machining with a fast tool servo (FTS) is one of the leading methodologies for fabrication of such surfaces. The first important issue that faced in ultra-precision and high-effectively fabricating is how to properly describe the complex shapes based on the principle of FTS. In order to meet the demands of FTS machining that need for tool high-frequency response, high data throughput and huge memory space, an off-line discrete data points generation method for microstructure surfaces is presented which can avoid on-line shape calculation in fabricating process. A new analysis software package is developed to compute the speed, acceleration and spectrum over the generated data points which helps to analysis the tool tracking characteristics needed in fabricating. Also a new mechanism for FTS machining data transmission based on the huge-capacity storage device is proposed. Experiments show that the off-line surface data generation method and data transfer mechanism can effectively improve FTS fabricating efficiency, the surface analysis software can help to determine the machining ability of tool-holder and to guide and optimize the processing parameters such as spindle speed, feed rate, etc.

  20. Conceptual Systems Model as a Tool for Hypothesis Generation and Testing in Ecotoxicological Research

    EPA Science Inventory

    Microarray, proteomic, and metabonomic technologies are becoming increasingly accessible as tools for ecotoxicology research. Effective use of these technologies will depend, at least in part, on the ability to apply these techniques within a paradigm of hypothesis driven researc...

  1. Generated spiral bevel gears - Optimal machine-tool settings and tooth contact analysis

    NASA Technical Reports Server (NTRS)

    Litvin, F. L.; Tsung, W.-J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  2. Generated spiral bevel gears: Optimal machine-tool settings and tooth contact analysis

    NASA Technical Reports Server (NTRS)

    Litvin, F. L.; Tsung, W. J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  3. A Study on Tooling and Its Effect on Heat Generation and Mechanical Properties of Welded Joints in Friction Stir Welding

    NASA Astrophysics Data System (ADS)

    Tikader, Sujoy; Biswas, Pankaj; Puri, Asit Baran

    2016-06-01

    Friction stir welding (FSW) has been the most attracting solid state welding process as it serves numerous advantages like good mechanical, metallurgical properties etc. Non weldable aluminium alloys like 5XXX, 7XXX series can be simply joined by this process. In this present study a mathematical model has been developed and experiments were successfully performed to evaluate mechanical properties of FSW on similar aluminium alloys i.e. AA1100 for different process parameters and mainly two kind of tool geometry (straight cylindrical and conical or cylindrical tapered shaped pin with flat shoulder). Tensile strength and micro hardness for different process parameters are reported of the welded plate sample. It was noticed that in FSW of similar alloy with tool made of SS-310 tool steel, friction is the major contributor for the heat generation. It was seen that tool geometry, tool rotational speed, plunging force by the tool and traverse speed have significant effect on tensile strength and hardness of friction stir welded joints.

  4. Generator program for computer-assisted instruction: MACGEN. A software tool for generating computer-assisted instructional texts.

    PubMed

    Utsch, M J; Ingram, D

    1983-01-01

    This publication describes MACGEN, an interactive development tool to assist teachers to create, modify and extend case simulations, tutorial exercises and multiple-choice question tests designed for computer-aided instruction. The menu-driven software provides full authoring facilities for text files in MACAID format by means of interactive editing. Authors are prompted for items which they might want to change whereas all user-independent items are provided automatically. Optional default values and explanatory messages are available with every prompt. Errors are corrected automatically or commented upon. Thus the program eliminates the need to familiarize with a new language or details of the text file structure. The options for modification of existing text files include display, renumbering of frames and a line-oriented editor. The resulting text files can be interpreted by the MACAID driver without further changes. The text file is held as ASCII records and as such is also accessible with many standard word-processing systems if desired. PMID:6362978

  5. miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.

    PubMed

    Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M

    2009-07-01

    Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/. PMID:19433510

  6. ArcCN-Runoff: An ArcGIS tool for generating curve number and runoff maps

    USGS Publications Warehouse

    Zhan, X.; Huang, M.-L.

    2004-01-01

    The development and the application of ArcCN-Runoff tool, an extension of ESRI@ ArcGIS software, are reported. This tool can be applied to determine curve numbers and to calculate runoff or infiltration for a rainfall event in a watershed. Implementation of GIS techniques such as dissolving, intersecting, and a curve-number reference table improve efficiency. Technical processing time may be reduced from days, if not weeks, to hours for producing spatially varied curve number and runoff maps. An application example for a watershed in Lyon County and Osage County, Kansas, USA, is presented. ?? 2004 Elsevier Ltd. All rights reserved.

  7. Virtual tool mark generation for efficient striation analysis in forensic science

    SciTech Connect

    Ekstrand, Laura

    2012-01-01

    In 2009, a National Academy of Sciences report called for investigation into the scienti c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served to con rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5 and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range of angles

  8. DRIVE Analysis Tool Generates Custom Vehicle Drive Cycles Based on Real-World Data (Fact Sheet)

    SciTech Connect

    Not Available

    2013-04-01

    This fact sheet from the National Renewable Energy Laboratory describes the Drive-Cycle Rapid Investigation, Visualization, and Evaluation (DRIVE) analysis tool, which uses GPS and controller area network data to characterize vehicle operation and produce custom vehicle drive cycles, analyzing thousands of hours of data in a matter of minutes.

  9. Generating genomic tools for blueberry improvement -- an update of our progress

    Technology Transfer Automated Retrieval System (TEKTRAN)

    There is increased demand for and consumption of blueberries worldwide because of their many recognized health benefits. Great strides have been made in blueberry cultivar development since its domestication using traditional breeding approaches. However, genomic tools are lacking in blueberry, whic...

  10. WORD STATISTICS IN THE GENERATION OF SEMANTIC TOOLS FOR INFORMATION SYSTEMS.

    ERIC Educational Resources Information Center

    STONE, DON C.

    ONE OF THE PROBLEMS IN INFORMATION STORAGE AND RETRIEVAL SYSTEMS OF TECHNICAL DOCUMENTS IS THE INTERPRETATION OF WORDS USED TO INDEX DOCUMENTS. SEMANTIC TOOLS, DEFINED AS CHANNELS FOR THE COMMUNICATION OF WORD MEANINGS BETWEEN TECHNICAL EXPERTS, DOCUMENT INDEXERS, AND SEARCHERS, PROVIDE ONE METHOD OF DEALING WITH THE PROBLEM OF MULTIPLE…

  11. Generation of orientation tools for automated zebrafish screening assays using desktop 3D printing

    PubMed Central

    2014-01-01

    Background The zebrafish has been established as the main vertebrate model system for whole organism screening applications. However, the lack of consistent positioning of zebrafish embryos within wells of microtiter plates remains an obstacle for the comparative analysis of images acquired in automated screening assays. While technical solutions to the orientation problem exist, dissemination is often hindered by the lack of simple and inexpensive ways of distributing and duplicating tools. Results Here, we provide a cost effective method for the production of 96-well plate compatible zebrafish orientation tools using a desktop 3D printer. The printed tools enable the positioning and orientation of zebrafish embryos within cavities formed in agarose. Their applicability is demonstrated by acquiring lateral and dorsal views of zebrafish embryos arrayed within microtiter plates using an automated screening microscope. This enables the consistent visualization of morphological phenotypes and reporter gene expression patterns. Conclusions The designs are refined versions of previously demonstrated devices with added functionality and strongly reduced production costs. All corresponding 3D models are freely available and digital design can be easily shared electronically. In combination with the increasingly widespread usage of 3D printers, this provides access to the developed tools to a wide range of zebrafish users. Finally, the design files can serve as templates for other additive and subtractive fabrication methods. PMID:24886511

  12. Generating and Analyzing Visual Representations of Conic Sections with the Use of Technological Tools

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Espinosa-Perez, Hugo; Reyes-Rodriguez, Aaron

    2006-01-01

    Technological tools have the potential to offer students the possibility to represent information and relationships embedded in problems and concepts in ways that involve numerical, algebraic, geometric, and visual approaches. In this paper, the authors present and discuss an example in which an initial representation of a mathematical object…

  13. SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool

    NASA Technical Reports Server (NTRS)

    Boyer, Jeffrey S.

    1994-01-01

    Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  14. The Three-Generation Pedigree: A Critical Tool in Cancer Genetics Care.

    PubMed

    Mahon, Suzanne M

    2016-09-01

    The family history, a rather low-tech tool, is the backbone of genetic assessment and guides risk assessment and genetic testing decisions. The importance of the pedigree and its application to genetic practice is often overlooked and underestimated. Unfortunately, particularly with electronic health records, standard pedigrees are not routinely constructed. A clear understanding of how pedigrees are employed in clinical oncology practice may lead to improved collection and use of family history data.
. PMID:27541558

  15. Free Tools and Strategies for the Generation of 3D Finite Element Meshes: Modeling of the Cardiac Structures

    PubMed Central

    Pavarino, E.; Neves, L. A.; Machado, J. M.; de Godoy, M. F.; Shiyou, Y.; Momente, J. C.; Zafalon, G. F. D.; Pinto, A. R.; Valêncio, C. R.

    2013-01-01

    The Finite Element Method is a well-known technique, being extensively applied in different areas. Studies using the Finite Element Method (FEM) are targeted to improve cardiac ablation procedures. For such simulations, the finite element meshes should consider the size and histological features of the target structures. However, it is possible to verify that some methods or tools used to generate meshes of human body structures are still limited, due to nondetailed models, nontrivial preprocessing, or mainly limitation in the use condition. In this paper, alternatives are demonstrated to solid modeling and automatic generation of highly refined tetrahedral meshes, with quality compatible with other studies focused on mesh generation. The innovations presented here are strategies to integrate Open Source Software (OSS). The chosen techniques and strategies are presented and discussed, considering cardiac structures as a first application context. PMID:23762031

  16. Some decidable results on reachability of solvable systems

    NASA Astrophysics Data System (ADS)

    Xu, Ming; Zhu, Jiaqi; Li, Zhi-Bin

    2013-05-01

    Reachability analysis plays an important role in verifying the safety of modern control systems. In the existing work, there are many decidable results on reachability of discrete systems. For continuous systems, however, the known decidable results are established merely for linear systems. In this paper, we propose a class of nonlinear systems (named solvable systems) extending linear systems. We first show that their solutions are of closed form. On the basis of it, we study a series of reachability problems for various subclasses of solvable systems. Our main results are that these reachability problems are decidable by manipulations in number theory, real root isolation, and quantifier elimination. Finally the decision procedures are implemented in a Maple package REACH to solve several non-trivial examples.

  17. Photography as a Data Generation Tool for Qualitative Inquiry in Education.

    ERIC Educational Resources Information Center

    Cappello, Marva

    This paper discusses the ways in which photography was used for data generation in a 9-month qualitative study on a mixed-age elementary school classroom. Through a review of the research literature in anthropology, sociology, and education, and an analysis of the research data, the usefulness of photography for educational research with young…

  18. Messaging, Gaming, Peer-to-Peer Sharing: Language Learning Strategies & Tools for the Millennial Generation

    ERIC Educational Resources Information Center

    Godwin-Jones, Bob

    2005-01-01

    The next generation's enthusiasm for instant messaging, videogames, and peer-to-peer file swapping is likely to be dismissed by their elders as so many ways to waste time and avoid the real worlds of work or school. But these activities may not be quite as vapid as they may seem from the perspective of outsiders--or educators. Researchers point…

  19. Arkose: A Prototype Mechanism and Tool for Collaborative Information Generation and Distillation

    ERIC Educational Resources Information Center

    Nam, Kevin Kyung

    2010-01-01

    The goals of this thesis have been to gain a better understanding of collaborative knowledge sharing and distilling and to build a prototype collaborative system that supports flexible knowledge generation and distillation. To reach these goals, I have conducted two user studies and built two systems. The first system, Arkose 1.0, is a…

  20. The Development of a Tool for Semi-Automated Generation of Structured and Unstructured Grids about Isolated Rotorcraft Blades

    NASA Technical Reports Server (NTRS)

    Shanmugasundaram, Ramakrishnan; Garriz, Javier A.; Samareh, Jamshid A.

    1997-01-01

    The grid generation used to model rotorcraft configurations for Computational Fluid Dynamics (CFD) analysis is highly complicated and time consuming. The highly complex geometry and irregular shapes encountered in entire rotorcraft configurations are typically modeled using overset grids. Another promising approach is to utilize unstructured grid methods. With either approach the majority of time is spent manually setting up the topology. For less complicated geometries such as isolated rotor blades, less time is obviously required. This paper discusses the capabilities of a tool called Rotor blade Optimized Topology Organizer and Renderer(ROTOR) being developed to quickly generate block structured grids and unstructured tetrahedral grids about isolated blades. The key algorithm uses individual airfoil sections to construct a Non-Uniform Rational B-Spline(NURBS) surface representation of the rotor blade. This continuous surface definition can be queried to define the block topology used in constructing a structured mesh around the rotor blade. Alternatively, the surface definition can be used to define the surface patches and grid cell spacing requirements for generating unstructured surface and volume grids. Presently, the primary output for ROTOR is block structured grids using 0-H and H-H topologies suitable for full-potential solvers. This paper will discuss the present capabilities of the tool and highlight future work.

  1. Environmental epigenetics: A promising venue for developing next-generation pollution biomonitoring tools in marine invertebrates.

    PubMed

    Suarez-Ulloa, Victoria; Gonzalez-Romero, Rodrigo; Eirin-Lopez, Jose M

    2015-09-15

    Environmental epigenetics investigates the cause-effect relationships between specific environmental factors and the subsequent epigenetic modifications triggering adaptive responses in the cell. Given the dynamic and potentially reversible nature of the different types of epigenetic marks, environmental epigenetics constitutes a promising venue for developing fast and sensible biomonitoring programs. Indeed, several epigenetic biomarkers have been successfully developed and applied in traditional model organisms (e.g., human and mouse). Nevertheless, the lack of epigenetic knowledge in other ecologically and environmentally relevant organisms has hampered the application of these tools in a broader range of ecosystems, most notably in the marine environment. Fortunately, that scenario is now changing thanks to the growing availability of complete reference genome sequences along with the development of high-throughput DNA sequencing and bioinformatic methods. Altogether, these resources make the epigenetic study of marine organisms (and more specifically marine invertebrates) a reality. By building on this knowledge, the present work provides a timely perspective highlighting the extraordinary potential of environmental epigenetic analyses as a promising source of rapid and sensible tools for pollution biomonitoring, using marine invertebrates as sentinel organisms. This strategy represents an innovative, groundbreaking approach, improving the conservation and management of natural resources in the oceans. PMID:26088539

  2. Rational protein design: developing next-generation biological therapeutics and nanobiotechnological tools.

    PubMed

    Wilson, Corey J

    2015-01-01

    Proteins are the most functionally diverse macromolecules observed in nature, participating in a broad array of catalytic, biosensing, transport, scaffolding, and regulatory functions. Fittingly, proteins have become one of the most promising nanobiotechnological tools to date, and through the use of recombinant DNA and other laboratory methods we have produced a vast number of biological therapeutics derived from human genes. Our emerging ability to rationally design proteins (e.g., via computational methods) holds the promise of significantly expanding the number and diversity of protein therapies and has opened the gateway to realizing true and uncompromised personalized medicine. In the last decade computational protein design has been transformed from a set of fundamental strategies to stringently test our understanding of the protein structure-function relationship, to practical tools for developing useful biological processes, nano-devices, and novel therapeutics. As protein design strategies improve (i.e., in terms of accuracy and efficiency) clinicians will be able to leverage individual genetic data and biological metrics to develop and deliver personalized protein therapeutics with minimal delay. PMID:25348497

  3. An Automated and Minimally Invasive Tool for Generating Autologous Viable Epidermal Micrografts

    PubMed Central

    Osborne, Sandra N.; Schmidt, Marisa A.; Harper, John R.

    2016-01-01

    ABSTRACT OBJECTIVE: A new epidermal harvesting tool (CelluTome; Kinetic Concepts, Inc, San Antonio, Texas) created epidermal micrografts with minimal donor site damage, increased expansion ratios, and did not require the use of an operating room. The tool, which applies both heat and suction concurrently to normal skin, was used to produce epidermal micrografts that were assessed for uniform viability, donor-site healing, and discomfort during and after the epidermal harvesting procedure. DESIGN: This study was a prospective, noncomparative institutional review board–approved healthy human study to assess epidermal graft viability, donor-site morbidity, and patient experience. SETTING: These studies were conducted at the multispecialty research facility, Clinical Trials of Texas, Inc, San Antonio. PATIENTS: The participants were 15 healthy human volunteers. RESULTS: The average viability of epidermal micrografts was 99.5%. Skin assessment determined that 76% to 100% of the area of all donor sites was the same in appearance as the surrounding skin within 14 days after epidermal harvest. A mean pain of 1.3 (on a scale of 1 to 5) was reported throughout the harvesting process. CONCLUSIONS: Use of this automated, minimally invasive harvesting system provided a simple, low-cost method of producing uniformly viable autologous epidermal micrografts with minimal patient discomfort and superficial donor-site wound healing within 2 weeks. PMID:26765157

  4. Protein engineering for metabolic engineering: Current and next-generation tools

    SciTech Connect

    Marcheschi, RJ; Gronenberg, LS; Liao, JC

    2013-04-16

    Protein engineering in the context of metabolic engineering is increasingly important to the field of industrial biotechnology. As the demand for biologically produced food, fuels, chemicals, food additives, and pharmaceuticals continues to grow, the ability to design and modify proteins to accomplish new functions will be required to meet the high productivity demands for the metabolism of engineered organisms. We review advances in selecting, modeling, and engineering proteins to improve or alter their activity. Some of the methods have only recently been developed for general use and are just beginning to find greater application in the metabolic engineering community. We also discuss methods of generating random and targeted diversity in proteins to generate mutant libraries for analysis. Recent uses of these techniques to alter cofactor use; produce non-natural amino acids, alcohols, and carboxylic acids; and alter organism phenotypes are presented and discussed as examples of the successful engineering of proteins for metabolic engineering purposes.

  5. Protein engineering for metabolic engineering: current and next-generation tools

    PubMed Central

    Marcheschi, Ryan J.; Gronenberg, Luisa S.; Liao, James C.

    2014-01-01

    Protein engineering in the context of metabolic engineering is increasingly important to the field of industrial biotechnology. As the demand for biologically-produced food, fuels, chemicals, food additives, and pharmaceuticals continues to grow, the ability to design and modify proteins to accomplish new functions will be required to meet the high productivity demands for the metabolism of engineered organisms. This article reviews advances of selecting, modeling, and engineering proteins to improve or alter their activity. Some of the methods have only recently been developed for general use and are just beginning to find greater application in the metabolic engineering community. We also discuss methods of generating random and targeted diversity in proteins to generate mutant libraries for analysis. Recent uses of these techniques to alter cofactor use, produce non-natural amino acids, alcohols, and carboxylic acids, and alter organism phenotypes are presented and discussed as examples of the successful engineering of proteins for metabolic engineering purposes. PMID:23589443

  6. Generation of histo-anatomically representative models of the individual heart: tools and application

    PubMed Central

    Plank, Gernot; Burton, Rebecca A. B.; Hales, Patrick; Bishop, Martin; Mansoori, Tahir; Bernabeu, Miguel; Garny, Alan; Prassl, Anton J.; Bollensdorff, Christian; Mason, Fleur; Mahmood, Fahd; Rodriguez, Blanca; Grau, Vicente; Schneider, Jürgen E.; Gavaghan, David; Kohl, Peter

    2010-01-01

    This paper presents methods to build histo-anatomically detailed individualised cardiac models. The models are based on high-resolution 3D anatomical and/or diffusion tensor magnetic resonance images, combined with serial histological sectioning data, and are used to investigate individualised cardiac function. The current state-of-the-art is reviewed, and its limitations are discussed. We assess the challenges associated with the generation of histo-anatomically representative individualised in-silico models of the heart. The entire processing pipeline including image acquisition, image processing, mesh generation, model set-up and execution of computer simulations, and the underlying methods are described. The multi-faceted challenges associated with these goals are highlighted, suitable solutions are proposed, and an important application of developed high-resolution structure-function models in elucidating the effect of individual structural heterogeneity upon wavefront dynamics is demonstrated. PMID:19414455

  7. Mid-water Software Tools and the Application to Processing and Analysis of the Latest Generation Multibeam Sonars

    NASA Astrophysics Data System (ADS)

    Gee, L.; Doucet, M.

    2010-12-01

    The latest generation of multibeam sonars now has the ability to map the water-column, along with the seafloor. Currently, the users of these sonars have a limited view of the mid-water data in real-time, and if they do store the data, they are restricted to replaying it only, with no ability for further analysis. The water-column data has the potential to address a number of research areas including detection of small targets (wrecks, etc.) above the seabed, mapping of fish and marine mammals and a wide range of physical oceanographic processes. However, researchers have been required to develop their own in-house software tools before they can even begin their study of the water column data. This paper describes the development of more general software tools for the full processing of raw sonar data (bathymetry, backscatter and water-column) to yield output products suitable for visualization in a 4D time-synchronized environment. The huge water-column data volumes generated by the new sonars, combined with the variety of data formats from the different sonar manufacturers, provides a significant challenge in the design and development of tools that can be applied to the wide variety of applications. The development of the mid-water tools on this project addressed this problem by using a unified way of storing the water column data in a generic water column format (GWC). The sonar data are converted into the GWC by re-integrating the water column packets with time-based navigation and attitude, such that downstream in the workflow, the tools will have access to all relevant data of any particular ping. Dependent on the application and the resolution requirements, the conversion process also allows simple sub-sampling. Additionally, each file is indexed to enable fast non-linear lookup and extraction of any packet type or packet type collection in the sonar file. These tools also fully exploit multi-core and hyper-threading technologies to maximize the throughput

  8. Thrombin generation assay: a new tool to predict and optimize clinical outcome in cardiovascular patients?

    PubMed

    Campo, Gianluca; Pavasini, Rita; Pollina, Alberto; Fileti, Luca; Marchesini, Jlenia; Tebaldi, Matteo; Ferrari, Roberto

    2012-12-01

    Antithrombotic therapy (including antiplatelet and anticoagulant drugs) is the cornerstone of the current medical treatment of patients with acute coronary syndromes (ACS). This therapy and particularly the new antiplatelet and anticoagulant drugs have significantly reduced the ischemic risk, but have increased bleeding complications. Recently, several studies have emphasized the negative prognostic impact on long-term mortality of these bleeding adverse events. Thus, new assays to estimate the bleeding risk and the efficacy of these antithrombotic drugs are clearly in demand. Regarding the anticoagulant drugs, new promising data have emerged about the thrombin generation assay (TGA). TGA measures the ability of plasma to generate thrombin. TGA may be used to check coagulation function, to value risk of thrombosis and to compare the efficacy of different anticoagulants employed in clinical management of patients with ACS. The TGA result is a curve which describes the variation of thrombin's amount during the activation of the coagulation cascade. All available anticoagulant drugs influence the principal parameters generated by TGA and so it is possible to evaluate the effects of the medical treatment. In this review we provide a brief description of the assay and we summarize the principals of previous studies by analyzing the relationship between anticoagulant drugs and TGA. Moreover, a brief summary of its ability to predict ischemic and bleeding risks has been provided. PMID:22688556

  9. Next-Generation Sequencing: A Review of Technologies and Tools for Wound Microbiome Research

    PubMed Central

    Hodkinson, Brendan P.; Grice, Elizabeth A.

    2015-01-01

    Significance: The colonization of wounds by specific microbes or communities of microbes may delay healing and/or lead to infection-related complication. Studies of wound-associated microbial communities (microbiomes) to date have primarily relied upon culture-based methods, which are known to have extreme biases and are not reliable for the characterization of microbiomes. Biofilms are very resistant to culture and are therefore especially difficult to study with techniques that remain standard in clinical settings. Recent Advances: Culture-independent approaches employing next-generation DNA sequencing have provided researchers and clinicians a window into wound-associated microbiomes that could not be achieved before and has begun to transform our view of wound-associated biodiversity. Within the past decade, many platforms have arisen for performing this type of sequencing, with various types of applications for microbiome research being possible on each. Critical Issues: Wound care incorporating knowledge of microbiomes gained from next-generation sequencing could guide clinical management and treatments. The purpose of this review is to outline the current platforms, their applications, and the steps necessary to undertake microbiome studies using next-generation sequencing. Future Directions: As DNA sequencing technology progresses, platforms will continue to produce longer reads and more reads per run at lower costs. A major future challenge is to implement these technologies in clinical settings for more precise and rapid identification of wound bioburden. PMID:25566414

  10. Cognitive avionics and watching spaceflight crews think: generation-after-next research tools in functional neuroimaging.

    PubMed

    Genik, Richard J; Green, Christopher C; Graydon, Francis X; Armstrong, Robert E

    2005-06-01

    Confinement and isolation have always confounded the extraordinary endeavor of human spaceflight. Psychosocial health is at the forefront in considering risk factors that imperil missions of 1- to 2-yr duration. Current crewmember selection metrics restricted to behavioral observation by definition observe rather than prevent performance degradation and are thus inadequate when preflight training cannot simulate an entire journey. Nascent techniques to monitor functional and task-related cortical neural activity show promise and can be extended to include whole-brain monitoring. Watching spaceflight crews think can reveal the efficiency of training procedures. Moreover, observing subcortical emotion centers may provide early detection of developing neuropsychiatric disorders. The non-invasive functional neuroimaging modalities electroencephalography (EEG), magnetoencephalography (MEG), magnetic resonance imaging (MRI), and near-infrared spectroscopy (NIRS), and highlights of how they may be engineered for spacecraft are detailed. Preflight and in-flight applications to crewmember behavioral health from current generation, next generation, and generation-after-next neuroscience research studies are also described. The emphasis is on preventing the onset of neuropsychiatric dysfunctions, thus reducing the risk of mission failure due to human error. PMID:15943214

  11. Bone Marrow Transplantation in Mice as a Tool to Generate Genetically Modified Animals

    NASA Astrophysics Data System (ADS)

    Rőszer, Tamás; Pintye, Éva; Benkő, Ilona

    2008-12-01

    Transgenic mice can be used either as models of known inherited human diseases or can be applied to perform phenotypic tests of genes with unknown function. In some special applications of gene modification we have to create a tissue specific mutation of a given gene. In some cases however the gene modification can be lethal in the intrauterine life, therefore we should engraft the mutated cells in the postnatal life period. After total body irradiation transplantation of bone marrow cells can be a solution to introduce mutant hematopoietic stem cells into a mature animal. Bone marrow transplantation is a useful and novel tool to study the role of hematopoietic cells in the pathogenesis of inflammation, autoimmune syndromes and many metabolic alterations coupled recently to leukocyte functions.

  12. Bone Marrow Transplantation in Mice as a Tool to Generate Genetically Modified Animals

    SciTech Connect

    Roszer, Tamas; Pintye, Eva; Benko', Ilona

    2008-12-08

    Transgenic mice can be used either as models of known inherited human diseases or can be applied to perform phenotypic tests of genes with unknown function. In some special applications of gene modification we have to create a tissue specific mutation of a given gene. In some cases however the gene modification can be lethal in the intrauterine life, therefore we should engraft the mutated cells in the postnatal life period. After total body irradiation transplantation of bone marrow cells can be a solution to introduce mutant hematopoietic stem cells into a mature animal. Bone marrow transplantation is a useful and novel tool to study the role of hematopoietic cells in the pathogenesis of inflammation, autoimmune syndromes and many metabolic alterations coupled recently to leukocyte functions.

  13. Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation Electricore, Inc.

    SciTech Connect

    Daye, Tony

    2013-09-30

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  14. New tool to detect operation anomalies on automatic voltage regulator equipment of large power units; Generator simulator (GS)

    SciTech Connect

    Blanchet, P. )

    1990-01-01

    When large generating plants are installed on site remote from the consumer areas, the operation of network with correct margins of stability is conditioned by adjustment of automatic voltage regulator (AVR). Any spoiled deviation in normal operation or especially in abnormal run must be detected at first overhaul or first shutdown. Then, without delay, this new tool which is the generator simulator (GS) contributes to minimize the time necessary for failures investigation and to qualify again AVR equipment after repair. The two main objectives of this paper are: to qualify the AVR performances of power unit during the scheduled overhaul; and to lighten failures research into AVR system, avoiding faulty dismantling during the unit fortuitous shutdown.

  15. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  16. Deciding in Democracies: A Role for Thinking Skills?

    ERIC Educational Resources Information Center

    Gardner, Peter

    2014-01-01

    In societies that respect our right to decide many things for ourselves, exercising that right can be a source of anxiety. We want to make the right decisions, which is difficult when we are confronted with complex issues that are usually the preserve of specialists. But is help at hand? Are thinking skills the very things that non-specialists…

  17. Consumer Economics, Book I [and] Book II. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Consumer Economics, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday…

  18. Deciding when It's Time to Buy a New PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    How to best decide when it's time to replace your PC, whether at home or at work, is always tricky. Spending on computers can make you more productive, but it's money you otherwise cannot spend, invest or save, and faster systems always await you in the future. What is clear is that the computer industry really wants you to buy, and the computer…

  19. 13 CFR 124.1009 - Who decides disadvantaged status protests?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... protests? 124.1009 Section 124.1009 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1009 Who decides...

  20. 13 CFR 124.1009 - Who decides disadvantaged status protests?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... protests? 124.1009 Section 124.1009 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1009 Who decides...

  1. 13 CFR 124.1009 - Who decides disadvantaged status protests?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... protests? 124.1009 Section 124.1009 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1009 Who decides...

  2. 13 CFR 124.1009 - Who decides disadvantaged status protests?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... protests? 124.1009 Section 124.1009 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1009 Who decides...

  3. 13 CFR 124.1009 - Who decides disadvantaged status protests?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... protests? 124.1009 Section 124.1009 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1009 Who decides...

  4. An aluminium tool for multiple stellar generations in the globular clusters 47 Tucanae and M 4

    NASA Astrophysics Data System (ADS)

    Carretta, E.; Gratton, R. G.; Bragaglia, A.; D'Orazi, V.; Lucatello, S.

    2013-02-01

    We present aluminium abundances for a sample of about 100 red giant stars in each of the Galactic globular clusters 47 Tuc (NGC 104) and M 4 (NGC 6121). We have derived homogeneous abundances from intermediate-resolution FLAMES/GIRAFFE spectra. Aluminium abundances are from the strong doublet Al i 8772-8773 Å, as in previous works done for giants in NGC 6752 and NGC 1851, and nitrogen abundances are extracted from a large number of features of the CN molecules by assuming a suitable carbon abundance. We added previous homogeneous abundances of O and Na and newly derived abundances of Mg and Si for our samples of 83 stars in M 4 and 116 stars in 47 Tuc to obtain the full set of elements from proton-capture reactions produced by different stellar generations in these clusters. By simultaneously studying the Ne-Na and Mg-Al cycles of H-burning at high temperature, our main aims are to understand the nature of the polluters at work in the first generation and to ascertain whether the second generation of cluster stars was formed in one or, rather, several episodes of star formation. Our data confirm that in M 4 only two stellar populations are visible. On the other hand, for 47 Tuc a cluster analysis performed on our full dataset suggests that at least three distinct groups of stars are present on the giant branch. The abundances of O, Na, Mg, and Al in the intermediate group can be produced within a pollution scenario; results for N are ambiguous, depending on the C abundance we adopt for the three groups. Based on observations collected at ESO telescopes under program 085.D-0205 and on public data from the ESO/ST-ECF Science Archive Facility.Tables 2 and 3 are available in electronic form at http://www.aanda.org

  5. Mitochondrial DNA methylation as a next-generation biomarker and diagnostic tool.

    PubMed

    Iacobazzi, Vito; Castegna, Alessandra; Infantino, Vittoria; Andria, Generoso

    2013-01-01

    Recent expansion of our knowledge on epigenetic changes strongly suggests that not only nuclear DNA (nDNA), but also mitochondrial DNA (mtDNA) may be subjected to epigenetic modifications related to disease development, environmental exposure, drug treatment and aging. Thus, mtDNA methylation is attracting increasing attention as a potential biomarker for the detection and diagnosis of diseases and the understanding of cellular behavior in particular conditions. In this paper we review the current advances in mtDNA methylation studies with particular attention to the evidences of mtDNA methylation changes in diseases and physiological conditions so far investigated. Technological advances for the analysis of epigenetic variations are promising tools to provide insights into methylation of mtDNA with similar resolution levels as those reached for nDNA. However, many aspects related to mtDNA methylation are still unclear. More studies are needed to understand whether and how changes in mtDNA methylation patterns, global and gene specific, are associated to diseases or risk factors. PMID:23920043

  6. Development of a new generation of active AFM tools for applications in liquids

    NASA Astrophysics Data System (ADS)

    Rollier, A.-S.; Jenkins, D.; Dogheche, E.; Legrand, B.; Faucher, M.; Buchaillot, L.

    2010-08-01

    Atomic force microscopy (AFM) is a powerful imaging tool with high-resolution imaging capability. AFM probes consist of a very sharp tip at the end of a silicon cantilever that can respond to surface artefacts to produce an image of the topography or surface features. They are intrinsically passive devices. For imaging soft biological samples, and also for samples in liquid, it is essential to control the AFM tip position, both statically and dynamically, and this is not possible using external actuators mounted on the AFM chip. AFM cantilevers have been fabricated using silicon micromachining to incorporate a piezoelectric thin film actuator for precise control. The piezoelectric thin films have been fully characterized to determine their actuation performance and to characterize the operation of the integrated device. Examples of the spatial and vertical response are presented to illustrate their imaging capability. For operation in a liquid environment, the dynamic behaviour has been modelled and verified experimentally. The optimal drive conditions for the cantilever, along with their dynamic response, including frequency and phase in air and water, are presented.

  7. NASA's Learning Technology Project: Developing Educational Tools for the Next Generation of Explorers

    NASA Astrophysics Data System (ADS)

    Federman, A. N.; Hogan, P. J.

    2003-12-01

    Since 1996, NASA's Learning Technology has pioneered the use of innovative technology toinspire students to pursue careers in STEM(Science, Technology, Engineering and Math.) In the past this has included Web sites like Quest and the Observatorium, webcasts and distance learning courses, and even interactive television broadcasts. Our current focus is on development of several mission oriented software packages, targeted primarily at the middle-school population, but flexible enough to be used by elementary to graduate students. These products include contributions to an open source solar system simulator, a 3D planetary encyclopedia), development of a planetary surface viewer (atlas) and others. Whenever possible these software products are written to be 'open source' and multi-platform, for the widest use and easiest access for developers. Along with the software products, we are developing activities and lesson plans that are tested and used by educators in the classroom. The products are reviewed by professional educators. Together these products constitute the NASA Experential Platform for learning, in which the tools used by the public are similar (and in some respects) the same as those used by professional investigators. Efforts are now underway to incorporate actual MODIS and other real time data uplink capabilities.

  8. Third Harmonic Generation microscopy as a diagnostic tool for the investigation of microglia BV-2 and breast cancer cells activation

    NASA Astrophysics Data System (ADS)

    Gavgiotaki, E.; Filippidis, G.; Psilodimitrakopoulos, S.; Markomanolaki, H.; Kalognomou, M.; Agelaki, S.; Georgoulias, V.; Athanassakis, I.

    2015-07-01

    Nonlinear optical imaging techniques have created new opportunities of research in the biomedical field. Specifically, Third Harmonic Generation (THG) seems to be a suitable noninvasive imaging tool for the delineation and quantification of biological structures at the microscopic level. The aim of this study was to extract information as to the activation state of different cell types by using the THG imaging microscopy as a diagnostic tool. BV-2 microglia cell line was used as a representative biological model enabling the study of resting and activated state of the cells linked to various pathological conditions. Third Harmonic Generation (THG) and Two Photon Excitation Fluorescence (TPEF) measurements were simultaneously collected from stained breast cancer cells, by employing a single homemade experimental apparatus and it was shown that high THG signals mostly arise from lipid bodies. Continuously, BV-2 microglia cells were examined with or without activation by lipopolysaccharide (LPS) in order to discriminate between control and activated cells based on the quantification of THG signals. Statistically quantification was accomplished in both mean area and mean intensity values of THG. The values for mean total area and mean THG intensity values have been increased in activated versus the non-activated cells. Similar studies of quantification are underway in breast cancer cells for the exact discrimination on different cell lines. Furthermore, laser polarization dependence of SHG and THG signal in unstained biological samples is investigated.

  9. PLAN-IT-2: The next generation planning and scheduling tool

    NASA Technical Reports Server (NTRS)

    Eggemeyer, William C.; Cruz, Jennifer W.

    1990-01-01

    PLAN-IT is a scheduling program which has been demonstrated and evaluated in a variety of scheduling domains. The capability enhancements being made for the next generation of PLAN-IT, called PLAN-IT-2 is discussed. PLAN-IT-2 represents a complete rewrite of the original PLAN-IT incorporating major changes as suggested by the application experiences with the original PLAN-IT. A few of the enhancements described are additional types of constraints, such as states and resettable-depletables (batteries), dependencies between constraints, multiple levels of activity planning during the scheduling process, pattern constraint searching for opportunities as opposed to just minimizing the amount of conflicts, additional customization construction features for display and handling of diverse multiple time systems, and reduction in both the size and the complexity for creating the knowledge-base to address the different problem domains.

  10. Clustering of cochlear oscillations in frequency plateaus as a tool to investigate SOAE generation

    NASA Astrophysics Data System (ADS)

    Epp, Bastian; Wit, Hero; van Dijk, Pim

    2015-12-01

    Spontonaeous otoacoustic emissions (SOAE) reflect the net effect of self-sustained activity in the cochlea, but do not directly provide information about the underlying mechanism and place of origin within the cochlea. The present study investigates if frequency plateaus as found in a linear array of coupled oscillators (OAM) [7] are also found in a transmission line model (TLM) which is able to generate realistic SOAEs [2] and if these frequency plateaus can be used to explain the formation of SOAEs. The simulations showed a clustering of oscillators along the simulated basilar membrane Both, the OAM and the TLM show traveling-wave like behavior along the oscillators coupled into one frequency plateau. While in the TLM roughness is required in order to produce SOAEs, no roughness is required to trigger frequency plateaus in the linear array of oscillators. The formation of frequency plateaus as a consequence of coupling between neighbored active oscillators might be the mechanism underlying SOAEs.

  11. Fractal analysis of experimentally generated pyroclasts: A tool for volcanic hazard assessment

    NASA Astrophysics Data System (ADS)

    Perugini, Diego; Kueppers, Ulrich

    2012-06-01

    Rapid decompression experiments on natural volcanic rocks mimick explosive eruptions. Fragment size distributions (FSD) of such experimentally generated pyroclasts are investigated using fractal geometry. The fractal dimension of fragmentation, D, of FSD is measured for samples from Unzen (Japan) and Popocatépetl (Mexico) volcanoes. Results show that: (i) FSD are fractal and can be quantified by measuring D values; (ii) D increases linearly with potential energy for fragmentation (PEF) and, thus, with increasing applied pressure; (iii) the rate of increase of D with PEF depends on open porosity: the higher the open porosity, the lower the increase of D with PEF; (iv) at comparable open porosity, samples display a similar behavior for any rock composition. The method proposed here has the potential to become a standard routine to estimate eruptive energy of past and recent eruptions using values of D and open porosity, providing an important step towards volcanic hazard assessment.

  12. Generative Topographic Mapping (GTM): Universal Tool for Data Visualization, Structure-Activity Modeling and Dataset Comparison.

    PubMed

    Kireeva, N; Baskin, I I; Gaspar, H A; Horvath, D; Marcou, G; Varnek, A

    2012-04-01

    Here, the utility of Generative Topographic Maps (GTM) for data visualization, structure-activity modeling and database comparison is evaluated, on hand of subsets of the Database of Useful Decoys (DUD). Unlike other popular dimensionality reduction approaches like Principal Component Analysis, Sammon Mapping or Self-Organizing Maps, the great advantage of GTMs is providing data probability distribution functions (PDF), both in the high-dimensional space defined by molecular descriptors and in 2D latent space. PDFs for the molecules of different activity classes were successfully used to build classification models in the framework of the Bayesian approach. Because PDFs are represented by a mixture of Gaussian functions, the Bhattacharyya kernel has been proposed as a measure of the overlap of datasets, which leads to an elegant method of global comparison of chemical libraries. PMID:27477099

  13. Transposon assisted gene insertion technology (TAGIT): a tool for generating fluorescent fusion proteins.

    PubMed

    Gregory, James A; Becker, Eric C; Jung, James; Tuwatananurak, Ida; Pogliano, Kit

    2010-01-01

    We constructed a transposon (transposon assisted gene insertion technology, or TAGIT) that allows the random insertion of gfp (or other genes) into chromosomal loci without disrupting operon structure or regulation. TAGIT is a modified Tn5 transposon that uses Kan(R) to select for insertions on the chromosome or plasmid, beta-galactosidase to identify in-frame gene fusions, and Cre recombinase to excise the kan and lacZ genes in vivo. The resulting gfp insertions maintain target gene reading frame (to the 5' and 3' of gfp) and are integrated at the native chromosomal locus, thereby maintaining native expression signals. Libraries can be screened to identify GFP insertions that maintain target protein function at native expression levels, allowing more trustworthy localization studies. We here use TAGIT to generate a library of GFP insertions in the Escherichia coli lactose repressor (LacI). We identified fully functional GFP insertions and partially functional insertions that bind DNA but fail to repress the lacZ operon. Several of these latter GFP insertions localize to lacO arrays integrated in the E. coli chromosome without producing the elongated cells frequently observed when functional LacI-GFP fusions are used in chromosome tagging experiments. TAGIT thereby faciliates the isolation of fully functional insertions of fluorescent proteins into target proteins expressed from the native chromosomal locus as well as potentially useful partially functional proteins. PMID:20090956

  14. Development and implementation of an electronic health record generated surgical handoff and rounding tool.

    PubMed

    Raval, Mehul V; Rust, Laura; Thakkar, Rajan K; Kurtovic, Kelli J; Nwomeh, Benedict C; Besner, Gail E; Kenney, Brian D

    2015-02-01

    Electronic health records (EHR) have been adopted across the nation at tremendous effort and expense. The purpose of this study was to assess improvements in accuracy, efficiency, and patient safety for a high-volume pediatric surgical service with adoption of an EHR-generated handoff and rounding list. The quality and quantity of errors were compared pre- and post-EHR-based list implementation. A survey was used to determine time spent by team members using the two versions of the list. Perceived utility, safety, and quality of the list were reported. Serious safety events determined by the hospital were also compared for the two periods. The EHR-based list eliminated clerical errors while improving efficiency by automatically providing data such as vital signs. Survey respondents reported 43 min saved per week per team member, translating to 372 work hours of time saved annually for a single service. EHR-based list users reported higher satisfaction and perceived improvement in efficiency, accuracy, and safety. Serious safety events remained unchanged. In conclusion, creation of an EHR-based list to assist with daily handoffs, rounding, and patient management demonstrated improved accuracy, increased efficiency, and assisted in maintaining a high level of safety. PMID:25631842

  15. Second harmonic generation imaging as a potential tool for staging pregnancy and predicting preterm birth

    NASA Astrophysics Data System (ADS)

    Akins, Meredith L.; Luby-Phelps, Katherine; Mahendroo, Mala

    2010-03-01

    We use second harmonic generation (SHG) microscopy to assess changes in collagen structure of murine cervix during cervical remodeling of normal pregnancy and in a preterm birth model. Visual inspection of SHG images revealed substantial changes in collagen morphology throughout normal gestation. SHG images collected in both the forward and backward directions were analyzed quantitatively for changes in overall mean intensity, forward to backward intensity ratio, collagen fiber size, and porosity. Changes in mean SHG intensity and intensity ratio take place in early pregnancy, suggesting that submicroscopic changes in collagen fibril size and arrangement occur before macroscopic changes become evident. Fiber size progressively increased from early to late pregnancy, while pores between collagen fibers became larger and farther apart. Analysis of collagen features in premature cervical remodeling show that changes in collagen structure are dissimilar from normal remodeling. The ability to quantify multiple morphological features of collagen that characterize normal cervical remodeling and distinguish abnormal remodeling in preterm birth models supports future studies aimed at development of SHG endoscopic devices for clinical assessment of collagen changes during pregnancy in women and for predicting risk of preterm labor which occurs in 12.5% of all pregnancies.

  16. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  17. Generation of Fluorogen-Activating Designed Ankyrin Repeat Proteins (FADAs) as Versatile Sensor Tools.

    PubMed

    Schütz, Marco; Batyuk, Alexander; Klenk, Christoph; Kummer, Lutz; de Picciotto, Seymour; Gülbakan, Basri; Wu, Yufan; Newby, Gregory A; Zosel, Franziska; Schöppe, Jendrik; Sedlák, Erik; Mittl, Peer R E; Zenobi, Renato; Wittrup, K Dane; Plückthun, Andreas

    2016-03-27

    Fluorescent probes constitute a valuable toolbox to address a variety of biological questions and they have become irreplaceable for imaging methods. Commonly, such probes consist of fluorescent proteins or small organic fluorophores coupled to biological molecules of interest. Recently, a novel class of fluorescence-based probes, fluorogen-activating proteins (FAPs), has been reported. These binding proteins are based on antibody single-chain variable fragments and activate fluorogenic dyes, which only become fluorescent upon activation and do not fluoresce when free in solution. Here we present a novel class of fluorogen activators, termed FADAs, based on the very robust designed ankyrin repeat protein scaffold, which also readily folds in the reducing environment of the cytoplasm. The FADA generated in this study was obtained by combined selections with ribosome display and yeast surface display. It enhances the fluorescence of malachite green (MG) dyes by a factor of more than 11,000 and thus activates MG to a similar extent as FAPs based on single-chain variable fragments. As shown by structure determination and in vitro measurements, this FADA was evolved to form a homodimer for the activation of MG dyes. Exploiting the favorable properties of the designed ankyrin repeat protein scaffold, we created a FADA biosensor suitable for imaging of proteins on the cell surface, as well as in the cytosol. Moreover, based on the requirement of dimerization for strong fluorogen activation, a prototype FADA biosensor for in situ detection of a target protein and protein-protein interactions was developed. Therefore, FADAs are versatile fluorescent probes that are easily produced and suitable for diverse applications and thus extend the FAP technology. PMID:26812208

  18. Astronomy as a Tool for Training the Next Generation Technical Workforce

    NASA Astrophysics Data System (ADS)

    Romero, V.; Walsh, G.; Ryan, W.; Ryan, E.

    A major challenge for today's institutes of higher learning is training the next generation of scientists, engineers, and optical specialists to be proficient in the latest technologies they will encounter when they enter the workforce. Although research facilities can offer excellent hands-on instructional opportunities, integrating such experiential learning into academic coursework without disrupting normal operations at such facilities can be difficult. Also, motivating entry level students to increase their skill levels by undertaking and successfully completing difficult coursework can require more creative instructional approaches, including fostering a fun, non-threatening environment for enhancing basic abilities. Astronomy is a universally appealing subject area, and can be very effective as a foundation for cultivating advanced competencies. We report on a project underway at the New Mexico Institute of Mining and Technology (NM Tech), a science and engineering school in Socorro, NM, to incorporate a state-of-the-art optical telescope and laboratory experiments into an entry-level course in basic engineering. Students enrolled in an explosive engineering course were given a topical problem in Planetary Astronomy: they were asked to develop a method to energetically mitigate a potentially hazardous impact between our planet and a Near-Earth asteroid to occur sometime in the future. They were first exposed to basic engineering training in the areas of fracture and material response to failure under different environmental conditions through lectures and traditional laboratory exercises. The students were then given access to NM Tech's Magdalena Ridge Observatory's (MRO) 2.4-meter telescope to collect physical characterization data, (specifically shape information) on two potentially hazardous asteroids (one roughly spherical, the other an elongated ellipsoid). Finally, the students used NM Tech's Energetic Materials Research and Testing Center (EMRTC) to

  19. The Society-Deciders Model and Fairness in Nations

    NASA Astrophysics Data System (ADS)

    Flomenbom, Ophir

    2015-05-01

    Modeling the dynamics in nations from economical and sociological perspectives is a central theme in economics and sociology. Accurate models can predict and therefore help all the world's citizens. Yet, recent years have show that the current models are missing. Here, we develop a dynamical society-deciders model that can explain the stability in a nation, based on concepts from dynamics, ecology and socio-econo-physics; a nation has two groups that interconnect, the deciders and the society. We show that a nation is either stable or it collapses. This depends on just two coefficients that we relate with sociological and economical indicators. We define a new socio-economic indicator, fairness. Fairness can measure the stability in a nation and how probable a change favoring the society is. We compute fairness among all the world's nations. Interestingly, in comparison with other indicators, fairness shows that the USA loses its rank among Western democracies, India is the best among the 15 most populated nations, and Egypt, Libya and Tunisia have significantly improved their rankings as a result of recent revolutions, further increasing the probability of additional positive changes. Within the model, long lasting crises are solved rather than with increasing governmental spending or cuts with regulations that reduce the stability of the deciders, namely, increasing fairness, while, for example, shifting wealth in the direction of the people, and therefore increasing further opportunities.

  20. Generation of an ABCG2{sup GFPn-puro} transgenic line - A tool to study ABCG2 expression in mice

    SciTech Connect

    Orford, Michael; Mean, Richard; Lapathitis, George; Genethliou, Nicholas; Panayiotou, Elena; Panayi, Helen; Malas, Stavros

    2009-06-26

    The ATP-binding cassette (ABC) transporter 2 (ABCG2) is expressed by stem cells in many organs and in stem cells of solid tumors. These cells are isolated based on the side population (SP) phenotype, a Hoechst 3342 dye efflux property believed to be conferred by ABCG2. Because of the limitations of this approach we generated transgenic mice that express Nuclear GFP (GFPn) coupled to the Puromycin-resistance gene, under the control of ABCG2 promoter/enhancer sequences. We show that ABCG2 is expressed in neural progenitors of the developing forebrain and spinal cord and in embryonic and adult endothelial cells of the brain. Using the neurosphere assay, we isolated tripotent ABCG2-expressing neural stem cells from embryonic mouse brain. This transgenic line is a powerful tool for studying the expression of ABCG2 in many tissues and for performing functional studies in different experimental settings.

  1. Generation of a Knockout Mouse Embryonic Stem Cell Line Using a Paired CRISPR/Cas9 Genome Engineering Tool.

    PubMed

    Wettstein, Rahel; Bodak, Maxime; Ciaudo, Constance

    2016-01-01

    CRISPR/Cas9, originally discovered as a bacterial immune system, has recently been engineered into the latest tool to successfully introduce site-specific mutations in a variety of different organisms. Composed only of the Cas9 protein as well as one engineered guide RNA for its functionality, this system is much less complex in its setup and easier to handle than other guided nucleases such as Zinc-finger nucleases or TALENs.Here, we describe the simultaneous transfection of two paired CRISPR sgRNAs-Cas9 plasmids, in mouse embryonic stem cells (mESCs), resulting in the knockout of the selected target gene. Together with a four primer-evaluation system, it poses an efficient way to generate new independent knockout mouse embryonic stem cell lines. PMID:25762293

  2. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives

    PubMed Central

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development. PMID:24564169

  3. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  4. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  5. A Scalable and Accurate Targeted Gene Assembly Tool (SAT-Assembler) for Next-Generation Sequencing Data

    PubMed Central

    Zhang, Yuan; Sun, Yanni; Cole, James R.

    2014-01-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material. PMID:25122209

  6. A scalable and accurate targeted gene assembly tool (SAT-Assembler) for next-generation sequencing data.

    PubMed

    Zhang, Yuan; Sun, Yanni; Cole, James R

    2014-08-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material. PMID:25122209

  7. A new generation of tools for search, recovery and quality evaluation of World Wide Web medical resources.

    PubMed

    Aguillo, I

    2000-01-01

    Although the Internet is already a valuable information resource in medicine, there are important challenges to be faced before physicians and general users will have extensive access to this information. As a result of a research effort to compile a health-related Internet directory, new tools and strategies have been developed to solve key problems derived from the explosive growth of medical information on the Net and the great concern over the quality of such critical information. The current Internet search engines lack some important capabilities. We suggest using second generation tools (client-side based) able to deal with large quantities of data and to increase the usability of the records recovered. We tested the capabilities of these programs to solve health-related information problems, recognising six groups according to the kind of topics addressed: Z39.50 clients, downloaders, multisearchers, tracing agents, indexers and mappers. The evaluation of the quality of health information available on the Internet could require a large amount of human effort. A possible solution may be to use quantitative indicators based on the hypertext visibility of the Web sites. The cybermetric measures are valid for quality evaluation if they are derived from indirect peer review by experts with Web pages citing the site. The hypertext links acting as citations need to be extracted from a controlled sample of quality super-sites. PMID:11142063

  8. Next-Generation Phage Display: Integrating and Comparing Available Molecular Tools to Enable Cost-Effective High-Throughput Analysis

    PubMed Central

    Dias-Neto, Emmanuel; Nunes, Diana N.; Giordano, Ricardo J.; Sun, Jessica; Botz, Gregory H.; Yang, Kuan; Setubal, João C.; Pasqualini, Renata; Arap, Wadih

    2009-01-01

    Background Combinatorial phage display has been used in the last 20 years in the identification of protein-ligands and protein-protein interactions, uncovering relevant molecular recognition events. Rate-limiting steps of combinatorial phage display library selection are (i) the counting of transducing units and (ii) the sequencing of the encoded displayed ligands. Here, we adapted emerging genomic technologies to minimize such challenges. Methodology/Principal Findings We gained efficiency by applying in tandem real-time PCR for rapid quantification to enable bacteria-free phage display library screening, and added phage DNA next-generation sequencing for large-scale ligand analysis, reporting a fully integrated set of high-throughput quantitative and analytical tools. The approach is far less labor-intensive and allows rigorous quantification; for medical applications, including selections in patients, it also represents an advance for quantitative distribution analysis and ligand identification of hundreds of thousands of targeted particles from patient-derived biopsy or autopsy in a longer timeframe post library administration. Additional advantages over current methods include increased sensitivity, less variability, enhanced linearity, scalability, and accuracy at much lower cost. Sequences obtained by qPhage plus pyrosequencing were similar to a dataset produced from conventional Sanger-sequenced transducing-units (TU), with no biases due to GC content, codon usage, and amino acid or peptide frequency. These tools allow phage display selection and ligand analysis at >1,000-fold faster rate, and reduce costs ∼250-fold for generating 106 ligand sequences. Conclusions/Significance Our analyses demonstrates that whereas this approach correlates with the traditional colony-counting, it is also capable of a much larger sampling, allowing a faster, less expensive, more accurate and consistent analysis of phage enrichment. Overall, qPhage plus pyrosequencing is

  9. Evaluation of pulsed laser ablation in liquids generated gold nanoparticles as novel transfection tools: efficiency and cytotoxicity

    NASA Astrophysics Data System (ADS)

    Willenbrock, Saskia; Durán, María. Carolina; Barchanski, Annette; Barcikowski, Stephan; Feige, Karsten; Nolte, Ingo; Murua Escobar, Hugo

    2014-03-01

    Varying transfection efficiencies and cytotoxicity are crucial aspects in cell manipulation. The utilization of gold nanoparticles (AuNP) has lately attracted special interest to enhance transfection efficiency. Conventional AuNP are usually generated by chemical reactions or gas pyrolysis requiring often cell-toxic stabilizers or coatings to conserve their characteristics. Alternatively, stabilizer- and coating-free, highly pure, colloidal AuNP can be generated by pulsed laser ablation in liquids (PLAL). Mammalian cells were transfected efficiently by addition of PLAL-AuNP, but data systematically evaluating the cell-toxic potential are lacking. Herein, the transfection efficiency and cytotoxicity of PLAL AuNP was evaluated by transfection of a mammalian cell line with a recombinant HMGB1/GFP DNA expression vector. Different methods were compared using two sizes of PLAL-AuNP, commercialized AuNP, two magnetic NP-based protocols and a conventional transfection reagent (FuGENE HD; FHD). PLAL-AuNP were generated using a Spitfire Pro femtosecond laser system delivering 120 fs laser pulses at a wavelength of 800 nm focusing the fs-laser beam on a 99.99% pure gold target placed in ddH2O. Transfection efficiencies were analyzed after 24h using fluorescence microscopy and flow cytometry. Toxicity was assessed measuring cell proliferation and percentage of necrotic, propidium iodide positive cells (PI %). The addition of PLAL-AuNP significantly enhanced transfection efficiencies (FHD: 31 %; PLAL-AuNP size-1: 46 %; size-2: 50 %) with increased PI% but no reduced cell proliferation. Commercial AuNP-transfection showed significantly lower efficiency (23 %), slightly increased PI % and reduced cell proliferation. Magnetic NP based methods were less effective but showing also lowest cytotoxicity. In conclusion, addition of PLAL-AuNP provides a novel tool for transfection efficiency enhancement with acceptable cytotoxic side-effects.

  10. Do sunbirds use taste to decide how much to drink?

    PubMed

    Bailey, Ida E; Nicolson, Susan W

    2016-03-01

    Nectarivorous birds typically consume smaller meals of more concentrated than of less concentrated sugar solutions. It is not clear, however, whether they use taste to decide how much to consume or whether they base this decision on post-ingestive feedback. Taste, a cue to nectar concentration, is available to nectarivores during ingestion whereas post-ingestive information about resource quality becomes available only after a meal. When conditions are variable, we would expect nectarivorous birds to base their decisions on how much to consume on taste, as post-ingestive feedback from previous meals would not be a reliable cue to current resource quality. Here, we tested whether white-bellied sunbirds (Cinnyris talatala), foraging from an array of artificial flowers, use taste to decide how much to consume per meal when nectar concentration is highly variable: they did not. Instead, how much they chose to consume per meal appeared to depend on the energy intake at the previous meal, that is how hungry they were. Our birds did, however, appear to use taste to decide how much to consume per flower visited within a meal. Unexpectedly, some individuals preferred to consume more from flowers with lower concentration rewards and some preferred to do the opposite. We draw attention to the fact that many studies perhaps misleadingly claim that birds use sweet taste to inform their foraging decisions, as they analyse mean data for multiple meals over which post-ingestive feedback will have become available rather than data for individual meals when only sensory information is available. We discuss how conflicting foraging rules could explain why sunbirds do not use sweet taste to inform their meal size decisions. PMID:26618299

  11. Wireless sensor systems for sense/decide/act/communicate.

    SciTech Connect

    Berry, Nina M.; Cushner, Adam; Baker, James A.; Davis, Jesse Zehring; Stark, Douglas P.; Ko, Teresa H.; Kyker, Ronald D.; Stinnett, Regan White; Pate, Ronald C.; Van Dyke, Colin; Kyckelhahn, Brian

    2003-12-01

    After 9/11, the United States (U.S.) was suddenly pushed into challenging situations they could no longer ignore as simple spectators. The War on Terrorism (WoT) was suddenly ignited and no one knows when this war will end. While the government is exploring many existing and potential technologies, the area of wireless Sensor networks (WSN) has emerged as a foundation for establish future national security. Unlike other technologies, WSN could provide virtual presence capabilities needed for precision awareness and response in military, intelligence, and homeland security applications. The Advance Concept Group (ACG) vision of Sense/Decide/Act/Communicate (SDAC) sensor system is an instantiation of the WSN concept that takes a 'systems of systems' view. Each sensing nodes will exhibit the ability to: Sense the environment around them, Decide as a collective what the situation of their environment is, Act in an intelligent and coordinated manner in response to this situational determination, and Communicate their actions amongst each other and to a human command. This LDRD report provides a review of the research and development done to bring the SDAC vision closer to reality.

  12. Dynamic combinatorial/covalent chemistry: a tool to read, generate and modulate the bioactivity of compounds and compound mixtures.

    PubMed

    Herrmann, Andreas

    2014-03-21

    Reversible covalent bond formation under thermodynamic control adds reactivity to self-assembled supramolecular systems, and is therefore an ideal tool to assess complexity of chemical and biological systems. Dynamic combinatorial/covalent chemistry (DCC) has been used to read structural information by selectively assembling receptors with the optimum molecular fit around a given template from a mixture of reversibly reacting building blocks. This technique allows access to efficient sensing devices and the generation of new biomolecules, such as small molecule receptor binders for drug discovery, but also larger biomimetic polymers and macromolecules with particular three-dimensional structural architectures. Adding a kinetic factor to a thermodynamically controlled equilibrium results in dynamic resolution and in self-sorting and self-replicating systems, all of which are of major importance in biological systems. Furthermore, the temporary modification of bioactive compounds by reversible combinatorial/covalent derivatisation allows control of their release and facilitates their transport across amphiphilic self-assembled systems such as artificial membranes or cell walls. The goal of this review is to give a conceptual overview of how the impact of DCC on supramolecular assemblies at different levels can allow us to understand, predict and modulate the complexity of biological systems. PMID:24296754

  13. Next generation genome-wide association tool: Design and coverage of a high-throughput European-optimized SNP array

    PubMed Central

    Hoffmann, Thomas J.; Kvale, Mark N.; Hesselson, Stephanie E.; Zhan, Yiping; Aquino, Christine; Cao, Yang; Cawley, Simon; Chung, Elaine; Connell, Sheryl; Eshragh, Jasmin; Ewing, Marcia; Gollub, Jeremy; Henderson, Mary; Hubbell, Earl; Iribarren, Carlos; Kaufman, Jay; Lao, Richard Z.; Lu, Yontao; Ludwig, Dana; Mathauda, Gurpreet K.; McGuire, William; Mei, Gangwu; Miles, Sunita; Purdy, Matthew M.; Quesenberry, Charles; Ranatunga, Dilrini; Rowell, Sarah; Sadler, Marianne; Shapero, Michael H.; Shen, Ling; Shenoy, Tanushree R.; Smethurst, David; Van den Eeden, Stephen K.; Walter, Larry; Wan, Eunice; Wearley, Reid; Webster, Teresa; Wen, Christopher C.; Weng, Li; Whitmer, Rachel A.; Williams, Alan; Wong, Simon C.; Zau, Chia; Finn, Andrea; Schaefer, Catherine; Kwok, Pui-Yan; Risch, Neil

    2011-01-01

    The success of genome-wide association studies has paralleled the development of efficient genotyping technologies. We describe the development of a next-generation microarray based on the new highly-efficient Affymetrix Axiom genotyping technology that we are using to genotype individuals of European ancestry from the Kaiser Permanente Research Program on Genes, Environment and Health (RPGEH). The array contains 674,517 SNPs, and provides excellent genome-wide as well as gene-based and candidate-SNP coverage. Coverage was calculated using an approach based on imputation and cross validation. Preliminary results for the first 80,301 saliva-derived DNA samples from the RPGEH demonstrate very high quality genotypes, with sample success rates above 94% and over 98% of successful samples having SNP call rates exceeding 98%. At steady state, we have produced 462 million genotypes per week for each Axiom system. The new array provides a valuable addition to the repertoire of tools for large scale genome-wide association studies. PMID:21565264

  14. Phenotype MicroArrays as a complementary tool to next generation sequencing for characterization of tree endophytes.

    PubMed

    Blumenstein, Kathrin; Macaya-Sanz, David; Martín, Juan A; Albrectsen, Benedicte R; Witzell, Johanna

    2015-01-01

    There is an increasing need to calibrate microbial community profiles obtained through next generation sequencing (NGS) with relevant taxonomic identities of the microbes, and to further associate these identities with phenotypic attributes. Phenotype MicroArray (PM) techniques provide a semi-high throughput assay for characterization and monitoring the microbial cellular phenotypes. Here, we present detailed descriptions of two different PM protocols used in our recent studies on fungal endophytes of forest trees, and highlight the benefits and limitations of this technique. We found that the PM approach enables effective screening of substrate utilization by endophytes. However, the technical limitations are multifaceted and the interpretation of the PM data challenging. For the best result, we recommend that the growth conditions for the fungi are carefully standardized. In addition, rigorous replication and control strategies should be employed whether using pre-configured, commercial microwell-plates or in-house designed PM plates for targeted substrate analyses. With these precautions, the PM technique is a valuable tool to characterize the metabolic capabilities of individual endophyte isolates, or successional endophyte communities identified by NGS, allowing a functional interpretation of the taxonomic data. Thus, PM approaches can provide valuable complementary information for NGS studies of fungal endophytes in forest trees. PMID:26441951

  15. Phenotype MicroArrays as a complementary tool to next generation sequencing for characterization of tree endophytes

    PubMed Central

    Blumenstein, Kathrin; Macaya-Sanz, David; Martín, Juan A.; Albrectsen, Benedicte R.; Witzell, Johanna

    2015-01-01

    There is an increasing need to calibrate microbial community profiles obtained through next generation sequencing (NGS) with relevant taxonomic identities of the microbes, and to further associate these identities with phenotypic attributes. Phenotype MicroArray (PM) techniques provide a semi-high throughput assay for characterization and monitoring the microbial cellular phenotypes. Here, we present detailed descriptions of two different PM protocols used in our recent studies on fungal endophytes of forest trees, and highlight the benefits and limitations of this technique. We found that the PM approach enables effective screening of substrate utilization by endophytes. However, the technical limitations are multifaceted and the interpretation of the PM data challenging. For the best result, we recommend that the growth conditions for the fungi are carefully standardized. In addition, rigorous replication and control strategies should be employed whether using pre-configured, commercial microwell-plates or in-house designed PM plates for targeted substrate analyses. With these precautions, the PM technique is a valuable tool to characterize the metabolic capabilities of individual endophyte isolates, or successional endophyte communities identified by NGS, allowing a functional interpretation of the taxonomic data. Thus, PM approaches can provide valuable complementary information for NGS studies of fungal endophytes in forest trees. PMID:26441951

  16. Profiling biopharmaceutical deciding properties of absorption of lansoprazole enteric-coated tablets using gastrointestinal simulation technology.

    PubMed

    Wu, Chunnuan; Sun, Le; Sun, Jin; Yang, Yajun; Ren, Congcong; Ai, Xiaoyu; Lian, He; He, Zhonggui

    2013-09-10

    The aim of the present study was to correlate in vitro properties of drug formulation to its in vivo performance, and to elucidate the deciding properties of oral absorption. Gastrointestinal simulation technology (GST) was used to simulate the in vivo plasma concentration-time curve and was implemented by GastroPlus™ software. Lansoprazole, a typical BCS class II drug, was chosen as a model drug. Firstly, physicochemical and pharmacokinetic parameters of lansoprazole were determined or collected from literature to construct the model. Validation of the developed model was performed by comparison of the predicted and the experimental plasma concentration data. We found that the predicted curve was in a good agreement with the experimental data. Then, parameter sensitivity analysis (PSA) was performed to find the key parameters of oral absorption. The absorption was particularly sensitive to dose, solubility and particle size for lansoprazole enteric-coated tablets. With a single dose of 30 mg and the solubility of 0.04 mg/ml, the absorption was complete. A good absorption could be achieved with lansoprazole particle radius down to about 25 μm. In summary, GST is a useful tool for profiling biopharmaceutical deciding properties of absorption of lansoprazole enteric-coated tablets and guiding the formulation optimization. PMID:23806811

  17. Methylation status of IGFBP-3 as a useful clinical tool for deciding on a concomitant radiotherapy

    PubMed Central

    Pernía, Olga; Belda-Iniesta, Cristobal; Pulido, Veronica; Cortes-Sempere, María; Rodriguez, Carlos; Vera, Olga; Soto, Javier; Jiménez, Julia; Taus, Alvaro; Rojo, Federico; Arriola, Edurne; Rovira, Ana; Albanell, Joan; Macías, M Teresa; de Castro, Javier; Perona, Rosario; Ibañez de Caceres, Inmaculada

    2014-01-01

    The methylation status of the IGFBP-3 gene is strongly associated with cisplatin sensitivity in patients with non-small cell lung cancer (NSCLC). In this study, we found in vitro evidence that linked the presence of an unmethylated promoter with poor response to radiation. Our data also indicate that radiation might sensitize chemotherapy-resistant cells by reactivating IGFBP-3-expression through promoter demethylation, inactivating the PI3K/AKT pathway. We also explored the IGFBP-3 methylation effect on overall survival (OS) in a population of 40 NSCLC patients who received adjuvant therapy after R0 surgery. Our results indicate that patients harboring an unmethylated promoter could benefit more from a chemotherapy schedule alone than from a multimodality therapy involving radiotherapy and platinum-based treatments, increasing their OS by 2.5 y (p = .03). Our findings discard this epi-marker as a prognostic factor in a patient population without adjuvant therapy, indicating that radiotherapy does not improve survival for patients harboring an unmethylated IGFBP-3 promoter. PMID:25482372

  18. Methylation status of IGFBP-3 as a useful clinical tool for deciding on a concomitant radiotherapy.

    PubMed

    Pernía, Olga; Belda-Iniesta, Cristobal; Pulido, Veronica; Cortes-Sempere, María; Rodriguez, Carlos; Vera, Olga; Soto, Javier; Jiménez, Julia; Taus, Alvaro; Rojo, Federico; Arriola, Edurne; Rovira, Ana; Albanell, Joan; Macías, M Teresa; de Castro, Javier; Perona, Rosario; Ibañez de Caceres, Inmaculada

    2014-11-01

    The methylation status of the IGFBP-3 gene is strongly associated with cisplatin sensitivity in patients with non-small cell lung cancer (NSCLC). In this study, we found in vitro evidence that linked the presence of an unmethylated promoter with poor response to radiation. Our data also indicate that radiation might sensitize chemotherapy-resistant cells by reactivating IGFBP-3-expression through promoter demethylation, inactivating the PI3K/AKT pathway. We also explored the IGFBP-3 methylation effect on overall survival (OS) in a population of 40 NSCLC patients who received adjuvant therapy after R0 surgery. Our results indicate that patients harboring an unmethylated promoter could benefit more from a chemotherapy schedule alone than from a multimodality therapy involving radiotherapy and platinum-based treatments, increasing their OS by 2.5 y (p = .03). Our findings discard this epi-marker as a prognostic factor in a patient population without adjuvant therapy, indicating that radiotherapy does not improve survival for patients harboring an unmethylated IGFBP-3 promoter. PMID:25482372

  19. Deciding Termination for Ancestor Match- Bounded String Rewriting Systems

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2005-01-01

    Termination of a string rewriting system can be characterized by termination on suitable recursively defined languages. This kind of termination criteria has been criticized for its lack of automation. In an earlier paper we have shown how to construct an automated termination criterion if the recursion is aligned with the rewrite relation. We have demonstrated the technique with Dershowitz's forward closure criterion. In this paper we show that a different approach is suitable when the recursion is aligned with the inverse of the rewrite relation. We apply this idea to Kurth's ancestor graphs and obtain ancestor match-bounded string rewriting systems. Termination is shown to be decidable for this class. The resulting method improves upon those based on match-boundedness or inverse match-boundedness.

  20. Monte Carlo event generators in atomic collisions: A new tool to tackle the few-body dynamics

    NASA Astrophysics Data System (ADS)

    Ciappina, M. F.; Kirchner, T.; Schulz, M.

    2010-04-01

    We present a set of routines to produce theoretical event files, for both single and double ionization of atoms by ion impact, based on a Monte Carlo event generator (MCEG) scheme. Such event files are the theoretical counterpart of the data obtained from a kinematically complete experiment; i.e. they contain the momentum components of all collision fragments for a large number of ionization events. Among the advantages of working with theoretical event files is the possibility to incorporate the conditions present in a real experiment, such as the uncertainties in the measured quantities. Additionally, by manipulating them it is possible to generate any type of cross sections, specially those that are usually too complicated to compute with conventional methods due to a lack of symmetry. Consequently, the numerical effort of such calculations is dramatically reduced. We show examples for both single and double ionization, with special emphasis on a new data analysis tool, called four-body Dalitz plots, developed very recently. Program summaryProgram title: MCEG Catalogue identifier: AEFV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2695 No. of bytes in distributed program, including test data, etc.: 18 501 Distribution format: tar.gz Programming language: FORTRAN 77 with parallelization directives using scripting Computer: Single machines using Linux and Linux servers/clusters (with cores with any clock speed, cache memory and bits in a word) Operating system: Linux (any version and flavor) and FORTRAN 77 compilers Has the code been vectorised or parallelized?: Yes RAM: 64-128 kBytes (the codes are very cpu intensive) Classification: 2.6 Nature of problem: The code deals with single and double

  1. E-DECIDER Disaster Response and Decision Support Cyberinfrastructure: Technology and Challenges

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    Timely delivery of critical information to decision makers during a disaster is essential to response and damage assessment. Key issues to an efficient emergency response after a natural disaster include rapidly processing and delivering this critical information to emergency responders and reducing human intervention as much as possible. Essential elements of information necessary to achieve situational awareness are often generated by a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. A key challenge is the current state of practice does not easily support information sharing and technology interoperability. NASA E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) has worked with the California Earthquake Clearinghouse and its partners to address these issues and challenges by adopting the XChangeCore Web Service Data Orchestration technology and participating in several earthquake response exercises. The E-DECIDER decision support system provides rapid delivery of advanced situational awareness data products to operations centers and emergency responders in the field. Remote sensing and hazard data, model-based map products, information from simulations, damage detection, and crowdsourcing is integrated into a single geospatial view and delivered through a service oriented architecture for improved decision-making and then directly to mobile devices of responders. By adopting a Service Oriented Architecture based on Open Geospatial Consortium standards, the system provides an extensible, comprehensive framework for geospatial data processing and distribution on Cloud platforms and other distributed environments. While the Clearinghouse and its partners are not first responders, they do support the emergency response community by providing information about the damaging effects earthquakes. It is critical for decision makers to maintain a situational awareness

  2. Evolutionary algorithm based optimization of hydraulic machines utilizing a state-of-the-art block coupled CFD solver and parametric geometry and mesh generation tools

    NASA Astrophysics Data System (ADS)

    S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr

    2014-03-01

    An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.

  3. FANSe2: a robust and cost-efficient alignment tool for quantitative next-generation sequencing applications.

    PubMed

    Xiao, Chuan-Le; Mai, Zhi-Biao; Lian, Xin-Lei; Zhong, Jia-Yong; Jin, Jing-Jie; He, Qing-Yu; Zhang, Gong

    2014-01-01

    Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT) based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads) to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/. PMID:24743329

  4. FANSe2: A Robust and Cost-Efficient Alignment Tool for Quantitative Next-Generation Sequencing Applications

    PubMed Central

    Xiao, Chuan-Le; Mai, Zhi-Biao; Lian, Xin-Lei; Zhong, Jia-Yong; Jin, Jing-jie; He, Qing-Yu; Zhang, Gong

    2014-01-01

    Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT) based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads) to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/. PMID:24743329

  5. A Graphic Symbol Tool for the Evaluation of Communication, Satisfaction and Priorities of Individuals with Intellectual Disability Who Use a Speech Generating Device

    ERIC Educational Resources Information Center

    Valiquette, Christine; Sutton, Ann; Ska, Bernadette

    2010-01-01

    This article reports on the views of individuals with learning disability (LD) on their use of their speech generating devices (SGDs), their satisfaction about their communication, and their priorities. The development of an interview tool made of graphic symbols and entitled Communication, Satisfaction and Priorities of SGD Users (CSPU) is…

  6. Assessing next-generation sequencing and 4 bioinformatics tools for detection of Enterovirus D68 and other respiratory viruses in clinical samples.

    PubMed

    Huang, Weihua; Wang, Guiqing; Lin, Henry; Zhuge, Jian; Nolan, Sheila M; Vail, Eric; Dimitrova, Nevenka; Fallon, John T

    2016-05-01

    We used 4 different bioinformatics algorithms to evaluate the application of a metagenomic shot-gun sequencing method in detection of Enterovirus D68 and other respiratory viruses in clinical specimens. Our data supported that next-generation sequencing, combined with improved bioinformatics tools, is practically feasible and useful for clinical diagnosis of viral infections. PMID:26971640

  7. Generations.

    PubMed

    Chambers, David W

    2005-01-01

    Groups naturally promote their strengths and prefer values and rules that give them an identity and an advantage. This shows up as generational tensions across cohorts who share common experiences, including common elders. Dramatic cultural events in America since 1925 can help create an understanding of the differing value structures of the Silents, the Boomers, Gen Xers, and the Millennials. Differences in how these generations see motivation and values, fundamental reality, relations with others, and work are presented, as are some applications of these differences to the dental profession. PMID:16623137

  8. Dormancy and germination: How does the crop seed decide?

    PubMed

    Shu, K; Meng, Y J; Shuai, H W; Liu, W G; Du, J B; Liu, J; Yang, W Y

    2015-11-01

    Whether seeds germinate or maintain dormancy is decided upon through very intricate physiological processes. Correct timing of these processes is most important for the plants life cycle. If moist conditions are encountered, a low dormancy level causes pre-harvest sprouting in various crop species, such as wheat, corn and rice, this decreases crop yield and negatively impacts downstream industrial processing. In contrast, a deep level of seed dormancy prevents normal germination even under favourable conditions, resulting in a low emergence rate during agricultural production. Therefore, an optimal seed dormancy level is valuable for modern mechanised agricultural systems. Over the past several years, numerous studies have demonstrated that diverse endogenous and environmental factors regulate the balance between dormancy and germination, such as light, temperature, water status and bacteria in soil, and phytohormones such as ABA (abscisic acid) and GA (gibberellic acid). In this updated review, we highlight recent advances regarding the molecular mechanisms underlying regulation of seed dormancy and germination processes, including the external environmental and internal hormonal cues, and primarily focusing on the staple crop species. Furthermore, future challenges and research directions for developing a full understanding of crop seed dormancy and germination are also discussed. PMID:26095078

  9. Using Tableau to Decide Expressive Description Logics with Role Negation

    NASA Astrophysics Data System (ADS)

    Schmidt, Renate A.; Tishkovsky, Dmitry

    This paper presents a tableau approach for deciding description logics outside the scope of OWL DL/1.1 and current state-of-the-art tableau-based description logic systems. In particular, we define a sound and complete tableau calculus for the description logic {ALBO} and show that it provides a basis for decision procedures for this logic and numerous other description logics with full role negation. {ALBO} is the extension of {ALC} with the Boolean role operators, inverse of roles, domain and range restriction operators and it includes full support for nominals (individuals). {ALBO} is a very expressive description logic which subsumes Boolean modal logic and the two-variable fragment of first-order logic and reasoning in it is NExpTime-complete. An important novelty is the use of a generic, unrestricted blocking rule as a replacement for standard loop checking mechanisms implemented in description logic systems. An implementation of our approach exists in the {textsc{MetTeL}} system.

  10. Point of decision: when do pigeons decide to head home?

    NASA Astrophysics Data System (ADS)

    Schiffner, Ingo; Wiltschko, Roswitha

    2009-02-01

    Pigeons released away from their loft usually fly around at the release site for a while before they finally leave. Visual observations had suggested that the moment when the birds decide to head home is associated with a certain change in flying style. To see whether this change is also reflected by GPS-recorded tracks, a group of pigeons equipped with flight recorders was released at two sites about 10 km from their home loft. The initial part of their flight paths was analyzed in order to find objective criteria indicating the point of decision. We selected the highest increase in steadiness as the best estimate for the moment of decision. This criterion allows us to divide the pigeons’ paths in two distinct phases, an initial phase and the homing phase, with the moment of decision, on an average, 2 min after release. The moment of decision marks a change in behavior, with a significant increase in steadiness and flying speed and headings significantly closer to the home direction. The behavior of the individual birds at the two sites was not correlated, suggesting no pronounced individual traits for the length of the initial phase. The behavior during this phase seems to be controlled by flight preparation, exploration, and non-navigational motivations rather than by navigational necessities alone.

  11. Share and share alike: deciding how to distribute the scientific and social benefits of genomic data.

    PubMed

    Foster, Morris W; Sharp, Richard R

    2007-08-01

    Emerging technologies make genomic analyses more efficient and less expensive, enabling genome-wide association and gene-environment interaction studies. In anticipation of their results, funding agencies such as the US National Institutes of Health and the Wellcome Trust are formulating guidelines for sharing the large amounts of genomic data that are generated by the projects that they sponsor. Data-sharing policies can have varying implications for how disease susceptibility and drug-response research will be pursued by the scientific community, and for who will benefit from the resulting medical discoveries. We suggest that the complex interplay of stakeholders and their interests, rather than single-issue and single-stakeholder perspectives, should be considered when deciding genomic data-sharing policies. PMID:17607307

  12. Analysis of the heat transfer at the tool-workpiece interface in machining: determination of heat generation and heat transfer coefficients

    NASA Astrophysics Data System (ADS)

    Haddag, B.; Atlati, S.; Nouari, M.; Zenasni, M.

    2015-10-01

    This paper deals with the modelling and identification of the heat exchange at the tool-workpiece interface in machining. A thermomechanical modelling has been established including heat balance equations of the tool-workpiece interface which take into account the heat generated by friction and the heat transfer by conduction due to the thermal contact resistance. The interface heat balance equations involve two coefficients: heat generation coefficient (HGC) of the frictional heat and heat transfer coefficient (HTC) of the heat conduction (inverse of the thermal contact resistance coefficient). Using experimental average heat flux in the tool, estimated for several cutting speeds, an identification procedure of the HGC-HTC couple, involved in the established thermomechanical FE-based modelling of the cutting process, has been proposed, which gives the numerical heat flux equal the measured one for each cutting speed. Using identified values of the HGC-HTC couple, evolution laws are proposed for the HGC as function of cutting speed, and then as function of sliding velocity at the tool-workpiece interface. Such laws can be implemented for instance in a Finite Element code for machining simulations.

  13. Canute Rules the Waves?: Hope for E-Library Tools Facing the Challenge of the "Google Generation"

    ERIC Educational Resources Information Center

    Myhill, Martin

    2007-01-01

    Purpose: To consider the findings of a recent e-resources survey at the University of Exeter Library in the context of the dominance of web search engines in academia, balanced by the development of e-library tools such as the library OPAC, OpenURL resolvers, metasearch engines, LDAP and proxy servers, and electronic resource management modules.…

  14. Development of a 2nd Generation Decision Support Tool to Optimize Resource and Energy Recovery for Municipal Solid Waste

    EPA Science Inventory

    In 2012, EPA’s Office of Research and Development released the MSW decision support tool (MSW-DST) to help identify strategies for more sustainable MSW management. Depending upon local infrastructure, energy grid mix, population density, and waste composition and quantity,...

  15. The Circuit of Culture as a Generative Tool of Contemporary Analysis: Examining the Construction of an Education Commodity

    ERIC Educational Resources Information Center

    Leve, Annabelle M.

    2012-01-01

    Contemporary studies in the field of education cannot afford to neglect the ever present interrelationships between power and politics, economics and consumption, representation and identity. In studying a recent cultural phenomenon in government schools, it became clear that a methodological tool that made sense of these interlinked processes was…

  16. Generation Y, Learner Autonomy and the Potential of Web 2.0 Tools for Language Learning and Teaching

    ERIC Educational Resources Information Center

    Morgan, Liam

    2012-01-01

    Purpose: The purpose of this paper is to examine the relationship between the development of learner autonomy and the application of Web 2.0 tools in the language classroom. Design/methodology/approach: The approach taken is that of qualitative action research within an explicit theoretical framework and the data were collected via surveys and…

  17. The Effectiveness of Virtual Learning Tools for Millennial Generation Students in a Community College Criminal Justice Degree Program

    ERIC Educational Resources Information Center

    Snyder, Lawrence

    2013-01-01

    An analysis of data from the Community College Survey of Student Engagement and multiyear analysis of pretest/posttest scores in introductory criminal justice courses revealed there was a systemic decline in student engagement and achievement. Because of this analysis, a commercial virtual learning tool (CJI) that purported great success in…

  18. PARTNERSHIP FOR THE DEVELOPMENT OF NEXT GENERATION SIMULATION TOOLS TO EVALUATE CEMENTITIOUS BARRIERS AND MATERIALS USED IN NUCLEAR APPLICATION - 8388

    SciTech Connect

    Langton, C; Richard Dimenna, R

    2008-01-29

    The US DOE has initiated a multidisciplinary cross cutting project to develop a reasonable and credible set of tools to predict the structural, hydraulic and chemical performance of cement barriers used in nuclear applications over extended time frames (e.g., > 100 years for operating facilities and > 1000 years for waste management). A partnership that combines DOE, NRC, academia, private sector, and international expertise has been formed to accomplish the project objectives by integrating existing information and realizing advancements where necessary. The set of simulation tools and data developed under this project will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems, e.g., waste forms, containment structures, entombments and environmental remediation, including decontamination and decommissioning (D&D) activities. The simulation tools will also support analysis of structural concrete components of nuclear facilities (spent fuel pools, dry spent fuel storage units, and recycling facilities, e.g., fuel fabrication, separations processes). Simulation parameters will be obtained from prior literature and will be experimentally measured under this project, as necessary, to demonstrate application of the simulation tools for three prototype applications (waste form in concrete vault, high level waste tank grouting, and spent fuel pool). Test methods and data needs to support use of the simulation tools for future applications will be defined. This is a national issue that affects all waste disposal sites that use cementitious waste forms and structures, decontamination and decommissioning activities, service life determination of existing structures, and design of future public and private nuclear facilities. The problem is difficult because it requires projecting conditions and responses over extremely long times. Current performance assessment analyses show that engineered barriers are

  19. A Useful Laboratory Tool

    ERIC Educational Resources Information Center

    Johnson, Samuel A.; Tutt, Tye

    2008-01-01

    Recently, a high school Science Club generated a large number of questions involving temperature. Therefore, they decided to construct a thermal gradient apparatus in order to conduct a wide range of experiments beyond the standard "cookbook" labs. They felt that this apparatus could be especially useful in future ninth-grade biology classes, in…

  20. The determination of waste generation and composition as an essential tool to improve the waste management plan of a university.

    PubMed

    Gallardo, A; Edo-Alcón, N; Carlos, M; Renau, M

    2016-07-01

    When many people work in organized institutions or enterprises, those institutions or enterprises become big meeting places that also have energy, water and resources necessities. One of these necessities is the correct management of the waste that is daily produced by these communities. Universities are a good example of institution where every day a great amount of people go to work or to study. But independently of their task, they use the different services at the University such as cafeterias, canteens, and photocopy and as a result of their activity a cleaning service is also needed. All these activities generate an environmental impact. Nowadays, many Universities have accepted the challenge to minimize this impact applying several measures. One of the impacts to be reduced is the waste generation. The first step to implement measures to implement a waste management plan at a University is to know the composition, the amount and the distribution of the waste generated in its facilities. As the waste composition and generation depend among other things on the climate, these variables should be analysed over one year. This research work estimates the waste generation and composition of a Spanish University, the Universitat Jaume I, during a school year. To achieve this challenge, all the waste streams generated at the University have been identified and quantified emphasizing on those which are not controlled. Furthermore, several statistical analyses have been carried out to know if the season of the year or the day of the week affect waste generation and composition. All this information will allow the University authorities to propose a set of minimization measures to enhance the current management. PMID:27107706

  1. Creating User-Friendly Tools for Data Analysis and Visualization in K-12 Classrooms: A Fortran Dinosaur Meets Generation Y

    NASA Technical Reports Server (NTRS)

    Chambers, L. H.; Chaudhury, S.; Page, M. T.; Lankey, A. J.; Doughty, J.; Kern, Steven; Rogerson, Tina M.

    2008-01-01

    During the summer of 2007, as part of the second year of a NASA-funded project in partnership with Christopher Newport University called SPHERE (Students as Professionals Helping Educators Research the Earth), a group of undergraduate students spent 8 weeks in a research internship at or near NASA Langley Research Center. Three students from this group formed the Clouds group along with a NASA mentor (Chambers), and the brief addition of a local high school student fulfilling a mentorship requirement. The Clouds group was given the task of exploring and analyzing ground-based cloud observations obtained by K-12 students as part of the Students' Cloud Observations On-Line (S'COOL) Project, and the corresponding satellite data. This project began in 1997. The primary analysis tools developed for it were in FORTRAN, a computer language none of the students were familiar with. While they persevered through computer challenges and picky syntax, it eventually became obvious that this was not the most fruitful approach for a project aimed at motivating K-12 students to do their own data analysis. Thus, about halfway through the summer the group shifted its focus to more modern data analysis and visualization tools, namely spreadsheets and Google(tm) Earth. The result of their efforts, so far, is two different Excel spreadsheets and a Google(tm) Earth file. The spreadsheets are set up to allow participating classrooms to paste in a particular dataset of interest, using the standard S'COOL format, and easily perform a variety of analyses and comparisons of the ground cloud observation reports and their correspondence with the satellite data. This includes summarizing cloud occurrence and cloud cover statistics, and comparing cloud cover measurements from the two points of view. A visual classification tool is also provided to compare the cloud levels reported from the two viewpoints. This provides a statistical counterpart to the existing S'COOL data visualization tool

  2. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    PubMed

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories. PMID:24227591

  3. The C3 Framework: A Powerful Tool for Preparing Future Generations for Informed and Engaged Civic Life

    ERIC Educational Resources Information Center

    Croddy, Marshall; Levine, Peter

    2014-01-01

    As the C3 Framework for the social studies rolls out, it is hoped that its influence will grow, offering a vision and guidance for the development of a new generation of state social studies standards that promote deeper student learning and the acquisition of essentials skills for college, career, and civic life. In the interim, it can be an…

  4. Effectiveness of Student-Generated Video as a Teaching Tool for an Instrumental Technique in the Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Jordan, Jeremy T.; Box, Melinda C.; Eguren, Kristen E.; Parker, Thomas A.; Saraldi-Gallardo, Victoria M.; Wolfe, Michael I.; Gallardo-Williams, Maria T.

    2016-01-01

    Multimedia instruction has been shown to serve as an effective learning aid for chemistry students. In this study, the viability of student-generated video instruction for organic chemistry laboratory techniques and procedure was examined and its effectiveness compared to instruction provided by a teaching assistant (TA) was evaluated. After…

  5. Using Tablets as Tools for Learner-Generated Drawings in the Context of Teaching the Kinetic Theory of Gases

    ERIC Educational Resources Information Center

    Lehtinen, A.; Viiri, J.

    2014-01-01

    Even though research suggests that the use of drawings could be an important part of learning science, learner-generated drawings have not received much attention in physics classrooms. This paper presents a method for recording students' drawings and group discussions using tablets. Compared to pen and paper, tablets offer unique benefits,…

  6. Restoring the ON Switch in Blind Retinas: Opto-mGluR6, a Next-Generation, Cell-Tailored Optogenetic Tool

    PubMed Central

    van Wyk, Michiel; Pielecka-Fortuna, Justyna; Löwel, Siegrid; Kleinlogel, Sonja

    2015-01-01

    Photoreceptor degeneration is one of the most prevalent causes of blindness. Despite photoreceptor loss, the inner retina and central visual pathways remain intact over an extended time period, which has led to creative optogenetic approaches to restore light sensitivity in the surviving inner retina. The major drawbacks of all optogenetic tools recently developed and tested in mouse models are their low light sensitivity and lack of physiological compatibility. Here we introduce a next-generation optogenetic tool, Opto-mGluR6, designed for retinal ON-bipolar cells, which overcomes these limitations. We show that Opto-mGluR6, a chimeric protein consisting of the intracellular domains of the ON-bipolar cell–specific metabotropic glutamate receptor mGluR6 and the light-sensing domains of melanopsin, reliably recovers vision at the retinal, cortical, and behavioral levels under moderate daylight illumination. PMID:25950461

  7. Family presence during cardiopulmonary resuscitation: who should decide?

    PubMed

    Lederman, Zohar; Garasic, Mirko; Piperberg, Michelle

    2014-05-01

    Whether to allow the presence of family members during cardiopulmonary resuscitation (CPR) has been a highly contentious topic in recent years. Even though a great deal of evidence and professional guidelines support the option of family presence during resuscitation (FPDR), many healthcare professionals still oppose it. One of the main arguments espoused by the latter is that family members should not be allowed for the sake of the patient's best interests, whether it is to increase his chances of survival, respect his privacy or leave his family with a last positive impression of him. In this paper, we examine the issue of FPDR from the patient's point of view. Since the patient requires CPR, he is invariably unconscious and therefore incompetent. We discuss the Autonomy Principle and the Three-Tiered process for surrogate decision making, as well as the Beneficence Principle and show that these are limited in providing us with an adequate tool for decision making in this particular case. Rather, we rely on a novel principle (or, rather, a novel specification of an existing principle) and a novel integrated model for surrogate decision making. We show that this model is more satisfactory in taking the patient's true wishes under consideration and encourages a joint decision making process by all parties involved. PMID:23557910

  8. 20 CFR 670.200 - Who decides where Job Corps centers will be located?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false Who decides where Job Corps centers will be... LABOR (CONTINUED) THE JOB CORPS UNDER TITLE I OF THE WORKFORCE INVESTMENT ACT Site Selection and Protection and Maintenance of Facilities § 670.200 Who decides where Job Corps centers will be located?...

  9. 20 CFR 670.200 - Who decides where Job Corps centers will be located?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Who decides where Job Corps centers will be... LABOR THE JOB CORPS UNDER TITLE I OF THE WORKFORCE INVESTMENT ACT Site Selection and Protection and Maintenance of Facilities § 670.200 Who decides where Job Corps centers will be located? (a) The...

  10. 43 CFR 4.625 - How will my application be decided?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 1 2012-10-01 2011-10-01 true How will my application be decided? 4.625 Section 4.625 Public Lands: Interior Office of the Secretary of the Interior DEPARTMENT HEARINGS AND... Considering Applications § 4.625 How will my application be decided? The adjudicative officer must issue...

  11. 43 CFR 4.625 - How will my application be decided?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 1 2014-10-01 2014-10-01 false How will my application be decided? 4.625 Section 4.625 Public Lands: Interior Office of the Secretary of the Interior DEPARTMENT HEARINGS AND... Considering Applications § 4.625 How will my application be decided? The adjudicative officer must issue...

  12. 43 CFR 4.625 - How will my application be decided?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 1 2013-10-01 2013-10-01 false How will my application be decided? 4.625 Section 4.625 Public Lands: Interior Office of the Secretary of the Interior DEPARTMENT HEARINGS AND... Considering Applications § 4.625 How will my application be decided? The adjudicative officer must issue...

  13. 49 CFR 40.377 - Who decides whether to issue a PIE?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Who decides whether to issue a PIE? 40.377 Section... a PIE? (a) The ODAPC Director, or his or her designee, decides whether to issue a PIE. If a designee... determination about whether to start a PIE proceeding. (c) There is a “firewall” between the initiating...

  14. 30 CFR 285.1006 - How will MMS decide whether to issue an Alternate Use RUE?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false How will MMS decide whether to issue an Alternate Use RUE? 285.1006 Section 285.1006 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF... Requesting An Alternate Use Rue § 285.1006 How will MMS decide whether to issue an Alternate Use RUE? (a)...

  15. 25 CFR 162.566 - How will BIA decide whether to approve a WSR lease?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a WSR lease? 162.566 Section 162.566 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Approval § 162.566 How will BIA decide whether...

  16. 25 CFR 162.531 - How will BIA decide whether to approve a WEEL?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a WEEL? 162.531 Section 162.531 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Weel Approval § 162.531 How will BIA decide whether to approve...

  17. 25 CFR 162.566 - How will BIA decide whether to approve a WSR lease?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a WSR lease? 162.566 Section 162.566 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Approval § 162.566 How will BIA decide whether...

  18. 25 CFR 162.531 - How will BIA decide whether to approve a WEEL?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a WEEL? 162.531 Section 162.531 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Weel Approval § 162.531 How will BIA decide whether to approve...

  19. 12 CFR 617.7515 - How does the FCA decide whether to issue a directive?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false How does the FCA decide whether to issue a directive? 617.7515 Section 617.7515 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM BORROWER RIGHTS Distressed Loan Restructuring Directive § 617.7515 How does the FCA decide whether to...

  20. 20 CFR 404.708 - How we decide what is enough evidence.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false How we decide what is enough evidence. 404... DISABILITY INSURANCE (1950- ) Evidence General § 404.708 How we decide what is enough evidence. When you give us evidence, we examine it to see if it is convincing evidence. If it is, no other evidence is...

  1. 75 FR 57396 - List of Nonconforming Vehicles Decided To Be Eligible for Importation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-21

    ... National Highway Traffic Safety Administration 49 CFR Part 593 List of Nonconforming Vehicles Decided To Be... rule. SUMMARY: This document revises the list of vehicles not originally manufactured to conform to the Federal Motor Vehicle Safety Standards (FMVSS) that NHTSA has decided to be eligible for importation....

  2. 78 FR 54182 - List of Nonconforming Vehicles Decided To Be Eligible for Importation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-03

    ... National Highway Traffic Safety Administration 49 CFR Part 593 List of Nonconforming Vehicles Decided To Be... rule. SUMMARY: This document revises the list of vehicles not originally manufactured to conform to the Federal Motor Vehicle Safety Standards (FMVSS) that NHTSA has decided to be eligible for importation....

  3. 5 CFR 890.1041 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Suspension § 890.1041 Deciding a contest after a fact-finding... suspending or debarring the provider, based on the same facts. Effect of Debarment...

  4. 5 CFR 890.1041 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Suspension § 890.1041 Deciding a contest after a fact-finding... suspending or debarring the provider, based on the same facts. Effect of Debarment...

  5. 5 CFR 890.1029 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Permissive Debarments § 890.1029 Deciding a contest after a fact... official's findings of fact, unless they are arbitrary, capricious, or clearly erroneous. If the...

  6. 5 CFR 890.1029 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Permissive Debarments § 890.1029 Deciding a contest after a fact... official's findings of fact, unless they are arbitrary, capricious, or clearly erroneous. If the...

  7. 5 CFR 890.1029 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Permissive Debarments § 890.1029 Deciding a contest after a fact... official's findings of fact, unless they are arbitrary, capricious, or clearly erroneous. If the...

  8. 5 CFR 890.1029 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Permissive Debarments § 890.1029 Deciding a contest after a fact... official's findings of fact, unless they are arbitrary, capricious, or clearly erroneous. If the...

  9. 5 CFR 890.1041 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Suspension § 890.1041 Deciding a contest after a fact-finding... suspending or debarring the provider, based on the same facts. Effect of Debarment...

  10. 5 CFR 890.1029 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Permissive Debarments § 890.1029 Deciding a contest after a fact... official's findings of fact, unless they are arbitrary, capricious, or clearly erroneous. If the...

  11. 5 CFR 890.1041 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Suspension § 890.1041 Deciding a contest after a fact-finding... suspending or debarring the provider, based on the same facts. Effect of Debarment...

  12. 5 CFR 890.1041 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Suspension § 890.1041 Deciding a contest after a fact-finding... suspending or debarring the provider, based on the same facts. Effect of Debarment...

  13. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false Who decides how Language Development funds can be used... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local...

  14. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 1 2012-04-01 2011-04-01 true Who decides how Language Development funds can be used? 39... SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school boards...

  15. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false Who decides how Language Development funds can be used... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local...

  16. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false Who decides how Language Development funds can be used... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local...

  17. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Who decides how Language Development funds can be used... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local...

  18. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... BENEFIT Reopening, ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding...

  19. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... BENEFIT Reopening, ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding...

  20. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... BENEFIT Reopening, ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding...

  1. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH..., ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding an Appeal before...

  2. Decided and Undecided Students: Career Self-Efficacy, Negative Thinking, and Decision-Making Difficulties

    ERIC Educational Resources Information Center

    Bullock-Yowell, Emily; McConnell, Amy E.; Schedin, Emily A.

    2014-01-01

    The career concern differences between undecided and decided college students (N = 223) are examined. Undecided college students (n = 83) reported lower career decision-making self-efficacy, higher incidences of negative career thoughts, and more career decision-making difficulties than their decided peers (n = 143). Results reveal that undecided…

  3. Functional connectivity associated with hand shape generation: Imitating novel hand postures and pantomiming tool grips challenge different nodes of a shared neural network.

    PubMed

    Vingerhoets, Guy; Clauwaert, Amanda

    2015-09-01

    Clinical research suggests that imitating meaningless hand postures and pantomiming tool-related hand shapes rely on different neuroanatomical substrates. We investigated the BOLD responses to different tasks of hand posture generation in 14 right handed volunteers. Conjunction and contrast analyses were applied to select regions that were either common or sensitive to imitation and/or pantomime tasks. The selection included bilateral areas of medial and lateral extrastriate cortex, superior and inferior regions of the lateral and medial parietal lobe, primary motor and somatosensory cortex, and left dorsolateral prefrontal, and ventral and dorsal premotor cortices. Functional connectivity analysis revealed that during hand shape generation the BOLD-response of every region correlated significantly with every other area regardless of the hand posture task performed, although some regions were more involved in some hand postures tasks than others. Based on between-task differences in functional connectivity we predict that imitation of novel hand postures would suffer most from left superior parietal disruption and that pantomiming hand postures for tools would be impaired following left frontal damage, whereas both tasks would be sensitive to inferior parietal dysfunction. We also unveiled that posterior temporal cortex is committed to pantomiming tool grips, but that the involvement of this region to the execution of hand postures in general appears limited. We conclude that the generation of hand postures is subserved by a highly interconnected task-general neural network. Depending on task requirements some nodes/connections will be more engaged than others and these task-sensitive findings are in general agreement with recent lesion studies. PMID:26095674

  4. A comprehensive tool for image-based generation of fetus and pregnant women mesh models for numerical dosimetry studies

    NASA Astrophysics Data System (ADS)

    Dahdouh, S.; Varsier, N.; Serrurier, A.; De la Plata, J.-P.; Anquez, J.; Angelini, E. D.; Wiart, J.; Bloch, I.

    2014-08-01

    Fetal dosimetry studies require the development of accurate numerical 3D models of the pregnant woman and the fetus. This paper proposes a 3D articulated fetal growth model covering the main phases of pregnancy and a pregnant woman model combining the utero-fetal structures and a deformable non-pregnant woman body envelope. The structures of interest were automatically or semi-automatically (depending on the stage of pregnancy) segmented from a database of images and surface meshes were generated. By interpolating linearly between fetal structures, each one can be generated at any age and in any position. A method is also described to insert the utero-fetal structures in the maternal body. A validation of the fetal models is proposed, comparing a set of biometric measurements to medical reference charts. The usability of the pregnant woman model in dosimetry studies is also investigated, with respect to the influence of the abdominal fat layer.

  5. electronic Ligand Builder and Optimisation Workbench (eLBOW): A tool for ligand coordinate and restraint generation

    SciTech Connect

    Moriarty, Nigel; Grosse-Kunstleve, Ralf; Adams, Paul

    2009-07-01

    The electronic Ligand Builder and Optimisation Workbench (eLBOW) is a program module of the PHENIX suite of computational crystallographic software. It's designed to be a flexible procedure using simple and fast quantum chemical techniques to provide chemically accurate information for novel and known ligands alike. A variety of input formats and options allow for the attainment of a number of diverse goals including geometry optimisation and generation of restraints.

  6. electronic Ligand Builder and Optimization Workbench (eLBOW): a tool for ligand coordinate and restraint generation

    PubMed Central

    Moriarty, Nigel W.; Grosse-Kunstleve, Ralf W.; Adams, Paul D.

    2009-01-01

    The electronic Ligand Builder and Optimization Workbench (eLBOW) is a program module of the PHENIX suite of computational crystallographic software. It is designed to be a flexible procedure that uses simple and fast quantum-chemical techniques to provide chemically accurate information for novel and known ligands alike. A variety of input formats and options allow the attainment of a number of diverse goals including geometry optimization and generation of restraints. PMID:19770504

  7. Next generation of Z* modelling tool for high intensity EUV and soft x-ray plasma sources simulations

    NASA Astrophysics Data System (ADS)

    Zakharov, S. V.; Zakharov, V. S.; Choi, P.; Krukovskiy, A. Y.; Novikov, V. G.; Solomyannaya, A. D.; Berezin, A. V.; Vorontsov, A. S.; Markov, M. B.; Parot'kin, S. V.

    2011-04-01

    In the specifications for EUV sources, high EUV power at IF for lithography HVM and very high brightness for actinic mask and in-situ inspections are required. In practice, the non-equilibrium plasma dynamics and self-absorption of radiation limit the in-band radiance of the plasma and the usable radiation power of a conventional single unit EUV source. A new generation of the computational code Z* is currently developed under international collaboration in the frames of FP7 IAPP project FIRE for modelling of multi-physics phenomena in radiation plasma sources, particularly for EUVL. The radiation plasma dynamics, the spectral effects of self-absorption in LPP and DPP and resulting Conversion Efficiencies are considered. The generation of fast electrons, ions and neutrals is discussed. Conditions for the enhanced radiance of highly ionized plasma in the presence of fast electrons are evaluated. The modelling results are guiding a new generation of EUV sources being developed at Nano-UV, based on spatial/temporal multiplexing of individual high brightness units, to deliver the requisite brightness and power for both lithography HVM and actinic metrology applications.

  8. NOTE: A software tool for 2D/3D visualization and analysis of phase-space data generated by Monte Carlo modelling of medical linear accelerators

    NASA Astrophysics Data System (ADS)

    Neicu, Toni; Aljarrah, Khaled M.; Jiang, Steve B.

    2005-10-01

    A computer program has been developed for novel 2D/3D visualization and analysis of the phase-space parameters of Monte Carlo simulations of medical accelerator radiation beams. The software is written in the IDL language and reads the phase-space data generated in the BEAMnrc/BEAM Monte Carlo code format. Contour and colour-wash plots of the fluence, mean energy, energy fluence, mean angle, spectra distribution, energy fluence distribution, angular distribution, and slices and projections of the 3D ZLAST distribution can be calculated and displayed. Based on our experience of using it at Massachusetts General Hospital, the software has proven to be a useful tool for analysis and verification of the Monte Carlo generated phase-space files. The software is in the public domain.

  9. InfiniCharges: A tool for generating partial charges via the simultaneous fit of multiframe electrostatic potential (ESP) and total dipole fluctuations (TDF)

    NASA Astrophysics Data System (ADS)

    Sant, Marco; Gabrieli, Andrea; Demontis, Pierfranco; Suffritti, Giuseppe B.

    2016-03-01

    The InfiniCharges computer program, for generating reliable partial charges for molecular simulations in periodic systems, is here presented. This tool is an efficient implementation of the recently developed DM-REPEAT method, where the stability of the resulting charges, over a large set of fitting regions, is obtained through the simultaneous fit of multiple electrostatic potential (ESP) configurations together with the total dipole fluctuations (TDF). Besides DM-REPEAT, the program can also perform standard REPEAT fit and its multiframe extension (M-REPEAT), with the possibility to restrain the charges to an arbitrary value. Finally, the code is employed to generate partial charges for ZIF-90, a microporous material of the metal organic frameworks (MOFs) family, and an extensive analysis of the results is carried out.

  10. Myofibroblasts: Trust your heart and let fate decide

    PubMed Central

    Davis, Jennifer; Molkentin, Jeffery D.

    2014-01-01

    Cardiac fibrosis is a substantial problem in managing multiple forms of heart disease. Fibrosis results from an unrestrained tissue repair process orchestrated predominantly by the myofibroblast. These are highly specialized cells characterized by their ability to secrete extracellular matrix (ECM) components and remodel tissue due to their contractile properties. This contractile activity of the myofibroblast is ascribed, in part, to the expression of smooth muscle α-actin (αSMA) and other tension-associated structural genes. Myofibroblasts are a newly generated cell type derived largely from residing mesenchymal cells in response to both mechanical and neurohumoral stimuli. Several cytokines, chemokines, and growth factors are induced in the injured heart, and in conjunction with elevated wall tension, specific signaling pathways and downstream effectors are mobilized to initiate myofibroblast differentiation. Here we will review the cell fates that contribute to the myofibroblast as well as nodal molecular signaling effectors that promote their differentiation and activity. We will discuss canonical versus non-canonical transforming growth factor-β (TGFβ), angiotensin II (AngII), endothelin-1 (ET-1), serum response factor (SRF), transient receptor potential (TRP) channels, mitogen-activated protein kinases (MAPKs) and mechanical signaling pathways that are required for myofibroblast transformation and fibrotic disease. PMID:24189039

  11. Appendix 2. Guide for Running AgMIP Climate Scenario Generation Tools with R in Windows, Version 2.3

    NASA Technical Reports Server (NTRS)

    Hudson, Nicholas; Ruane, Alexander Clark

    2013-01-01

    This Guide explains how to create climate series and climate change scenarios by using the AgMip Climate team's methodology as outlined in the AgMIP Guide for Regional Assessment: Handbook of Methods and Procedures. It details how to: install R and the required packages to run the AgMIP Climate Scenario Generation scripts, and create climate scenarios from CMIP5 GCMs using a 30-year baseline daily weather dataset. The Guide also outlines a workflow that can be modified for application to your own climate data.

  12. Using tablets as tools for learner-generated drawings in the context of teaching the kinetic theory of gases

    NASA Astrophysics Data System (ADS)

    Lehtinen, A.; Viiri, J.

    2014-05-01

    Even though research suggests that the use of drawings could be an important part of learning science, learner-generated drawings have not received much attention in physics classrooms. This paper presents a method for recording students’ drawings and group discussions using tablets. Compared to pen and paper, tablets offer unique benefits, which include the recording of the whole drawing process and of the discussion associated with the drawing. A study, which investigated the use of drawings and the need for guidance among Finnish upper secondary school students, is presented alongside ideas for teachers on how to see drawing in a new light.

  13. LymAnalyzer: a tool for comprehensive analysis of next generation sequencing data of T cell receptors and immunoglobulins.

    PubMed

    Yu, Yaxuan; Ceredig, Rhodri; Seoighe, Cathal

    2016-02-29

    The adaptive immune system includes populations of B and T cells capable of binding foreign epitopes via antigen specific receptors, called immunoglobulin (IG) for B cells and the T cell receptor (TCR) for T cells. In order to provide protection from a wide range of pathogens, these cells display highly diverse repertoires of IGs and TCRs. This is achieved through combinatorial rearrangement of multiple gene segments in addition, for B cells, to somatic hypermutation. Deep sequencing technologies have revolutionized analysis of the diversity of these repertoires; however, accurate TCR/IG diversity profiling requires specialist bioinformatics tools. Here we present LymAnalzyer, a software package that significantly improves the completeness and accuracy of TCR/IG profiling from deep sequence data and includes procedures to identify novel alleles of gene segments. On real and simulated data sets LymAnalyzer produces highly accurate and complete results. Although, to date we have applied it to TCR/IG data from human and mouse, it can be applied to data from any species for which an appropriate database of reference genes is available. Implemented in Java, it includes both a command line version and a graphical user interface and is freely available at https://sourceforge.net/projects/lymanalyzer/. PMID:26446988

  14. COV2HTML: a visualization and analysis tool of bacterial next generation sequencing (NGS) data for postgenomics life scientists.

    PubMed

    Monot, Marc; Orgeur, Mickael; Camiade, Emilie; Brehier, Clément; Dupuy, Bruno

    2014-03-01

    COV2HTML is an interactive web interface, which is addressed to biologists, and allows performing both coverage visualization and analysis of NGS alignments performed on prokaryotic organisms (bacteria and phages). It combines two processes: a tool that converts the huge NGS mapping or coverage files into light specific coverage files containing information on genetic elements; and a visualization interface allowing a real-time analysis of data with optional integration of statistical results. To demonstrate the scope of COV2HTML, the program was tested with data from two published studies. The first data were from RNA-seq analysis of Campylobacter jejuni, based on comparison of two conditions with two replicates. We were able to recover 26 out of 27 genes highlighted in the publication using COV2HTML. The second data comprised of stranded TSS and RNA-seq data sets on the Archaea Sulfolobus solfataricus. COV2HTML was able to highlight most of the TSSs from the article and allows biologists to visualize both TSS and RNA-seq on the same screen. The strength of the COV2HTML interface is making possible NGS data analysis without software installation, login, or a long training period. A web version is accessible at https://mmonot.eu/COV2HTML/ . This website is free and open to users without any login requirement. PMID:24512253

  15. COV2HTML: A Visualization and Analysis Tool of Bacterial Next Generation Sequencing (NGS) Data for Postgenomics Life Scientists

    PubMed Central

    Orgeur, Mickael; Camiade, Emilie; Brehier, Clément; Dupuy, Bruno

    2014-01-01

    Abstract COV2HTML is an interactive web interface, which is addressed to biologists, and allows performing both coverage visualization and analysis of NGS alignments performed on prokaryotic organisms (bacteria and phages). It combines two processes: a tool that converts the huge NGS mapping or coverage files into light specific coverage files containing information on genetic elements; and a visualization interface allowing a real-time analysis of data with optional integration of statistical results. To demonstrate the scope of COV2HTML, the program was tested with data from two published studies. The first data were from RNA-seq analysis of Campylobacter jejuni, based on comparison of two conditions with two replicates. We were able to recover 26 out of 27 genes highlighted in the publication using COV2HTML. The second data comprised of stranded TSS and RNA-seq data sets on the Archaea Sulfolobus solfataricus. COV2HTML was able to highlight most of the TSSs from the article and allows biologists to visualize both TSS and RNA-seq on the same screen. The strength of the COV2HTML interface is making possible NGS data analysis without software installation, login, or a long training period. A web version is accessible at https://mmonot.eu/COV2HTML/. This website is free and open to users without any login requirement. PMID:24512253

  16. The Michigan Healthy School Action Tools process generates improvements in school nutrition policies and practices, and student dietary intake.

    PubMed

    Alaimo, Katherine; Oleksyk, Shannon; Golzynski, Diane; Drzal, Nick; Lucarelli, Jennifer; Reznar, Melissa; Wen, Yalu; Krabill Yoder, Karen

    2015-05-01

    The Michigan Healthy School Action Tools (HSAT) is an online self-assessment and action planning process for schools seeking to improve their health policies and practices. The School Nutrition Advances Kids study, a 2-year quasi-experimental intervention with low-income middle schools, evaluated whether completing the HSAT with a facilitator assistance and small grant funding resulted in (1) improvements in school nutrition practices and policies and (2) improvements in student dietary intake. A total of 65 low-income Michigan middle schools participated in the study. The Block Youth Food Frequency Questionnaire was completed by 1,176 seventh-grade students at baseline and in eighth grade (during intervention). Schools reported nutrition-related policies and practices/education using the School Environment and Policy Survey. Schools completing the HSAT were compared to schools that did not complete the HSAT with regard to number of policy and practice changes and student dietary intake. Schools that completed the HSAT made significantly more nutrition practice/education changes than schools that did not complete the HSAT, and students in those schools made dietary improvements in fruit, fiber, and cholesterol intake. The Michigan HSAT process is an effective strategy to initiate improvements in nutrition policies and practices within schools, and to improve student dietary intake. PMID:25733730

  17. Developmental dysplasia of the hip: usefulness of next generation genomic tools for characterizing the underlying genes - a mini review.

    PubMed

    Basit, S; Hannan, M A; Khoshhal, K I

    2016-07-01

    Developmental dysplasia of the hip (DDH) is one of the most common skeletal anomalies. DDH encompasses a spectrum of the disorder ranging from minor acetabular dysplasia to irreducible dislocation, which may lead to premature arthritis in later life. Involvement of genetic factors underlying DDH became evident when several studies reported chromosomal loci linked to DDH in families with multiple affected individuals. Moreover, using association studies, variants in genes involved in chondrogenesis and joint formation have been shown to be associated with DDH. At least, one study identified a pathogenic variant in the chemokine receptor gene in DDH. No genetic analysis has been reported or carried out in DDH patients from the Middle East. Here, we review the literature related to genetics of DDH and emphasized the usefulness of new generation technologies in identifying genetic variants underlying DDH in consanguineous families. PMID:26842108

  18. Next-Generation Sequencing: A powerful tool for the discovery of molecular markers in breast ductal carcinoma in situ

    PubMed Central

    Kaur, Hitchintan; Mao, Shihong; Shah, Seema; Gorski, David H.; Krawetz, Stephen A.; Sloane, Bonnie F.; Mattingly, Raymond R.

    2013-01-01

    Mammographic screening leads to frequent biopsies and concomitant overdiagnosis of breast cancer, particularly ductal carcinoma in situ (DCIS). Some DCIS lesions rapidly progress to invasive carcinoma whereas others remain indolent. Because we cannot yet predict which lesions will not progress, all DCIS is regarded as malignant, and many women are overtreated. Thus, there is a pressing need for a panel of molecular markers in addition to the current clinical and pathologic factors to provide prognostic information. Genomic technologies such as microarrays have made major contributions to defining sub-types of breast cancer. Next-generation sequencing (NGS) modalities offer unprecedented depth of expression analysis through revealing transcriptional boundaries, mutations, rare transcripts and alternative splice variants. NGS approaches are just beginning to be applied to DCIS. Here, we review the applications and challenges of NGS in discovering novel potential therapeutic targets and candidate biomarkers in the premalignant progression of breast cancer. PMID:23477556

  19. Generation and Optimization of the Self-Administered Bleeding Assessment Tool (Self-BAT) and its Validation as a Screening Test for von Willebrand Disease

    PubMed Central

    Deforest, Meghan; Grabell, Julie; Albert, Shirren; Young, Jane; Tuttle, Angie; Hopman, Wilma M.; James, Paula D.

    2015-01-01

    Summary Introduction/Aim Our aim was to generate, optimize and validate a self-administered bleeding assessment tool (BAT) for von Willebrand disease (VWD). Methods In Phase 1, medical terminology in the expert-administered ISTH-BAT was converted to a grade 4 reading level to produce the first version of the Self-BAT which was then optimized to ensure agreement with the ISTH-BAT. In Phase 2, the normal range of bleeding scores was determined and test-retest reliability analyzed. In Phase 3, the optimized Self-BAT was tested as a screening tool for first time referrals to the Hematology clinic. Results BS from the final optimized version of the Self-BAT showed an excellent intra-class correlation coefficient (ICC) of 0.87 with ISTH-BAT BS in Phase 1. In Phase 2, the normal range of bleeding scores for the optimized Self-BAT was determined to be 0 to +5 for females and 0 to +3 for males and excellent test-retest reliability was shown (ICC = 0.95). In Phase 3, we showed that a positive Self-BAT BS (≥ 6 for females, ≥ 4 for males) has a sensitivity of 78%, specificity of 23%, positive predictive value (PPV) of 0.15 and negative predictive value (NPV) of 0.86 for VWD; these figures improved when just the females were analyzed; sensitivity of 100%, specificity of 21%, PPV=0.17 and NPV=1.0. Conclusion We show an optimized Self-BAT can generate comparable BS to the expert-administered ISTH-BAT and is a reliable, effective screening tool to incorporate into the assessment of individuals, particularly women, referred for a possible bleeding disorder. PMID:26179127

  20. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    NASA Astrophysics Data System (ADS)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  1. miRMOD: a tool for identification and analysis of 5′ and 3′ miRNA modifications in Next Generation Sequencing small RNA data

    PubMed Central

    Mukherjee, Sunil K.

    2015-01-01

    In the past decade, the microRNAs (miRNAs) have emerged to be important regulators of gene expression across various species. Several studies have confirmed different types of post-transcriptional modifications at terminal ends of miRNAs. The reports indicate that miRNA modifications are conserved and functionally significant as it may affect miRNA stability and ability to bind mRNA targets, hence affecting target gene repression. Next Generation Sequencing (NGS) of the small RNA (sRNA) provides an efficient and reliable method to explore miRNA modifications. The need for dedicated software, especially for users with little knowledge of computers, to determine and analyze miRNA modifications in sRNA NGS data, motivated us to develop miRMOD. miRMOD is a user-friendly, Microsoft Windows and Graphical User Interface (GUI) based tool for identification and analysis of 5′ and 3′ miRNA modifications (non-templated nucleotide additions and trimming) in sRNA NGS data. In addition to identification of miRNA modifications, the tool also predicts and compares the targets of query and modified miRNAs. In order to compare binding affinities for the same target, miRMOD utilizes minimum free energies of the miRNA:target and modified-miRNA:target interactions. Comparisons of the binding energies may guide experimental exploration of miRNA post-transcriptional modifications. The tool is available as a stand-alone package to overcome large data transfer problems commonly faced in web-based high-throughput (HT) sequencing data analysis tools. miRMOD package is freely available at http://bioinfo.icgeb.res.in/miRMOD. PMID:26623179

  2. Using soil properties as a tool to differentiate landslide generations and constrain their ages - Rogowiec landslide, Sudetes (SW Poland)

    NASA Astrophysics Data System (ADS)

    Kacprzak, Andrzej; Migoń, Piotr

    2013-04-01

    profiles in the landslide body do not show evidence of protracted soil evolution under contemporary climate and hence, are interpreted as having been formed during a fraction of the Holocene. This implies a Holocene age of the landslide. In addition, an older shallow translational landslide has been recognized on the valley side, with the toe buried by the main Rogowiec landslide. The depletion area was identified through the occurrence of thin, truncated soils (compared to the neighbouring slopes). This and the occurrence of weakly horizonated and poorly structural soils in the landslide body itself suggest that this valley-side landslide is of the Holocene age too. Thus, soils proved a powerful tool to establish the relative chronology of landslides and give strong evidence of their Holocene age. Soil research is recommended as a part of landslide hazard and risk assessment for landslides of unknown age.

  3. Bioanalytical tools for the evaluation of organic micropollutants during sewage treatment, water recycling and drinking water generation.

    PubMed

    Macova, Miroslava; Toze, Simon; Hodgers, Leonie; Mueller, Jochen F; Bartkow, Michael; Escher, Beate I

    2011-08-01

    performed that allows direct comparison of different treatment technologies and covers several orders of magnitude of TEQ from highly contaminated sewage to drinking water with TEQ close or below the limit of detection. Detection limits of the bioassays were decreased in comparison to earlier studies by optimizing sample preparation and test protocols, and were comparable to or lower than the quantification limits of the routine chemical analysis, which allowed monitoring of the presence and removal of micropollutants post Barrier 2 and in drinking water. The results obtained by bioanalytical tools were reproducible, robust and consistent with previous studies assessing the effectiveness of the wastewater and advanced water treatment plants. The results of this study indicate that bioanalytical results expressed as TEQ are useful to assess removal efficiency of micropollutants throughout all treatment steps of water recycling. PMID:21704353

  4. On resonant ICRF absorption in three-ion component plasmas: a new promising tool for fast ion generation

    NASA Astrophysics Data System (ADS)

    Kazakov, Ye. O.; Van Eester, D.; Dumont, R.; Ongena, J.

    2015-03-01

    We report on a very efficient ion-cyclotron-resonance-frequency (ICRF) absorption scheme (Z)-Y-X, which hinges on the presence of three ion species residing in the plasma. A mode conversion (cutoff-resonance) layer is well known to appear in two-ion species plasmas. If the location of the L-cutoff in Y-X plasmas, which can be controlled by varying the Y : X density ratio, almost coincides with the fundamental cyclotron resonance of the third ion species Z (resonant absorber), the latter—albeit present only in trace quantities—is shown to absorb almost all the incoming RF power. A quantitative criterion for the resonant Y : X plasma composition is derived and a few numerical examples are given. Since the absorbed power per resonant particle is much larger than for any other ICRF scheme, the here discussed scenarios are particularly promising for fast particle generation. Their possible application as a source of high-energy ions for the stellarator W7-X and to mimic alpha particles during the non-activated phase of ITER tokamak is briefly discussed.

  5. Points Clouds Generation Using Tls and Dense-Matching Techniques. a Test on Approachable Accuracies of Different Tools

    NASA Astrophysics Data System (ADS)

    Chiabrando, F.; Spanò, A.

    2013-07-01

    3D detailed models derived from digital survey techniques has increasingly developed and focused in many field of application, ranging from the land and urban areas survey, using remote sensed data, to landscape assets and finally to Cultural Heritage items. The high detailed content and accuracy of such models makes them so attractive and usable for large sets of purposes. The present paper is focused on a test aimed to point clouds generation fulfilled by archaeological data; active and passive sensors techniques and related image matching systems have been used in order to evaluate and compare the accuracy of results, achievable using proper TLS and low cost image-matching software and techniques. After a short review of approachable methods some attained results will be discussed; the test area consists of a set of mosaic floorings in a late roman domus located in Aquileia (UD-Italy) requesting a very high level of details and high scale and precision. The experimental section provides the descriptions of the applied tests in order to compare the different software and the employed methods.

  6. Generation of growth arrested Leishmania amastigotes: a tool to develop live attenuated vaccine candidates against visceral leishmaniasis.

    PubMed

    Selvapandiyan, Angamuthu; Dey, Ranadhir; Gannavaram, Sreenivas; Solanki, Sumit; Salotra, Poonam; Nakhasi, Hira L

    2014-06-30

    Visceral leishmaniasis (VL) is fatal if not treated and is prevalent widely in the tropical and sub-tropical regions of world. VL is caused by the protozoan parasite Leishmania donovani or Leishmania infantum. Although several second generation vaccines have been licensed to protect dogs against VL, there are no effective vaccines against human VL [1]. Since people cured of leishmaniasis develop lifelong protection, development of live attenuated Leishmania parasites as vaccines, which can have controlled infection, may be a close surrogate to leishmanization. This can be achieved by deletion of genes involved in the regulation of growth and/or virulence of the parasite. Such mutant parasites generally do not revert to virulence in animal models even under conditions of induced immune suppression due to complete deletion of the essential gene(s). In the Leishmania life cycle, the intracellular amastigote form is the virulent form and causes disease in the mammalian hosts. We developed centrin gene deleted L. donovani parasites that displayed attenuated growth only in the amastigote stage and were found safe and efficacious against virulent challenge in the experimental animal models. Thus, targeting genes differentially expressed in the amastigote stage would potentially attenuate only the amastigote stage and hence controlled infectivity may be effective in developing immunity. This review lays out the strategies for attenuation of the growth of the amastigote form of Leishmania for use as live vaccine against leishmaniasis, with a focus on visceral leishmaniasis. PMID:24837513

  7. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    NASA Astrophysics Data System (ADS)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  8. Constraint-Based Model of Shewanella oneidensis MR-1 Metabolism: a Tool for Data Analysis and Hypothesis Generation

    SciTech Connect

    Pinchuk, Grigoriy E.; Hill, Eric A.; Geydebrekht, Oleg V.; De Ingeniis, Jessica; Zhang, Xiaolin; Osterman, Andrei; Scott, James H.; Reed, Samantha B.; Romine, Margaret F.; Konopka, Allan; Beliaev, Alex S.; Fredrickson, Jim K.; Reed, Jennifer L.

    2010-06-24

    Shewanellae are gram-negative facultatively anaerobic metal-reducing bacteria commonly found in chemically (i.e., redox) stratified environments. Occupying such niches requires the ability to rapidly acclimate to changes in electron donor/acceptor type and availability; hence, the ability to compete and thrive in such environments must ultimately be reflected in the organization and flexibility of the electron transfer networks as well as central and peripheral carbon metabolism pathways. To understand the factors contributing to the ecophysiological success of Shewanellae, the metabolic network of S. oneidensis MR-1 was reconstructed. The resulting network consists of 774 reactions, 783 genes, and 634 unique metabolites and contains biosynthesis pathways for all cell constituents. Using constraint-based modeling, we investigated aerobic growth of S. oneidensis MR-1 on numerous carbon sources. To achieve this, we (i) used experimental data to formulate a biomass equation and estimate cellular ATP requirements, (ii) developed an approach to identify futile cycles, (iii) classified how reaction usage affects cellular growth, (iv) predicted cellular biomass yields on different carbon sources and compared model predictions to experimental measurements, and (v) used experimental results to refine metabolic fluxes for growth on lactate. The results revealed that aerobic lactate-grown cells of S. oneidensis MR-1 used less efficient enzymes to couple electron transport to proton motive force generation, and possibly operated at least one futile cycle involving malic enzymes. Several examples are provided whereby model predictions were validated by experimental data, in particular the role of serine hydroxymethyltransferase and glycine cleavage system in the metabolism of one-carbon units, and growth on different sources of carbon and energy. This work illustrates how integration of computational and experimental efforts facilitates the understanding of microbial metabolism

  9. Hillslope Hydrological Modeling in a Boreal Forest - Tools for Assessment of Logging Impacts on Runoff Generation and Nitrogen Loads

    NASA Astrophysics Data System (ADS)

    Koivusalo, H.; Kokkonen, T.

    2003-12-01

    Advance in runoff generation models requires measurements of water quantity and quality both at the outlet of a catchment and along hillslopes residing within a catchment. Predicting response of streamflow volumes to land-use changes is challenging, but it is even more demanding to produce realistic estimates of surface and subsurface runoff components and solute fluxes within a catchment. This work, which is part of the FEMMA project, presents development and application of a deterministic hillslope hydrological model, which combines hydrological, biological, and solute transport computation schemes. The model simulates canopy processes, snow accumulation and melt, soil and ground water movement, and nitrogen transport. Here the modeling system is applied to two coniferous-forested catchments (56 and 29 ha) located in Eastern Finland. One of the catchments was partially clear-cut (35 %) while the other one was left as a control. Measured meteorological, snow, streamflow, and nitrogen concentration data cover a nine-year period starting five years before the treatment. The model parameterization is based on spatial data on topography, soil types, soil depths, and location of treated areas with respect to the stream network draining the catchment. Spatial data analysis condenses information on catchment physical properties for identification of one or several typical hillslopes, which are assumed to represent the behavior of the entire catchment. Results derivable directly from the measured data suggest that timing and volume of streamflow changed soon after loggings were completed. In the modeling exercise these changes are attributed to increased melt intensity and absence of interception in the logged areas. Measured concentrations of nitrite-nitrate nitrogen showed changes only two years after the logging. This lag can result from slow initiation of felling waste decomposition and long residence of nitrates in mineral soils and wetland areas surrounding the

  10. Constraint-Based Model of Shewanella oneidensis MR-1 Metabolism: A Tool for Data Analysis and Hypothesis Generation

    PubMed Central

    Hill, Eric A.; Geydebrekht, Oleg V.; De Ingeniis, Jessica; Zhang, Xiaolin; Osterman, Andrei; Scott, James H.; Reed, Samantha B.; Romine, Margaret F.; Konopka, Allan E.; Beliaev, Alexander S.; Fredrickson, Jim K.

    2010-01-01

    Shewanellae are gram-negative facultatively anaerobic metal-reducing bacteria commonly found in chemically (i.e., redox) stratified environments. Occupying such niches requires the ability to rapidly acclimate to changes in electron donor/acceptor type and availability; hence, the ability to compete and thrive in such environments must ultimately be reflected in the organization and utilization of electron transfer networks, as well as central and peripheral carbon metabolism. To understand how Shewanella oneidensis MR-1 utilizes its resources, the metabolic network was reconstructed. The resulting network consists of 774 reactions, 783 genes, and 634 unique metabolites and contains biosynthesis pathways for all cell constituents. Using constraint-based modeling, we investigated aerobic growth of S. oneidensis MR-1 on numerous carbon sources. To achieve this, we (i) used experimental data to formulate a biomass equation and estimate cellular ATP requirements, (ii) developed an approach to identify cycles (such as futile cycles and circulations), (iii) classified how reaction usage affects cellular growth, (iv) predicted cellular biomass yields on different carbon sources and compared model predictions to experimental measurements, and (v) used experimental results to refine metabolic fluxes for growth on lactate. The results revealed that aerobic lactate-grown cells of S. oneidensis MR-1 used less efficient enzymes to couple electron transport to proton motive force generation, and possibly operated at least one futile cycle involving malic enzymes. Several examples are provided whereby model predictions were validated by experimental data, in particular the role of serine hydroxymethyltransferase and glycine cleavage system in the metabolism of one-carbon units, and growth on different sources of carbon and energy. This work illustrates how integration of computational and experimental efforts facilitates the understanding of microbial metabolism at a systems

  11. LSG: An External-Memory Tool to Compute String Graphs for Next-Generation Sequencing Data Assembly.

    PubMed

    Bonizzoni, Paola; Vedova, Gianluca Della; Pirola, Yuri; Previtali, Marco; Rizzi, Raffaella

    2016-03-01

    The large amount of short read data that has to be assembled in future applications, such as in metagenomics or cancer genomics, strongly motivates the investigation of disk-based approaches to index next-generation sequencing (NGS) data. Positive results in this direction stimulate the investigation of efficient external memory algorithms for de novo assembly from NGS data. Our article is also motivated by the open problem of designing a space-efficient algorithm to compute a string graph using an indexing procedure based on the Burrows-Wheeler transform (BWT). We have developed a disk-based algorithm for computing string graphs in external memory: the light string graph (LSG). LSG relies on a new representation of the FM-index that is exploited to use an amount of main memory requirement that is independent from the size of the data set. Moreover, we have developed a pipeline for genome assembly from NGS data that integrates LSG with the assembly step of SGA (Simpson and Durbin, 2012 ), a state-of-the-art string graph-based assembler, and uses BEETL for indexing the input data. LSG is open source software and is available online. We have analyzed our implementation on a 875-million read whole-genome dataset, on which LSG has built the string graph using only 1GB of main memory (reducing the memory occupation by a factor of 50 with respect to SGA), while requiring slightly more than twice the time than SGA. The analysis of the entire pipeline shows an important decrease in memory usage, while managing to have only a moderate increase in the running time. PMID:26953874

  12. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a contract opportunity for SDVO competition exists....

  13. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a contract opportunity for SDVO competition exists....

  14. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a contract opportunity for SDVO competition exists....

  15. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a contract opportunity for SDVO competition exists....

  16. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a contract opportunity for SDVO competition exists....

  17. Developing molecular tools and insights into the Penstemon genome using genomic reduction and next-generation sequencing

    PubMed Central

    2013-01-01

    (93 taxa), DNA sequence within these amplicons (12 SSR/INDEL markers) was highly diverse. With the continued decline in next-generation sequencing costs, it will soon be feasible to use genomic reduction techniques to simultaneously sequence thousands of homologous loci across dozens of Penstemon species. Such efforts will greatly facilitate our understanding of the phylogenetic structure within this important drought tolerant genus. In the interim, this study identified thousands of SNPs and over 50 SSRs/INDELs which should provide a foundation for future Penstemon phylogenetic studies and breeding efforts. PMID:23924218

  18. Electric pulses: a flexible tool to manipulate cytosolic calcium concentrations and generate spontaneous-like calcium oscillations in mesenchymal stem cells

    PubMed Central

    de Menorval, Marie-Amelie; Andre, Franck M.; Silve, Aude; Dalmay, Claire; Français, Olivier; Le Pioufle, Bruno; Mir, Lluis M.

    2016-01-01

    Human adipose mesenchymal stem cells (haMSCs) are multipotent adult stem cells of great interest in regenerative medicine or oncology. They present spontaneous calcium oscillations related to cell cycle progression or differentiation but the correlation between these events is still unclear. Indeed, it is difficult to mimic haMSCs spontaneous calcium oscillations with chemical means. Pulsed electric fields (PEFs) can permeabilise plasma and/or organelles membranes depending on the applied pulses and therefore generate cytosolic calcium peaks by recruiting calcium from the external medium or from internal stores. We show that it is possible to mimic haMSCs spontaneous calcium oscillations (same amplitude, duration and shape) using 100 μs PEFs or 10 ns PEFs. We propose a model that explains the experimental situations reported. PEFs can therefore be a flexible tool to manipulate cytosolic calcium concentrations. This tool, that can be switched on and off instantaneously, contrary to chemicals agents, can be very useful to investigate the role of calcium oscillations in cell physiology and/or to manipulate cell fate. PMID:27561994

  19. PyBetVH: A Python tool for probabilistic volcanic hazard assessment and for generation of Bayesian hazard curves and maps

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary

    2015-06-01

    PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.

  20. FluxSuite: a New Scientific Tool for Advanced Network Management and Cross-Sharing of Next-Generation Flux Stations

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.

    2015-12-01

    Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from

  1. Modeling of low-temperature plasmas generated using laser-induced breakdown spectroscopy: the ChemCam diagnostic tool on the Mars Science Laboratory Rover

    NASA Astrophysics Data System (ADS)

    Colgan, James

    2016-05-01

    We report on efforts to model the low-temperature plasmas generated using laser-induced breakdown spectroscopy (LIBS). LIBS is a minimally invasive technique that can quickly and efficiently determine the elemental composition of a target and is employed in an extremely wide range of applications due to its ease of use and fast turnaround. In particular, LIBS is the diagnostic tool used by the ChemCam instrument on the Mars Science Laboratory rover Curiosity. In this talk, we report on the use of the Los Alamos plasma modeling code ATOMIC to simulate LIBS plasmas, which are typically at temperatures of order 1 eV and electron densities of order 10 16 - 17 cm-3. At such conditions, these plasmas are usually in local-thermodynamic equilibrium (LTE) and normally contain neutral and singly ionized species only, which then requires that modeling must use accurate atomic structure data for the element under investigation. Since LIBS devices are often employed in a very wide range of applications, it is therefore desirable to have accurate data for most of the elements in the periodic table, ideally including actinides. Here, we discuss some recent applications of our modeling using ATOMIC that have explored the plasma physics aspects of LIBS generated plasmas, and in particular discuss the modeling of a plasma formed from a basalt sample used as a ChemCam standard1. We also highlight some of the more general atomic physics challenges that are encountered when attempting to model low-temperature plasmas. The Los Alamos National Laboratory is operated by Los Alamos National Security, LLC for the National Nuclear Security Administration of the U.S. Department of Energy under Contract No. DE-AC5206NA25396. Work performed in conjunction with D. P. Kilcrease, H. M. Johns, E. J. Judge, J. E. Barefield, R. C. Wiens, S. M. Clegg.

  2. Rapid SAW Sensor Development Tools

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2007-01-01

    The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.

  3. 49 CFR 40.387 - What matters does the Director decide concerning a proposed PIE?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... proposed PIE? 40.387 Section 40.387 Transportation Office of the Secretary of Transportation PROCEDURES FOR... the Director decide concerning a proposed PIE? (a) Following the service agent's response (see § 40... from the service agent (see § 40.379(b)(1)) or on his or her own motion, the Director may dismiss a...

  4. ENGINEERING FORUM ISSUE: CONSIDERATIONS IN DECIDING TO TREAT CONTAMINATED UNSATURATED SOILS

    EPA Science Inventory

    The purpose of this document is to provide assistance in deciding in situ treatment of contaminated soils is a potentially feasible remedial alternative. echnical considerations that affect the decision to treat soils in situ are discussed. eneral factors which influence the sele...

  5. Improving Vocational Students' Consideration of Source Information When Deciding about Science Controversies

    ERIC Educational Resources Information Center

    Stadtler, Marc; Scharrer, Lisa; Macedo-Rouet, Monica; Rouet, Jean-François; Bromme, Rainer

    2016-01-01

    We present an empirical investigation of a classroom training fostering vocational students' consideration of source information when deciding about science-based controversies. The training was specifically aimed at raising students' awareness of the division of cognitive labor and the resulting need to take a source's competence into account…

  6. The Effect of "Career Cruising" on the Self-Efficacy of Students Deciding on Majors

    ERIC Educational Resources Information Center

    Cunningham, Karen; Smothers, Anthony

    2014-01-01

    We analyzed the impact of a self-assessment instrument on the self-efficacy of those deciding on majors in a university setting. Using a pre- and post-test methodology, we employed "Career Cruising" to measure career decision-making self-efficacy. Participants completed the "Career Decision Self-Efficacy-Short Form" (CDSE-SF)…

  7. 42 CFR 405.1016 - Time frames for deciding an appeal before an ALJ.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Time frames for deciding an appeal before an ALJ. 405.1016 Section 405.1016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... § 405.1037 against another party to the hearing, the adjudication periods discussed in paragraphs...

  8. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... indicates, or the ALJ determines that applying the standard timeframe for making a decision may seriously... 42 Public Health 3 2010-10-01 2010-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF...

  9. Vaginal Birth After Cesarean Delivery: Deciding on a Trial of Labor After a Cesarean Delivery (TOLAC)

    MedlinePlus

    ... a previous pregnancy, TOLAC is not advised. • A pregnancy problem or a medical condition that makes vaginal delivery risky • Type of hospital—The hospital in which you have a TOLAC should be prepared to deal with emergencies that may arise. Whatever I decide, are there ...

  10. College or Training Programs: How to Decide. PACER Center ACTion Information Sheets. PHP-c115

    ERIC Educational Resources Information Center

    PACER Center, 2006

    2006-01-01

    A high school diploma opens the door to many exciting new options. These might include a first full-time job, or part-time or full-time attendance at a technical school, community college, or university. Students might want to obtain a certificate, an associate degree, or a diploma. With so many choices, it can be a challenge to decide which path…

  11. 5 CFR 890.1024 - Standard and burden of proof for deciding contests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1024 Standard and burden of proof for... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Standard and burden of proof for deciding contests. 890.1024 Section 890.1024 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT...

  12. Live and Let Die: CITES--How We Decide the Fate of the World's Species.

    ERIC Educational Resources Information Center

    Beasley, Conger, Jr.

    1992-01-01

    Discusses the significance of the decisions made at the Eighth Convention on the International Trade of Endangered Species (CITES) when governmental delegates and nongovernmental organizations from around the world decided the fate of potentially threatened and endangered species of plants and animals. Particular emphasis is placed on the politics…

  13. 20 CFR 416.1861 - Deciding whether you are a child: Are you a student?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... student? 416.1861 Section 416.1861 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered A Child § 416.1861 Deciding whether you are a child: Are you a student? (a) Are you a student? You are a student...

  14. 20 CFR 416.1861 - Deciding whether you are a child: Are you a student?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... student? 416.1861 Section 416.1861 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered A Child § 416.1861 Deciding whether you are a child: Are you a student? (a) Are you a student? You are a student...

  15. 42 CFR 51a.5 - What criteria will DHHS use to decide which projects to fund?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICES GRANTS PROJECT GRANTS FOR MATERNAL AND CHILD HEALTH § 51a.5 What criteria will DHHS use to decide...) The extent to which the project will contribute to the advancement of maternal and child health and/or... mortality rate (relative to the latest average infant mortality rate in the United States or in the State...

  16. 42 CFR 51a.5 - What criteria will DHHS use to decide which projects to fund?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SERVICES GRANTS PROJECT GRANTS FOR MATERNAL AND CHILD HEALTH § 51a.5 What criteria will DHHS use to decide...) The extent to which the project will contribute to the advancement of maternal and child health and/or... mortality rate (relative to the latest average infant mortality rate in the United States or in the State...

  17. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Deciding a contest without additional fact-finding. 890.1038 Section 890.1038 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... fact-finding. (a) Written decision. The suspending official shall issue a written decision on...

  18. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false Deciding a contest without additional fact-finding. 890.1038 Section 890.1038 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... fact-finding. (a) Written decision. The suspending official shall issue a written decision on...

  19. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false Deciding a contest without additional fact-finding. 890.1038 Section 890.1038 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... fact-finding. (a) Written decision. The suspending official shall issue a written decision on...

  20. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Deciding a contest without additional fact-finding. 890.1038 Section 890.1038 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... fact-finding. (a) Written decision. The suspending official shall issue a written decision on...

  1. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Deciding a contest without additional fact-finding. 890.1038 Section 890.1038 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... fact-finding. (a) Written decision. The suspending official shall issue a written decision on...

  2. The Five Stages of Deciding on a Purchase...or a Job.

    ERIC Educational Resources Information Center

    Summey, John H.; Anderson, Carol H.

    1992-01-01

    Describes five stages of deciding on purchase or job: recognition of employment need; career information search; evaluation of career alternatives; identification and acceptance of employment; and postchoice evaluation. Evaluated importance of freedom/significance, growth, and variety in career decisions of 362 college students. Concludes…

  3. 31 CFR 375.20 - When will the Treasury decide on which offers to accept?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 2 2012-07-01 2012-07-01 false When will the Treasury decide on which offers to accept? 375.20 Section 375.20 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT...

  4. 31 CFR 375.20 - When will the Treasury decide on which offers to accept?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false When will the Treasury decide on which offers to accept? 375.20 Section 375.20 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC...

  5. 31 CFR 375.20 - When will the Treasury decide on which offers to accept?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 2 2013-07-01 2013-07-01 false When will the Treasury decide on which offers to accept? 375.20 Section 375.20 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT...

  6. 31 CFR 375.20 - When will the Treasury decide on which offers to accept?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false When will the Treasury decide on which offers to accept? 375.20 Section 375.20 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT...

  7. 12 CFR 617.7415 - How does a qualified lender decide to restructure a loan?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false How does a qualified lender decide to restructure a loan? 617.7415 Section 617.7415 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM... projections, the lender may use benchmarks to determine the operational input costs and chattel...

  8. 12 CFR 617.7415 - How does a qualified lender decide to restructure a loan?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false How does a qualified lender decide to restructure a loan? 617.7415 Section 617.7415 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM... projections, the lender may use benchmarks to determine the operational input costs and chattel...

  9. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of FEHBP... meet professionally-recognized quality standards, OPM shall obtain the input of trained...

  10. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of FEHBP... meet professionally-recognized quality standards, OPM shall obtain the input of trained...

  11. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of FEHBP... meet professionally-recognized quality standards, OPM shall obtain the input of trained...

  12. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of FEHBP... meet professionally-recognized quality standards, OPM shall obtain the input of trained...

  13. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of FEHBP... meet professionally-recognized quality standards, OPM shall obtain the input of trained...

  14. 42 CFR 51a.5 - What criteria will DHHS use to decide which projects to fund?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false What criteria will DHHS use to decide which projects to fund? 51a.5 Section 51a.5 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS PROJECT GRANTS FOR MATERNAL AND CHILD HEALTH § 51a.5 What criteria will DHHS use to...

  15. 42 CFR 51a.5 - What criteria will DHHS use to decide which projects to fund?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false What criteria will DHHS use to decide which projects to fund? 51a.5 Section 51a.5 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS PROJECT GRANTS FOR MATERNAL AND CHILD HEALTH § 51a.5 What criteria will DHHS use to...

  16. 42 CFR 51a.5 - What criteria will DHHS use to decide which projects to fund?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false What criteria will DHHS use to decide which projects to fund? 51a.5 Section 51a.5 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS PROJECT GRANTS FOR MATERNAL AND CHILD HEALTH § 51a.5 What criteria will DHHS use to...

  17. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... protest? 124.1008 Section 124.1008 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1008 When will SBA not decide...

  18. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... protest? 124.1008 Section 124.1008 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1008 When will SBA not decide...

  19. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... protest? 124.1008 Section 124.1008 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1008 When will SBA not decide...

  20. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... protest? 124.1008 Section 124.1008 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1008 When will SBA not decide...

  1. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... protest? 124.1008 Section 124.1008 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1008 When will SBA not decide...

  2. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  3. Polypharmacy or medication washout: an old tool revisited

    PubMed Central

    Hoffman, Daniel A; Schiller, Mark; Greenblatt, James M; Iosifescu, Dan V

    2011-01-01

    There has been a rapid increase in the use of polypharmacy in psychiatry possibly due to the introduction of newer drugs, greater availability of these newer drugs, excessive confidence in clinical trial results, widespread prescribing of psychotropic medications by primary care, and pressure to augment with additional medications for unresolved side effects or greater efficacy. Even the new generation of medications may not hold significant advantages over older drugs. In fact, there may be additional safety risks with polypharmacy being so widespread. Washout, as a clinical tool, is rarely done in medication management today. Studies have shown that augmenting therapy with additional medications resulted in 9.1%–34.1% dropouts due to intolerance of the augmentation, whereas studies of medication washout demonstrated only 5.9%–7.8% intolerance to the washout procedure. These perils justify reconsideration of medication washout before deciding on augmentation. There are unwarranted fears and resistance in the medical community toward medication washout, especially at the moment a physician is trying to decide whether to washout or add more medications to the treatment regimen. However, medication washout provides unique benefits to the physician: it establishes a new baseline of the disorder, helps identify medication efficacy from their adverse effects, and provides clarity of diagnosis and potential reduction of drug treatments, drug interactions, and costs. It may also reduce overall adverse events, not to mention a potential to reduce liability. After washout, physicians may be able to select the appropriate polypharmacy more effectively and safely, if necessary. Washout, while not for every patient, may be an effective tool for physicians who need to decide on whether to add potentially risky polypharmacy for a given patient. The risks of washout may, in some cases, be lower and the benefits may be clearly helpful for diagnosis, understanding medication

  4. Downhole tool

    DOEpatents

    Hall, David R.; Muradov, Andrei; Pixton, David S.; Dahlgren, Scott Steven; Briscoe, Michael A.

    2007-03-20

    A double shouldered downhole tool connection comprises box and pin connections having mating threads intermediate mating primary and secondary shoulders. The connection further comprises a secondary shoulder component retained in the box connection intermediate a floating component and the primary shoulders. The secondary shoulder component and the pin connection cooperate to transfer a portion of makeup load to the box connection. The downhole tool may be selected from the group consisting of drill pipe, drill collars, production pipe, and reamers. The floating component may be selected from the group consisting of electronics modules, generators, gyroscopes, power sources, and stators. The secondary shoulder component may comprises an interface to the box connection selected from the group consisting of radial grooves, axial grooves, tapered grooves, radial protrusions, axial protrusions, tapered protrusions, shoulders, and threads.

  5. Examining the impact of larval source management and insecticide-treated nets using a spatial agent-based model of Anopheles gambiae and a landscape generator tool

    PubMed Central

    2013-01-01

    Background Agent-based models (ABMs) have been used to estimate the effects of malaria-control interventions. Early studies have shown the efficacy of larval source management (LSM) and insecticide-treated nets (ITNs) as vector-control interventions, applied both in isolation and in combination. However, the robustness of results can be affected by several important modelling assumptions, including the type of boundary used for landscapes, and the number of replicated simulation runs reported in results. Selection of the ITN coverage definition may also affect the predictive findings. Hence, by replication, independent verification of prior findings of published models bears special importance. Methods A spatially-explicit entomological ABM of Anopheles gambiae is used to simulate the resource-seeking process of mosquitoes in grid-based landscapes. To explore LSM and replicate results of an earlier LSM study, the original landscapes and scenarios are replicated by using a landscape generator tool, and 1,800 replicated simulations are run using absorbing and non-absorbing boundaries. To explore ITNs and evaluate the relative impacts of the different ITN coverage schemes, the settings of an earlier ITN study are replicated, the coverage schemes are defined and simulated, and 9,000 replicated simulations for three ITN parameters (coverage, repellence and mortality) are run. To evaluate LSM and ITNs in combination, landscapes with varying densities of houses and human populations are generated, and 12,000 simulations are run. Results General agreement with an earlier LSM study is observed when an absorbing boundary is used. However, using a non-absorbing boundary produces significantly different results, which may be attributed to the unrealistic killing effect of an absorbing boundary. Abundance cannot be completely suppressed by removing aquatic habitats within 300 m of houses. Also, with density-dependent oviposition, removal of insufficient number of aquatic

  6. On the utility of spectroscopic imaging as a tool for generating geometrically accurate MR images and parameter maps in the presence of field inhomogeneities and chemical shift effects.

    PubMed

    Bakker, Chris J G; de Leeuw, Hendrik; van de Maat, Gerrit H; van Gorp, Jetse S; Bouwman, Job G; Seevinck, Peter R

    2013-01-01

    Lack of spatial accuracy is a recognized problem in magnetic resonance imaging (MRI) which severely detracts from its value as a stand-alone modality for applications that put high demands on geometric fidelity, such as radiotherapy treatment planning and stereotactic neurosurgery. In this paper, we illustrate the potential and discuss the limitations of spectroscopic imaging as a tool for generating purely phase-encoded MR images and parameter maps that preserve the geometry of an object and allow localization of object features in world coordinates. Experiments were done on a clinical system with standard facilities for imaging and spectroscopy. Images were acquired with a regular spin echo sequence and a corresponding spectroscopic imaging sequence. In the latter, successive samples of the acquired echo were used for the reconstruction of a series of evenly spaced images in the time and frequency domain. Experiments were done with a spatial linearity phantom and a series of test objects representing a wide range of susceptibility- and chemical-shift-induced off-resonance conditions. In contrast to regular spin echo imaging, spectroscopic imaging was shown to be immune to off-resonance effects, such as those caused by field inhomogeneity, susceptibility, chemical shift, f(0) offset and field drift, and to yield geometrically accurate images and parameter maps that allowed object structures to be localized in world coordinates. From these illustrative examples and a discussion of the limitations of purely phase-encoded imaging techniques, it is concluded that spectroscopic imaging offers a fundamental solution to the geometric deficiencies of MRI which may evolve toward a practical solution when full advantage will be taken of current developments with regard to scan time reduction. This perspective is backed up by a demonstration of the significant scan time reduction that may be achieved by the use of compressed sensing for a simple phantom. PMID:22898694

  7. NI-49SMART SUCKER: NEXT GENERATION SMART SURGICAL TOOL FOR INTRAOPERATIVE BRAIN TUMOR RESECTION USING TIME RESOLVED LASER INDUCED FLUORESCENCE SPECTROSCOPY

    PubMed Central

    Kittle, David S.; Butte, Pramod V.; Vasefi, Fartash; Patil, Chirag G.; Black, Keith

    2014-01-01

    Primary brain tumors are highly lethal tumors where surgical resection is the primary treatment of choice. It has been shown that survival rate is directly related to the extent of tumor resection. In order to aid the surgeon in achieving near-complete resection, novel technologies are required. Time-resolved laser induced fluorescence spectroscopy (TRLIFS) promises to be one such technology, where the tissue is excited using an ultra-short laser and the corresponding fluorescence intensity decay is captured. Based on the fluorescence spectrum and the decay characteristics at various color bands from TRLIFS, differentiation of tumor from the normal brain tissue is possible in real-time. We built a portable TRLIFS system using custom optics and hardware (laser excitation: 355nm, 400ps pulse width, 5 uJ/pulse; PMT detector: Photek, rise time 80 picoseconds; digitizer: 7 Giga-samples per second) which is capable of providing the results in real time (every 50 milliseconds). We have designed a custom probe which is attached to a Roton sucker "Smart sucker" to collect the data during surgical resection from patients at Cedars-Sinai Medical Center. The histopathological diagnosis of the site under study with TRLIFS is confirmed with a biopsy and H-E staining. We will present our preliminary data from human brain tumor samples collected in-vivo. Our preliminary study shows that TRLIFS is capable of classifying low grade tumors with high sensitivity and specificity. This study will also demonstrate the potential of using the TRLIFS system to enhance the surgical instrumentation, aiding surgeons in near-complete excision of tumors and bringing these instruments into the next generation of smart tools.

  8. Cratylia mollis 1, 4 lectin: a new biotechnological tool in IL-6, IL-17A, IL-22, and IL-23 induction and generation of immunological memory.

    PubMed

    de Oliveira, Priscilla Stela Santana; Rêgo, Moacyr Jesus Barreto de Melo; da Silva, Rafael Ramos; Cavalcanti, Mariana Brayner; Galdino, Suely Lins; Correia, Maria Tereza dos Santos; Coelho, Luana Cassandra Breitenbach Barroso; Pitta, Maira Galdino da Rocha

    2013-01-01

    Cratylia mollis lectin has already established cytokine induction in Th1 and Th2 pathways. Thereby, this study aimed to evaluate Cramoll 1, 4 in IL-6, IL-17A, IL-22, and IL-23 induction as well as analyze immunologic memory mechanism by reinducing lymphocyte stimulation. Initially we performed a screening in cultured splenocytes where Cramoll 1, 4 stimulated IL-6 production 5x more than ConA (P < 0.05). The same behavior was observed with IL-22 where the increase was greater than 4x. Nevertheless, IL-17A induction was similar for both lectins. In PBMCs, the same splenocytes course was observed for IL-6 and IL-17A. Concerning the stimulation of IL-22 and IL-23 Cramoll 1, 4 was more efficient than ConA in cytokines stimulation mainly in IL-23 (P < 0.01). Analyzing reinduced lymphocyte stimulation, IL-17A production was higher (P < 0.001) when the first stimulus was realized with Cramoll 1, 4 at 1 μ g/mL and the second at 5 μ g/mL. IL-22 shows significant differences (P < 0.01) at the same condition. Nevertheless, IL-23 revels the best response when the first stimuli was realized with Cramoll1, 4 at 100 ng/mL and the second with 5 μ g/mL. We conclude that the Cramoll 1, 4 is able to induce IL-6, IL-17A, IL-22, and IL-23 cytokines in vitro better than Concavalin A, besides immunologic memory generation, being a potential biotechnological tool in Th17 pathway studies. PMID:23586026

  9. RHyThM, a tool for analysis of PDOS formatted hyperthermia treatment data generated by the BSD2000/3D system.

    PubMed

    Fatehi, Daryoush; de Bruijne, Maarten; van der Zee, Jacoba; van Rhoon, Gerard C

    2006-03-01

    One of the systems used by hyperthermia (HT) groups for heating tumours in the pelvic region is the BSD2000 system. Previous versions of the BSD2000 operate on a PDOS machine and the majority of the currently installed BSD2000/3D systems are still running under PDOS. Availability of the PDOS formatted treatment data provided by the BSD2000/3D has some difficulties. To facilitate analysis of the PDOS formatted treatment data generated by the BSD2000/3D a programme, called RHyThM (Rotterdam Hyperthermia Thermal Modulator) has been created. The purpose of RHyThM is first to read and check the integrity and validity of the treatment data for each measurement in time and space as provided by the BSD2000/3D and secondly to register a tissue type, based on computer tomography information, for each temperature probe position. Prior to any analyses, RHyThM shows the temperature profiles enabling the user to check on probe movement and to correct for unrealistically high temperature gradients in time and space. Subsequently, this approved data set is saved in a 'mother-file' for future on-demand thermal dose analyses. A unique feature of RHyThM is that it also shows all radiofrequency (RF) power signals for inspection. Finally, to make a quick assessment of the quality of the applied HT-treatment, RHyThM reports several temperature indices for bladder, vagina and rectum as well as RF-power related quantities. In summary, RHyThM is considered a valuable tool as it quickly provides a quality index per treatment, which serves as input for the preparation of the next treatment. Further, it makes verified and improved primary data sets accessible for further analysis with advanced statistical programmes. PMID:16754600

  10. Analysis of the Vaginal Microbiome by Next-Generation Sequencing and Evaluation of its Performance as a Clinical Diagnostic Tool in Vaginitis

    PubMed Central

    Hong, Ki Ho; Hong, Sung Kuk; Cho, Sung Im; Ra, Eunkyung; Han, Kyung Hee; Kang, Soon Beom; Kim, Eui-Chong; Park, Sung Sup

    2016-01-01

    Background Next-generation sequencing (NGS) can detect many more microorganisms of a microbiome than traditional methods. This study aimed to analyze the vaginal microbiomes of Korean women by using NGS that included bacteria and other microorganisms. The NGS results were compared with the results of other assays, and NGS was evaluated for its feasibility for predicting vaginitis. Methods In total, 89 vaginal swab specimens were collected. Microscopic examinations of Gram staining and microbiological cultures were conducted on 67 specimens. NGS was performed with GS junior system on all of the vaginal specimens for the 16S rRNA, internal transcribed spacer (ITS), and Tvk genes to detect bacteria, fungi, and Trichomonas vaginalis. In addition, DNA probe assays of the Candida spp., Gardnerella vaginalis, and Trichomonas vaginalis were performed. Various predictors of diversity that were obtained from the NGS data were analyzed to predict vaginitis. Results ITS sequences were obtained in most of the specimens (56.2%). The compositions of the intermediate and vaginitis Nugent score groups were similar to each other but differed from the composition of the normal score group. The fraction of the Lactobacillus spp. showed the highest area under the curve value (0.8559) in ROC curve analysis. The NGS and DNA probe assay results showed good agreement (range, 86.2-89.7%). Conclusions Fungi as well as bacteria should be considered for the investigation of vaginal microbiome. The intermediate and vaginitis Nugent score groups were indistinguishable in NGS. NGS is a promising diagnostic tool of the vaginal microbiome and vaginitis, although some problems need to be resolved. PMID:27374709

  11. Generating "Random" Integers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2011-01-01

    One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…

  12. 5 CFR 890.1069 - Information the debarring official must consider in deciding a provider's contest of proposed...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL... deciding a provider's contest of proposed penalties and assessments. (a) Documentary material and...

  13. Split views among parents regarding children's right to decide about participation in research: a questionnaire survey.

    PubMed

    Swartling, U; Helgesson, G; Hansson, M G; Ludvigsson, J

    2009-07-01

    Based on extensive questionnaire data, this paper focuses on parents' views about children's right to decide about participation in research. The data originates from 4000 families participating in a longitudinal prospective screening as 1997. Although current regulations and recommendations underline that children should have influence over their participation, many parents in this study disagree. Most (66%) were positive providing information to the child about relevant aspects of the study. However, responding parents were split about whether or not children should at some point be allowed decisional authority when participating in research: 41.6% of the parents reported being against or unsure. Those who responded positively believed that children should be allowed to decide about blood-sampling procedures (70%), but to a less extent about participation (48.5%), analyses of samples (19.7%) and biological bank storage (15.4%). That as many as possible should remain in the study, and that children do not have the competence to understand the consequences for research was strongly stressed by respondents who do not think children should have a right to decide. When asked what interests they consider most important in paediatric research, child autonomy and decision-making was ranked lowest. We discuss the implications of these findings. PMID:19567697

  14. Databases and tools for nuclear astrophysics applications. BRUSsels Nuclear LIBrary (BRUSLIB), Nuclear Astrophysics Compilation of REactions II (NACRE II) and Nuclear NETwork GENerator (NETGEN)

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Goriely, S.; Jorissen, A.; Chen, G. L.; Arnould, M.

    2013-01-01

    An update of a previous description of the BRUSLIB + NACRE package of nuclear data for astrophysics and of the web-based nuclear network generator NETGEN is presented. The new version of BRUSLIB contains the latest predictions of a wide variety of nuclear data based on the most recent version of the Brussels-Montreal Skyrme-Hartree-Fock-Bogoliubov model. The nuclear masses, radii, spin/parities, deformations, single-particle schemes, matter densities, nuclear level densities, E1 strength functions, fission properties, and partition functions are provided for all nuclei lying between the proton and neutron drip lines over the 8 ≤ Z ≤ 110 range, whose evaluation is based on a unique microscopic model that ensures a good compromise between accuracy, reliability, and feasibility. In addition, these various ingredients are used to calculate about 100 000 Hauser-Feshbach neutron-, proton-, α-, and γ-induced reaction rates based on the reaction code TALYS. NACRE is superseded by the NACRE II compilation for 15 charged-particle transfer reactions and 19 charged-particle radiative captures on stable targets with mass numbers A < 16. NACRE II features the inclusion of experimental data made available after the publication of NACRE in 1999 and up to 2011. In addition, the extrapolation of the available data to the very low energies of astrophysical relevance is improved through the systematic use of phenomenological potential models. Uncertainties in the rates are also evaluated on this basis. Finally, the latest release v10.0 of the web-based tool NETGEN is presented. In addition to the data already used in the previous NETGEN package, it contains in a fully documented form the new BRUSLIB and NACRE II data, as well as new experiment-based radiative neutron capture cross sections. The full new versions of BRUSLIB, NACRE II, and NETGEN are available electronically from the nuclear database at http://www.astro.ulb.ac.be/NuclearData. The nuclear material is presented in

  15. 33 CFR 96.440 - How will the Coast Guard decide whether to approve an organization's request to be authorized?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false How will the Coast Guard decide... Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY VESSEL OPERATING REGULATIONS RULES FOR THE... Act on Behalf of the U.S. § 96.440 How will the Coast Guard decide whether to approve an...

  16. 33 CFR 96.440 - How will the Coast Guard decide whether to approve an organization's request to be authorized?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false How will the Coast Guard decide... Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY VESSEL OPERATING REGULATIONS RULES FOR THE... Act on Behalf of the U.S. § 96.440 How will the Coast Guard decide whether to approve an...

  17. 33 CFR 96.440 - How will the Coast Guard decide whether to approve an organization's request to be authorized?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false How will the Coast Guard decide... Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY VESSEL OPERATING REGULATIONS RULES FOR THE... Act on Behalf of the U.S. § 96.440 How will the Coast Guard decide whether to approve an...

  18. 33 CFR 96.440 - How will the Coast Guard decide whether to approve an organization's request to be authorized?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false How will the Coast Guard decide... Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY VESSEL OPERATING REGULATIONS RULES FOR THE... Act on Behalf of the U.S. § 96.440 How will the Coast Guard decide whether to approve an...

  19. 12 CFR 617.7615 - What should the System institution do when it decides to lease acquired agricultural real estate?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15...

  20. 12 CFR 617.7620 - What should the System institution do when it decides to sell acquired agricultural real estate...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... decides to sell acquired agricultural real estate at a public auction? 617.7620 Section 617.7620 Banks and... What should the System institution do when it decides to sell acquired agricultural real estate at a public auction? System institutions electing to sell or lease acquired agricultural real estate or...

  1. 12 CFR 617.7610 - What should the System institution do when it decides to sell acquired agricultural real estate?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it...

  2. 12 CFR 617.7615 - What should the System institution do when it decides to lease acquired agricultural real estate?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15...

  3. 12 CFR 617.7610 - What should the System institution do when it decides to sell acquired agricultural real estate?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it...

  4. 12 CFR 617.7615 - What should the System institution do when it decides to lease acquired agricultural real estate?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15...

  5. 12 CFR 617.7610 - What should the System institution do when it decides to sell acquired agricultural real estate?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it...

  6. 12 CFR 617.7615 - What should the System institution do when it decides to lease acquired agricultural real estate?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15...

  7. 12 CFR 617.7620 - What should the System institution do when it decides to sell acquired agricultural real estate...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... decides to sell acquired agricultural real estate at a public auction? 617.7620 Section 617.7620 Banks and... What should the System institution do when it decides to sell acquired agricultural real estate at a public auction? System institutions electing to sell or lease acquired agricultural real estate or...

  8. 12 CFR 617.7610 - What should the System institution do when it decides to sell acquired agricultural real estate?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it...

  9. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1,...

  10. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1,...

  11. 20 CFR 408.630 - How will we notify you when we decide you need a representative payee?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false How will we notify you when we decide you need a representative payee? 408.630 Section 408.630 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SPECIAL BENEFITS FOR CERTAIN WORLD WAR II VETERANS Representative Payment § 408.630 How will we notify you when we decide you need a...

  12. 20 CFR 416.630 - How will we notify you when we decide you need a representative payee?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false How will we notify you when we decide you need a representative payee? 416.630 Section 416.630 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Representative Payment § 416.630 How will we notify you when we decide you need...

  13. 20 CFR 404.2030 - How will we notify you when we decide you need a representative payee?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false How will we notify you when we decide you need a representative payee? 404.2030 Section 404.2030 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Representative Payment § 404.2030 How will we notify you when we decide you need...

  14. 20 CFR 404.2030 - How will we notify you when we decide you need a representative payee?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 2 2014-04-01 2014-04-01 false How will we notify you when we decide you need a representative payee? 404.2030 Section 404.2030 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Representative Payment § 404.2030 How will we notify you when we decide you need...

  15. 25 CFR 103.34 - What if the lender and borrower decide to change the terms of the loan?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... identity or organizational structure of the borrower. (5) Allow any material change in the use of loan... 25 Indians 1 2010-04-01 2010-04-01 false What if the lender and borrower decide to change the... What if the lender and borrower decide to change the terms of the loan? (a) The lender must...

  16. 5 CFR 890.1069 - Information the debarring official must consider in deciding a provider's contest of proposed...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Information the debarring official must consider in deciding a provider's contest of proposed penalties and assessments. 890.1069 Section 890.1069... deciding a provider's contest of proposed penalties and assessments. (a) Documentary material and...

  17. 5 CFR 1201.175 - Judicial review of cases decided under 5 U.S.C. 7702.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Discrimination Special Panel § 1201.175 Judicial review of cases decided under 5 U.S.C. 7702. (a) Place and type... of cases decided under 5 U.S.C. 7702. Those cases include appeals from actions taken under the... 5 U.S.C. 7702 must be filed within 30 days after the appellant received notice of the...

  18. 40 CFR 35.4225 - What if my group decides a prospective contractor has a conflict of interest?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... contractor has a conflict of interest? 35.4225 Section 35.4225 Protection of Environment ENVIRONMENTAL... decides a prospective contractor has a conflict of interest? If, after evaluating the information in § 35.4220, your group decides a prospective contractor has a significant conflict of interest that cannot...

  19. 42 CFR 59.7 - What criteria will the Department of Health and Human Services use to decide which family...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Human Services use to decide which family planning services projects to fund and in what amount? 59.7 Section 59.7 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS GRANTS... Department of Health and Human Services use to decide which family planning services projects to fund and...

  20. 42 CFR 59.7 - What criteria will the Department of Health and Human Services use to decide which family...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Human Services use to decide which family planning services projects to fund and in what amount? 59.7... FOR FAMILY PLANNING SERVICES Project Grants for Family Planning Services § 59.7 What criteria will the Department of Health and Human Services use to decide which family planning services projects to fund and...

  1. 42 CFR 59.7 - What criteria will the Department of Health and Human Services use to decide which family...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Human Services use to decide which family planning services projects to fund and in what amount? 59.7... FOR FAMILY PLANNING SERVICES Project Grants for Family Planning Services § 59.7 What criteria will the Department of Health and Human Services use to decide which family planning services projects to fund and...

  2. 42 CFR 59.7 - What criteria will the Department of Health and Human Services use to decide which family...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Human Services use to decide which family planning services projects to fund and in what amount? 59.7... FOR FAMILY PLANNING SERVICES Project Grants for Family Planning Services § 59.7 What criteria will the Department of Health and Human Services use to decide which family planning services projects to fund and...

  3. 42 CFR 59.7 - What criteria will the Department of Health and Human Services use to decide which family...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Human Services use to decide which family planning services projects to fund and in what amount? 59.7... FOR FAMILY PLANNING SERVICES Project Grants for Family Planning Services § 59.7 What criteria will the Department of Health and Human Services use to decide which family planning services projects to fund and...

  4. 31 CFR 19.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false What does the debarring official consider in deciding whether to debar me? 19.845 Section 19.845 Money and Finance: Treasury Office of the... does the debarring official consider in deciding whether to debar me? (a) The debarring official...

  5. 21 CFR 1404.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 9 2013-04-01 2013-04-01 false What does the debarring official consider in deciding whether to debar me? 1404.845 Section 1404.845 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL... debarring official consider in deciding whether to debar me? (a) The debarring official may debar you...

  6. 29 CFR 1471.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false What does the debarring official consider in deciding whether to debar me? 1471.845 Section 1471.845 Labor Regulations Relating to Labor (Continued) FEDERAL....845 What does the debarring official consider in deciding whether to debar me? (a) The...

  7. 31 CFR 19.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance: Treasury 1 2012-07-01 2012-07-01 false What does the debarring official consider in deciding whether to debar me? 19.845 Section 19.845 Money and Finance: Treasury Office of the... does the debarring official consider in deciding whether to debar me? (a) The debarring official...

  8. 41 CFR 105-68.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What does the debarring official consider in deciding whether to debar me? 105-68.845 Section 105-68.845 Public Contracts and... (NONPROCUREMENT) Debarment § 105-68.845 What does the debarring official consider in deciding whether to debar...

  9. 22 CFR 208.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 22 Foreign Relations 1 2011-04-01 2011-04-01 false What does the debarring official consider in deciding whether to debar me? 208.845 Section 208.845 Foreign Relations AGENCY FOR INTERNATIONAL... debarring official consider in deciding whether to debar me? (a) The debarring official may debar you...

  10. 31 CFR 19.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance: Treasury 1 2011-07-01 2011-07-01 false What does the debarring official consider in deciding whether to debar me? 19.845 Section 19.845 Money and Finance: Treasury Office of the... does the debarring official consider in deciding whether to debar me? (a) The debarring official...

  11. 29 CFR 1471.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 4 2012-07-01 2012-07-01 false What does the debarring official consider in deciding whether to debar me? 1471.845 Section 1471.845 Labor Regulations Relating to Labor (Continued) FEDERAL....845 What does the debarring official consider in deciding whether to debar me? (a) The...

  12. 29 CFR 1471.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 4 2013-07-01 2013-07-01 false What does the debarring official consider in deciding whether to debar me? 1471.845 Section 1471.845 Labor Regulations Relating to Labor (Continued) FEDERAL....845 What does the debarring official consider in deciding whether to debar me? (a) The...

  13. 21 CFR 1404.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 9 2011-04-01 2011-04-01 false What does the debarring official consider in deciding whether to debar me? 1404.845 Section 1404.845 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL... debarring official consider in deciding whether to debar me? (a) The debarring official may debar you...

  14. 21 CFR 1404.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 9 2014-04-01 2014-04-01 false What does the debarring official consider in deciding whether to debar me? 1404.845 Section 1404.845 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL... debarring official consider in deciding whether to debar me? (a) The debarring official may debar you...

  15. 29 CFR 1471.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 4 2014-07-01 2014-07-01 false What does the debarring official consider in deciding whether to debar me? 1471.845 Section 1471.845 Labor Regulations Relating to Labor (Continued) FEDERAL....845 What does the debarring official consider in deciding whether to debar me? (a) The...

  16. 2 CFR 180.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 2 Grants and Agreements 1 2013-01-01 2013-01-01 false What does the debarring official consider in deciding whether to debar me? 180.845 Section 180.845 Grants and Agreements Office of Management and Budget... (NONPROCUREMENT) Debarment § 180.845 What does the debarring official consider in deciding whether to debar me?...

  17. 5 CFR 919.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false What does the debarring official consider in deciding whether to debar me? 919.845 Section 919.845 Administrative Personnel OFFICE OF PERSONNEL... (NONPROCUREMENT) Debarment § 919.845 What does the debarring official consider in deciding whether to debar me?...

  18. 41 CFR 105-68.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What does the debarring official consider in deciding whether to debar me? 105-68.845 Section 105-68.845 Public Contracts and... (NONPROCUREMENT) Debarment § 105-68.845 What does the debarring official consider in deciding whether to debar...

  19. 31 CFR 19.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance: Treasury 1 2013-07-01 2013-07-01 false What does the debarring official consider in deciding whether to debar me? 19.845 Section 19.845 Money and Finance: Treasury Office of the... does the debarring official consider in deciding whether to debar me? (a) The debarring official...

  20. 21 CFR 1404.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 9 2012-04-01 2012-04-01 false What does the debarring official consider in deciding whether to debar me? 1404.845 Section 1404.845 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL... debarring official consider in deciding whether to debar me? (a) The debarring official may debar you...

  1. 5 CFR 919.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false What does the debarring official consider in deciding whether to debar me? 919.845 Section 919.845 Administrative Personnel OFFICE OF PERSONNEL... (NONPROCUREMENT) Debarment § 919.845 What does the debarring official consider in deciding whether to debar me?...

  2. 5 CFR 919.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false What does the debarring official consider in deciding whether to debar me? 919.845 Section 919.845 Administrative Personnel OFFICE OF PERSONNEL... (NONPROCUREMENT) Debarment § 919.845 What does the debarring official consider in deciding whether to debar me?...

  3. 29 CFR 1471.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false What does the debarring official consider in deciding whether to debar me? 1471.845 Section 1471.845 Labor Regulations Relating to Labor (Continued) FEDERAL....845 What does the debarring official consider in deciding whether to debar me? (a) The...

  4. 5 CFR 919.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false What does the debarring official consider in deciding whether to debar me? 919.845 Section 919.845 Administrative Personnel OFFICE OF PERSONNEL... (NONPROCUREMENT) Debarment § 919.845 What does the debarring official consider in deciding whether to debar me?...

  5. 41 CFR 105-68.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What does the debarring official consider in deciding whether to debar me? 105-68.845 Section 105-68.845 Public Contracts and... (NONPROCUREMENT) Debarment § 105-68.845 What does the debarring official consider in deciding whether to debar...

  6. 2 CFR 180.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false What does the debarring official consider in deciding whether to debar me? 180.845 Section 180.845 Grants and Agreements Office of Management and Budget...) Debarment § 180.845 What does the debarring official consider in deciding whether to debar me? (a)...

  7. 2 CFR 180.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 2 Grants and Agreements 1 2012-01-01 2012-01-01 false What does the debarring official consider in deciding whether to debar me? 180.845 Section 180.845 Grants and Agreements Office of Management and Budget... (NONPROCUREMENT) Debarment § 180.845 What does the debarring official consider in deciding whether to debar me?...

  8. 21 CFR 1404.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false What does the debarring official consider in deciding whether to debar me? 1404.845 Section 1404.845 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL... debarring official consider in deciding whether to debar me? (a) The debarring official may debar you...

  9. 5 CFR 919.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false What does the debarring official consider in deciding whether to debar me? 919.845 Section 919.845 Administrative Personnel OFFICE OF PERSONNEL... (NONPROCUREMENT) Debarment § 919.845 What does the debarring official consider in deciding whether to debar me?...

  10. 2 CFR 180.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false What does the debarring official consider in deciding whether to debar me? 180.845 Section 180.845 Grants and Agreements Office of Management and Budget... § 180.845 What does the debarring official consider in deciding whether to debar me? (a) The...

  11. 22 CFR 208.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false What does the debarring official consider in deciding whether to debar me? 208.845 Section 208.845 Foreign Relations AGENCY FOR INTERNATIONAL... debarring official consider in deciding whether to debar me? (a) The debarring official may debar you...

  12. 31 CFR 19.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance: Treasury 1 2014-07-01 2014-07-01 false What does the debarring official consider in deciding whether to debar me? 19.845 Section 19.845 Money and Finance: Treasury Office of the... does the debarring official consider in deciding whether to debar me? (a) The debarring official...

  13. Tools, flies and what to do next

    NASA Astrophysics Data System (ADS)

    Gomez-Marin, A.

    2013-01-01

    In these brief notes addressed to students and researchers, recent advances of modern neurobiology are discussed in the light of some of its challenges. I use fly larval chemotaxis as a platform to debate about how much we are able to do with the available tools as opposed to how little we actually understand what it means to decide.

  14. Deciding when to decide: time-variant sequential sampling models explain the emergence of value-based decisions in the human brain.

    PubMed

    Gluth, Sebastian; Rieskamp, Jörg; Büchel, Christian

    2012-08-01

    The cognitive and neuronal mechanisms of perceptual decision making have been successfully linked to sequential sampling models. These models describe the decision process as a gradual accumulation of sensory evidence over time. The temporal evolution of economic choices, however, remains largely unexplored. We tested whether sequential sampling models help to understand the formation of value-based decisions in terms of behavior and brain responses. We used functional magnetic resonance imaging (fMRI) to measure brain activity while human participants performed a buying task in which they freely decided upon how and when to choose. Behavior was accurately predicted by a time-variant sequential sampling model that uses a decreasing rather than fixed decision threshold to estimate the time point of the decision. Presupplementary motor area, caudate nucleus, and anterior insula activation was associated with the accumulation of evidence over time. Furthermore, at the beginning of the decision process the fMRI signal in these regions accounted for trial-by-trial deviations from behavioral model predictions: relatively high activation preceded relatively early responses. The updating of value information was correlated with signals in the ventromedial prefrontal cortex, left and right orbitofrontal cortex, and ventral striatum but also in the primary motor cortex well before the response itself. Our results support a view of value-based decisions as emerging from sequential sampling of evidence and suggest a close link between the accumulation process and activity in the motor system when people are free to respond at any time. PMID:22855817

  15. How many invariant polynomials are needed to decide local unitary equivalence of qubit states?

    SciTech Connect

    Maciążek, Tomasz; Oszmaniec, Michał

    2013-09-15

    Given L-qubit states with the fixed spectra of reduced one-qubit density matrices, we find a formula for the minimal number of invariant polynomials needed for solving local unitary (LU) equivalence problem, that is, problem of deciding if two states can be connected by local unitary operations. Interestingly, this number is not the same for every collection of the spectra. Some spectra require less polynomials to solve LU equivalence problem than others. The result is obtained using geometric methods, i.e., by calculating the dimensions of reduced spaces, stemming from the symplectic reduction procedure.

  16. The risk-benefit balance in the United States: who decides?

    PubMed

    Graham, John; Hu, Jianhui

    2007-01-01

    A health policy decision often requires a balancing of risks, costs, and benefits. In this paper we illustrate that there is no uniform answer in the United States to the question of who decides the risk-benefit balance. We use a wide range of case examples from medicine and public health to show the different approaches that are used to allocate decision-making responsibility. Our ultimate purpose is to urge the U.S. health policy community to develop a more consistent way of thinking about how risk-benefit decisions could be guided by general principles. PMID:17485737

  17. Application of ChemDraw NMR Tool: Correlation of Program-Generated (Super 13)C Chemical Shifts and pK[subscript a] Values of Para-Substituted Benzoic Acids

    ERIC Educational Resources Information Center

    Hongyi Wang

    2005-01-01

    A study uses the ChemDraw nuclear magnetic resonance spectroscopy (NMR) tool to process 15 para-substituted benzoic acids and generate (super 13)C NMR chemical shifts of C1 through C5. The data were plotted against their pK[subscript a] value and a fairly good linear fit was found for pK[subscript a] versus delta[subscript c1].

  18. Comparison of select reference management tools.

    PubMed

    Zhang, Yingting

    2012-01-01

    Bibliographic management tools have been widely used by researchers to store, organize, and manage their references for research papers, theses, dissertations, journal articles, and other publications. There are a number of reference management tools available. In order for users to decide which tool is best for their needs, it is important to know each tool's strengths and weaknesses. This article compares four reference management tools, one of which is licensed by University of Medicine and Dentistry of New Jersey libraries and the other three are open source and freely available. They were chosen based on their functionality, ease of use, availability to library users, and popularity. These four tools are EndNote/EndNote Web, Zotero, Connotea, and Mendeley Desktop/Mendeley Web. Each tool is analyzed in terms of the following features: accessing, collecting, organizing, collaborating, and citing/formatting. A comparison table is included to summarize the key features of these tools. PMID:22289095

  19. The Synthesis Map Is a Multidimensional Educational Tool That Provides Insight into Students' Mental Models and Promotes Students' Synthetic Knowledge Generation

    ERIC Educational Resources Information Center

    Ortega, Ryan A.; Brame, Cynthia J.

    2015-01-01

    Concept mapping was developed as a method of displaying and organizing hierarchical knowledge structures. Using the new, multidimensional presentation software Prezi, we have developed a new teaching technique designed to engage higher-level skills in the cognitive domain. This tool, synthesis mapping, is a natural evolution of concept mapping,…

  20. LensTools: Weak Lensing computing tools

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  1. 30 CFR 1227.107 - When will the ONRR Director decide whether to approve a State's delegation proposal?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... MINING RECLAMATION AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR Natural Resources Revenue DELEGATION TO STATES Delegation Process § 1227.107 When will the ONRR Director decide whether to approve a...

  2. 30 CFR 250.1470 - How does BSEE decide what the amount of the penalty should be?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Outer Continental Shelf Civil Penalties General Provisions § 250.1470 How does BSEE decide...

  3. 30 CFR 250.1470 - How does BSEE decide what the amount of the penalty should be?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Outer Continental Shelf Civil Penalties General Provisions § 250.1470 How does BSEE decide...

  4. 30 CFR 250.1470 - How does BSEE decide what the amount of the penalty should be?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Outer Continental Shelf Civil Penalties General Provisions § 250.1470 How does BSEE decide...

  5. 50 CFR 23.2 - How do I decide if these regulations apply to my shipment or me?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ENDANGERED SPECIES OF WILD FAUNA AND FLORA (CITES) Introduction § 23.2 How do I decide if these regulations... plant species (including parts, products, derivatives, whether wild-collected, or born or propagated...

  6. 50 CFR 23.2 - How do I decide if these regulations apply to my shipment or me?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ENDANGERED SPECIES OF WILD FAUNA AND FLORA (CITES) Introduction § 23.2 How do I decide if these regulations... plant species (including parts, products, derivatives, whether wild-collected, or born or propagated...

  7. 50 CFR 23.2 - How do I decide if these regulations apply to my shipment or me?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ENDANGERED SPECIES OF WILD FAUNA AND FLORA (CITES) Introduction § 23.2 How do I decide if these regulations... plant species (including parts, products, derivatives, whether wild-collected, or born or propagated...

  8. Involving Communities in Deciding What Benefits They Receive in Multinational Research.

    PubMed

    Wendler, David; Shah, Seema

    2015-10-01

    There is wide agreement that communities in lower-income countries should benefit when they participate in multinational research. Debate now focuses on how and to what extent these communities should benefit. This debate has identified compelling reasons to reject the claim that whatever benefits a community agrees to accept are necessarily fair. Yet, those who conduct clinical research may conclude from this rejection that there is no reason to involve communities in the process of deciding how they benefit. Against this possibility, the present manuscript argues that involving host communities in this process helps to promote four important goals: (1) protecting host communities, (2) respecting host communities, (3) promoting transparency, and (4) enhancing social value. PMID:26224724

  9. PSI decides to write off most of its $2. 7B Marble Hill investment

    SciTech Connect

    Not Available

    1986-03-01

    After the Indiana Supreme Court ruled last November that the utility may not recover its investment from the cancelled plant, Public Service Indiana (PSI) decided to write off a substantial portion of the $2.7 million already invested in the cancelled Marble Hill nuclear plant. The board will omit common stock dividends for three years and the preferred stock dividend for the first quarter. It will also accept a negotiated rate settlement of 8.2% increase. A 5% emergency surcharge will become permanent. The settlement calls for the utility to restrict capital expenditures over the next three years to the $285.1 million already budgeted for construction. Opposition from a consumers group argues that ratepayers should not be the risk bearers for PSI, but the utility argues that its long-term financial health depends on attracting and keeping investors.

  10. How to decide whether to move species threatened by climate change.

    PubMed

    Rout, Tracy M; McDonald-Madden, Eve; Martin, Tara G; Mitchell, Nicola J; Possingham, Hugh P; Armstrong, Doug P

    2013-01-01

    Introducing species to areas outside their historical range to secure their future under climate change is a controversial strategy for preventing extinction. While the debate over the wisdom of this strategy continues, such introductions are already taking place. Previous frameworks for analysing the decision to introduce have lacked a quantifiable management objective and mathematically rigorous problem formulation. Here we develop the first rigorous quantitative framework for deciding whether or not a particular introduction should go ahead, which species to prioritize for introduction, and where and how to introduce them. It can also be used to compare introduction with alternative management actions, and to prioritise questions for future research. We apply the framework to a case study of tuatara (Sphenodon punctatus) in New Zealand. While simple and accessible, this framework can accommodate uncertainty in predictions and values. It provides essential support for the existing IUCN guidelines by presenting a quantitative process for better decision-making about conservation introductions. PMID:24146778

  11. How do children weigh competence and benevolence when deciding whom to trust?

    PubMed

    Johnston, Angie M; Mills, Candice M; Landrum, Asheley R

    2015-11-01

    In three experiments, we investigate how 187 3- to 5-year-olds weigh competence and benevolence when deciding whom to trust. Children were presented with two informants who provided conflicting labels for novel objects--one informant was competent, but mean, the other incompetent, but nice. Across experiments, we manipulated the order in which competence and benevolence were presented and the way in which they were described (via trait labels or descriptions of prior behavior). When competence was described via prior behavior (Experiments 1-2), children endorsed the informants' labels equally. In contrast, when competence was described via trait labels (Experiment 3), children endorsed labels provided by the competent, mean informant. When considering children's endorsement at the individual level, we found their ability to evaluate competence, not benevolence, related to their endorsements. These findings emphasize the importance of considering how children process information about informants and use this information to determine whom to trust. PMID:26254218

  12. Decidability for a temporal logic used in discrete-event system analysis

    NASA Technical Reports Server (NTRS)

    Knight, J. F.; Passino, K. M.

    1990-01-01

    The type of plant considered is one that can be modeled by a nondeterministic finite-state machine P. The regulator is a deterministic finite state machine R. The closed-loop system is formed by connecting P and R in a regulator configuration. Formulas in a propositional temporal language are used to describe the behavior of the closed-loop system. It is shown that there is a mechanical procedure which, for a given P and R, and a temporal formula Psi, will determine in a finite number of steps whether or not Psi must be true. This 'decidability' result could be proven using other known results on temporal logic. The proof given here shows that the behavior of the closed-loop system may safely be assumed to be ultimately periodic. The results are illustrated on two discrete-event system examples.

  13. Neurocomputational account of how the human brain decides when to have a break.

    PubMed

    Meyniel, Florent; Sergent, Claire; Rigoux, Lionel; Daunizeau, Jean; Pessiglione, Mathias

    2013-02-12

    No pain, no gain: cost-benefit trade-off has been formalized in classical decision theory to account for how we choose whether to engage effort. However, how the brain decides when to have breaks in the course of effort production remains poorly understood. We propose that decisions to cease and resume work are triggered by a cost evidence accumulation signal reaching upper and lower bounds, respectively. We developed a task in which participants are free to exert a physical effort knowing that their payoff would be proportional to their effort duration. Functional MRI and magnetoencephalography recordings conjointly revealed that the theoretical cost evidence accumulation signal was expressed in proprioceptive regions (bilateral posterior insula). Furthermore, the slopes and bounds of the accumulation process were adapted to the difficulty of the task and the money at stake. Cost evidence accumulation might therefore provide a dynamical mechanistic account of how the human brain maximizes benefits while preventing exhaustion. PMID:23341598

  14. Decide now, pay later: Early influences in math and science education

    SciTech Connect

    Malcom, S.

    1995-12-31

    Who are the people deciding to major in science, math or engineering in college? The early interest in science and math education which can lead to science and engineering careers, is shaped as much by the encompassing world of the child as it is by formal education experiences. This paper documents what we know and what we need to know about the influences on children from pre-kindergarten through sixth grade, including the home, pre-school groups, science and math programs in churches, community groups, the media, cultural institutions (museums, zoos, botanical gardens), libraries, and schools (curriculum, instruction, policies and assessment). It also covers the nature and quality of curricular and intervention programs, and identifies strategies that appear to be most effective for various groups.

  15. Regulating (or not) reproductive medicine: an alternative to letting the market decide.

    PubMed

    Dickenson, Donna L

    2011-01-01

    Whilst India has been debating how to regulate 'surrogacy' the UK has undergone a major consultation on increasing the amount of 'expenses'paid to egg 'donors', while France has recently finished debating its entire package of bioethics regulation and the role of its Biomedicine Agency. Although it is often claimed that there is no alternative to the neo-liberal, market-based approach in regulating (or not) reproductive medicine--the ideology prevalent in both India and the UK--advocates of that position ignore the alternative model offered by France's tighter regulation, as well as its overarching concern with protecting the vulnerable and ensuring social justice. Whilst the concepts underpinning the French model of regulation also have their provenance in Western political philosophy and not in the developed world, they embody a very different attitude and suggest that there is indeed an alternative to letting the market decide. However, even in France that alternative is highly contested. PMID:22106647

  16. [DECIDE: developing and evaluating communication strategies to support informed decisions and practice based on evidence].

    PubMed

    Parmelli, Elena; Amato, Laura; Saitto, Carlo; Davoli, Marina

    2013-10-01

    Healthcare systems are offered with a wide range of technologies and services, but they have to cope with decreasing resources and the uncertainty about what is effective and more appropriate. Making decisions about health care interventions is complex. Decisions should be informed by the best available evidence, being comprehensive to take into account all the relevant aspects (e.g. efficacy, safety, equity, costs), and taken within a limited time period. DECIDE is a project funded by the European Community that, using the GRADE methodology, aims at implementing strategies to enhance dissemination and communication of scientific evidence to support on-time evidence-based decision making in clinical practice and healthcare policies. Communication strategies are developed in order to address different target audiences, trying to meet their information needs. One key target are policy makers and managers who are responsible for coverage decision making. PMID:24326703

  17. An absurd inconsistency in law: Nicklinson's case and deciding to die.

    PubMed

    Douglas, Michael

    2014-03-01

    R (Nicklinson) v Ministry of Justice [2012] EWHC 2381 was a tragic case that considered a perennial question: whether voluntary active euthanasia is murder. The traditional position was affirmed, that is, it is indeed murder. The law's treatment of decisions to refuse treatment resulting in death is a stark contrast to the position in respect of voluntary, active euthanasia. In cases of refusing treatment, principles of individual autonomy are paramount. This article presents an overview of the legal distinction between refusing medical treatment and voluntary, active euthanasia. It questions the purported differences between what are described as acts of "active" or "passive" euthanasia. It also highlights the inconsistency of the law's treatment of different ways that people decide to die. PMID:24804532

  18. Role of serum interleukin-6 in deciding therapy for multidrug resistant oral lichen planus

    PubMed Central

    Marwah, Akanksha; Kaushik, Smita; Garg, Vijay K.; Gupta, Sunita

    2015-01-01

    Background Oral lichen planus (OLP) is a T cell mediated immune response. T cells locally present in the involved tissues release cytokines like interleukin-6 (IL-6), which contributes to pathogenesis of OLP. Also IL-6 has been associated with multidrug resistance protein (MRP) expression by keratinocytes. Correspondingly, upregulation of MRP was found in OLP. We conducted this study to evaluate the effects of various drugs on serum IL-6 in OLP; and correlation of these effects with the nature of clinical response and resistance pattern seen in OLP lesions with various therapeutic modalities. Thus we evaluated the role of serum IL-6 in deciding therapy for multidrug resistant OLP. Material and Methods Serum IL-6 was evaluated in 42 erosive OLP (EOLP) patients and 10 normal mucosa and 10 oral squamous cell carcinoma cases using ELISA technique. OLP patients were randomly divided into 3 groups of 14 patients each and were subjected to Pimecrolimus local application, oral Mycophenolate Mofetil (MMF) and Methotrexate (MTX) alongwith Pimecrolimus local application. IL-6 levels were evaluated before and after treatment. Results Serum IL-6 levels were raised above 3pg/ml in 26.19% erosive OLP (EOLP) cases (mean- 3.72±8.14). EOLP (5%) cases with IL-6 levels above 5pg/ml were resistant in MTX group. However significant decrease in serum IL-6 corresponding with the clinical resolution was seen in MMF group. Conclusions Significantly raised IL-6 levels in EOLP reflect the chronic inflammatory nature of the disease. As serum IL-6 levels significantly decreased in MMF group, correspondingly no resistance to treatment was noted. However with MTX there was no significant decrease in IL-6 and resistance to treatment was noted in some, especially plaque type lesions. Thus IL-6 can be a possible biomarker in deciding the best possible therapy for treatment resistant OLP. Key words:Lichen planus, biological markers, cytokines, enzyme-linked immunosorbent assay, immunosuppressive

  19. Deciding Optimal Noise Monitoring Sites with Matrix Gray Absolute Relation Degree Theory

    NASA Astrophysics Data System (ADS)

    Gao, Zhihua; Li, Yadan; Zhao, Limin; Wang, Shuangwei

    2015-08-01

    Noise maps are applied to assess noise level in cities all around the world. There are mainly two ways of producing noise maps: one way is producing noise maps through theoretical simulations with the surrounding conditions, such as traffic flow, building distribution, etc.; the other one is calculating noise level with actual measurement data from noise monitors. Currently literature mainly focuses on considering more factors that affect sound traveling during theoretical simulations and interpolation methods in producing noise maps based on measurements of noise. Although many factors were considered during simulation, noise maps have to be calibrated by actual noise measurements. Therefore, the way of obtaining noise data is significant to both producing and calibrating a noise map. However, there is little literature mentioned about rules of deciding the right monitoring sites when placed the specified number of noise sensors and given the deviation of a noise map produced with data from them. In this work, by utilizing matrix Gray Absolute Relation Degree Theory, we calculated the relation degrees between the most precise noise surface and those interpolated with different combinations of noise data with specified number. We found that surfaces plotted with different combinations of noise data produced different relation degrees with the most precise one. Then we decided the least significant one among the total and calculated the corresponding deviation when it was excluded in making a noise surface. Processing the left noise data in the same way, we found out the least significant datum among the left data one by one. With this method, we optimized the noise sensor’s distribution in an area about 2km2. And we also calculated the bias of surfaces with the least significant data removed. Our practice provides an optimistic solution to the situation faced by most governments that there is limited financial budget available for noise monitoring, especially in

  20. Are Women Deciding against Home Births in Low and Middle Income Countries?

    PubMed Central

    Johnson, Fiifi Amoako; Padmadas, Sabu S.; Matthews, Zoë

    2013-01-01

    Background Although there is evidence to tracking progress towards facility births within the UN Millennium Development Goals framework, we do not know whether women are deciding against home birth over their reproductive lives. Using Demographic and Health Surveys (DHS) data from 44 countries, this study aims to investigate the patterns and shifts in childbirth locations and to determine whether these shifts are in favour of home or health settings. Methods and Findings The analyses considered 108,777 women who had at least two births in the five years preceding the most recent DHS over the period 2000–2010. The vast majority of women opted for the same place of childbirth for their successive births. However, about 14% did switch their place and not all these decisions favoured health facility over home setting. In 24 of the 44 countries analysed, a higher proportion of women switched from a health facility to home. Multilevel regression analyses show significantly higher odds of switching from home to a facility for high parity women, those with frequent antenatal visits and more wealth. However, in countries with high infant mortality rates, low parity women had an increased probability of switching from home to a health facility. Conclusions There is clear evidence that women do change their childbirth locations over successive births in low and middle income countries. After two decades of efforts to improve maternal health, it might be expected that a higher proportion of women will be deciding against home births in favour of facility births. The results from this analysis show that is not the case. PMID:23799022

  1. DG-AMMOS: A New tool to generate 3D conformation of small molecules using Distance Geometry and Automated Molecular Mechanics Optimization for in silico Screening

    PubMed Central

    2009-01-01

    Background Discovery of new bioactive molecules that could enter drug discovery programs or that could serve as chemical probes is a very complex and costly endeavor. Structure-based and ligand-based in silico screening approaches are nowadays extensively used to complement experimental screening approaches in order to increase the effectiveness of the process and facilitating the screening of thousands or millions of small molecules against a biomolecular target. Both in silico screening methods require as input a suitable chemical compound collection and most often the 3D structure of the small molecules has to be generated since compounds are usually delivered in 1D SMILES, CANSMILES or in 2D SDF formats. Results Here, we describe the new open source program DG-AMMOS which allows the generation of the 3D conformation of small molecules using Distance Geometry and their energy minimization via Automated Molecular Mechanics Optimization. The program is validated on the Astex dataset, the ChemBridge Diversity database and on a number of small molecules with known crystal structures extracted from the Cambridge Structural Database. A comparison with the free program Balloon and the well-known commercial program Omega generating the 3D of small molecules is carried out. The results show that the new free program DG-AMMOS is a very efficient 3D structure generator engine. Conclusion DG-AMMOS provides fast, automated and reliable access to the generation of 3D conformation of small molecules and facilitates the preparation of a compound collection prior to high-throughput virtual screening computations. The validation of DG-AMMOS on several different datasets proves that generated structures are generally of equal quality or sometimes better than structures obtained by other tested methods. PMID:19912625

  2. Next-generation genome sequencing and assembly provides tools for phylogenetics and identification of closely related species of Spathius, parasitoids of Agrilus planipennis (emerald ash borer)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A crucial step in biological control programs is identification of candidates for introduction. This is often difficult when cryptic species are involved. However, recent advances in next-generation sequencing allows whole genome sequencing in non-model species for the discovery and genotyping of ...

  3. Sowing the Seeds for a Bountiful Harvest: Shaping the Rules and Creating the Tools for Wisconsin's Next Generation of Wind Farms

    SciTech Connect

    Vickerman, Michael Jay

    2012-03-29

    Project objectives are twofold: (1) to engage wind industry stakeholders to participate in formulating uniform permitting standards applicable to commercial wind energy installations; and (2) to create and maintain an online Wisconsin Wind Information Center to enable policymakers and the public to increaser their knowledge of and support for wind generation in Wisconsin.

  4. The Project Manager's Tool Kit

    NASA Technical Reports Server (NTRS)

    Cameron, W. Scott

    2003-01-01

    Project managers are rarely described as being funny. Moreover, a good sense of humor rarely seems to be one of the deciding factors in choosing someone to be a project manager, or something that pops up as a major discussion point at an annual performance review. Perhaps this is because people think you aren't serious about your work if you laugh. I disagree with this assessment, but that's not really my point. As I talk to people either pursuing a career in project management, or broadening their assignment to include project management, I encourage them to consider what tools they need to be successful. I suggest that they consider any strength they have to be part of their Project Management (PM) Tool Kit, and being funny could be one of the tools they need.

  5. Machine tool locator

    DOEpatents

    Hanlon, John A.; Gill, Timothy J.

    2001-01-01

    Machine tools can be accurately measured and positioned on manufacturing machines within very small tolerances by use of an autocollimator on a 3-axis mount on a manufacturing machine and positioned so as to focus on a reference tooling ball or a machine tool, a digital camera connected to the viewing end of the autocollimator, and a marker and measure generator for receiving digital images from the camera, then displaying or measuring distances between the projection reticle and the reference reticle on the monitoring screen, and relating the distances to the actual position of the autocollimator relative to the reference tooling ball. The images and measurements are used to set the position of the machine tool and to measure the size and shape of the machine tool tip, and examine cutting edge wear. patent

  6. Shifting tools

    SciTech Connect

    Fisher, E.P.; Welch, W.R.

    1984-03-13

    An improved shifting tool connectable in a well tool string and useful to engage and position a slidable sleeve in a sliding sleeve device in a well flow conductor. The selectively profiled shifting tool keys provide better fit with and more contact area between keys and slidable sleeves. When the engaged slidable sleeve cannot be moved up and the shifting tool is not automatically disengaged, emergency disengagement means may be utilized by applying upward force to the shifting tool sufficient to shear pins and cause all keys to be cammed inwardly at both ends to completely disengage for removal of the shifting tool from the sliding sleeve device.

  7. The Xygra gun simulation tool.

    SciTech Connect

    Garasi, Christopher Joseph; Lamppa, Derek C.; Aubuchon, Matthew S.; Shirley, David Noyes; Robinson, Allen Conrad; Russo, Thomas V.

    2008-12-01

    Inductive electromagnetic launchers, or coilguns, use discrete solenoidal coils to accelerate a coaxial conductive armature. To date, Sandia has been using an internally developed code, SLINGSHOT, as a point-mass lumped circuit element simulation tool for modeling coilgun behavior for design and verification purposes. This code has shortcomings in terms of accurately modeling gun performance under stressful electromagnetic propulsion environments. To correct for these limitations, it was decided to attempt to closely couple two Sandia simulation codes, Xyce and ALEGRA, to develop a more rigorous simulation capability for demanding launch applications. This report summarizes the modifications made to each respective code and the path forward to completing interfacing between them.

  8. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries

    PubMed Central

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Background and objective Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Methods Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. Results The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, p<0.0001), but a similar F-measure to that of the SAAP (0.89 vs 0.87). For the Procedure terms, the F-measure was not significantly different among the three pipelines. Conclusions The combination of a semi-automatic annotation approach and the NLP application seems to be a solution for generating entry-level interoperable clinical documents. PMID:25332357

  9. Mutual information analysis as a tool to assess the role of aneuploidy in the generation of cancer-associated differential gene expression patterns.

    PubMed

    Klus, G T; Song, A; Schick, A; Wahde, M; Szallasi, Z

    2001-01-01

    Most human tumors are characterized by: (1) an aberrant set of chromosomes, a state termed aneuploidy; (2) an aberrant gene expression pattern; and (3) an aberrant phenotype of uncontrolled growth. One of the goals of cancer research is to establish causative relationships between these three important characteristics. In this paper we were searching for evidence that aneuploidy is a major cause of differential gene expression. We describe how mutual information analysis of cancer-associated gene expression patterns could be exploited to answer this question. In addition to providing general guidelines, we have applied the proposed analysis to a recently published breast cancer-associated gene expression matrix. The results derived from this particular data set provided preliminary evidence that mutual information analysis may become a useful tool to investigate the link between differential gene expression and aneuploidy. PMID:11262960

  10. FORTRAN tools

    NASA Technical Reports Server (NTRS)

    Presser, L.

    1978-01-01

    An integrated set of FORTRAN tools that are commercially available is described. The basic purpose of various tools is summarized and their economic impact highlighted. The areas addressed by these tools include: code auditing, error detection, program portability, program instrumentation, documentation, clerical aids, and quality assurance.

  11. Percussion tool

    SciTech Connect

    Reed, Teddy R.

    2006-11-28

    A percussion tool is described and which includes a housing mounting a tool bit; a reciprocally moveable hammer borne by the housing and which is operable to repeatedly strike the tool bit; and a reciprocally moveable piston enclosed within the hammer and which imparts reciprocal movement to the reciprocally moveable hammer.

  12. Upper limb splints and the right to drive--who decides?

    PubMed

    Hobman, J W; Southern, S J

    2004-06-01

    Management of upper limb pathology frequently requires the wearing of a splint for a period of time. Our Occupational Therapy Department fits approximately 2000 thermoplastic splints per year. A significant number of these patients drive. In a bid to try and elucidate who is thought to have and who actually has responsibility for deciding which splints are safe to drive in we sent photographic questionnaires to patients, general practitioners (GPs), the police and driver and vehicle-licensing agency (DVLA). We performed a telephone survey of insurance companies. It is the duty of the patient to contact the DVLA if they have any doubt about their ability to drive safely whilst wearing the splint. Our results demonstrate only 10% of patients and 4% of GPs are aware of this. There was strong agreement between patients, GPs and the police about which splints would probably be safe to drive in, but patients need to be reviewed on an individual basis. Our study demonstrates a lack of knowledge among patients and GPs which could expose either group to adverse legal action in the event of an accident. PMID:15145740

  13. How to decide on stent insertion or surgery in colorectal obstruction?

    PubMed Central

    Zahid, Assad; Young, Christopher John

    2016-01-01

    Colorectal cancer is one of the most common cancers in western society and malignant obstruction of the colon accounts for 8%-29% of all large bowel obstructions. Conventional treatment of these patients with malignant obstruction requiring urgent surgery is associated with a greater physiological insult on already nutritionally replete patients. Of late the utility of colonic stents has offered an option in the management of these patients in both the palliative and bridge to surgery setting. This has been the subject of many reviews which highlight its efficacy, particulary in reducing ostomy rates, allowing quicker return to oral diet, minimising extended post-operative recovery as well as some quality of life benefits. The uncertainity in managing patients with malignant colonic obstructions has lead to a more cautious use of stenting technology as community equipoise exists. Decision making analysis has demonstrated that surgeons’ favored the use of stents in the palliative setting preferentially when compared to the curative setting where surgery was preferred. We aim to review the literature regarding the use of stent or surgery in colorectal obstruction, and then provide a discourse with regards to the approach in synthesising the data and applying it when deciding the appropriate application of stent or surgery in colorectal obstruction. PMID:26843916

  14. How to decide on stent insertion or surgery in colorectal obstruction?

    PubMed

    Zahid, Assad; Young, Christopher John

    2016-01-27

    Colorectal cancer is one of the most common cancers in western society and malignant obstruction of the colon accounts for 8%-29% of all large bowel obstructions. Conventional treatment of these patients with malignant obstruction requiring urgent surgery is associated with a greater physiological insult on already nutritionally replete patients. Of late the utility of colonic stents has offered an option in the management of these patients in both the palliative and bridge to surgery setting. This has been the subject of many reviews which highlight its efficacy, particulary in reducing ostomy rates, allowing quicker return to oral diet, minimising extended post-operative recovery as well as some quality of life benefits. The uncertainity in managing patients with malignant colonic obstructions has lead to a more cautious use of stenting technology as community equipoise exists. Decision making analysis has demonstrated that surgeons' favored the use of stents in the palliative setting preferentially when compared to the curative setting where surgery was preferred. We aim to review the literature regarding the use of stent or surgery in colorectal obstruction, and then provide a discourse with regards to the approach in synthesising the data and applying it when deciding the appropriate application of stent or surgery in colorectal obstruction. PMID:26843916

  15. Mitochondrial Mg2+ homeostasis decides cellular energy metabolism and vulnerability to stress

    PubMed Central

    Yamanaka, Ryu; Tabata, Sho; Shindo, Yutaka; Hotta, Kohji; Suzuki, Koji; Soga, Tomoyoshi; Oka, Kotaro

    2016-01-01

    Cellular energy production processes are composed of many Mg2+ dependent enzymatic reactions. In fact, dysregulation of Mg2+ homeostasis is involved in various cellular malfunctions and diseases. Recently, mitochondria, energy-producing organelles, have been known as major intracellular Mg2+ stores. Several biological stimuli alter mitochondrial Mg2+ concentration by intracellular redistribution. However, in living cells, whether mitochondrial Mg2+ alteration affect cellular energy metabolism remains unclear. Mg2+ transporter of mitochondrial inner membrane MRS2 is an essential component of mitochondrial Mg2+ uptake system. Here, we comprehensively analyzed intracellular Mg2+ levels and energy metabolism in Mrs2 knockdown (KD) cells using fluorescence imaging and metabolome analysis. Dysregulation of mitochondrial Mg2+ homeostasis disrupted ATP production via shift of mitochondrial energy metabolism and morphology. Moreover, Mrs2 KD sensitized cellular tolerance against cellular stress. These results indicate regulation of mitochondrial Mg2+ via MRS2 critically decides cellular energy status and cell vulnerability via regulation of mitochondrial Mg2+ level in response to physiological stimuli. PMID:27458051

  16. Revisiting Decidability and Optimum Reachability for Multi-Priced Timed Automata

    NASA Astrophysics Data System (ADS)

    Fränzle, Martin; Swaminathan, Mani

    We investigate the optimum reachability problem for Multi-Priced Timed Automata (MPTA) that admit both positive and negative costs on edges and locations, thus bridging the gap between the results of Bouyer et al. (2007) and of Larsen and Rasmussen (2008). Our contributions are the following: (1) We show that even the location reachability problem is undecidable for MPTA equipped with both positive and negative costs, provided the costs are subject to a bounded budget, in the sense that paths of the underlying Multi-Priced Transition System (MPTS) that operationally exceed the budget are considered as not being viable. This undecidability result follows from an encoding of Stop-Watch Automata using such MPTA, and applies to MPTA with as few as two cost variables, and even when no costs are incurred upon taking edges. (2) We then restrict the MPTA such that each viable quasi-cyclic path of the underlying MPTS incurs a minimum absolute cost. Under such a condition, the location reachability problem is shown to be decidable and the optimum cost is shown to be computable for MPTA with positive and negative costs and a bounded budget. These results follow from a reduction of the optimum reachability problem to the solution of a linear constraint system representing the path conditions over a finite number of viable paths of bounded length.

  17. Liberty to decide on dual use biomedical research: an acknowledged necessity.

    PubMed

    Keuleyan, Emma

    2010-03-01

    Humanity entered the twenty-first century with revolutionary achievements in biomedical research. At the same time multiple "dual-use" results have been published. The battle against infectious diseases is meeting new challenges, with newly emerging and re-emerging infections. Both natural disaster epidemics, such as SARS, avian influenza, haemorrhagic fevers, XDR and MDR tuberculosis and many others, and the possibility of intentional mis-use, such as letters containing anthrax spores in USA, 2001, have raised awareness of the real threats. Many great men, including Goethe, Spinoza, J.B. Shaw, Fr. Engels, J.F. Kennedy and others, have recognized that liberty is also a responsibility. That is why the liberty to decide now represents an acknowledged necessity: biomedical research should be supported, conducted and published with appropriate measures to prevent potential "dual use". Biomedical scientists should work according to the ethical principles of their Code of Conduct, an analogue of Hippocrates Oath of doctors; and they should inform government, society and their juniors about the problem. National science consulting boards of experts should be created to prepare guidelines and control the problem at state level. An international board should develop minimum standards to be applicable by each country. Bio-preparedness is considered another key-measure. PMID:18427955

  18. Act quickly, decide later: long-latency visual processing underlies perceptual decisions but not reflexive behavior.

    PubMed

    Jolij, Jacob; Scholte, H Steven; van Gaal, Simon; Hodgson, Timothy L; Lamme, Victor A F

    2011-12-01

    Humans largely guide their behavior by their visual representation of the world. Recent studies have shown that visual information can trigger behavior within 150 msec, suggesting that visually guided responses to external events, in fact, precede conscious awareness of those events. However, is such a view correct? By using a texture discrimination task, we show that the brain relies on long-latency visual processing in order to guide perceptual decisions. Decreasing stimulus saliency leads to selective changes in long-latency visually evoked potential components reflecting scene segmentation. These latency changes are accompanied by almost equal changes in simple RTs and points of subjective simultaneity. Furthermore, we find a strong correlation between individual RTs and the latencies of scene segmentation related components in the visually evoked potentials, showing that the processes underlying these late brain potentials are critical in triggering a response. However, using the same texture stimuli in an antisaccade task, we found that reflexive, but erroneous, prosaccades, but not antisaccades, can be triggered by earlier visual processes. In other words: The brain can act quickly, but decides late. Differences between our study and earlier findings suggesting that action precedes conscious awareness can be explained by assuming that task demands determine whether a fast and unconscious, or a slower and conscious, representation is used to initiate a visually guided response. PMID:21557644

  19. Mitochondrial Mg(2+) homeostasis decides cellular energy metabolism and vulnerability to stress.

    PubMed

    Yamanaka, Ryu; Tabata, Sho; Shindo, Yutaka; Hotta, Kohji; Suzuki, Koji; Soga, Tomoyoshi; Oka, Kotaro

    2016-01-01

    Cellular energy production processes are composed of many Mg(2+) dependent enzymatic reactions. In fact, dysregulation of Mg(2+) homeostasis is involved in various cellular malfunctions and diseases. Recently, mitochondria, energy-producing organelles, have been known as major intracellular Mg(2+) stores. Several biological stimuli alter mitochondrial Mg(2+) concentration by intracellular redistribution. However, in living cells, whether mitochondrial Mg(2+) alteration affect cellular energy metabolism remains unclear. Mg(2+) transporter of mitochondrial inner membrane MRS2 is an essential component of mitochondrial Mg(2+) uptake system. Here, we comprehensively analyzed intracellular Mg(2+) levels and energy metabolism in Mrs2 knockdown (KD) cells using fluorescence imaging and metabolome analysis. Dysregulation of mitochondrial Mg(2+) homeostasis disrupted ATP production via shift of mitochondrial energy metabolism and morphology. Moreover, Mrs2 KD sensitized cellular tolerance against cellular stress. These results indicate regulation of mitochondrial Mg(2+) via MRS2 critically decides cellular energy status and cell vulnerability via regulation of mitochondrial Mg(2+) level in response to physiological stimuli. PMID:27458051

  20. Deciding to institutionalize: why do family members cease caregiving at home?

    PubMed

    McLennon, Susan M; Habermann, Barbara; Davis, Linda Lindsey

    2010-04-01

    The primary purpose of this secondary analysis was to identify common themes from the statements of caregivers who ultimately decided to institutionalize their relative with Alzheimer or Parkinson disease. Content analysis of transcripts from caregivers (n=11) who institutionalized their relative during their participation in a caregiver intervention study was performed. Two categories identified from the caregivers' stories were anticipating the inevitable and reaching the limit. The results of the descriptive analysis indicated that 3 to 4 months before institutionalization, caregivers discussed knowing that they would not be able to continue caring for their relative. The most frequent reasons for institutionalization were serious health events. The incidental finding that there were more institutionalizations in the Alzheimer disease participant group than in the Parkinson disease group may indicate that caregiving is more difficult for caregivers in Alzheimer disease than in Parkinson disease. This analysis contributes new and important information about the time interval between caregivers' anticipation of the need for alternative care arrangements and the subsequent placement in formal care. Nurses and other healthcare providers should be alert to the fact that when caregivers express anticipation of the need for change in care arrangements, it may be a signal for immediate assessment and referral to appropriate resources for assistance. PMID:20422795