Science.gov

Sample records for generation tool decider

  1. E-DECIDER: Earthquake Disaster Decision Support and Response Tools - Development and Experiences

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Blom, R. G.; Bawden, G. W.; Fox, G.; Pierce, M.; Rundle, J. B.; Wang, J.; Ma, Y.; yoder, M. R.; Sachs, M. K.; Parker, J. W.

    2011-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision-making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. The overall goal of the project is to deliver these capabilities as standards-compliant Geographical Information System (GIS) data products through a web portal/web services infrastructure that will allow easy use by decision-makers; this design ensures that the system will be readily supportable and extensible in the future. E-DECIDER is incorporating the earthquake forecasting methodology developed through NASA's QuakeSim project, as well as other QuakeSim geophysical modeling tools. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, will allow us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). We are also working to provide a catalog of HAZUS input files and models for scenario earthquakes based on the QuakeSim forecast models, as well as designing an automated workflow for generating HAZUS models in the event of an earthquake (triggered from the USGS earthquake feed). Initially, E-DECIDER's focus was to deliver rapid and readily accessible InSAR products following earthquake disasters. Following our experiences with recent past events, such as the Baja Mexico earthquake and the Tohoku-oki Japan earthquake, we found that in many instances, radar data is not readily available following the event, whereas optical imagery can be provided fairly quickly as a result of the invocation of the International Charter. This led us to re-evaluate the type of data we would need to process and the products we could deliver

  2. Disaster Response Tools for Decision Support and Data Discovery - E-DECIDER and GeoGateway

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Granat, R. A.; Lyzenga, G. A.; Pierce, M. E.; Wang, J.; Grant Ludwig, L.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2015-12-01

    Providing actionable data for situational awareness following an earthquake or other disaster is critical to decision makers in order to improve their ability to anticipate requirements and provide appropriate resources for response. E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) is a decision support system producing remote sensing and geophysical modeling products that are relevant to the emergency preparedness and response communities and serves as a gateway to enable the delivery of actionable information to these communities. GeoGateway is a data product search and analysis gateway for scientific discovery, field use, and disaster response focused on NASA UAVSAR and GPS data that integrates with fault data, seismicity and models. Key information on the nature, magnitude and scope of damage, or Essential Elements of Information (EEI), necessary to achieve situational awareness are often generated from a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. We have worked in partnership with the California Earthquake Clearinghouse to develop actionable data products for use in their response efforts, particularly in regularly scheduled, statewide exercises like the recent May 2015 Capstone/SoCal NLE/Ardent Sentry Exercises and in the August 2014 South Napa earthquake activation. We also provided a number of products, services, and consultation to the NASA agency-wide response to the April 2015 Gorkha, Nepal earthquake. We will present perspectives on developing tools for decision support and data discovery in partnership with the Clearinghouse and for the Nepal earthquake. Products delivered included map layers as part of the common operational data plan for the Clearinghouse, delivered through XchangeCore Web Service Data Orchestration, enabling users to create merged datasets from multiple providers. For the Nepal response effort, products included models

  3. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  4. Web Tools: The Second Generation

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  5. E-DECIDER: Using Earth Science Data and Modeling Tools to Develop Decision Support for Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, Margaret T.; Wang, Jun; Pierce, Marlon E.; Yoder, Mark R.; Parker, Jay W.; Burl, Michael C.; Stough, Timothy M.; Granat, Robert A.; Donnellan, Andrea; Rundle, John B.; Ma, Yu; Bawden, Gerald W.; Yuen, Karen

    2015-08-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). This in turn is delivered through standards-compliant web services for desktop and hand-held devices.

  6. Sense, decide, act, communicate (SDAC): next generation of smart sensor systems

    NASA Astrophysics Data System (ADS)

    Berry, Nina; Davis, Jesse; Ko, Teresa H.; Kyker, Ron; Pate, Ron; Stark, Doug; Stinnett, Regan; Baker, James; Cushner, Adam; Van Dyke, Colin; Kyckelhahn, Brian

    2004-09-01

    The recent war on terrorism and increased urban warfare has been a major catalysis for increased interest in the development of disposable unattended wireless ground sensors. While the application of these sensors to hostile domains has been generally governed by specific tasks, this research explores a unique paradigm capitalizing on the fundamental functionality related to sensor systems. This functionality includes a sensors ability to Sense - multi-modal sensing of environmental events, Decide - smart analysis of sensor data, Act - response to environmental events, and Communication - internal to system and external to humans (SDAC). The main concept behind SDAC sensor systems is to integrate the hardware, software, and networking to generate 'knowledge and not just data'. This research explores the usage of wireless SDAC units to collectively make up a sensor system capable of persistent, adaptive, and autonomous behavior. These systems are base on the evaluation of scenarios and existing systems covering various domains. This paper presents a promising view of sensor network characteristics, which will eventually yield smart (intelligent collectives) network arrays of SDAC sensing units generally applicable to multiple related domains. This paper will also discuss and evaluate the demonstration system developed to test the concepts related to SDAC systems.

  7. E-DECIDER: Using Earth Science Data and Modeling Tools to Develop Decision Support for Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Stough, T. M.; Burl, M. C.; Pierce, M.; Wang, J.; Ma, Y.; Rundle, J. B.; yoder, M. R.; Bawden, G. W.

    2012-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision-making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. Geodetic imaging data, including from inteferometric synthetic aperture radar (InSAR) and GPS, have a rich scientific heritage for use in earthquake research. Survey grade GPS was developed in the 1980s and the first InSAR image of an earthquake was produced for the 1992 Landers event. As more of these types of data have become increasingly available they have also shown great utility for providing key information for disaster response. Work has been done to translate these data into useful and actionable information for decision makers in the event of an earthquake disaster. In addition to observed data, modeling tools provide essential preliminary estimates while data are still being collected and/or processed, which can be refined as data products become available. Now, with more data and better models, we are able apply these to responders who need easy tools and routinely produced data products. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER has taken advantage of the legacy of Earth science data, including MODIS, Landsat, SCIGN, PBO, UAVSAR, and modeling tools such as the ones developed by QuakeSim, in order to deliver successful decision support products for earthquake disaster response. The project has

  8. Health. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Health, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday Living." It…

  9. Systems Prototyping with Fourth Generation Tools.

    ERIC Educational Resources Information Center

    Sholtys, Phyllis

    1983-01-01

    The development of information systems using an engineering approach that uses both traditional programing techniques and fourth generation software tools is described. Fourth generation applications tools are used to quickly develop a prototype system that is revised as the user clarifies requirements. (MLW)

  10. SUPPORT Tools for evidence-informed health Policymaking (STP) 8: Deciding how much confidence to place in a systematic review

    PubMed Central

    2009-01-01

    This article is part of a series written for people responsible for making decisions about health policies and programmes and for those who support these decision makers. The reliability of systematic reviews of the effects of health interventions is variable. Consequently, policymakers and others need to assess how much confidence can be placed in such evidence. The use of systematic and transparent processes to determine such decisions can help to prevent the introduction of errors and bias in these judgements. In this article, we suggest five questions that can be considered when deciding how much confidence to place in the findings of a systematic review of the effects of an intervention. These are: 1. Did the review explicitly address an appropriate policy or management question? 2. Were appropriate criteria used when considering studies for the review? 3. Was the search for relevant studies detailed and reasonably comprehensive? 4. Were assessments of the studies' relevance to the review topic and of their risk of bias reproducible? 5. Were the results similar from study to study? PMID:20018115

  11. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  12. Decision generation tools and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  13. Automatic tool path generation for finish machining

    SciTech Connect

    Kwok, Kwan S.; Loucks, C.S.; Driessen, B.J.

    1997-03-01

    A system for automatic tool path generation was developed at Sandia National Laboratories for finish machining operations. The system consists of a commercially available 5-axis milling machine controlled by Sandia developed software. This system was used to remove overspray on cast turbine blades. A laser-based, structured-light sensor, mounted on a tool holder, is used to collect 3D data points around the surface of the turbine blade. Using the digitized model of the blade, a tool path is generated which will drive a 0.375 inch diameter CBN grinding pin around the tip of the blade. A fuzzified digital filter was developed to properly eliminate false sensor readings caused by burrs, holes and overspray. The digital filter was found to successfully generate the correct tool path for a blade with intentionally scanned holes and defects. The fuzzified filter improved the computation efficiency by a factor of 25. For application to general parts, an adaptive scanning algorithm was developed and presented with simulation results. A right pyramid and an ellipsoid were scanned successfully with the adaptive algorithm.

  14. Next generation tools for genomic data generation, distribution, and visualization

    PubMed Central

    2010-01-01

    Background With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Results Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. Conclusions These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq. PMID:20828407

  15. Groundwater Monitoring Report Generation Tools - 12005

    SciTech Connect

    Lopez, Natalie

    2012-07-01

    Compliance with National and State environmental regulations (e.g. Resource Conservation and Recovery Act (RCRA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) aka SuperFund) requires Savannah River Site (SRS) to extensively collect and report groundwater monitoring data, with potential fines for missed reporting deadlines. Several utilities have been developed at SRS to facilitate production of the regulatory reports which include maps, data tables, charts and statistics. Components of each report are generated in accordance with complex sets of regulatory requirements specific to each site monitored. SRS developed a relational database to incorporate the detailed reporting rules with the groundwater data, and created a set of automation tools to interface with the information and generate the report components. These process improvements enhanced quality and consistency by centralizing the information, and have reduced manpower and production time through automated efficiencies. (author)

  16. GROUNDWATER MONITORING REPORT GENERATION TOOLS - 12005

    SciTech Connect

    Lopez, N.

    2011-11-21

    Compliance with National and State environmental regulations (e.g. Resource Conservation and Recovery Act (RCRA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) aka SuperFund) requires Savannah River Site (SRS) to extensively collect and report groundwater monitoring data, with potential fines for missed reporting deadlines. Several utilities have been developed at SRS to facilitate production of the regulatory reports which include maps, data tables, charts and statistics. Components of each report are generated in accordance with complex sets of regulatory requirements specific to each site monitored. SRS developed a relational database to incorporate the detailed reporting rules with the groundwater data, and created a set of automation tools to interface with the information and generate the report components. These process improvements enhanced quality and consistency by centralizing the information, and have reduced manpower and production time through automated efficiencies.

  17. Next Generation of Visualization Tools for Astrophysics

    NASA Astrophysics Data System (ADS)

    Gheller, C.; Becciani, U.; Teuben, P. J.

    2007-10-01

    Visualization exists in many niche packages, each with their own strengths and weaknesses. In this BoF a series of presentations was meant to stimulate a discussion between developers and users about the future of visualization tools. A wiki {http://wiki.eurovotech.org/twiki/bin/view/VOTech/BoFADASSTucson2006} page has been set up to log and continue this discussion. One recent technique, dubbed ``plastifying,'' has enabled different tools to inter-operate data between them and result in a very flexible environment. Also deemed important are the ability to write plug-ins for analysis in visualization tools. Publication quality graphs are sometimes missing from the tools we use.

  18. Next-Generation Design and Simulation Tools

    NASA Technical Reports Server (NTRS)

    Weber, Tod A.

    1997-01-01

    Thirty years ago, the CAD industry was created as electronic drafting tools were developed to move people from the traditional two-dimensional drafting boards. While these tools provided an improvement in accuracy (true perpendicular lines, etc.), they did offer a significant improvement in productivity or impact development times. They electronically captured a manual process.

  19. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  20. Projectile-generating explosive access tool

    SciTech Connect

    Jakaboski, Juan-Carlos; Hughs, Chance G; Todd, Steven N

    2013-06-11

    A method for generating a projectile using an explosive device that can generate a projectile from the opposite side of a wall from the side where the explosive device is detonated. The projectile can be generated without breaching the wall of the structure or container. The device can optionally open an aperture in a solid wall of a structure or a container and form a high-kinetic-energy projectile from the portion of the wall removed to create the aperture.

  1. Dewarless Logging Tool - 1st Generation

    SciTech Connect

    HENFLING,JOSEPH A.; NORMANN,RANDY A.

    2000-08-01

    This report focuses on Sandia National Laboratories' effort to create high-temperature logging tools for geothermal applications without the need for heat shielding. One of the mechanisms for failure in conventional downhole tools is temperature. They can only survive a limited number of hours in high temperature environments. For the first time since the evolution of integrated circuits, components are now commercially available that are qualified to 225 C with many continuing to work up to 300 C. These components are primarily based on Silicon-On-Insulator (SOI) technology. Sandia has developed and tested a simple data logger based on this technology that operates up to 300 C with a few limiting components operating to only 250 C without thermal protection. An actual well log to 240 C without shielding is discussed. The first prototype high-temperature tool measures pressure and temperature using a wire-line for power and communication. The tool is based around the HT83C51 microcontroller. A brief discussion of the background and status of the High Temperature Instrumentation program at Sandia, objectives, data logger development, and future project plans are given.

  2. Generating genomic tools for blueberry improvement

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Because of their recognized health benefits, there has been increased demand and consumption of blueberries in recent years. Great strides have been made in cultivar development since its domestication using traditional breeding approaches. However, genomic tools are lacking in blueberry, which coul...

  3. Projectile-generating explosive access tool

    DOEpatents

    Jakaboski, Juan-Carlos; Todd, Steven N.

    2011-10-18

    An explosive device that can generate a projectile from the opposite side of a wall from the side where the explosive device is detonated. The projectile can be generated without breaching the wall of the structure or container. The device can optionally open an aperture in a solid wall of a structure or a container and form a high-kinetic-energy projectile from the portion of the wall removed to create the aperture.

  4. Next-Generation Ion Thruster Design Tool

    NASA Technical Reports Server (NTRS)

    Stolz, Peter

    2015-01-01

    Computational tools that accurately predict the performance of electric propulsion devices are highly desirable and beneficial to NASA and the broader electric propulsion community. The current state of the art in electric propulsion modeling relies heavily on empirical data and numerous computational "knobs." In Phase I of this project, Tech-X Corporation developed the most detailed ion engine discharge chamber model that currently exists. This kinetic model simulates all particles in the discharge chamber along with a physically correct simulation of the electric fields. In addition, kinetic erosion models are included for modeling the ion-impingement effects on thruster component erosion. In Phase II, Tech-X developed a user-friendly computer program for NASA and other governmental and industry customers. Tech-X has implemented a number of advanced numerical routines to bring the computational time down to a commercially acceptable level. NASA now has a highly sophisticated, user-friendly ion engine discharge chamber modeling tool.

  5. BGen: A UML Behavior Network Generator Tool

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry; Reder, Leonard J.; Balian, Harry

    2010-01-01

    BGen software was designed for autogeneration of code based on a graphical representation of a behavior network used for controlling automatic vehicles. A common format used for describing a behavior network, such as that used in the JPL-developed behavior-based control system, CARACaS ["Control Architecture for Robotic Agent Command and Sensing" (NPO-43635), NASA Tech Briefs, Vol. 32, No. 10 (October 2008), page 40] includes a graph with sensory inputs flowing through the behaviors in order to generate the signals for the actuators that drive and steer the vehicle. A computer program to translate Unified Modeling Language (UML) Freeform Implementation Diagrams into a legacy C implementation of Behavior Network has been developed in order to simplify the development of C-code for behavior-based control systems. UML is a popular standard developed by the Object Management Group (OMG) to model software architectures graphically. The C implementation of a Behavior Network is functioning as a decision tree.

  6. Integrating Fourth-Generation Tools Into the Applications Development Environment.

    ERIC Educational Resources Information Center

    Litaker, R. G.; And Others

    1985-01-01

    Much of the power of the "information center" comes from its ability to effectively use fourth-generation productivity tools to provide information processing services. A case study of the use of these tools at Western Michigan University is presented. (Author/MLW)

  7. HepMCAnalyser: A tool for Monte Carlo generator validation

    NASA Astrophysics Data System (ADS)

    Ay, C.; Johnert, S.; Katzy, J.; Qin, Zhonghua

    2010-04-01

    HepMCAnalyser is a tool for Monte Carlo (MC) generator validation and comparisons. It is a stable, easy-to-use and extendable framework allowing for easy access/integration to generator level analysis. It comprises a class library with benchmark physics processes to analyse MC generator HepMC output and to fill root histograms. A web-interface is provided to display all or selected histogramms, compare to references and validate the results based on Kolmogorov Tests. Steerable example programs can be used for event generation. The default steering is tuned to optimally align the distributions of the different MC generators. The tool will be used for MC generator validation by the Generator Services (GENSER) LCG project, e.g. for version upgrades. It is supported on the same platforms as the GENSER libraries and is already in use at ATLAS.

  8. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  9. The parser generator as a general purpose tool

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.; Collins, W. R.

    1985-01-01

    The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.

  10. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  11. MEAT: An Authoring Tool for Generating Adaptable Learning Resources

    ERIC Educational Resources Information Center

    Kuo, Yen-Hung; Huang, Yueh-Min

    2009-01-01

    Mobile learning (m-learning) is a new trend in the e-learning field. The learning services in m-learning environments are supported by fundamental functions, especially the content and assessment services, which need an authoring tool to rapidly generate adaptable learning resources. To fulfill the imperious demand, this study proposes an…

  12. Deciding to quit drinking alcohol

    MedlinePlus

    ... Alcoholism - deciding to quit References American Psychiatric Association. Diagnostic and statistical manual of mental disorders . 5th ed. Arlington, VA: American Psychiatric Association, 2013. ...

  13. Tools for Simulation and Benchmark Generation at Exascale

    SciTech Connect

    Lagadapati, Mahesh; Mueller, Frank; Engelmann, Christian

    2013-01-01

    The path to exascale high-performance computing (HPC) poses several challenges related to power, performance, resilience, productivity, programmability, data movement, and data management. Investigating the performance of parallel applications at scale on future architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. Simulations using models of future HPC systems and communication traces from applications running on existing HPC systems can offer an insight into the performance of future architectures. This work targets technology developed for scalable application tracing of communication events and memory profiles, but can be extended to other areas, such as I/O, control flow, and data flow. It further focuses on extreme-scale simulation of millions of Message Passing Interface (MPI) ranks using a lightweight parallel discrete event simulation (PDES) toolkit for performance evaluation. Instead of simply replaying a trace within a simulation, the approach is to generate a benchmark from it and to run this benchmark within a simulation using models to reflect the performance characteristics of future-generation HPC systems. This provides a number of benefits, such as eliminating the data intensive trace replay and enabling simulations at different scales. The presented work utilizes the ScalaTrace tool to generate scalable trace files, the ScalaBenchGen tool to generate the benchmark, and the xSim tool to run the benchmark within a simulation.

  14. An Infrastructure for UML-Based Code Generation Tools

    NASA Astrophysics Data System (ADS)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  15. S3D: An interactive surface grid generation tool

    NASA Technical Reports Server (NTRS)

    Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David

    1992-01-01

    S3D, an interactive software tool for surface grid generation, is described. S3D provides the means with which a geometry definition based either on a discretized curve set or a rectangular set can be quickly processed towards the generation of a surface grid for computational fluid dynamics (CFD) applications. This is made possible as a result of implementing commonly encountered surface gridding tasks in an environment with a highly efficient and user friendly graphical interface. Some of the more advanced features of S3D include surface-surface intersections, optimized surface domain decomposition and recomposition, and automated propagation of edge distributions to surrounding grids.

  16. A computer-based tool for generation of progress notes.

    PubMed Central

    Campbell, K. E.; Wieckert, K.; Fagan, L. M.; Musen, M. A.

    1993-01-01

    IVORY, a computer-based tool that uses clinical findings as the basic unit for composing progress notes, generates progress notes more efficiently than does a character-based word processor. IVORY's clinical findings are contained within a structured vocabulary that we developed to support generation of both prose progress notes and SNOMED III codes. Observational studies of physician participation in the development of IVORY's structured vocabulary have helped us to identify areas where changes are required before IVORY will be acceptable for routine clinical use. PMID:8130479

  17. Nearly arc-length tool path generation and tool radius compensation algorithm research in FTS turning

    NASA Astrophysics Data System (ADS)

    Zhao, Minghui; Zhao, Xuesen; Li, Zengqiang; Sun, Tao

    2014-08-01

    In the non-rotational symmetrical microstrcture surfaces generation using turning method with Fast Tool Servo(FTS), non-uniform distribution of the interpolation data points will lead to long processing cycle and poor surface quality. To improve this situation, nearly arc-length tool path generation algorithm is proposed, which generates tool tip trajectory points in nearly arc-length instead of the traditional interpolation rule of equal angle and adds tool radius compensation. All the interpolation points are equidistant in radial distribution because of the constant feeding speed in X slider, the high frequency tool radius compensation components are in both X direction and Z direction, which makes X slider difficult to follow the input orders due to its large mass. Newton iterative method is used to calculate the neighboring contour tangent point coordinate value with the interpolation point X position as initial value, in this way, the new Z coordinate value is gotten, and the high frequency motion components in X direction is decomposed into Z direction. Taking a typical microstructure with 4μm PV value for test, which is mixed with two 70μm wave length sine-waves, the max profile error at the angle of fifteen is less than 0.01μm turning by a diamond tool with big radius of 80μm. The sinusoidal grid is machined on a ultra-precision lathe succesfully, the wavelength is 70.2278μm the Ra value is 22.81nm evaluated by data points generated by filtering out the first five harmonics.

  18. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  19. Retraction Notice: Generation of Knock down Tools for Transcription Factor 7-like-2 (TCF7L2) and Evaluation of its Expression Pattern in Developing Chicken Optic Tectum.

    PubMed

    2016-01-01

    The publishers have decided to retract the manuscript entitled "Generation of Knock Down Tools for Transcription Factor 7-Like-2 (TCF7L2) and Evaluation of its Expression Pattern in Developing Chicken Optic Tectum" published in MicroRNA, volume 4, issue 3, page numbers 209-216, 2015, for the following reasons. • Due to conflict of interests between authors and the Principal Investigator. PMID:26861895

  20. Automatic Tool Path Generation for Robot Integrated Surface Sculpturing System

    NASA Astrophysics Data System (ADS)

    Zhu, Jiang; Suzuki, Ryo; Tanaka, Tomohisa; Saito, Yoshio

    In this paper, a surface sculpturing system based on 8-axis robot is proposed, the CAD/CAM software and tool path generation algorithm for this sculpturing system are presented. The 8-axis robot is composed of a 6-axis manipulator and a 2-axis worktable, it carves block of polystyrene foams by heated cutting tools. Multi-DOF (Degree of Freedom) robot benefits from the faster fashion than traditional RP (Rapid Prototyping) methods and more flexibility than CNC machining. With its flexibility driven from an 8-axis configuration, as well as efficient custom-developed software for rough cutting and finish cutting, this surface sculpturing system can carve sculptured surface accurately and efficiently.

  1. The Requirements Generation System: A tool for managing mission requirements

    NASA Technical Reports Server (NTRS)

    Sheppard, Sylvia B.

    1994-01-01

    Historically, NASA's cost for developing mission requirements has been a significant part of a mission's budget. Large amounts of time have been allocated in mission schedules for the development and review of requirements by the many groups who are associated with a mission. Additionally, tracing requirements from a current document to a parent document has been time-consuming and costly. The Requirements Generation System (RGS) is a computer-supported cooperative-work tool that assists mission developers in the online creation, review, editing, tracing, and approval of mission requirements as well as in the production of requirements documents. This paper describes the RGS and discusses some lessons learned during its development.

  2. MCM generator: a Java-based tool for generating medical metadata.

    PubMed

    Munoz, F; Hersh, W

    1998-01-01

    In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files. PMID:9929299

  3. MCM generator: a Java-based tool for generating medical metadata.

    PubMed Central

    Munoz, F.; Hersh, W.

    1998-01-01

    In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files. Images Figure 1 Figure 2 Figure 3 Figure 4 PMID:9929299

  4. Ambit-Tautomer: An Open Source Tool for Tautomer Generation.

    PubMed

    Kochev, Nikolay T; Paskaleva, Vesselina H; Jeliazkova, Nina

    2013-06-01

    We present a new open source tool for automatic generation of all tautomeric forms of a given organic compound. Ambit-Tautomer is a part of the open source software package Ambit2. It implements three tautomer generation algorithms: combinatorial method, improved combinatorial method and incremental depth-first search algorithm. All algorithms utilize a set of fully customizable rules for tautomeric transformations. The predefined knowledge base covers 1-3, 1-5 and 1-7 proton tautomeric shifts. Some typical supported tautomerism rules are keto-enol, imin-amin, nitroso-oxime, azo-hydrazone, thioketo-thioenol, thionitroso-thiooxime, amidine-imidine, diazoamino-diazoamino, thioamide-iminothiol and nitrosamine-diazohydroxide. Ambit-Tautomer uses a simple energy based system for tautomer ranking implemented by a set of empirically derived rules. A fine-grained output control is achieved by a set of post-generation filters. We performed an exhaustive comparison of the Ambit-Tautomer Incremental algorithm against several other software packages which offer tautomer generation: ChemAxon Marvin, Molecular Networks MN.TAUTOMER, ACDLabs, CACTVS and the CDK implementation of the algorithm, based on the mobile H atoms listed in the InChI. According to the presented test results, Ambit-Tautomer's performance is either comparable to or better than the competing algorithms. Ambit-Tautomer module is available for download as a Java library, a command line application, a demo web page or OpenTox API compatible Web service. PMID:27481667

  5. EIGER: A new generation of computational electromagnetics tools

    SciTech Connect

    Wilton, D.R.; Johnson, W.A.; Jorgenson, R.E.; Sharpe, R.M.; Grant, J.B.

    1996-03-01

    The EIGER project (Electromagnetic Interactions GenERalized) endeavors to bring the next generation of spectral domain electromagnetic analysis tools to maturity and to cast them in a general form which is amenable to a variety of applications. The tools are written in Fortran 90 and with an object oriented philosophy to yield a package that is easily ported to a variety of platforms, simply maintained, and above all efficiently modified to address wide ranging applications. The modular development style and the choice of Fortran 90 is also driven by the desire to run efficiently on existing high performance computer platforms and to remain flexible for new architectures that are anticipated. The electromagnetic tool box consists of extremely accurate physics models for 2D and 3D electromagnetic scattering, radiation, and penetration problems. The models include surface and volume formulations for conductors and complex materials. In addition, realistic excitations and symmetries are incorporated, as well as, complex environments through the use of Green`s functions.

  6. Governance and Factions--Who Decides Who Decides?

    ERIC Educational Resources Information Center

    Hodgkinson, Harold L.

    In several projects, the Center is studying the question: who will decide which factions will be represented in the decision-making process. In the Campus Governance Project investigating the nature of governance, over 3,000 questionnaires were administered and 900 intensive interviews conducted at 19 institutions. The questionnaire was designed…

  7. Benchmarking the next generation of homology inference tools

    PubMed Central

    Saripella, Ganapathi Varma; Sonnhammer, Erik L. L.; Forslund, Kristoffer

    2016-01-01

    Motivation: Over the last decades, vast numbers of sequences were deposited in public databases. Bioinformatics tools allow homology and consequently functional inference for these sequences. New profile-based homology search tools have been introduced, allowing reliable detection of remote homologs, but have not been systematically benchmarked. To provide such a comparison, which can guide bioinformatics workflows, we extend and apply our previously developed benchmark approach to evaluate the ‘next generation’ of profile-based approaches, including CS-BLAST, HHSEARCH and PHMMER, in comparison with the non-profile based search tools NCBI-BLAST, USEARCH, UBLAST and FASTA. Method: We generated challenging benchmark datasets based on protein domain architectures within either the PFAM + Clan, SCOP/Superfamily or CATH/Gene3D domain definition schemes. From each dataset, homologous and non-homologous protein pairs were aligned using each tool, and standard performance metrics calculated. We further measured congruence of domain architecture assignments in the three domain databases. Results: CSBLAST and PHMMER had overall highest accuracy. FASTA, UBLAST and USEARCH showed large trade-offs of accuracy for speed optimization. Conclusion: Profile methods are superior at inferring remote homologs but the difference in accuracy between methods is relatively small. PHMMER and CSBLAST stand out with the highest accuracy, yet still at a reasonable computational cost. Additionally, we show that less than 0.1% of Swiss-Prot protein pairs considered homologous by one database are considered non-homologous by another, implying that these classifications represent equivalent underlying biological phenomena, differing mostly in coverage and granularity. Availability and Implementation: Benchmark datasets and all scripts are placed at (http://sonnhammer.org/download/Homology_benchmark). Contact: forslund@embl.de Supplementary information: Supplementary data are available at

  8. Deciding where to Stop Speaking

    ERIC Educational Resources Information Center

    Tydgat, Ilse; Stevens, Michael; Hartsuiker, Robert J.; Pickering, Martin J.

    2011-01-01

    This study investigated whether speakers strategically decide where to interrupt their speech once they need to stop. We conducted four naming experiments in which pictures of colored shapes occasionally changed in color or shape. Participants then merely had to stop (Experiment 1); or they had to stop and resume speech (Experiments 2-4). They…

  9. Improving Dynamic Load and Generator Response PerformanceTools

    SciTech Connect

    Lesieutre, Bernard C.

    2005-11-01

    This report is a scoping study to examine research opportunities to improve the accuracy of the system dynamic load and generator models, data and performance assessment tools used by CAISO operations engineers and planning engineers, as well as those used by their counterparts at the California utilities, to establish safe operating margins. Model-based simulations are commonly used to assess the impact of credible contingencies in order to determine system operating limits (path ratings, etc.) to ensure compliance with NERC and WECC reliability requirements. Improved models and a better understanding of the impact of uncertainties in these models will increase the reliability of grid operations by allowing operators to more accurately study system voltage problems and the dynamic stability response of the system to disturbances.

  10. Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools

    NASA Technical Reports Server (NTRS)

    Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory

    2013-01-01

    Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.

  11. Tool for Generating Realistic Residential Hot Water Event Schedules: Preprint

    SciTech Connect

    Hendron, B.; Burch, J.; Barker, G.

    2010-08-01

    The installed energy savings for advanced residential hot water systems can depend greatly on detailed occupant use patterns. Quantifying these patterns is essential for analyzing measures such as tankless water heaters, solar hot water systems with demand-side heat exchangers, distribution system improvements, and recirculation loops. This paper describes the development of an advanced spreadsheet tool that can generate a series of year-long hot water event schedules consistent with realistic probability distributions of start time, duration and flow rate variability, clustering, fixture assignment, vacation periods, and seasonality. This paper also presents the application of the hot water event schedules in the context of an integral-collector-storage solar water heating system in a moderate climate.

  12. PRIST: a fourth-generation tool for medical information systems.

    PubMed

    Cristiani, P; Larizza, C

    1990-04-01

    PRIST is a fourth-generation software package purposely oriented to development and management of medical applications, running under MS/DOS IBM compatible personal computers. The tool has been developed on the top of DBIII Plus language utilizing the Clipper Compiler networking features for the integration in a LAN environment. Several routines written in C and BASIC Microsoft languages integrated this DBMS-kernel system providing I/O, graphics, statistics, retrieval utilities. To increase the interactivity of the system both menu-driven and windowing interfaces have been implemented. PRIST has been utilized to develop a wide variety of small medical applications ranging from research laboratories to intensive care units. The great majority of reactions from the use of these applications were positive, confirming that PRIST is able to assist in practice management and patient care as well as research purposes. PMID:2345045

  13. Automatic Generation of Remote Visualization Tools with WATT

    NASA Astrophysics Data System (ADS)

    Jensen, P. A.; Bollig, E. F.; Yuen, D. A.; Erlebacher, G.; Momsen, A. R.

    2006-12-01

    The ever increasing size and complexity of geophysical and other scientific datasets has forced developers to turn to more powerful alternatives for visualizing results of computations and experiments. These alternative need to be faster, scalable, more efficient, and able to be run on large machines. At the same time, advances in scripting languages and visualization libraries have significantly decreased the development time of smaller, desktop visualization tools. Ideally, programmers would be able to develop visualization tools in a high-level, local, scripted environment and then automatically convert their programs into compiled, remote visualization tools for integration into larger computation environments. The Web Automation and Translation Toolkit (WATT) [1] converts a Tcl script for the Visualization Toolkit (VTK) [2] into a standards-compliant web service. We will demonstrate the used of WATT for the automated conversion of a desktop visualization application (written in Tcl for VTK) into a remote visualization service of interest to geoscientists. The resulting service will allow real-time access to a large dataset through the Internet, and will be easily integrated into the existing architecture of the Virtual Laboratory for Earth and Planetary Materials (VLab) [3]. [1] Jensen, P.A., Yuen, D.A., Erlebacher, G., Bollig, E.F., Kigelman, D.G., Shukh, E.A., Automated Generation of Web Services for Visualization Toolkits, Eos Trans. AGU, 86(52), Fall Meet. Suppl., Abstract IN42A-06, 2005. [2] The Visualization Toolkit, http://www.vtk.org [3] The Virtual Laboratory for Earth and Planetary Materials, http://vlab.msi.umn.edu

  14. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    NASA Astrophysics Data System (ADS)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  15. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGESBeta

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; et al

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  16. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    SciTech Connect

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  17. Sandia Generated Matrix Tool (SGMT) v. 1.0

    Energy Science and Technology Software Center (ESTSC)

    2010-03-24

    Provides a tool with which create and characterize a very large set of matrix-based visual analogy problems that have properties that are similar to Raven™s Progressive Matrices (RPMs). The software uses the same underlying patterns found in RPMs to generate large numbers of unique matrix problems using parameters chosen by the researcher. Specifically, the software is designed so that researchers can choose the type, direction, and number of relations in a problem and then createmore » any number of unique matrices that share the same underlying structure (e.g. changes in numerosity in a diagonal pattern) but have different surface features (e.g. shapes, colors).Raven™s Progressive Matrices (RPMs) are a widely-used test for assessing intelligence and reasoning ability. Since the test is non-verbal, it can be applied to many different populations and has been used all over the world. However, there are relatively few matrices in the sets developed by Raven, which limits their use in experiments requiring large numbers of stimuli. This tool creates a matrix set in a systematic way that allows researchers to have a great deal of control over the underlying structure, surface features, and difficulty of the matrix problems while providing a large set of novel matrices with which to conduct experiments.« less

  18. Sandia Generated Matrix Tool (SGMT) v. 1.0

    SciTech Connect

    Benz, Zachary; & Dixon, Kevin

    2010-03-24

    Provides a tool with which create and characterize a very large set of matrix-based visual analogy problems that have properties that are similar to Raven™s Progressive Matrices (RPMs). The software uses the same underlying patterns found in RPMs to generate large numbers of unique matrix problems using parameters chosen by the researcher. Specifically, the software is designed so that researchers can choose the type, direction, and number of relations in a problem and then create any number of unique matrices that share the same underlying structure (e.g. changes in numerosity in a diagonal pattern) but have different surface features (e.g. shapes, colors).Raven™s Progressive Matrices (RPMs) are a widely-used test for assessing intelligence and reasoning ability. Since the test is non-verbal, it can be applied to many different populations and has been used all over the world. However, there are relatively few matrices in the sets developed by Raven, which limits their use in experiments requiring large numbers of stimuli. This tool creates a matrix set in a systematic way that allows researchers to have a great deal of control over the underlying structure, surface features, and difficulty of the matrix problems while providing a large set of novel matrices with which to conduct experiments.

  19. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  20. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    SciTech Connect

    Cem Sarica; Holden Zhang

    2006-05-31

    The developments of oil and gas fields in deep waters (5000 ft and more) will become more common in the future. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas, oil and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of hydrocarbon recovery from design to operation. Recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications, including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is crucial for any multiphase separation technique, either at topside, seabed or bottom-hole, to know inlet conditions such as flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. Therefore, the development of a new generation of multiphase flow predictive tools is needed. The overall objective of the proposed study is to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). In the current multiphase modeling approach, flow pattern and flow behavior (pressure gradient and phase fractions) prediction modeling are separated. Thus, different models based on different physics are employed, causing inaccuracies and discontinuities. Moreover, oil and water are treated as a pseudo single phase, ignoring the distinct characteristics of both oil and water, and often resulting in inaccurate design that leads to operational problems. In this study, a new model is being developed through a theoretical and experimental study employing a revolutionary approach. The

  1. Bioethics in America: Who decides

    SciTech Connect

    Yesley, M.S.

    1992-01-01

    This paper is concerned with the process by which bioethics decisions are made as well as the actual decisions that are reached. The process commonly is one of shared decision-making,'' that is, decisionmaking at several levels, beginning with the government and ending with the individual. After the government has defined a scope of permissible activity, the research or health care institution may further limit what activities are permitted. Finally, the individual patient, or, if the patient is incompetent, the patient's legal representative decides whether or not to participate in the activity. Because bioethics in general, and bioethics related to genetics in particular, evolves through this process of decisionmaking at several levels, this paper briefly traces the process, to see how it works in several areas of bioethics, in order to provide a perspective on the way in which ethical decisions related to genetics are or will be made.

  2. Bioethics in America: Who decides?

    SciTech Connect

    Yesley, M.S.

    1992-05-01

    This paper is concerned with the process by which bioethics decisions are made as well as the actual decisions that are reached. The process commonly is one of ``shared decision-making,`` that is, decisionmaking at several levels, beginning with the government and ending with the individual. After the government has defined a scope of permissible activity, the research or health care institution may further limit what activities are permitted. Finally, the individual patient, or, if the patient is incompetent, the patient`s legal representative decides whether or not to participate in the activity. Because bioethics in general, and bioethics related to genetics in particular, evolves through this process of decisionmaking at several levels, this paper briefly traces the process, to see how it works in several areas of bioethics, in order to provide a perspective on the way in which ethical decisions related to genetics are or will be made.

  3. Developing the Next Generation of Tools for Simulating Galaxy Outflows

    NASA Astrophysics Data System (ADS)

    Scannapieco, Evan

    Outflows are observed in starbursting galaxies of all masses and at all cosmological epochs. They play a key role throughout the history of the Universe: shaping the galaxy mass-metallicity relation, drastically affecting the content and number density of dwarf galaxies, and transforming the chemical composition of the intergalactic medium. Yet, a complete model of galaxy out ows has proven to be elusive, as it requires both a better understanding of the evolution of the turbulent, multiphase gas in and around starbursting galaxies, and better tools to reproduce this evolution in galaxy-scale simulations. Here we propose to conduct a detailed series of numerical simulations designed to help develop such next-generation tools for the simulation of galaxy outflows. The program will consist of three types of direct numerical simulations, each of which will be targeted to allow galaxy-scale simulations to more accurately model key microphysical processes and their observational consequences. Our first set of simulations will be targeted at better modeling the starbursting interstellar medium (ISM) from which galaxy outflows are driven. The surface densities in starbursting galaxies are much larger than those in the Milky Way, resulting in larger gravitational accelerations and random velocities exceeding 30 or even 100 km/s. Under these conditions, the thermal stability of the ISM is changed dramatically, due to the sharp peak in gas cooling efficiency at H 200,000 K. Our simulations will carefully quantify the key ways in which this medium differs from the local ISM, and the consequences of these differences for when, where, and how outflows are driven. A second set of simulations will be targeted at better modeling the observed properties of rapidly cooling, highly turbulent gas. Because gas cooling in and around starbursts is extremely efficient, turbulent motions are often supersonic, which leads to a distribution of ionization states that is vastly different than

  4. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    SciTech Connect

    Tulsa Fluid Flow

    2008-08-31

    The developments of fields in deep waters (5000 ft and more) is a common occurrence. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas-oil-and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of the hydrocarbon recovery from design to operation. The recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is very crucial to any multiphase separation technique that is employed either at topside, seabed or bottom-hole to know inlet conditions such as the flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. The overall objective was to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict the flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). The project was conducted in two periods. In Period 1 (four years), gas-oil-water flow in pipes were investigated to understand the fundamental physical mechanisms describing the interaction between the gas-oil-water phases under flowing conditions, and a unified model was developed utilizing a novel modeling approach. A gas-oil-water pipe flow database including field and laboratory data was formed in Period 2 (one year). The database was utilized in model performance demonstration. Period 1 primarily consisted of the development of a unified model and software to predict the gas-oil-water flow, and experimental studies of the gas-oil-water project, including flow behavior description and

  5. Fine-Tuning Next-Generation Genome Editing Tools.

    PubMed

    Kanchiswamy, Chidananda Nagamangala; Maffei, Massimo; Malnoy, Mickael; Velasco, Riccardo; Kim, Jin-Soo

    2016-07-01

    The availability of genome sequences of numerous organisms and the revolution brought about by genome editing tools (e.g., ZFNs, TALENs, and CRISPR/Cas9 or RGENs) has provided a breakthrough in introducing targeted genetic changes both to explore emergent phenotypes and to introduce new functionalities. However, the wider application of these tools in biology, agriculture, medicine, and biotechnology is limited by off-target mutation effects. In this review, we compare available methods for detecting, measuring, and analyzing off-target mutations. Furthermore, we particularly focus on CRISPR/Cas9 regarding various methods, tweaks, and software tools available to nullify off-target effects. PMID:27167723

  6. Prostate Cancer: Take Time to Decide

    MedlinePlus

    ... printing [PDF-983KB] Cancer Home Prostate Cancer: Take Time to Decide Infographic Language: English Español (Spanish) Recommend on Facebook Tweet Share Compartir Prostate Cancer: Take Time to Decide Most prostate cancers grow slowly, and ...

  7. Ontodog: a web-based ontology community view generation tool.

    PubMed

    Zheng, Jie; Xiang, Zuoshuang; Stoeckert, Christian J; He, Yongqun

    2014-05-01

    Biomedical ontologies are often very large and complex. Only a subset of the ontology may be needed for a specified application or community. For ontology end users, it is desirable to have community-based labels rather than the labels generated by ontology developers. Ontodog is a web-based system that can generate an ontology subset based on Excel input, and support generation of an ontology community view, which is defined as the whole or a subset of the source ontology with user-specified annotations including user-preferred labels. Ontodog allows users to easily generate community views with minimal ontology knowledge and no programming skills or installation required. Currently >100 ontologies including all OBO Foundry ontologies are available to generate the views based on user needs. We demonstrate the application of Ontodog for the generation of community views using the Ontology for Biomedical Investigations as the source ontology. PMID:24413522

  8. Virtual Tool Mark Generation for Efficient Striation Analysis

    SciTech Connect

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  9. E-DECIDER Decision Support Gateway For Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.

    2013-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that

  10. Skateboards or Wildlife? Kids Decide!

    ERIC Educational Resources Information Center

    Thomas, Julie; Cooper, Sandra; Haukos, David

    2004-01-01

    How can teachers make science learning relevant to today's technology savvy students? They can incorporate the Internet and use it as a tool to help solve real-life problems. A group of university professors, a field biologist, and classroom teachers teamed up to create an exciting, interactive Web-based learning environment for students and…

  11. New generation of exploration tools: interactive modeling software and microcomputers

    SciTech Connect

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly, these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.

  12. Rover Team Decides: Safety First

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA's Mars Exploration Rover Spirit recorded this view while approaching the northwestern edge of 'Home Plate,' a circular plateau-like area of bright, layered outcrop material roughly 80 meters (260 feet) in diameter. The images combined into this mosaic were taken by Spirit's navigation camera during the rover's 746th, 748th and 750th Martian days, or sols (Feb. 7, 9 and 11, 2006).

    With Martian winter closing in, engineers and scientists working with NASA's Mars Exploration Rover Spirit decided to play it safe for the time being rather than attempt to visit the far side of Home Plate in search of rock layers that might show evidence of a past watery environment. This feature has been one of the major milestones of the mission. Though it's conceivable that rock layers might be exposed on the opposite side, sunlight is diminishing on the rover's solar panels and team members chose not to travel in a counterclockwise direction that would take the rover to the west and south slopes of the plateau. Slopes in that direction are hidden from view and team members chose, following a long, thorough discussion, to have the rover travel clockwise and remain on north-facing slopes rather than risk sending the rover deeper into unknown terrain.

    In addition to studying numerous images from Spirit's cameras, team members studied three-dimensional models created with images from the Mars Orbiter Camera on NASA's Mars Globel Surveyor orbiter. The models showed a valley on the southern side of Home Plate, the slopes of which might cause the rover's solar panels to lose power for unknown lengths of time. In addition, images from Spirit's cameras showed a nearby, talus-covered section of slope on the west side of Home Plate, rather than exposed rock layers scientists eventually hope to investigate.

    Home Plate has been on the rover's potential itinerary since the early days of the mission, when it stood out in images taken by the Mars Orbiter Camera shortly after

  13. JVM: Java Visual Mapping tool for next generation sequencing read.

    PubMed

    Yang, Ye; Liu, Juan

    2015-01-01

    We developed a program JVM (Java Visual Mapping) for mapping next generation sequencing read to reference sequence. The program is implemented in Java and is designed to deal with millions of short read generated by sequence alignment using the Illumina sequencing technology. It employs seed index strategy and octal encoding operations for sequence alignments. JVM is useful for DNA-Seq, RNA-Seq when dealing with single-end resequencing. JVM is a desktop application, which supports reads capacity from 1 MB to 10 GB. PMID:25387956

  14. Next generation tools to accelerate the synthetic biology process.

    PubMed

    Shih, Steve C C; Moraes, Christopher

    2016-05-16

    Synthetic biology follows the traditional engineering paradigm of designing, building, testing and learning to create new biological systems. While such approaches have enormous potential, major challenges still exist in this field including increasing the speed at which this workflow can be performed. Here, we present recently developed microfluidic tools that can be used to automate the synthetic biology workflow with the goal of advancing the likelihood of producing desired functionalities. With the potential for programmability, automation, and robustness, the integration of microfluidics and synthetic biology has the potential to accelerate advances in areas such as bioenergy, health, and biomaterials. PMID:27146265

  15. CUEMAP: A tool for generating hierarchical charts and dataflow diagrams

    SciTech Connect

    Lee, J.W.

    1987-12-01

    CUEMAP is a preprocessor to the MAPPER program, which generates report quality visual aids. CUEMAP uses text blocks, symbols, and line connectors to lay out hierarchical charts and dataflow diagrams. A grid is specified as a reference point on which the labels and symbols can be placed. Connectors are added to complete the diagram. Modifications and enhancements require knowledge of the MAPPER syntax. 1 ref., 2 figs.

  16. HALOGEN: a tool for fast generation of mock halo catalogues

    NASA Astrophysics Data System (ADS)

    Avila, Santiago; Murray, Steven G.; Knebe, Alexander; Power, Chris; Robotham, Aaron S. G.; Garcia-Bellido, Juan

    2015-06-01

    We present a simple method of generating approximate synthetic halo catalogues: HALOGEN. This method uses a combination of second-order Lagrangian Perturbation Theory (2LPT) in order to generate the large-scale matter distribution, analytical mass functions to generate halo masses, and a single-parameter stochastic model for halo bias to position haloes. HALOGEN represents a simplification of similar recently published methods. Our method is constrained to recover the two-point function at intermediate (10 h-1 Mpc < r < 50 h-1 Mpc) scales, which we show is successful to within 2 per cent. Larger scales (˜100 h-1 Mpc) are reproduced to within 15 per cent. We compare several other statistics (e.g. power spectrum, point distribution function, redshift space distortions) with results from N-body simulations to determine the validity of our method for different purposes. One of the benefits of HALOGEN is its flexibility, and we demonstrate this by showing how it can be adapted to varying cosmologies and simulation specifications. A driving motivation for the development of such approximate schemes is the need to compute covariance matrices and study the systematic errors for large galaxy surveys, which requires thousands of simulated realizations. We discuss the applicability of our method in this context, and conclude that it is well suited to mass production of appropriate halo catalogues. The code is publicly available at https://github.com/savila/halogen.

  17. Automated tools for the generation of performance-based training

    SciTech Connect

    Trainor, M.S.; Fries, J.

    1990-01-01

    The field of educational technology is not a new one, but the emphasis in the past has been on the use of technologies for the delivery of instruction and tests. This paper explores the application of technology to the development of performance-based instruction and to the analyses leading up to the development of the instruction. Several technologies are discussed, with specific software packages described. The purpose of these technologies is to streamline the instructional analysis and design process, using the computer for its strengths to aid the human-in-the-loop. Currently, the process is all accomplished manually. Applying automated tools to the process frees the humans from some of the tedium involved so that they can be dedicated to the more complex aspects of the process. 12 refs.

  18. Multiscale Toxicology - Building the Next Generation Tools for Toxicology

    SciTech Connect

    Thrall, Brian D.; Minard, Kevin R.; Teeguarden, Justin G.; Waters, Katrina M.

    2012-09-01

    A Cooperative Research and Development Agreement (CRADA) was sponsored by Battelle Memorial Institute (Battelle, Columbus), to initiate a collaborative research program across multiple Department of Energy (DOE) National Laboratories aimed at developing a suite of new capabilities for predictive toxicology. Predicting the potential toxicity of emerging classes of engineered nanomaterials was chosen as one of two focusing problems for this program. PNNL’s focus toward this broader goal was to refine and apply experimental and computational tools needed to provide quantitative understanding of nanoparticle dosimetry for in vitro cell culture systems, which is necessary for comparative risk estimates for different nanomaterials or biological systems. Research conducted using lung epithelial and macrophage cell models successfully adapted magnetic particle detection and fluorescent microscopy technologies to quantify uptake of various forms of engineered nanoparticles, and provided experimental constraints and test datasets for benchmark comparison against results obtained using an in vitro computational dosimetry model, termed the ISSD model. The experimental and computational approaches developed were used to demonstrate how cell dosimetry is applied to aid in interpretation of genomic studies of nanoparticle-mediated biological responses in model cell culture systems. The combined experimental and theoretical approach provides a highly quantitative framework for evaluating relationships between biocompatibility of nanoparticles and their physical form in a controlled manner.

  19. Profiting from competition: Financial tools for electric generation companies

    NASA Astrophysics Data System (ADS)

    Richter, Charles William, Jr.

    Regulations governing the operation of electric power systems in North America and many other areas of the world are undergoing major changes designed to promote competition. This process of change is often referred to as deregulation. Participants in deregulated electricity systems may find that their profits will greatly benefit from the implementation of successful bidding strategies. While the goal of the regulators may be to create rules which balance reliable power system operation with maximization of the total benefit to society, the goal of generation companies is to maximize their profit, i.e., return to their shareholders. The majority of the research described here is conducted from the point of view of generation companies (GENCOs) wishing to maximize their expected utility function, which is generally comprised of expected profit and risk. Strategies that help a GENCO to maximize its objective function must consider the impact of (and aid in making) operating decisions that may occur within a few seconds to multiple years. The work described here assumes an environment in which energy service companies (ESCOs) buy and GENCOs sell power via double auctions in regional commodity exchanges. Power is transported on wires owned by transmission companies (TRANSCOs) and distribution companies (DISTCOs). The proposed market framework allows participants to trade electrical energy contracts via the spot, futures, options, planning, and swap markets. An important method of studying these proposed markets and the behavior of participating agents is the field of experimental/computational economics. For much of the research reported here, the market simulator developed by Kumar and Sheble and similar simulators has been adapted to allow computerized agents to trade energy. Creating computerized agents that can react as rationally or irrationally as a human trader is a difficult problem for which we have turned to the field of artificial intelligence. Some of our

  20. Deciding as Intentional Action: Control over Decisions

    PubMed Central

    Shepherd, Joshua

    2015-01-01

    Common-sense folk psychology and mainstream philosophy of action agree about decisions: these are under an agent's direct control, and are thus intentional actions for which agents can be held responsible. I begin this paper by presenting a problem for this view. In short, since the content of the motivational attitudes that drive deliberation and decision remains open-ended until the moment of decision, it is unclear how agents can be thought to exercise control over what they decide at the moment of deciding. I note that this problem might motivate a non-actional view of deciding—a view that decisions are not actions, but are instead passive events of intention acquisition. For without an understanding of how an agent might exercise control over what is decided at the moment of deciding, we lack a good reason for maintaining commitment to an actional view of deciding. However, I then offer the required account of how agents exercise control over decisions at the moment of deciding. Crucial to this account is an understanding of the relation of practical deliberation to deciding, an understanding of skilled deliberative activity, and the role of attention in the mental action of deciding. PMID:26321765

  1. PC graphics generation and management tool for real-time applications

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1992-01-01

    A graphics tool was designed and developed for easy generation and management of personal computer graphics. It also provides methods and 'run-time' software for many common artificial intelligence (AI) or expert system (ES) applications.

  2. Multiscale Toxicology- Building the Next Generation Tools for Toxicology

    SciTech Connect

    Retterer, S. T.; Holsapple, M. P.

    2013-10-31

    A Cooperative Research and Development Agreement (CRADA) was established between Battelle Memorial Institute (BMI), Pacific Northwest National Laboratory (PNNL), Oak Ridge National Laboratory (ORNL), Brookhaven National Laboratory (BNL), Lawrence Livermore National Laboratory (LLNL) with the goal of combining the analytical and synthetic strengths of the National Laboratories with BMI's expertise in basic and translational medical research to develop a collaborative pipeline and suite of high throughput and imaging technologies that could be used to provide a more comprehensive understanding of material and drug toxicology in humans. The Multi-Scale Toxicity Initiative (MSTI), consisting of the team members above, was established to coordinate cellular scale, high-throughput in vitro testing, computational modeling and whole animal in vivo toxicology studies between MSTI team members. Development of a common, well-characterized set of materials for testing was identified as a crucial need for the initiative. Two research tracks were established by BMI during the course of the CRADA. The first research track focused on the development of tools and techniques for understanding the toxicity of nanomaterials, specifically inorganic nanoparticles (NPs). ORNL"s work focused primarily on the synthesis, functionalization and characterization of a common set of NPs for dissemination to the participating laboratories. These particles were synthesized to retain the same surface characteristics and size, but to allow visualization using the variety of imaging technologies present across the team. Characterization included the quantitative analysis of physical and chemical properties of the materials as well as the preliminary assessment of NP toxicity using commercially available toxicity screens and emerging optical imaging strategies. Additional efforts examined the development of high-throughput microfluidic and imaging assays for measuring NP uptake, localization, and

  3. Next generation sequencing technologies: tool to study avian virus diversity.

    PubMed

    Kapgate, S S; Barbuddhe, S B; Kumanan, K

    2015-03-01

    Increased globalisation, climatic changes and wildlife-livestock interface led to emergence of novel viral pathogens or zoonoses that have become serious concern to avian, animal and human health. High biodiversity and bird migration facilitate spread of the pathogen and provide reservoirs for emerging infectious diseases. Current classical diagnostic methods designed to be virus-specific or aim to be limited to group of viral agents, hinder identifying of novel viruses or viral variants. Recently developed approaches of next-generation sequencing (NGS) provide culture-independent methods that are useful for understanding viral diversity and discovery of novel virus, thereby enabling a better diagnosis and disease control. This review discusses the different possible steps of a NGS study utilizing sequence-independent amplification, high-throughput sequencing and bioinformatics approaches to identify novel avian viruses and their diversity. NGS lead to the identification of a wide range of new viruses such as picobirnavirus, picornavirus, orthoreovirus and avian gamma coronavirus associated with fulminating disease in guinea fowl and is also used in describing viral diversity among avian species. The review also briefly discusses areas of viral-host interaction and disease associated causalities with newly identified avian viruses. PMID:25790045

  4. Enzyme Function Initiative-Enzyme Similarity Tool (EFI-EST): A web tool for generating protein sequence similarity networks

    PubMed Central

    Gerlt, John A.; Bouvier, Jason T.; Davidson, Daniel B.; Imker, Heidi J.; Sadkhin, Boris; Slater, David R.; Whalen, Katie L.

    2015-01-01

    The Enzyme Function Initiative, an NIH/NIGMS-supported Large-Scale Collaborative Project (EFI; U54GM093342; http://enzymefunction.org/), is focused on devising and disseminating bioinformatics and computational tools as well as experimental strategies for the prediction and assignment of functions (in vitro activities and in vivo physiological/metabolic roles) to uncharacterized enzymes discovered in genome projects. Protein sequence similarity networks (SSNs) are visually powerful tools for analyzing sequence relationships in protein families (H.J. Atkinson, J.H. Morris, T.E. Ferrin, and P.C. Babbitt, PLoS One 2009, 4, e4345). However, the members of the biological/biomedical community have not had access to the capability to generate SSNs for their “favorite” protein families. In this article we announce the EFI-EST (Enzyme Function Initiative-Enzyme Similarity Tool) web tool (http://efi.igb.illinois.edu/efi-est/) that is available without cost for the automated generation of SSNs by the community. The tool can create SSNs for the “closest neighbors” of a user-supplied protein sequence from the UniProt database (Option A) or of members of any user-supplied Pfam and/or InterPro family (Option B). We provide an introduction to SSNs, a description of EFI-EST, and a demonstration of the use of EFI-EST to explore sequence-function space in the OMP decarboxylase superfamily (PF00215). This article is designed as a tutorial that will allow members of the community to use the EFI-EST web tool for exploring sequence/function space in protein families. PMID:25900361

  5. E-DECIDER Rapid Response to the M 6.0 South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    E-DECIDER initiated rapid response mode when the California Earthquake Clearinghouse was activated the morning following the M6 Napa earthquake. Data products, including: 1) rapid damage and loss estimates, 2) deformation magnitude and slope change maps, and 3) aftershock forecasts were provided to the Clearinghouse partners within 24 hours of the event via XchangeCore Web Service Data Orchestration sharing. NASA data products were provided to end-users via XchangeCore, EERI and Clearinghouse websites, and ArcGIS online for Napa response, reaching a wide response audience. The E-DECIDER team helped facilitate rapid delivery of NASA products to stakeholders and participated in Clearinghouse Napa earthquake briefings to update stakeholders on product information. Rapid response products from E-DECIDER can be used to help prioritize response efforts shortly after the event has occurred. InLET (Internet Loss Estimation Tool) post-event damage and casualty estimates were generated quickly after the Napa earthquake. InLET provides immediate post-event estimates of casualties and building damage by performing loss/impact simulations using USGS ground motion data and FEMA HAZUS damage estimation technology. These results were provided to E-DECIDER by their collaborators, ImageCat, Inc. and the Community Stakeholder Network (CSN). Strain magnitude and slope change maps were automatically generated when the Napa earthquake appeared on the USGS feed. These maps provide an early estimate of where the deformation has occurred and where damage may be localized. Using E-DECIDER critical infrastructure overlays with damage estimates, decision makers can direct response effort that can be verified later with field reconnaissance and remote sensing-based observations. Earthquake aftershock forecast maps were produced within hours of the event. These maps highlight areas where aftershocks are likely to occur and can also be coupled with infrastructure overlays to help direct response

  6. Generation X Teaches College: Generation Construction as Pedagogical Tool in the Writing Classroom.

    ERIC Educational Resources Information Center

    Hassel, Holly; Epp, Dawn Vernooy

    In the 1996 book "Generation X Goes to College: An Eye-Opening Account of Teaching in Post-Modern America," Peter Sacks probes the "decay" of higher education in the United States; a decay he attributes to listless, entitled students. This paper interrogates the paradigm of Boomers and Generation Xers poised in opposition to one another,…

  7. Deciding to have knee or hip replacement

    MedlinePlus

    ... patientinstructions/000368.htm Deciding to have knee or hip replacement To use the sharing features on this page, ... make a decision. Who Benefits From Knee or hip Replacement Surgery? The most common reason to have a ...

  8. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  9. A comparison of tools for the simulation of genomic next-generation sequencing data.

    PubMed

    Escalona, Merly; Rocha, Sara; Posada, David

    2016-08-01

    Computer simulation of genomic data has become increasingly popular for assessing and validating biological models or for gaining an understanding of specific data sets. Several computational tools for the simulation of next-generation sequencing (NGS) data have been developed in recent years, which could be used to compare existing and new NGS analytical pipelines. Here we review 23 of these tools, highlighting their distinct functionality, requirements and potential applications. We also provide a decision tree for the informed selection of an appropriate NGS simulation tool for the specific question at hand. PMID:27320129

  10. The Sequence of Events generator: A powerful tool for mission operations

    NASA Technical Reports Server (NTRS)

    Wobbe, Hubertus; Braun, Armin

    1994-01-01

    The functions and features of the sequence of events (SOE) and flight operations procedures (FOP) generator developed and used at DLR/GSOC for the positioning of EUTELSAT 2 satellites are presented. The SOE and FOP are the main operational documents that are prepared for nominal as well as for non-nominal mission execution. Their structure and application are described. Both of these documents are generated, validated, and maintained by a common software tool. Its main features and advantages are demonstrated. The tool has been improved continuously over the last 5 years. Due to its flexibility it can easily be applied to other projects and new features may be added.

  11. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the

  12. A decision support tool for landfill methane generation and gas collection.

    PubMed

    Emkes, Harriet; Coulon, Frédéric; Wagland, Stuart

    2015-09-01

    This study presents a decision support tool (DST) to enhance methane generation at individual landfill sites. To date there is no such tool available to provide landfill decision makers with clear and simplified information to evaluate biochemical processes within a landfill site, to assess performance of gas production and to identify potential remedies to any issues. The current lack in understanding stems from the complexity of the landfill waste degradation process. Two scoring sets for landfill gas production performance are calculated with the tool: (1) methane output score which measures the deviation of the actual methane output rate at each site which the prediction generated by the first order decay model LandGEM; and (2) landfill gas indicators' score, which measures the deviation of the landfill gas indicators from their ideal ranges for optimal methane generation conditions. Landfill gas indicators include moisture content, temperature, alkalinity, pH, BOD, COD, BOD/COD ratio, ammonia, chloride, iron and zinc. A total landfill gas indicator score is provided using multi-criteria analysis to calculate the sum of weighted scores for each indicator. The weights for each indicator are calculated using an analytical hierarchical process. The tool is tested against five real scenarios for landfill sites in UK with a range of good, average and poor landfill methane generation over a one year period (2012). An interpretation of the results is given for each scenario and recommendations are highlighted for methane output rate enhancement. Results demonstrate how the tool can help landfill managers and operators to enhance their understanding of methane generation at a site-specific level, track landfill methane generation over time, compare and rank sites, and identify problems areas within a landfill site. PMID:26168873

  13. Evaluation of Computer Tools for Idea Generation and Team Formation in Project-Based Learning

    ERIC Educational Resources Information Center

    Ardaiz-Villanueva, Oscar; Nicuesa-Chacon, Xabier; Brene-Artazcoz, Oscar; Sanz de Acedo Lizarraga, Maria Luisa; Sanz de Acedo Baquedano, Maria Teresa

    2011-01-01

    The main objective of this research was to validate the effectiveness of Wikideas and Creativity Connector tools to stimulate the generation of ideas and originality by university students organized into groups according to their indexes of creativity and affinity. Another goal of the study was to evaluate the classroom climate created by these…

  14. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  15. Short circuiting the circadian system with a new generation of precision tools.

    PubMed

    Loh, Dawn H; Kudo, Takashi; Colwell, Christopher S

    2015-03-01

    Circadian behavior in mammals is coordinated by neurons within the suprachiasmatic nucleus (SCN). In this issue, Lee et al. (2015) and Mieda et al. (2015) applied state-of-the-art genetic tools to dissect the microcircuits within the SCN generating circadian rhythmic behavior. PMID:25741718

  16. Trajectory Assessment and Modification Tools for Next Generation Air Traffic Management Operations

    NASA Technical Reports Server (NTRS)

    Brasil, Connie; Lee, Paul; Mainini, Matthew; Lee, Homola; Lee, Hwasoo; Prevot, Thomas; Smith, Nancy

    2011-01-01

    This paper reviews three Next Generation Air Transportation System (NextGen) based high fidelity air traffic control human-in-the-loop (HITL) simulations, with a focus on the expected requirement of enhanced automated trajectory assessment and modification tools to support future air traffic flow management (ATFM) planning positions. The simulations were conducted at the National Aeronautics and Space Administration (NASA) Ames Research Centers Airspace Operations Laboratory (AOL) in 2009 and 2010. The test airspace for all three simulations assumed the mid-term NextGenEn-Route high altitude environment utilizing high altitude sectors from the Kansas City and Memphis Air Route Traffic Control Centers. Trajectory assessment, modification and coordination decision support tools were developed at the AOL in order to perform future ATFM tasks. Overall tool usage results and user acceptability ratings were collected across three areas of NextGen operatoins to evaluate the tools. In addition to the usefulness and usability feedback, feasibility issues, benefits, and future requirements were also addressed. Overall, the tool sets were rated very useful and usable, and many elements of the tools received high scores and were used frequently and successfully. Tool utilization results in all three HITLs showed both user and system benefits including better airspace throughput, reduced controller workload, and highly effective communication protocols in both full Data Comm and mixed-equipage environments.

  17. The role of optimization in the next generation of computer-based design tools

    NASA Technical Reports Server (NTRS)

    Rogan, J. Edward

    1989-01-01

    There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.

  18. General application of rapid 3-D digitizing and tool path generation for complex shapes

    SciTech Connect

    Kwok, K.S.; Loucks, C.S.; Driessen, B.J.

    1997-09-01

    A system for automatic tool path generation was developed at Sandia National Laboratories for finish machining operations. The system consists of a commercially available 5-axis milling machine controlled by Sandia developed software. This system was used to remove overspray on cast turbine blades. A laser-based, structured-light sensor, mounted on a tool holder, is used to collect 3D data points around the surface of the turbine blade. Using the digitized model of the blade, a tool path is generated which will drive a 0.375 inch grinding pin around the tip of the blade. A fuzzified digital filter was developed to properly eliminate false sensor readings caused by burrs, holes and overspray. The digital filter was found to successfully generate the correct tool path for a blade with intentionally scanned holes and defects. The fuzzified filter improved the computation efficiency by a factor of 25. For application to general parts, an adaptive scanning algorithm was developed and presented with simulation and experimental results. A right pyramid and an ellipsoid were scanned successfully with the adaptive algorithm in simulation studies. In actual experiments, a nose cone and a turbine blade were successfully scanned. A complex shaped turbine blade was successfully scanned and finished machined using these algorithms.

  19. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  20. Fault Simulation and Test Generation for Transistor Shorts Using Stuck-at Test Tools

    NASA Astrophysics Data System (ADS)

    Higami, Yoshinobu; Saluja, Kewal K.; Takahashi, Hiroshi; Kobayashi, Shin-Ya; Takamatsu, Yuzo

    This paper presents methods for detecting transistor short faults using logic level fault simulation and test generation. The paper considers two types of transistor level faults, namely strong shorts and weak shorts, which were introduced in our previous research. These faults are defined based on the values of outputs of faulty gates. The proposed fault simulation and test generation are performed using gate-level tools designed to deal with stuck-at faults, and no transistor-level tools are required. In the test generation process, a circuit is modified by inserting inverters, and a stuck-at test generator is used. The modification of a circuit does not mean a design-for-testability technique, as the modified circuit is used only during the test generation process. Further, generated test patterns are compacted by fault simulation. Also, since the weak short model involves uncertainty in its behavior, we define fault coverage and fault efficiency in three different way, namely, optimistic, pessimistic and probabilistic and assess them. Finally, experimental results for ISCAS benchmark circuits are used to demonstrate the effectiveness of the proposed methods.

  1. Forecasting municipal solid waste generation using prognostic tools and regression analysis.

    PubMed

    Ghinea, Cristina; Drăgoi, Elena Niculina; Comăniţă, Elena-Diana; Gavrilescu, Marius; Câmpean, Teofil; Curteanu, Silvia; Gavrilescu, Maria

    2016-11-01

    For an adequate planning of waste management systems the accurate forecast of waste generation is an essential step, since various factors can affect waste trends. The application of predictive and prognosis models are useful tools, as reliable support for decision making processes. In this paper some indicators such as: number of residents, population age, urban life expectancy, total municipal solid waste were used as input variables in prognostic models in order to predict the amount of solid waste fractions. We applied Waste Prognostic Tool, regression analysis and time series analysis to forecast municipal solid waste generation and composition by considering the Iasi Romania case study. Regression equations were determined for six solid waste fractions (paper, plastic, metal, glass, biodegradable and other waste). Accuracy Measures were calculated and the results showed that S-curve trend model is the most suitable for municipal solid waste (MSW) prediction. PMID:27454099

  2. Developing next-generation telehealth tools and technologies: patients, systems, and data perspectives.

    PubMed

    Ackerman, Michael J; Filart, Rosemarie; Burgess, Lawrence P; Lee, Insup; Poropatich, Ronald K

    2010-01-01

    The major goals of telemedicine today are to develop next-generation telehealth tools and technologies to enhance healthcare delivery to medically underserved populations using telecommunication technology, to increase access to medical specialty services while decreasing healthcare costs, and to provide training of healthcare providers, clinical trainees, and students in health-related fields. Key drivers for these tools and technologies are the need and interest to collaborate among telehealth stakeholders, including patients, patient communities, research funders, researchers, healthcare services providers, professional societies, industry, healthcare management/economists, and healthcare policy makers. In the development, marketing, adoption, and implementation of these tools and technologies, communication, training, cultural sensitivity, and end-user customization are critical pieces to the process. Next-generation tools and technologies are vehicles toward personalized medicine, extending the telemedicine model to include cell phones and Internet-based telecommunications tools for remote and home health management with video assessment, remote bedside monitoring, and patient-specific care tools with event logs, patient electronic profile, and physician note-writing capability. Telehealth is ultimately a system of systems in scale and complexity. To cover the full spectrum of dynamic and evolving needs of end-users, we must appreciate system complexity as telehealth moves toward increasing functionality, integration, interoperability, outreach, and quality of service. Toward that end, our group addressed three overarching questions: (1) What are the high-impact topics? (2) What are the barriers to progress? and (3) What roles can the National Institutes of Health and its various institutes and centers play in fostering the future development of telehealth? PMID:20043711

  3. Developing Next-Generation Telehealth Tools and Technologies: Patients, Systems, and Data Perspectives

    PubMed Central

    Filart, Rosemarie; Burgess, Lawrence P.; Lee, Insup; Poropatich, Ronald K.

    2010-01-01

    Abstract The major goals of telemedicine today are to develop next-generation telehealth tools and technologies to enhance healthcare delivery to medically underserved populations using telecommunication technology, to increase access to medical specialty services while decreasing healthcare costs, and to provide training of healthcare providers, clinical trainees, and students in health-related fields. Key drivers for these tools and technologies are the need and interest to collaborate among telehealth stakeholders, including patients, patient communities, research funders, researchers, healthcare services providers, professional societies, industry, healthcare management/economists, and healthcare policy makers. In the development, marketing, adoption, and implementation of these tools and technologies, communication, training, cultural sensitivity, and end-user customization are critical pieces to the process. Next-generation tools and technologies are vehicles toward personalized medicine, extending the telemedicine model to include cell phones and Internet-based telecommunications tools for remote and home health management with video assessment, remote bedside monitoring, and patient-specific care tools with event logs, patient electronic profile, and physician note-writing capability. Telehealth is ultimately a system of systems in scale and complexity. To cover the full spectrum of dynamic and evolving needs of end-users, we must appreciate system complexity as telehealth moves toward increasing functionality, integration, interoperability, outreach, and quality of service. Toward that end, our group addressed three overarching questions: (1) What are the high-impact topics? (2) What are the barriers to progress? and (3) What roles can the National Institutes of Health and its various institutes and centers play in fostering the future development of telehealth? PMID:20043711

  4. Method and tool for generating and managing image quality allocations through the design and development process

    NASA Astrophysics Data System (ADS)

    Sparks, Andrew W.; Olson, Craig; Theisen, Michael J.; Addiego, Chris J.; Hutchins, Tiffany G.; Goodman, Timothy D.

    2016-05-01

    Performance models for infrared imaging systems require image quality parameters; optical design engineers need image quality design goals; systems engineers develop image quality allocations to test imaging systems against. It is a challenge to maintain consistency and traceability amongst the various expressions of image quality. We present a method and parametric tool for generating and managing expressions of image quality during the system modeling, requirements specification, design, and testing phases of an imaging system design and development project.

  5. CHOPCHOP v2: a web tool for the next generation of CRISPR genome engineering.

    PubMed

    Labun, Kornel; Montague, Tessa G; Gagnon, James A; Thyme, Summer B; Valen, Eivind

    2016-07-01

    In just 3 years CRISPR genome editing has transformed biology, and its popularity and potency continue to grow. New CRISPR effectors and rules for locating optimum targets continue to be reported, highlighting the need for computational CRISPR targeting tools to compile these rules and facilitate target selection and design. CHOPCHOP is one of the most widely used web tools for CRISPR- and TALEN-based genome editing. Its overarching principle is to provide an intuitive and powerful tool that can serve both novice and experienced users. In this major update we introduce tools for the next generation of CRISPR advances, including Cpf1 and Cas9 nickases. We support a number of new features that improve the targeting power, usability and efficiency of CHOPCHOP. To increase targeting range and specificity we provide support for custom length sgRNAs, and we evaluate the sequence composition of the whole sgRNA and its surrounding region using models compiled from multiple large-scale studies. These and other new features, coupled with an updated interface for increased usability and support for a continually growing list of organisms, maintain CHOPCHOP as one of the leading tools for CRISPR genome editing. CHOPCHOP v2 can be found at http://chopchop.cbu.uib.no. PMID:27185894

  6. System-level tools and reconfigurable computing for next-generation HWIL systems

    NASA Astrophysics Data System (ADS)

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy

    2001-08-01

    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  7. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  8. Automatic generation of bioinformatics tools for predicting protein–ligand binding sites

    PubMed Central

    Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-01-01

    Motivation: Predictive tools that model protein–ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein–ligand binding predictive tools would be useful. Results: We developed a system for automatically generating protein–ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5–1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. Availability and implementation: The source code and web application are freely available for download at http://utprot.net. They are implemented in Python and supported on Linux. Contact: shimizu@bi.a.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26545824

  9. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  10. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    SciTech Connect

    Boe, Timothy; Lemieux, Paul; Schultheisz, Daniel; Peake, Tom; Hayes, Colin

    2013-07-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies on the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri{sup R} ArcGIS{sup R} scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus{sup R}-MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel{sup R} 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)

  11. Automatic generation of conceptual database design tools from data model specifications

    SciTech Connect

    Hong, Shuguang.

    1989-01-01

    The problems faced in the design and implementation of database software systems based on object-oriented data models are similar to that of other software design, i.e., difficult, complex, yet redundant effort. Automatic generation of database software system has been proposed as a solution to the problems. In order to generate database software system for a variety of object-oriented data models, two critical issues: data model specification and software generation, must be addressed. SeaWeed is a software system that automatically generates conceptual database design tools from data model specifications. A meta model has been defined for the specification of a class of object-oriented data models. This meta model provides a set of primitive modeling constructs that can be used to express the semantics, or unique characteristics, of specific data models. Software reusability has been adopted for the software generation. The technique of design reuse is utilized to derive the requirement specification of the software to be generated from data model specifications. The mechanism of code reuse is used to produce the necessary reusable software components. This dissertation presents the research results of SeaWeed including the meta model, data model specification, a formal representation of design reuse and code reuse, and the software generation paradigm.

  12. Geological applications of automatic grid generation tools for finite elements applied to porous flow modeling

    SciTech Connect

    Gable, C.W.; Trease, H.E.; Cherry, T.A.

    1996-04-01

    The construction of grids that accurately reflect geologic structure and stratigraphy for computational flow and transport models poses a formidable task. Even with a complete understanding of stratigraphy, material properties, boundary and initial conditions, the task of incorporating data into a numerical model can be difficult and time consuming. Furthermore, most tools available for representing complex geologic surfaces and volumes are not designed for producing optimal grids for flow and transport computation. We have developed a modeling tool, GEOMESH, for automating finite element grid generation that maintains the geometric integrity of geologic structure and stratigraphy. The method produces an optimal (Delaunay) tetrahedral grid that can be used for flow and transport computations. The process of developing a flow and transport model can be divided into three parts: (1) Developing accurate conceptual models inclusive of geologic interpretation, material characterization and construction of a stratigraphic and hydrostratigraphic framework model, (2) Building and initializing computational frameworks; grid generation, boundary and initial conditions, (3) Computational physics models of flow and transport. Process (1) and (3) have received considerable attention whereas (2) has not. This work concentrates on grid generation and its connections to geologic characterization and process modeling. Applications of GEOMESH illustrate grid generation for two dimensional cross sections, three dimensional regional models, and adaptive grid refinement in three dimensions. Examples of grid representation of wells and tunnels with GEOMESH can be found in Cherry et al. The resulting grid can be utilized by unstructured finite element or integrated finite difference models.

  13. A generative tool for building health applications driven by ISO 13606 archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques. PMID:21968574

  14. Simulation Tool to Assess Mechanical and Electrical Stresses on Wind Turbine Generators: Preprint

    SciTech Connect

    Singh, M.; Muljadi, E.; Gevorgian, V.; Jonkman, J.

    2013-10-01

    Wind turbine generators (WTGs) consist of many different components to convert kinetic energy of the wind into electrical energy for end users. Wind energy is accessed to provide mechanical torque for driving the shaft of the electrical generator. The conversion from wind power to mechanical power is governed by the aerodynamic conversion. The aerodynamic-electrical-conversion efficiency of a WTGis influenced by the efficiency of the blades, the gearbox, the generator, and the power converter. This paper describes the use of MATLAB/Simulink to simulate the electrical and grid-related aspects of a WTG coupled with the FAST aero-elastic wind turbine computer-aided engineering tool to simulate the aerodynamic and mechanical aspects of a WTG. The combination of the two enables studiesinvolving both electrical and mechanical aspects of a WTG. This digest includes some examples of the capabilities of the FAST and MATLAB coupling, namely the effects of electrical faults on the blade moments.

  15. A survey of tools for variant analysis of next-generation genome sequencing data

    PubMed Central

    Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R.; Zschocke, Johannes

    2014-01-01

    Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

  16. Mutation based treatment recommendations from next generation sequencing data: a comparison of web tools.

    PubMed

    Patel, Jaymin M; Knopf, Joshua; Reiner, Eric; Bossuyt, Veerle; Epstein, Lianne; DiGiovanna, Michael; Chung, Gina; Silber, Andrea; Sanft, Tara; Hofstatter, Erin; Mougalian, Sarah; Abu-Khalaf, Maysa; Platt, James; Shi, Weiwei; Gershkovich, Peter; Hatzis, Christos; Pusztai, Lajos

    2016-04-19

    Interpretation of complex cancer genome data, generated by tumor target profiling platforms, is key for the success of personalized cancer therapy. How to draw therapeutic conclusions from tumor profiling results is not standardized and may vary among commercial and academically-affiliated recommendation tools. We performed targeted sequencing of 315 genes from 75 metastatic breast cancer biopsies using the FoundationOne assay. Results were run through 4 different web tools including the Drug-Gene Interaction Database (DGidb), My Cancer Genome (MCG), Personalized Cancer Therapy (PCT), and cBioPortal, for drug and clinical trial recommendations. These recommendations were compared amongst each other and to those provided by FoundationOne. The identification of a gene as targetable varied across the different recommendation sources. Only 33% of cases had 4 or more sources recommend the same drug for at least one of the usually several altered genes found in tumor biopsies. These results indicate further development and standardization of broadly applicable software tools that assist in our therapeutic interpretation of genomic data is needed. Existing algorithms for data acquisition, integration and interpretation will likely need to incorporate artificial intelligence tools to improve both content and real-time status. PMID:26980737

  17. Metagenomics: Tools and Insights for Analyzing Next-Generation Sequencing Data Derived from Biodiversity Studies

    PubMed Central

    Oulas, Anastasis; Pavloudi, Christina; Polymenakou, Paraskevi; Pavlopoulos, Georgios A; Papanikolaou, Nikolas; Kotoulas, Georgios; Arvanitidis, Christos; Iliopoulos, Ioannis

    2015-01-01

    Advances in next-generation sequencing (NGS) have allowed significant breakthroughs in microbial ecology studies. This has led to the rapid expansion of research in the field and the establishment of “metagenomics”, often defined as the analysis of DNA from microbial communities in environmental samples without prior need for culturing. Many metagenomics statistical/computational tools and databases have been developed in order to allow the exploitation of the huge influx of data. In this review article, we provide an overview of the sequencing technologies and how they are uniquely suited to various types of metagenomic studies. We focus on the currently available bioinformatics techniques, tools, and methodologies for performing each individual step of a typical metagenomic dataset analysis. We also provide future trends in the field with respect to tools and technologies currently under development. Moreover, we discuss data management, distribution, and integration tools that are capable of performing comparative metagenomic analyses of multiple datasets using well-established databases, as well as commonly used annotation standards. PMID:25983555

  18. Mutation based treatment recommendations from next generation sequencing data: a comparison of web tools

    PubMed Central

    Patel, Jaymin M.; Knopf, Joshua; Reiner, Eric; Bossuyt, Veerle; Epstein, Lianne; DiGiovanna, Michael; Chung, Gina; Silber, Andrea; Sanft, Tara; Hofstatter, Erin; Mougalian, Sarah; Abu-Khalaf, Maysa; Platt, James; Shi, Weiwei; Gershkovich, Peter; Hatzis, Christos; Pusztai, Lajos

    2016-01-01

    Interpretation of complex cancer genome data, generated by tumor target profiling platforms, is key for the success of personalized cancer therapy. How to draw therapeutic conclusions from tumor profiling results is not standardized and may vary among commercial and academically-affiliated recommendation tools. We performed targeted sequencing of 315 genes from 75 metastatic breast cancer biopsies using the FoundationOne assay. Results were run through 4 different web tools including the Drug-Gene Interaction Database (DGidb), My Cancer Genome (MCG), Personalized Cancer Therapy (PCT), and cBioPortal, for drug and clinical trial recommendations. These recommendations were compared amongst each other and to those provided by FoundationOne. The identification of a gene as targetable varied across the different recommendation sources. Only 33% of cases had 4 or more sources recommend the same drug for at least one of the usually several altered genes found in tumor biopsies. These results indicate further development and standardization of broadly applicable software tools that assist in our therapeutic interpretation of genomic data is needed. Existing algorithms for data acquisition, integration and interpretation will likely need to incorporate artificial intelligence tools to improve both content and real-time status. PMID:26980737

  19. Angular Determination of Toolmarks Using a Computer-Generated Virtual Tool.

    PubMed

    Spotts, Ryan; Chumbley, L Scott; Ekstrand, Laura; Zhang, Song; Kreiser, James

    2015-07-01

    A blind study to determine whether virtual toolmarks created using a computer could be used to identify and characterize angle of incidence of physical toolmarks was conducted. Six sequentially manufactured screwdriver tips and one random screwdriver were used to create toolmarks at various angles. An apparatus controlled tool angle. Resultant toolmarks were randomly coded and sent to the researchers, who scanned both tips and toolmarks using an optical profilometer to obtain 3D topography data. Developed software was used to create virtual marks based on the tool topography data. Virtual marks generated at angles from 30 to 85° (5° increments) were compared to physical toolmarks using a statistical algorithm. Twenty of twenty toolmarks were correctly identified by the algorithm. On average, the algorithm misidentified the correct angle of incidence by -6.12°. This study presents the results, their significance, and offers reasons for the average angular misidentification. PMID:25929523

  20. ModelMage: a tool for automatic model generation, selection and management.

    PubMed

    Flöttmann, Max; Schaber, Jörg; Hoops, Stephan; Klipp, Edda; Mendes, Pedro

    2008-01-01

    Mathematical modeling of biological systems usually involves implementing, simulating, and discriminating several candidate models that represent alternative hypotheses. Generating and managing these candidate models is a tedious and difficult task and can easily lead to errors. ModelMage is a tool that facilitates management of candidate models. It is designed for the easy and rapid development, generation, simulation, and discrimination of candidate models. The main idea of the program is to automatically create a defined set of model alternatives from a single master model. The user provides only one SBML-model and a set of directives from which the candidate models are created by leaving out species, modifiers or reactions. After generating models the software can automatically fit all these models to the data and provides a ranking for model selection, in case data is available. In contrast to other model generation programs, ModelMage aims at generating only a limited set of models that the user can precisely define. ModelMage uses COPASI as a simulation and optimization engine. Thus, all simulation and optimization features of COPASI are readily incorporated. ModelMage can be downloaded from http://sysbio.molgen.mpg.de/modelmage and is distributed as free software. PMID:19425122

  1. Automated protein motif generation in the structure-based protein function prediction tool ProMOL.

    PubMed

    Osipovitch, Mikhail; Lambrecht, Mitchell; Baker, Cameron; Madha, Shariq; Mills, Jeffrey L; Craig, Paul A; Bernstein, Herbert J

    2015-12-01

    ProMOL, a plugin for the PyMOL molecular graphics system, is a structure-based protein function prediction tool. ProMOL includes a set of routines for building motif templates that are used for screening query structures for enzyme active sites. Previously, each motif template was generated manually and required supervision in the optimization of parameters for sensitivity and selectivity. We developed an algorithm and workflow for the automation of motif building and testing routines in ProMOL. The algorithm uses a set of empirically derived parameters for optimization and requires little user intervention. The automated motif generation algorithm was first tested in a performance comparison with a set of manually generated motifs based on identical active sites from the same 112 PDB entries. The two sets of motifs were equally effective in identifying alignments with homologs and in rejecting alignments with unrelated structures. A second set of 296 active site motifs were generated automatically, based on Catalytic Site Atlas entries with literature citations, as an expansion of the library of existing manually generated motif templates. The new motif templates exhibited comparable performance to the existing ones in terms of hit rates against native structures, homologs with the same EC and Pfam designations, and randomly selected unrelated structures with a different EC designation at the first EC digit, as well as in terms of RMSD values obtained from local structural alignments of motifs and query structures. This research is supported by NIH grant GM078077. PMID:26573864

  2. Customised 3D Printing: An Innovative Training Tool for the Next Generation of Orbital Surgeons.

    PubMed

    Scawn, Richard L; Foster, Alex; Lee, Bradford W; Kikkawa, Don O; Korn, Bobby S

    2015-01-01

    Additive manufacturing or 3D printing is the process by which three dimensional data fields are translated into real-life physical representations. 3D printers create physical printouts using heated plastics in a layered fashion resulting in a three-dimensional object. We present a technique for creating customised, inexpensive 3D orbit models for use in orbital surgical training using 3D printing technology. These models allow trainee surgeons to perform 'wet-lab' orbital decompressions and simulate upcoming surgeries on orbital models that replicate a patient's bony anatomy. We believe this represents an innovative training tool for the next generation of orbital surgeons. PMID:26121063

  3. Face acquisition camera design using the NV-IPM image generation tool

    NASA Astrophysics Data System (ADS)

    Howell, Christopher L.; Choi, Hee-Sue; Reynolds, Joseph P.

    2015-05-01

    In this paper, we demonstrate the utility of the Night Vision Integrated Performance Model (NV-IPM) image generation tool by using it to create a database of face images with controlled degradations. Available face recognition algorithms can then be used to directly evaluate camera designs using these degraded images. By controlling camera effects such as blur, noise, and sampling, we can analyze algorithm performance and establish a more complete performance standard for face acquisition cameras. The ability to accurately simulate imagery and directly test with algorithms not only improves the system design process but greatly reduces development cost.

  4. Synthetic biology in mammalian cells: Next generation research tools and therapeutics

    PubMed Central

    Lienert, Florian; Lohmueller, Jason J; Garg, Abhishek; Silver, Pamela A

    2014-01-01

    Recent progress in DNA manipulation and gene circuit engineering has greatly improved our ability to programme and probe mammalian cell behaviour. These advances have led to a new generation of synthetic biology research tools and potential therapeutic applications. Programmable DNA-binding domains and RNA regulators are leading to unprecedented control of gene expression and elucidation of gene function. Rebuilding complex biological circuits such as T cell receptor signalling in isolation from their natural context has deepened our understanding of network motifs and signalling pathways. Synthetic biology is also leading to innovative therapeutic interventions based on cell-based therapies, protein drugs, vaccines and gene therapies. PMID:24434884

  5. Unexpected benefits of deciding by mind wandering

    PubMed Central

    Giblin, Colleen E.; Morewedge, Carey K.; Norton, Michael I.

    2013-01-01

    The mind wanders, even when people are attempting to make complex decisions. We suggest that mind wandering—allowing one's thoughts to wander until the “correct” choice comes to mind—can positively impact people's feelings about their decisions. We compare post-choice satisfaction from choices made by mind wandering to reason-based choices and randomly assigned outcomes. Participants chose a poster by mind wandering or deliberating, or were randomly assigned a poster. Whereas forecasters predicted that participants who chose by mind wandering would evaluate their outcome as inferior to participants who deliberated (Experiment 1), participants who used mind wandering as a decision strategy evaluated their choice just as positively as did participants who used deliberation (Experiment 2). In some cases, it appears that people can spare themselves the effort of deliberation and instead “decide by wind wandering,” yet experience no decrease in satisfaction. PMID:24046760

  6. Experiences with the application of the ADIC automatic differentiation tool for to the CSCMDO 3-D volume grid generation code

    SciTech Connect

    Bischof, C.H.; Mauer, A.; Jones, W.T.

    1995-12-31

    Automatic differentiation (AD) is a methodology for developing reliable sensitivity-enhanced versions of arbitrary computer programs with little human effort. It can vastly accelerate the use of advanced simulation codes in multidisciplinary design optimization, since the time for generating and verifying derivative codes is greatly reduced. In this paper, we report on the application of the recently developed ADIC automatic differentiation tool for ANSI C programs to the CSCMDO multiblock three-dimensional volume grid generator. The ADIC-generated code can easily be interfaced with Fortran derivative codes generated with the ADIFOR AD tool FORTRAN 77 programs, thus providing efficient sensitivity-enhancement techniques for multilanguage, multidiscipline problems.

  7. Gravity Waves Generated by Convection: A New Idealized Model Tool and Direct Validation with Satellite Observations

    NASA Astrophysics Data System (ADS)

    Alexander, M. Joan; Stephan, Claudia

    2015-04-01

    In climate models, gravity waves remain too poorly resolved to be directly modelled. Instead, simplified parameterizations are used to include gravity wave effects on model winds. A few climate models link some of the parameterized waves to convective sources, providing a mechanism for feedback between changes in convection and gravity wave-driven changes in circulation in the tropics and above high-latitude storms. These convective wave parameterizations are based on limited case studies with cloud-resolving models, but they are poorly constrained by observational validation, and tuning parameters have large uncertainties. Our new work distills results from complex, full-physics cloud-resolving model studies to essential variables for gravity wave generation. We use the Weather Research Forecast (WRF) model to study relationships between precipitation, latent heating/cooling and other cloud properties to the spectrum of gravity wave momentum flux above midlatitude storm systems. Results show the gravity wave spectrum is surprisingly insensitive to the representation of microphysics in WRF. This is good news for use of these models for gravity wave parameterization development since microphysical properties are a key uncertainty. We further use the full-physics cloud-resolving model as a tool to directly link observed precipitation variability to gravity wave generation. We show that waves in an idealized model forced with radar-observed precipitation can quantitatively reproduce instantaneous satellite-observed features of the gravity wave field above storms, which is a powerful validation of our understanding of waves generated by convection. The idealized model directly links observations of surface precipitation to observed waves in the stratosphere, and the simplicity of the model permits deep/large-area domains for studies of wave-mean flow interactions. This unique validated model tool permits quantitative studies of gravity wave driving of regional

  8. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    NASA Astrophysics Data System (ADS)

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi

  9. Developing a tool to estimate water withdrawal and consumption in electricity generation in the United States.

    SciTech Connect

    Wu, M.; Peng, J.

    2011-02-24

    Freshwater consumption for electricity generation is projected to increase dramatically in the next couple of decades in the United States. The increased demand is likely to further strain freshwater resources in regions where water has already become scarce. Meanwhile, the automotive industry has stepped up its research, development, and deployment efforts on electric vehicles (EVs) and plug-in hybrid electric vehicles (PHEVs). Large-scale, escalated production of EVs and PHEVs nationwide would require increased electricity production, and so meeting the water demand becomes an even greater challenge. The goal of this study is to provide a baseline assessment of freshwater use in electricity generation in the United States and at the state level. Freshwater withdrawal and consumption requirements for power generated from fossil, nonfossil, and renewable sources via various technologies and by use of different cooling systems are examined. A data inventory has been developed that compiles data from government statistics, reports, and literature issued by major research institutes. A spreadsheet-based model has been developed to conduct the estimates by means of a transparent and interactive process. The model further allows us to project future water withdrawal and consumption in electricity production under the forecasted increases in demand. This tool is intended to provide decision makers with the means to make a quick comparison among various fuel, technology, and cooling system options. The model output can be used to address water resource sustainability when considering new projects or expansion of existing plants.

  10. System diagnostic builder: a rule-generation tool for expert systems that do intelligent data evaluation

    NASA Astrophysics Data System (ADS)

    Nieten, Joseph L.; Burke, Roger

    1993-03-01

    The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.

  11. New generation of medium wattage metal halide lamps and spectroscopic tools for their diagnostics

    NASA Astrophysics Data System (ADS)

    Dunaevsky, A.; Tu, J.; Gibson, R.; Steere, T.; Graham, K.; van der Eyden, J.

    2010-11-01

    A new generation of ceramic metal halide high intensity discharge (HID) lamps has achieved high efficiencies by implementing new design concepts. The shape of the ceramic burner is optimized to withstand high temperatures with minimal thermal stress. Corrosion processes with the ceramic walls are slowed down via adoption of non-aggressive metal halide chemistry. Light losses over life due to tungsten deposition on the walls are minimized by maintaining a self-cleaning chemical process, known as tungsten cycle. All these advancements have made the new ceramic metal halide lamps comparable to high pressure sodium lamps for luminous efficacy, life, and maintenance while providing white light with high color rendering. Direct replacement of quartz metal halide lamps and systems results in the energy saving from 18 up to 50%. High resolution spectroscopy remains the major non-destructive tool for the ceramic metal halide lamps. Approaches to reliable measurements of relative partial pressures of the arc species are discussed.

  12. Mobile Phones Democratize and Cultivate Next-Generation Imaging, Diagnostics and Measurement Tools

    PubMed Central

    Ozcan, Aydogan

    2014-01-01

    In this article, I discuss some of the emerging applications and the future opportunities and challenges created by the use of mobile phones and their embedded components for the development of next-generation imaging, sensing, diagnostics and measurement tools. The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones, transforming the mobile phone into a cost-effective and yet extremely powerful platform to run e.g., biomedical tests and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally. PMID:24647550

  13. Numerical study of electromagnetic waves generated by a prototype dielectric logging tool

    USGS Publications Warehouse

    Ellefsen, K.J.; Abraham, J.D.; Wright, D.L.; Mazzella, A.T.

    2004-01-01

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a numerical study was conducted using both the finite-difference, time-domain method and a frequency-wavenumber method. When the propagation velocity in the borehole was greater than that in the formation (e.g., an air-filled borehole in the unsaturated zone), only a guided wave propagated along the borehole. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave radiated electromagnetic energy into the formation, causing its amplitude to decrease. When the propagation velocity in the borehole was less than that in the formation (e.g., a water-filled borehole in the saturated zone), both a refracted wave and a guided wave propagated along the borehole. The velocity of the refracted wave equaled the phase velocity of a plane wave in the formation, and the refracted wave preceded the guided wave. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave did not radiate electromagnetic energy into the formation. To analyze traces recorded by the prototype tool during laboratory tests, they were compared to traces calculated with the finite-difference method. The first parts of both the recorded and the calculated traces were similar, indicating that guided and refracted waves indeed propagated along the prototype tool. ?? 2004 Society of Exploration Geophysicists. All rights reserved.

  14. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    SciTech Connect

    Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan; Saha, Pradip; Loewen, Eric

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parameters applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.

  15. A Practical Comparison of De Novo Genome Assembly Software Tools for Next-Generation Sequencing Technologies

    PubMed Central

    Zhang, Wenyu; Chen, Jiajia; Yang, Yang; Tang, Yifei; Shang, Jing; Shen, Bairong

    2011-01-01

    The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulated datasets from Solexa sequencing platform. Considering the computational time, maximum random access memory (RAM) occupancy, assembly accuracy and integrity, our study indicate that string-based assemblers, overlap-layout-consensus (OLC) assemblers are well-suited for very short reads and longer reads of small genomes respectively. For large datasets of more than hundred millions of short reads, De Bruijn graph-based assemblers would be more appropriate. In terms of software implementation, string-based assemblers are superior to graph-based ones, of which SOAPdenovo is complex for the creation of configuration file. Our comparison study will assist researchers in selecting a well-suited assembler and offer essential information for the improvement of existing assemblers or the developing of novel assemblers. PMID:21423806

  16. Virtual Geographic Environments (VGEs): A New Generation of Geographic Analysis Tool

    NASA Astrophysics Data System (ADS)

    Lin, Hui; Chen, Min; Lu, Guonian; Zhu, Qing; Gong, Jiahua; You, Xiong; Wen, Yongning; Xu, Bingli; Hu, Mingyuan

    2013-11-01

    Virtual Geographic Environments (VGEs) are proposed as a new generation of geographic analysis tool to contribute to human understanding of the geographic world and assist in solving geographic problems at a deeper level. The development of VGEs is focused on meeting the three scientific requirements of Geographic Information Science (GIScience) — multi-dimensional visualization, dynamic phenomenon simulation, and public participation. To provide a clearer image that improves user understanding of VGEs and to contribute to future scientific development, this article reviews several aspects of VGEs. First, the evolutionary process from maps to previous GISystems and then to VGEs is illustrated, with a particular focus on the reasons VGEs were created. Then, extended from the conceptual framework and the components of a complete VGE, three use cases are identified that together encompass the current state of VGEs at different application levels: 1) a tool for geo-object-based multi-dimensional spatial analysis and multi-channel interaction, 2) a platform for geo-process-based simulation of dynamic geographic phenomena, and 3) a workspace for multi-participant-based collaborative geographic experiments. Based on the above analysis, the differences between VGEs and other similar platforms are discussed to draw their clear boundaries. Finally, a short summary of the limitations of current VGEs is given, and future directions are proposed to facilitate ongoing progress toward forming a comprehensive version of VGEs.

  17. Recombineering strategies for developing next generation BAC transgenic tools for optogenetics and beyond

    PubMed Central

    Ting, Jonathan T.; Feng, Guoping

    2014-01-01

    The development and application of diverse BAC transgenic rodent lines has enabled rapid progress for precise molecular targeting of genetically-defined cell types in the mammalian central nervous system. These transgenic tools have played a central role in the optogenetic revolution in neuroscience. Indeed, an overwhelming proportion of studies in this field have made use of BAC transgenic Cre driver lines to achieve targeted expression of optogenetic probes in the brain. In addition, several BAC transgenic mouse lines have been established for direct cell-type specific expression of Channelrhodopsin-2 (ChR2). While the benefits of these new tools largely outweigh any accompanying challenges, many available BAC transgenic lines may suffer from confounds due in part to increased gene dosage of one or more “extra” genes contained within the large BAC DNA sequences. Here we discuss this under-appreciated issue and propose strategies for developing the next generation of BAC transgenic lines that are devoid of extra genes. Furthermore, we provide evidence that these strategies are simple, reproducible, and do not disrupt the intended cell-type specific transgene expression patterns for several distinct BAC clones. These strategies may be widely implemented for improved BAC transgenesis across diverse disciplines. PMID:24772073

  18. LCFM - LIVING COLOR FRAME MAKER: PC GRAPHICS GENERATION AND MANAGEMENT TOOL FOR REAL-TIME APPLICATIONS

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1994-01-01

    Computer graphics are often applied for better understanding and interpretation of data under observation. These graphics become more complicated when animation is required during "run-time", as found in many typical modern artificial intelligence and expert systems. Living Color Frame Maker is a solution to many of these real-time graphics problems. Living Color Frame Maker (LCFM) is a graphics generation and management tool for IBM or IBM compatible personal computers. To eliminate graphics programming, the graphic designer can use LCFM to generate computer graphics frames. The graphical frames are then saved as text files, in a readable and disclosed format, which can be easily accessed and manipulated by user programs for a wide range of "real-time" visual information applications. For example, LCFM can be implemented in a frame-based expert system for visual aids in management of systems. For monitoring, diagnosis, and/or controlling purposes, circuit or systems diagrams can be brought to "life" by using designated video colors and intensities to symbolize the status of hardware components (via real-time feedback from sensors). Thus status of the system itself can be displayed. The Living Color Frame Maker is user friendly with graphical interfaces, and provides on-line help instructions. All options are executed using mouse commands and are displayed on a single menu for fast and easy operation. LCFM is written in C++ using the Borland C++ 2.0 compiler for IBM PC series computers and compatible computers running MS-DOS. The program requires a mouse and an EGA/VGA display. A minimum of 77K of RAM is also required for execution. The documentation is provided in electronic form on the distribution medium in WordPerfect format. A sample MS-DOS executable is provided on the distribution medium. The standard distribution medium for this program is one 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools

  19. gLAB-A Fully Software Tool to Generate, Process and Analyze GNSS Signals

    NASA Astrophysics Data System (ADS)

    Dionisio, Cesare; Citterico, Dario; Pirazzi, Gabriele; De Quattro, Nicola; Marracci, Riccardo; Cucchi, Luca; Valdambrini, Nicola; Formaioni, Irene

    2010-08-01

    In this paper the concept of Software Defined Radio (SDR) and its use in modern GNSS receiver is highlighted demonstrating how software receivers are important in many situations especially for verification and validation. After a brief introduction of gLab, a fully software high modular tool to generate, process and analyze current and future GNSS signals, the different software modules will be described. Demonstrating the wide range of uses concerning gLab, different practical example will be briefly overviewed: from the analysis of real data over the experimental GIOVE-B satellite, to the antenna group delay determination or the CN0 estimation under wide dynamic range etc.. gLab is the result of different projects leaded by Intecs in GNSS SW Radio: the signal generator is the result of the SWAN (Sistemi softWare per Applicazioni di Navigazione) project under Italian Space Agency (ASI) contract, the analyzer and the processing module have been developed for ESA to V&V the IOV (In Orbit Validation) Galileo Phase. In this case the GNSS SW RX works in parallel with Test User Receivers (TUR) in order to validate the Signal In Space (SiS). Is remarkable that gLab is the result of over three years of development and approximately one year of test and validation under ESA (European Space Agency) supervision.

  20. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    NASA Astrophysics Data System (ADS)

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  1. Generation of Look-Up Tables for Dynamic Job Shop Scheduling Decision Support Tool

    NASA Astrophysics Data System (ADS)

    Oktaviandri, Muchamad; Hassan, Adnan; Mohd Shaharoun, Awaluddin

    2016-02-01

    Majority of existing scheduling techniques are based on static demand and deterministic processing time, while most job shop scheduling problem are concerned with dynamic demand and stochastic processing time. As a consequence, the solutions obtained from the traditional scheduling technique are ineffective wherever changes occur to the system. Therefore, this research intends to develop a decision support tool (DST) based on promising artificial intelligent that is able to accommodate the dynamics that regularly occur in job shop scheduling problem. The DST was designed through three phases, i.e. (i) the look-up table generation, (ii) inverse model development and (iii) integration of DST components. This paper reports the generation of look-up tables for various scenarios as a part in development of the DST. A discrete event simulation model was used to compare the performance among SPT, EDD, FCFS, S/OPN and Slack rules; the best performances measures (mean flow time, mean tardiness and mean lateness) and the job order requirement (inter-arrival time, due dates tightness and setup time ratio) which were compiled into look-up tables. The well-known 6/6/J/Cmax Problem from Muth and Thompson (1963) was used as a case study. In the future, the performance measure of various scheduling scenarios and the job order requirement will be mapped using ANN inverse model.

  2. Geant4-DNA simulations using complex DNA geometries generated by the DnaFabric tool

    NASA Astrophysics Data System (ADS)

    Meylan, S.; Vimont, U.; Incerti, S.; Clairand, I.; Villagrasa, C.

    2016-07-01

    Several DNA representations are used to study radio-induced complex DNA damages depending on the approach and the required level of granularity. Among all approaches, the mechanistic one requires the most resolved DNA models that can go down to atomistic DNA descriptions. The complexity of such DNA models make them hard to modify and adapt in order to take into account different biological conditions. The DnaFabric project was started to provide a tool to generate, visualise and modify such complex DNA models. In the current version of DnaFabric, the models can be exported to the Geant4 code to be used as targets in the Monte Carlo simulation. In this work, the project was used to generate two DNA fibre models corresponding to two DNA compaction levels representing the hetero and the euchromatin. The fibres were imported in a Geant4 application where computations were performed to estimate the influence of the DNA compaction on the amount of calculated DNA damage. The relative difference of the DNA damage computed in the two fibres for the same number of projectiles was found to be constant and equal to 1.3 for the considered primary particles (protons from 300 keV to 50 MeV). However, if only the tracks hitting the DNA target are taken into account, then the relative difference is more important for low energies and decreases to reach zero around 10 MeV. The computations were performed with models that contain up to 18,000 DNA nucleotide pairs. Nevertheless, DnaFabric will be extended to manipulate multi-scale models that go from the molecular to the cellular levels.

  3. Evaluating an image-fusion algorithm with synthetic-image-generation tools

    NASA Astrophysics Data System (ADS)

    Gross, Harry N.; Schott, John R.

    1996-06-01

    An algorithm that combines spectral mixing and nonlinear optimization is used to fuse multiresolution images. Image fusion merges images of different spatial and spectral resolutions to create a high spatial resolution multispectral combination. High spectral resolution allows identification of materials in the scene, while high spatial resolution locates those materials. In this algorithm, conventional spectral mixing estimates the percentage of each material (called endmembers) within each low resolution pixel. Three spectral mixing models are compared; unconstrained, partially constrained, and fully constrained. In the partially constrained application, the endmember fractions are required to sum to one. In the fully constrained application, all fractions are additionally required to lie between zero and one. While negative fractions seem inappropriate, they can arise from random spectral realizations of the materials. In the second part of the algorithm, the low resolution fractions are used as inputs to a constrained nonlinear optimization that calculates the endmember fractions for the high resolution pixels. The constraints mirror the low resolution constraints and maintain consistency with the low resolution fraction results. The algorithm can use one or more higher resolution sharpening images to locate the endmembers to high spatial accuracy. The algorithm was evaluated with synthetic image generation (SIG) tools. A SIG developed image can be used to control the various error sources that are likely to impair the algorithm performance. These error sources include atmospheric effects, mismodeled spectral endmembers, and variability in topography and illumination. By controlling the introduction of these errors, the robustness of the algorithm can be studied and improved upon. The motivation for this research is to take advantage of the next generation of multi/hyperspectral sensors. Although the hyperspectral images will be of modest to low resolution

  4. Diagnostic tool for red blood cell membrane disorders: Assessment of a new generation ektacytometer.

    PubMed

    Da Costa, Lydie; Suner, Ludovic; Galimand, Julie; Bonnel, Amandine; Pascreau, Tiffany; Couque, Nathalie; Fenneteau, Odile; Mohandas, Narla

    2016-01-01

    Inherited red blood cell (RBC) membrane disorders, such as hereditary spherocytosis, elliptocytosis and hereditary ovalocytosis, result from mutations in genes encoding various RBC membrane and skeletal proteins. The RBC membrane, a composite structure composed of a lipid bilayer linked to a spectrin/actin-based membrane skeleton, confers upon the RBC unique features of deformability and mechanical stability. The disease severity is primarily dependent on the extent of membrane surface area loss. RBC membrane disorders can be readily diagnosed by various laboratory approaches that include RBC cytology, flow cytometry, ektacytometry, electrophoresis of RBC membrane proteins and genetics. The reference technique for diagnosis of RBC membrane disorders is the osmotic gradient ektacytometry. However, in spite of its recognition as the reference technique, this technique is rarely used as a routine diagnosis tool for RBC membrane disorders due to its limited availability. This may soon change as a new generation of ektacytometer has been recently engineered. In this review, we describe the workflow of the samples shipped to our Hematology laboratory for RBC membrane disorder analysis and the data obtained for a large cohort of French patients presenting with RBC membrane disorders using a newly available version of the ektacytomer. PMID:26603718

  5. DDBJ launches a new archive database with analytical tools for next-generation sequence data.

    PubMed

    Kaminuma, Eli; Mashima, Jun; Kodama, Yuichi; Gojobori, Takashi; Ogasawara, Osamu; Okubo, Kousaku; Takagi, Toshihisa; Nakamura, Yasukazu

    2010-01-01

    The DNA Data Bank of Japan (DDBJ) (http://www.ddbj.nig.ac.jp) has collected and released 1,701,110 entries/1,116,138,614 bases between July 2008 and June 2009. A few highlighted data releases from DDBJ were the complete genome sequence of an endosymbiont within protist cells in the termite gut and Cap Analysis Gene Expression tags for human and mouse deposited from the Functional Annotation of the Mammalian cDNA consortium. In this period, we started a novel user announcement service using Really Simple Syndication (RSS) to deliver a list of data released from DDBJ on a daily basis. Comprehensive visualization of a DDBJ release data was attempted by using a word cloud program. Moreover, a new archive for sequencing data from next-generation sequencers, the 'DDBJ Read Archive' (DRA), was launched. Concurrently, for read data registered in DRA, a semi-automatic annotation tool called the 'DDBJ Read Annotation Pipeline' was released as a preliminary step. The pipeline consists of two parts: basic analysis for reference genome mapping and de novo assembly and high-level analysis of structural and functional annotations. These new services will aid users' research and provide easier access to DDBJ databases. PMID:19850725

  6. An accurate tool for the fast generation of dark matter halo catalogues

    NASA Astrophysics Data System (ADS)

    Monaco, P.; Sefusatti, E.; Borgani, S.; Crocce, M.; Fosalba, P.; Sheth, R. K.; Theuns, T.

    2013-08-01

    We present a new parallel implementation of the PINpointing Orbit Crossing-Collapsed HIerarchical Objects (PINOCCHIO) algorithm, a quick tool, based on Lagrangian Perturbation Theory, for the hierarchical build-up of dark matter (DM) haloes in cosmological volumes. To assess its ability to predict halo correlations on large scales, we compare its results with those of an N-body simulation of a 3 h-1 Gpc box sampled with 20483 particles taken from the MICE suite, matching the same seeds for the initial conditions. Thanks to the Fastest Fourier Transforms in the West (FFTW) libraries and to the relatively simple design, the code shows very good scaling properties. The CPU time required by PINOCCHIO is a tiny fraction (˜1/2000) of that required by the MICE simulation. Varying some of PINOCCHIO numerical parameters allows one to produce a universal mass function that lies in the range allowed by published fits, although it underestimates the MICE mass function of Friends-of-Friends (FoF) haloes in the high-mass tail. We compare the matter-halo and the halo-halo power spectra with those of the MICE simulation and find that these two-point statistics are well recovered on large scales. In particular, when catalogues are matched in number density, agreement within 10 per cent is achieved for the halo power spectrum. At scales k > 0.1 h Mpc-1, the inaccuracy of the Zel'dovich approximation in locating halo positions causes an underestimate of the power spectrum that can be modelled as a Gaussian factor with a damping scale of d = 3 h-1 Mpc at z = 0, decreasing at higher redshift. Finally, a remarkable match is obtained for the reduced halo bispectrum, showing a good description of non-linear halo bias. Our results demonstrate the potential of PINOCCHIO as an accurate and flexible tool for generating large ensembles of mock galaxy surveys, with interesting applications for the analysis of large galaxy redshift surveys.

  7. Space Laboratory on a Table Top: A Next Generative ECLSS design and diagnostic tool

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale-time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data

  8. Re-Imagining Specialized STEM Academies: Igniting and Nurturing "Decidedly Different Minds", by Design

    ERIC Educational Resources Information Center

    Marshall, Stephanie Pace

    2010-01-01

    This article offers a personal vision and conceptual design for reimagining specialized science, technology, engineering, and mathematics (STEM) academies designed to nurture "decidedly different" STEM minds and ignite a new generation of global STEM talent, innovation, and entrepreneurial leadership. This design enables students to engage…

  9. Existing computer applications, maintain or redesign: how to decide

    SciTech Connect

    Brice, L.

    1981-01-01

    Maintenance of large applications programs is an aspect of performance management that has been largely ignored by those studies that attempt to bring structure to the software production environment. Maintenance in this paper means: fixing bugs, modifying current design features, adding enhancements, and porting applications to other computer systems. It is often difficult to decide whether to maintain or redesign. One reason for the difficulty is that good models and methods do not exist for differentiating between those programs that should be maintained and those that should be redesigned. This enigma is illustrated by the description of a large application case study. The application was monitored for maintenance effort, thereby providing some insight into the redesign/maintain decision. Those tools which currently exist for the collection and measurement of performance data are highlighted. Suggestions are then made for yet other categories of data, difficult to collect and measure, yet ultimately necessary for the establishment of accurate predictions about the value of maintaining versus the value of redesigning. Finally, it is concluded that this aspect of performance management deserves increased attention in order to establish better guidelines with which to aid management in making the necessary but difficult decision: maintain or redesign.

  10. Career Cruising Impact on the Self Efficacy of Deciding Majors

    ERIC Educational Resources Information Center

    Smother, Anthony William

    2012-01-01

    The purpose of this study was to analyze the impact of "Career Cruising"© on self-efficacy of deciding majors in a university setting. The use of the self-assessment instrument, "Career Cruising"©, was used with measuring the career-decision making self-efficacy in a pre and post-test with deciding majors. The independent…

  11. Tools for Generating Useful Time-series Data from PhenoCam Images

    NASA Astrophysics Data System (ADS)

    Milliman, T. E.; Friedl, M. A.; Frolking, S.; Hufkens, K.; Klosterman, S.; Richardson, A. D.; Toomey, M. P.

    2012-12-01

    The PhenoCam project (http://phenocam.unh.edu/) is tasked with acquiring, processing, and archiving digital repeat photography to be used for scientific studies of vegetation phenological processes. Over the past 5 years the PhenoCam project has collected over 2 million time series images for a total over 700 GB of image data. Several papers have been published describing derived "vegetation indices" (such as green-chromatic-coordinate or gcc) which can be compared to standard measures such as NDVI or EVI. Imagery from our archive is available for download but converting series of images for a particular camera into useful scientific data, while simple in principle, is complicated by a variety of factors. Cameras are often exposed to harsh weather conditions (high wind, rain, ice, snow pile up), which result in images where the field of view (FOV) is partially obscured or completely blocked for periods of time. The FOV can also change for other reasons (mount failures, tower maintenance, etc.) Some of the relatively inexpensive cameras that are being used can also temporarily lose color balance or exposure controls resulting in loss of imagery. All these factors negatively influence the automated analysis of the image time series making this a non-trivial task. Here we discuss the challenges of processing PhenoCam image time-series for vegetation monitoring and the associated data management tasks. We describe our current processing framework and a simple standardized output format for the resulting time-series data. The time-series data in this format will be generated for specific "regions of interest" (ROI's) for each of the cameras in the PhenoCam network. This standardized output (which will be updated daily) can be considered 'the pulse' of a particular camera and will provide a default phenological dynamic for said camera. The time-series data can also be viewed as a higher level product which can be used to generate "vegetation indices", like gcc, for

  12. Systems Prototyping with Fourth Generation Tools: One Answer to the Productivity Puzzle? AIR 1983 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Sholtys, Phyllis A.

    The development of information systems using an engineering approach employing both traditional programming techniques and nonprocedural languages is described. A fourth generation application tool is used to develop a prototype system that is revised and expanded as the user clarifies individual requirements. When fully defined, a combination of…

  13. Generating Animal and Tool Names: An fMRI Study of Effective Connectivity

    ERIC Educational Resources Information Center

    Vitali, P.; Abutalebi, J.; Tettamanti, M.; Rowe, J.; Scifo, P.; Fazio, F.; Cappa, S.F.; Perani, D.

    2005-01-01

    The present fMRI study of semantic fluency for animal and tool names provides further evidence for category-specific brain activations, and reports task-related changes in effective connectivity among defined cerebral regions. Two partially segregated systems of functional integration were highlighted: the tool condition was associated with an…

  14. Future generations of horizontal tools will make tighter turns and last longer

    SciTech Connect

    Lyle, D.

    1995-10-01

    Operators want horizontal tools that turn tighter and last longer, and manufacturers are working to meet the need. An operator needs control of tools in the hole to drill a good horizontal well, and service and supply companies are trying to improve that control.

  15. HAPCAD: An open-source tool to detect PCR crossovers in next-generation sequencing generated HLA data.

    PubMed

    McDevitt, Shana L; Bredeson, Jessen V; Roy, Scott W; Lane, Julie A; Noble, Janelle A

    2016-03-01

    Next-generation sequencing (NGS) based HLA genotyping can generate PCR artifacts corresponding to IMGT/HLA Database alleles, for which multiple examples have been observed, including sequence corresponding to the HLA-DRB1(∗)03:42 allele. Repeat genotyping of 131 samples, previously genotyped as DRB1(∗)03:01 homozygotes using probe-based methods, resulted in the heterozygous call DRB1(∗)03:01+DRB1(∗)03:42. The apparent rare DRB1(∗)03:42 allele is hypothesized to be a "hybrid amplicon" generated by PCR crossover, a process in which a partial PCR product denatures from its template, anneals to a different allele template, and extends to completion. Unlike most PCR crossover products, "hybrid amplicons" always corresponds to an IMGT/HLA Database allele, necessitating a case-by-case analysis of whether its occurrence reflects the actual allele or is simply the result of PCR crossover. The Hybrid Amplicon/PCR Crossover Artifact Detector (HAPCAD) program mimics jumping PCR in silico and flags allele sequences that may also be generated as hybrid amplicon. PMID:26802209

  16. deepTools2: a next generation web server for deep-sequencing data analysis

    PubMed Central

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-01-01

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de. The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. PMID:27079975

  17. deepTools2: a next generation web server for deep-sequencing data analysis.

    PubMed

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-01

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. PMID:27079975

  18. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    NASA Astrophysics Data System (ADS)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  19. A surface data generation method of optical micro-structure and analysis system for Fast Tool Servo fabricating

    NASA Astrophysics Data System (ADS)

    Yang, Fan; Dai, Yi-fan; Wan, Fei; Wang, Gui-lin

    2010-10-01

    High-precision optical micro-structured components are now widely used in the field of military and civilian use. Ultraprecision machining with a fast tool servo (FTS) is one of the leading methodologies for fabrication of such surfaces. The first important issue that faced in ultra-precision and high-effectively fabricating is how to properly describe the complex shapes based on the principle of FTS. In order to meet the demands of FTS machining that need for tool high-frequency response, high data throughput and huge memory space, an off-line discrete data points generation method for microstructure surfaces is presented which can avoid on-line shape calculation in fabricating process. A new analysis software package is developed to compute the speed, acceleration and spectrum over the generated data points which helps to analysis the tool tracking characteristics needed in fabricating. Also a new mechanism for FTS machining data transmission based on the huge-capacity storage device is proposed. Experiments show that the off-line surface data generation method and data transfer mechanism can effectively improve FTS fabricating efficiency, the surface analysis software can help to determine the machining ability of tool-holder and to guide and optimize the processing parameters such as spindle speed, feed rate, etc.

  20. Conceptual Systems Model as a Tool for Hypothesis Generation and Testing in Ecotoxicological Research

    EPA Science Inventory

    Microarray, proteomic, and metabonomic technologies are becoming increasingly accessible as tools for ecotoxicology research. Effective use of these technologies will depend, at least in part, on the ability to apply these techniques within a paradigm of hypothesis driven researc...

  1. Generated spiral bevel gears - Optimal machine-tool settings and tooth contact analysis

    NASA Technical Reports Server (NTRS)

    Litvin, F. L.; Tsung, W.-J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  2. Generated spiral bevel gears: Optimal machine-tool settings and tooth contact analysis

    NASA Technical Reports Server (NTRS)

    Litvin, F. L.; Tsung, W. J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  3. A Study on Tooling and Its Effect on Heat Generation and Mechanical Properties of Welded Joints in Friction Stir Welding

    NASA Astrophysics Data System (ADS)

    Tikader, Sujoy; Biswas, Pankaj; Puri, Asit Baran

    2016-06-01

    Friction stir welding (FSW) has been the most attracting solid state welding process as it serves numerous advantages like good mechanical, metallurgical properties etc. Non weldable aluminium alloys like 5XXX, 7XXX series can be simply joined by this process. In this present study a mathematical model has been developed and experiments were successfully performed to evaluate mechanical properties of FSW on similar aluminium alloys i.e. AA1100 for different process parameters and mainly two kind of tool geometry (straight cylindrical and conical or cylindrical tapered shaped pin with flat shoulder). Tensile strength and micro hardness for different process parameters are reported of the welded plate sample. It was noticed that in FSW of similar alloy with tool made of SS-310 tool steel, friction is the major contributor for the heat generation. It was seen that tool geometry, tool rotational speed, plunging force by the tool and traverse speed have significant effect on tensile strength and hardness of friction stir welded joints.

  4. Generator program for computer-assisted instruction: MACGEN. A software tool for generating computer-assisted instructional texts.

    PubMed

    Utsch, M J; Ingram, D

    1983-01-01

    This publication describes MACGEN, an interactive development tool to assist teachers to create, modify and extend case simulations, tutorial exercises and multiple-choice question tests designed for computer-aided instruction. The menu-driven software provides full authoring facilities for text files in MACAID format by means of interactive editing. Authors are prompted for items which they might want to change whereas all user-independent items are provided automatically. Optional default values and explanatory messages are available with every prompt. Errors are corrected automatically or commented upon. Thus the program eliminates the need to familiarize with a new language or details of the text file structure. The options for modification of existing text files include display, renumbering of frames and a line-oriented editor. The resulting text files can be interpreted by the MACAID driver without further changes. The text file is held as ASCII records and as such is also accessible with many standard word-processing systems if desired. PMID:6362978

  5. miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.

    PubMed

    Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M

    2009-07-01

    Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/. PMID:19433510

  6. ArcCN-Runoff: An ArcGIS tool for generating curve number and runoff maps

    USGS Publications Warehouse

    Zhan, X.; Huang, M.-L.

    2004-01-01

    The development and the application of ArcCN-Runoff tool, an extension of ESRI@ ArcGIS software, are reported. This tool can be applied to determine curve numbers and to calculate runoff or infiltration for a rainfall event in a watershed. Implementation of GIS techniques such as dissolving, intersecting, and a curve-number reference table improve efficiency. Technical processing time may be reduced from days, if not weeks, to hours for producing spatially varied curve number and runoff maps. An application example for a watershed in Lyon County and Osage County, Kansas, USA, is presented. ?? 2004 Elsevier Ltd. All rights reserved.

  7. Virtual tool mark generation for efficient striation analysis in forensic science

    SciTech Connect

    Ekstrand, Laura

    2012-01-01

    In 2009, a National Academy of Sciences report called for investigation into the scienti c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served to con rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5 and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range of angles

  8. WORD STATISTICS IN THE GENERATION OF SEMANTIC TOOLS FOR INFORMATION SYSTEMS.

    ERIC Educational Resources Information Center

    STONE, DON C.

    ONE OF THE PROBLEMS IN INFORMATION STORAGE AND RETRIEVAL SYSTEMS OF TECHNICAL DOCUMENTS IS THE INTERPRETATION OF WORDS USED TO INDEX DOCUMENTS. SEMANTIC TOOLS, DEFINED AS CHANNELS FOR THE COMMUNICATION OF WORD MEANINGS BETWEEN TECHNICAL EXPERTS, DOCUMENT INDEXERS, AND SEARCHERS, PROVIDE ONE METHOD OF DEALING WITH THE PROBLEM OF MULTIPLE…

  9. Generation of orientation tools for automated zebrafish screening assays using desktop 3D printing

    PubMed Central

    2014-01-01

    Background The zebrafish has been established as the main vertebrate model system for whole organism screening applications. However, the lack of consistent positioning of zebrafish embryos within wells of microtiter plates remains an obstacle for the comparative analysis of images acquired in automated screening assays. While technical solutions to the orientation problem exist, dissemination is often hindered by the lack of simple and inexpensive ways of distributing and duplicating tools. Results Here, we provide a cost effective method for the production of 96-well plate compatible zebrafish orientation tools using a desktop 3D printer. The printed tools enable the positioning and orientation of zebrafish embryos within cavities formed in agarose. Their applicability is demonstrated by acquiring lateral and dorsal views of zebrafish embryos arrayed within microtiter plates using an automated screening microscope. This enables the consistent visualization of morphological phenotypes and reporter gene expression patterns. Conclusions The designs are refined versions of previously demonstrated devices with added functionality and strongly reduced production costs. All corresponding 3D models are freely available and digital design can be easily shared electronically. In combination with the increasingly widespread usage of 3D printers, this provides access to the developed tools to a wide range of zebrafish users. Finally, the design files can serve as templates for other additive and subtractive fabrication methods. PMID:24886511

  10. DRIVE Analysis Tool Generates Custom Vehicle Drive Cycles Based on Real-World Data (Fact Sheet)

    SciTech Connect

    Not Available

    2013-04-01

    This fact sheet from the National Renewable Energy Laboratory describes the Drive-Cycle Rapid Investigation, Visualization, and Evaluation (DRIVE) analysis tool, which uses GPS and controller area network data to characterize vehicle operation and produce custom vehicle drive cycles, analyzing thousands of hours of data in a matter of minutes.

  11. Generating genomic tools for blueberry improvement -- an update of our progress

    Technology Transfer Automated Retrieval System (TEKTRAN)

    There is increased demand for and consumption of blueberries worldwide because of their many recognized health benefits. Great strides have been made in blueberry cultivar development since its domestication using traditional breeding approaches. However, genomic tools are lacking in blueberry, whic...

  12. Generating and Analyzing Visual Representations of Conic Sections with the Use of Technological Tools

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Espinosa-Perez, Hugo; Reyes-Rodriguez, Aaron

    2006-01-01

    Technological tools have the potential to offer students the possibility to represent information and relationships embedded in problems and concepts in ways that involve numerical, algebraic, geometric, and visual approaches. In this paper, the authors present and discuss an example in which an initial representation of a mathematical object…

  13. The Three-Generation Pedigree: A Critical Tool in Cancer Genetics Care.

    PubMed

    Mahon, Suzanne M

    2016-09-01

    The family history, a rather low-tech tool, is the backbone of genetic assessment and guides risk assessment and genetic testing decisions. The importance of the pedigree and its application to genetic practice is often overlooked and underestimated. Unfortunately, particularly with electronic health records, standard pedigrees are not routinely constructed. A clear understanding of how pedigrees are employed in clinical oncology practice may lead to improved collection and use of family history data.
. PMID:27541558

  14. SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool

    NASA Technical Reports Server (NTRS)

    Boyer, Jeffrey S.

    1994-01-01

    Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  15. Free Tools and Strategies for the Generation of 3D Finite Element Meshes: Modeling of the Cardiac Structures

    PubMed Central

    Pavarino, E.; Neves, L. A.; Machado, J. M.; de Godoy, M. F.; Shiyou, Y.; Momente, J. C.; Zafalon, G. F. D.; Pinto, A. R.; Valêncio, C. R.

    2013-01-01

    The Finite Element Method is a well-known technique, being extensively applied in different areas. Studies using the Finite Element Method (FEM) are targeted to improve cardiac ablation procedures. For such simulations, the finite element meshes should consider the size and histological features of the target structures. However, it is possible to verify that some methods or tools used to generate meshes of human body structures are still limited, due to nondetailed models, nontrivial preprocessing, or mainly limitation in the use condition. In this paper, alternatives are demonstrated to solid modeling and automatic generation of highly refined tetrahedral meshes, with quality compatible with other studies focused on mesh generation. The innovations presented here are strategies to integrate Open Source Software (OSS). The chosen techniques and strategies are presented and discussed, considering cardiac structures as a first application context. PMID:23762031

  16. Some decidable results on reachability of solvable systems

    NASA Astrophysics Data System (ADS)

    Xu, Ming; Zhu, Jiaqi; Li, Zhi-Bin

    2013-05-01

    Reachability analysis plays an important role in verifying the safety of modern control systems. In the existing work, there are many decidable results on reachability of discrete systems. For continuous systems, however, the known decidable results are established merely for linear systems. In this paper, we propose a class of nonlinear systems (named solvable systems) extending linear systems. We first show that their solutions are of closed form. On the basis of it, we study a series of reachability problems for various subclasses of solvable systems. Our main results are that these reachability problems are decidable by manipulations in number theory, real root isolation, and quantifier elimination. Finally the decision procedures are implemented in a Maple package REACH to solve several non-trivial examples.

  17. Arkose: A Prototype Mechanism and Tool for Collaborative Information Generation and Distillation

    ERIC Educational Resources Information Center

    Nam, Kevin Kyung

    2010-01-01

    The goals of this thesis have been to gain a better understanding of collaborative knowledge sharing and distilling and to build a prototype collaborative system that supports flexible knowledge generation and distillation. To reach these goals, I have conducted two user studies and built two systems. The first system, Arkose 1.0, is a…

  18. Photography as a Data Generation Tool for Qualitative Inquiry in Education.

    ERIC Educational Resources Information Center

    Cappello, Marva

    This paper discusses the ways in which photography was used for data generation in a 9-month qualitative study on a mixed-age elementary school classroom. Through a review of the research literature in anthropology, sociology, and education, and an analysis of the research data, the usefulness of photography for educational research with young…

  19. Messaging, Gaming, Peer-to-Peer Sharing: Language Learning Strategies & Tools for the Millennial Generation

    ERIC Educational Resources Information Center

    Godwin-Jones, Bob

    2005-01-01

    The next generation's enthusiasm for instant messaging, videogames, and peer-to-peer file swapping is likely to be dismissed by their elders as so many ways to waste time and avoid the real worlds of work or school. But these activities may not be quite as vapid as they may seem from the perspective of outsiders--or educators. Researchers point…

  20. The Development of a Tool for Semi-Automated Generation of Structured and Unstructured Grids about Isolated Rotorcraft Blades

    NASA Technical Reports Server (NTRS)

    Shanmugasundaram, Ramakrishnan; Garriz, Javier A.; Samareh, Jamshid A.

    1997-01-01

    The grid generation used to model rotorcraft configurations for Computational Fluid Dynamics (CFD) analysis is highly complicated and time consuming. The highly complex geometry and irregular shapes encountered in entire rotorcraft configurations are typically modeled using overset grids. Another promising approach is to utilize unstructured grid methods. With either approach the majority of time is spent manually setting up the topology. For less complicated geometries such as isolated rotor blades, less time is obviously required. This paper discusses the capabilities of a tool called Rotor blade Optimized Topology Organizer and Renderer(ROTOR) being developed to quickly generate block structured grids and unstructured tetrahedral grids about isolated blades. The key algorithm uses individual airfoil sections to construct a Non-Uniform Rational B-Spline(NURBS) surface representation of the rotor blade. This continuous surface definition can be queried to define the block topology used in constructing a structured mesh around the rotor blade. Alternatively, the surface definition can be used to define the surface patches and grid cell spacing requirements for generating unstructured surface and volume grids. Presently, the primary output for ROTOR is block structured grids using 0-H and H-H topologies suitable for full-potential solvers. This paper will discuss the present capabilities of the tool and highlight future work.

  1. Environmental epigenetics: A promising venue for developing next-generation pollution biomonitoring tools in marine invertebrates.

    PubMed

    Suarez-Ulloa, Victoria; Gonzalez-Romero, Rodrigo; Eirin-Lopez, Jose M

    2015-09-15

    Environmental epigenetics investigates the cause-effect relationships between specific environmental factors and the subsequent epigenetic modifications triggering adaptive responses in the cell. Given the dynamic and potentially reversible nature of the different types of epigenetic marks, environmental epigenetics constitutes a promising venue for developing fast and sensible biomonitoring programs. Indeed, several epigenetic biomarkers have been successfully developed and applied in traditional model organisms (e.g., human and mouse). Nevertheless, the lack of epigenetic knowledge in other ecologically and environmentally relevant organisms has hampered the application of these tools in a broader range of ecosystems, most notably in the marine environment. Fortunately, that scenario is now changing thanks to the growing availability of complete reference genome sequences along with the development of high-throughput DNA sequencing and bioinformatic methods. Altogether, these resources make the epigenetic study of marine organisms (and more specifically marine invertebrates) a reality. By building on this knowledge, the present work provides a timely perspective highlighting the extraordinary potential of environmental epigenetic analyses as a promising source of rapid and sensible tools for pollution biomonitoring, using marine invertebrates as sentinel organisms. This strategy represents an innovative, groundbreaking approach, improving the conservation and management of natural resources in the oceans. PMID:26088539

  2. Rational protein design: developing next-generation biological therapeutics and nanobiotechnological tools.

    PubMed

    Wilson, Corey J

    2015-01-01

    Proteins are the most functionally diverse macromolecules observed in nature, participating in a broad array of catalytic, biosensing, transport, scaffolding, and regulatory functions. Fittingly, proteins have become one of the most promising nanobiotechnological tools to date, and through the use of recombinant DNA and other laboratory methods we have produced a vast number of biological therapeutics derived from human genes. Our emerging ability to rationally design proteins (e.g., via computational methods) holds the promise of significantly expanding the number and diversity of protein therapies and has opened the gateway to realizing true and uncompromised personalized medicine. In the last decade computational protein design has been transformed from a set of fundamental strategies to stringently test our understanding of the protein structure-function relationship, to practical tools for developing useful biological processes, nano-devices, and novel therapeutics. As protein design strategies improve (i.e., in terms of accuracy and efficiency) clinicians will be able to leverage individual genetic data and biological metrics to develop and deliver personalized protein therapeutics with minimal delay. PMID:25348497

  3. An Automated and Minimally Invasive Tool for Generating Autologous Viable Epidermal Micrografts

    PubMed Central

    Osborne, Sandra N.; Schmidt, Marisa A.; Harper, John R.

    2016-01-01

    ABSTRACT OBJECTIVE: A new epidermal harvesting tool (CelluTome; Kinetic Concepts, Inc, San Antonio, Texas) created epidermal micrografts with minimal donor site damage, increased expansion ratios, and did not require the use of an operating room. The tool, which applies both heat and suction concurrently to normal skin, was used to produce epidermal micrografts that were assessed for uniform viability, donor-site healing, and discomfort during and after the epidermal harvesting procedure. DESIGN: This study was a prospective, noncomparative institutional review board–approved healthy human study to assess epidermal graft viability, donor-site morbidity, and patient experience. SETTING: These studies were conducted at the multispecialty research facility, Clinical Trials of Texas, Inc, San Antonio. PATIENTS: The participants were 15 healthy human volunteers. RESULTS: The average viability of epidermal micrografts was 99.5%. Skin assessment determined that 76% to 100% of the area of all donor sites was the same in appearance as the surrounding skin within 14 days after epidermal harvest. A mean pain of 1.3 (on a scale of 1 to 5) was reported throughout the harvesting process. CONCLUSIONS: Use of this automated, minimally invasive harvesting system provided a simple, low-cost method of producing uniformly viable autologous epidermal micrografts with minimal patient discomfort and superficial donor-site wound healing within 2 weeks. PMID:26765157

  4. Protein engineering for metabolic engineering: Current and next-generation tools

    SciTech Connect

    Marcheschi, RJ; Gronenberg, LS; Liao, JC

    2013-04-16

    Protein engineering in the context of metabolic engineering is increasingly important to the field of industrial biotechnology. As the demand for biologically produced food, fuels, chemicals, food additives, and pharmaceuticals continues to grow, the ability to design and modify proteins to accomplish new functions will be required to meet the high productivity demands for the metabolism of engineered organisms. We review advances in selecting, modeling, and engineering proteins to improve or alter their activity. Some of the methods have only recently been developed for general use and are just beginning to find greater application in the metabolic engineering community. We also discuss methods of generating random and targeted diversity in proteins to generate mutant libraries for analysis. Recent uses of these techniques to alter cofactor use; produce non-natural amino acids, alcohols, and carboxylic acids; and alter organism phenotypes are presented and discussed as examples of the successful engineering of proteins for metabolic engineering purposes.

  5. Generation of histo-anatomically representative models of the individual heart: tools and application

    PubMed Central

    Plank, Gernot; Burton, Rebecca A. B.; Hales, Patrick; Bishop, Martin; Mansoori, Tahir; Bernabeu, Miguel; Garny, Alan; Prassl, Anton J.; Bollensdorff, Christian; Mason, Fleur; Mahmood, Fahd; Rodriguez, Blanca; Grau, Vicente; Schneider, Jürgen E.; Gavaghan, David; Kohl, Peter

    2010-01-01

    This paper presents methods to build histo-anatomically detailed individualised cardiac models. The models are based on high-resolution 3D anatomical and/or diffusion tensor magnetic resonance images, combined with serial histological sectioning data, and are used to investigate individualised cardiac function. The current state-of-the-art is reviewed, and its limitations are discussed. We assess the challenges associated with the generation of histo-anatomically representative individualised in-silico models of the heart. The entire processing pipeline including image acquisition, image processing, mesh generation, model set-up and execution of computer simulations, and the underlying methods are described. The multi-faceted challenges associated with these goals are highlighted, suitable solutions are proposed, and an important application of developed high-resolution structure-function models in elucidating the effect of individual structural heterogeneity upon wavefront dynamics is demonstrated. PMID:19414455

  6. Protein engineering for metabolic engineering: current and next-generation tools

    PubMed Central

    Marcheschi, Ryan J.; Gronenberg, Luisa S.; Liao, James C.

    2014-01-01

    Protein engineering in the context of metabolic engineering is increasingly important to the field of industrial biotechnology. As the demand for biologically-produced food, fuels, chemicals, food additives, and pharmaceuticals continues to grow, the ability to design and modify proteins to accomplish new functions will be required to meet the high productivity demands for the metabolism of engineered organisms. This article reviews advances of selecting, modeling, and engineering proteins to improve or alter their activity. Some of the methods have only recently been developed for general use and are just beginning to find greater application in the metabolic engineering community. We also discuss methods of generating random and targeted diversity in proteins to generate mutant libraries for analysis. Recent uses of these techniques to alter cofactor use, produce non-natural amino acids, alcohols, and carboxylic acids, and alter organism phenotypes are presented and discussed as examples of the successful engineering of proteins for metabolic engineering purposes. PMID:23589443

  7. Mid-water Software Tools and the Application to Processing and Analysis of the Latest Generation Multibeam Sonars

    NASA Astrophysics Data System (ADS)

    Gee, L.; Doucet, M.

    2010-12-01

    The latest generation of multibeam sonars now has the ability to map the water-column, along with the seafloor. Currently, the users of these sonars have a limited view of the mid-water data in real-time, and if they do store the data, they are restricted to replaying it only, with no ability for further analysis. The water-column data has the potential to address a number of research areas including detection of small targets (wrecks, etc.) above the seabed, mapping of fish and marine mammals and a wide range of physical oceanographic processes. However, researchers have been required to develop their own in-house software tools before they can even begin their study of the water column data. This paper describes the development of more general software tools for the full processing of raw sonar data (bathymetry, backscatter and water-column) to yield output products suitable for visualization in a 4D time-synchronized environment. The huge water-column data volumes generated by the new sonars, combined with the variety of data formats from the different sonar manufacturers, provides a significant challenge in the design and development of tools that can be applied to the wide variety of applications. The development of the mid-water tools on this project addressed this problem by using a unified way of storing the water column data in a generic water column format (GWC). The sonar data are converted into the GWC by re-integrating the water column packets with time-based navigation and attitude, such that downstream in the workflow, the tools will have access to all relevant data of any particular ping. Dependent on the application and the resolution requirements, the conversion process also allows simple sub-sampling. Additionally, each file is indexed to enable fast non-linear lookup and extraction of any packet type or packet type collection in the sonar file. These tools also fully exploit multi-core and hyper-threading technologies to maximize the throughput

  8. Thrombin generation assay: a new tool to predict and optimize clinical outcome in cardiovascular patients?

    PubMed

    Campo, Gianluca; Pavasini, Rita; Pollina, Alberto; Fileti, Luca; Marchesini, Jlenia; Tebaldi, Matteo; Ferrari, Roberto

    2012-12-01

    Antithrombotic therapy (including antiplatelet and anticoagulant drugs) is the cornerstone of the current medical treatment of patients with acute coronary syndromes (ACS). This therapy and particularly the new antiplatelet and anticoagulant drugs have significantly reduced the ischemic risk, but have increased bleeding complications. Recently, several studies have emphasized the negative prognostic impact on long-term mortality of these bleeding adverse events. Thus, new assays to estimate the bleeding risk and the efficacy of these antithrombotic drugs are clearly in demand. Regarding the anticoagulant drugs, new promising data have emerged about the thrombin generation assay (TGA). TGA measures the ability of plasma to generate thrombin. TGA may be used to check coagulation function, to value risk of thrombosis and to compare the efficacy of different anticoagulants employed in clinical management of patients with ACS. The TGA result is a curve which describes the variation of thrombin's amount during the activation of the coagulation cascade. All available anticoagulant drugs influence the principal parameters generated by TGA and so it is possible to evaluate the effects of the medical treatment. In this review we provide a brief description of the assay and we summarize the principals of previous studies by analyzing the relationship between anticoagulant drugs and TGA. Moreover, a brief summary of its ability to predict ischemic and bleeding risks has been provided. PMID:22688556

  9. Cognitive avionics and watching spaceflight crews think: generation-after-next research tools in functional neuroimaging.

    PubMed

    Genik, Richard J; Green, Christopher C; Graydon, Francis X; Armstrong, Robert E

    2005-06-01

    Confinement and isolation have always confounded the extraordinary endeavor of human spaceflight. Psychosocial health is at the forefront in considering risk factors that imperil missions of 1- to 2-yr duration. Current crewmember selection metrics restricted to behavioral observation by definition observe rather than prevent performance degradation and are thus inadequate when preflight training cannot simulate an entire journey. Nascent techniques to monitor functional and task-related cortical neural activity show promise and can be extended to include whole-brain monitoring. Watching spaceflight crews think can reveal the efficiency of training procedures. Moreover, observing subcortical emotion centers may provide early detection of developing neuropsychiatric disorders. The non-invasive functional neuroimaging modalities electroencephalography (EEG), magnetoencephalography (MEG), magnetic resonance imaging (MRI), and near-infrared spectroscopy (NIRS), and highlights of how they may be engineered for spacecraft are detailed. Preflight and in-flight applications to crewmember behavioral health from current generation, next generation, and generation-after-next neuroscience research studies are also described. The emphasis is on preventing the onset of neuropsychiatric dysfunctions, thus reducing the risk of mission failure due to human error. PMID:15943214

  10. Next-Generation Sequencing: A Review of Technologies and Tools for Wound Microbiome Research

    PubMed Central

    Hodkinson, Brendan P.; Grice, Elizabeth A.

    2015-01-01

    Significance: The colonization of wounds by specific microbes or communities of microbes may delay healing and/or lead to infection-related complication. Studies of wound-associated microbial communities (microbiomes) to date have primarily relied upon culture-based methods, which are known to have extreme biases and are not reliable for the characterization of microbiomes. Biofilms are very resistant to culture and are therefore especially difficult to study with techniques that remain standard in clinical settings. Recent Advances: Culture-independent approaches employing next-generation DNA sequencing have provided researchers and clinicians a window into wound-associated microbiomes that could not be achieved before and has begun to transform our view of wound-associated biodiversity. Within the past decade, many platforms have arisen for performing this type of sequencing, with various types of applications for microbiome research being possible on each. Critical Issues: Wound care incorporating knowledge of microbiomes gained from next-generation sequencing could guide clinical management and treatments. The purpose of this review is to outline the current platforms, their applications, and the steps necessary to undertake microbiome studies using next-generation sequencing. Future Directions: As DNA sequencing technology progresses, platforms will continue to produce longer reads and more reads per run at lower costs. A major future challenge is to implement these technologies in clinical settings for more precise and rapid identification of wound bioburden. PMID:25566414

  11. Bone Marrow Transplantation in Mice as a Tool to Generate Genetically Modified Animals

    NASA Astrophysics Data System (ADS)

    Rőszer, Tamás; Pintye, Éva; Benkő, Ilona

    2008-12-01

    Transgenic mice can be used either as models of known inherited human diseases or can be applied to perform phenotypic tests of genes with unknown function. In some special applications of gene modification we have to create a tissue specific mutation of a given gene. In some cases however the gene modification can be lethal in the intrauterine life, therefore we should engraft the mutated cells in the postnatal life period. After total body irradiation transplantation of bone marrow cells can be a solution to introduce mutant hematopoietic stem cells into a mature animal. Bone marrow transplantation is a useful and novel tool to study the role of hematopoietic cells in the pathogenesis of inflammation, autoimmune syndromes and many metabolic alterations coupled recently to leukocyte functions.

  12. Bone Marrow Transplantation in Mice as a Tool to Generate Genetically Modified Animals

    SciTech Connect

    Roszer, Tamas; Pintye, Eva; Benko', Ilona

    2008-12-08

    Transgenic mice can be used either as models of known inherited human diseases or can be applied to perform phenotypic tests of genes with unknown function. In some special applications of gene modification we have to create a tissue specific mutation of a given gene. In some cases however the gene modification can be lethal in the intrauterine life, therefore we should engraft the mutated cells in the postnatal life period. After total body irradiation transplantation of bone marrow cells can be a solution to introduce mutant hematopoietic stem cells into a mature animal. Bone marrow transplantation is a useful and novel tool to study the role of hematopoietic cells in the pathogenesis of inflammation, autoimmune syndromes and many metabolic alterations coupled recently to leukocyte functions.

  13. Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation Electricore, Inc.

    SciTech Connect

    Daye, Tony

    2013-09-30

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  14. New tool to detect operation anomalies on automatic voltage regulator equipment of large power units; Generator simulator (GS)

    SciTech Connect

    Blanchet, P. )

    1990-01-01

    When large generating plants are installed on site remote from the consumer areas, the operation of network with correct margins of stability is conditioned by adjustment of automatic voltage regulator (AVR). Any spoiled deviation in normal operation or especially in abnormal run must be detected at first overhaul or first shutdown. Then, without delay, this new tool which is the generator simulator (GS) contributes to minimize the time necessary for failures investigation and to qualify again AVR equipment after repair. The two main objectives of this paper are: to qualify the AVR performances of power unit during the scheduled overhaul; and to lighten failures research into AVR system, avoiding faulty dismantling during the unit fortuitous shutdown.

  15. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  16. Consumer Economics, Book I [and] Book II. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Consumer Economics, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday…

  17. Deciding in Democracies: A Role for Thinking Skills?

    ERIC Educational Resources Information Center

    Gardner, Peter

    2014-01-01

    In societies that respect our right to decide many things for ourselves, exercising that right can be a source of anxiety. We want to make the right decisions, which is difficult when we are confronted with complex issues that are usually the preserve of specialists. But is help at hand? Are thinking skills the very things that non-specialists…

  18. 13 CFR 124.1009 - Who decides disadvantaged status protests?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... protests? 124.1009 Section 124.1009 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1009 Who decides...

  19. 13 CFR 124.1009 - Who decides disadvantaged status protests?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... protests? 124.1009 Section 124.1009 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1009 Who decides...

  20. 13 CFR 124.1009 - Who decides disadvantaged status protests?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... protests? 124.1009 Section 124.1009 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1009 Who decides...

  1. 13 CFR 124.1009 - Who decides disadvantaged status protests?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... protests? 124.1009 Section 124.1009 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1009 Who decides...

  2. 13 CFR 124.1009 - Who decides disadvantaged status protests?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... protests? 124.1009 Section 124.1009 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS Eligibility, Certification, and Protests Relating to Federal Small Disadvantaged Business Programs § 124.1009 Who decides...

  3. Deciding when It's Time to Buy a New PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    How to best decide when it's time to replace your PC, whether at home or at work, is always tricky. Spending on computers can make you more productive, but it's money you otherwise cannot spend, invest or save, and faster systems always await you in the future. What is clear is that the computer industry really wants you to buy, and the computer…

  4. An aluminium tool for multiple stellar generations in the globular clusters 47 Tucanae and M 4

    NASA Astrophysics Data System (ADS)

    Carretta, E.; Gratton, R. G.; Bragaglia, A.; D'Orazi, V.; Lucatello, S.

    2013-02-01

    We present aluminium abundances for a sample of about 100 red giant stars in each of the Galactic globular clusters 47 Tuc (NGC 104) and M 4 (NGC 6121). We have derived homogeneous abundances from intermediate-resolution FLAMES/GIRAFFE spectra. Aluminium abundances are from the strong doublet Al i 8772-8773 Å, as in previous works done for giants in NGC 6752 and NGC 1851, and nitrogen abundances are extracted from a large number of features of the CN molecules by assuming a suitable carbon abundance. We added previous homogeneous abundances of O and Na and newly derived abundances of Mg and Si for our samples of 83 stars in M 4 and 116 stars in 47 Tuc to obtain the full set of elements from proton-capture reactions produced by different stellar generations in these clusters. By simultaneously studying the Ne-Na and Mg-Al cycles of H-burning at high temperature, our main aims are to understand the nature of the polluters at work in the first generation and to ascertain whether the second generation of cluster stars was formed in one or, rather, several episodes of star formation. Our data confirm that in M 4 only two stellar populations are visible. On the other hand, for 47 Tuc a cluster analysis performed on our full dataset suggests that at least three distinct groups of stars are present on the giant branch. The abundances of O, Na, Mg, and Al in the intermediate group can be produced within a pollution scenario; results for N are ambiguous, depending on the C abundance we adopt for the three groups. Based on observations collected at ESO telescopes under program 085.D-0205 and on public data from the ESO/ST-ECF Science Archive Facility.Tables 2 and 3 are available in electronic form at http://www.aanda.org

  5. Development of a new generation of active AFM tools for applications in liquids

    NASA Astrophysics Data System (ADS)

    Rollier, A.-S.; Jenkins, D.; Dogheche, E.; Legrand, B.; Faucher, M.; Buchaillot, L.

    2010-08-01

    Atomic force microscopy (AFM) is a powerful imaging tool with high-resolution imaging capability. AFM probes consist of a very sharp tip at the end of a silicon cantilever that can respond to surface artefacts to produce an image of the topography or surface features. They are intrinsically passive devices. For imaging soft biological samples, and also for samples in liquid, it is essential to control the AFM tip position, both statically and dynamically, and this is not possible using external actuators mounted on the AFM chip. AFM cantilevers have been fabricated using silicon micromachining to incorporate a piezoelectric thin film actuator for precise control. The piezoelectric thin films have been fully characterized to determine their actuation performance and to characterize the operation of the integrated device. Examples of the spatial and vertical response are presented to illustrate their imaging capability. For operation in a liquid environment, the dynamic behaviour has been modelled and verified experimentally. The optimal drive conditions for the cantilever, along with their dynamic response, including frequency and phase in air and water, are presented.

  6. Mitochondrial DNA methylation as a next-generation biomarker and diagnostic tool.

    PubMed

    Iacobazzi, Vito; Castegna, Alessandra; Infantino, Vittoria; Andria, Generoso

    2013-01-01

    Recent expansion of our knowledge on epigenetic changes strongly suggests that not only nuclear DNA (nDNA), but also mitochondrial DNA (mtDNA) may be subjected to epigenetic modifications related to disease development, environmental exposure, drug treatment and aging. Thus, mtDNA methylation is attracting increasing attention as a potential biomarker for the detection and diagnosis of diseases and the understanding of cellular behavior in particular conditions. In this paper we review the current advances in mtDNA methylation studies with particular attention to the evidences of mtDNA methylation changes in diseases and physiological conditions so far investigated. Technological advances for the analysis of epigenetic variations are promising tools to provide insights into methylation of mtDNA with similar resolution levels as those reached for nDNA. However, many aspects related to mtDNA methylation are still unclear. More studies are needed to understand whether and how changes in mtDNA methylation patterns, global and gene specific, are associated to diseases or risk factors. PMID:23920043

  7. NASA's Learning Technology Project: Developing Educational Tools for the Next Generation of Explorers

    NASA Astrophysics Data System (ADS)

    Federman, A. N.; Hogan, P. J.

    2003-12-01

    Since 1996, NASA's Learning Technology has pioneered the use of innovative technology toinspire students to pursue careers in STEM(Science, Technology, Engineering and Math.) In the past this has included Web sites like Quest and the Observatorium, webcasts and distance learning courses, and even interactive television broadcasts. Our current focus is on development of several mission oriented software packages, targeted primarily at the middle-school population, but flexible enough to be used by elementary to graduate students. These products include contributions to an open source solar system simulator, a 3D planetary encyclopedia), development of a planetary surface viewer (atlas) and others. Whenever possible these software products are written to be 'open source' and multi-platform, for the widest use and easiest access for developers. Along with the software products, we are developing activities and lesson plans that are tested and used by educators in the classroom. The products are reviewed by professional educators. Together these products constitute the NASA Experential Platform for learning, in which the tools used by the public are similar (and in some respects) the same as those used by professional investigators. Efforts are now underway to incorporate actual MODIS and other real time data uplink capabilities.

  8. Third Harmonic Generation microscopy as a diagnostic tool for the investigation of microglia BV-2 and breast cancer cells activation

    NASA Astrophysics Data System (ADS)

    Gavgiotaki, E.; Filippidis, G.; Psilodimitrakopoulos, S.; Markomanolaki, H.; Kalognomou, M.; Agelaki, S.; Georgoulias, V.; Athanassakis, I.

    2015-07-01

    Nonlinear optical imaging techniques have created new opportunities of research in the biomedical field. Specifically, Third Harmonic Generation (THG) seems to be a suitable noninvasive imaging tool for the delineation and quantification of biological structures at the microscopic level. The aim of this study was to extract information as to the activation state of different cell types by using the THG imaging microscopy as a diagnostic tool. BV-2 microglia cell line was used as a representative biological model enabling the study of resting and activated state of the cells linked to various pathological conditions. Third Harmonic Generation (THG) and Two Photon Excitation Fluorescence (TPEF) measurements were simultaneously collected from stained breast cancer cells, by employing a single homemade experimental apparatus and it was shown that high THG signals mostly arise from lipid bodies. Continuously, BV-2 microglia cells were examined with or without activation by lipopolysaccharide (LPS) in order to discriminate between control and activated cells based on the quantification of THG signals. Statistically quantification was accomplished in both mean area and mean intensity values of THG. The values for mean total area and mean THG intensity values have been increased in activated versus the non-activated cells. Similar studies of quantification are underway in breast cancer cells for the exact discrimination on different cell lines. Furthermore, laser polarization dependence of SHG and THG signal in unstained biological samples is investigated.

  9. Clustering of cochlear oscillations in frequency plateaus as a tool to investigate SOAE generation

    NASA Astrophysics Data System (ADS)

    Epp, Bastian; Wit, Hero; van Dijk, Pim

    2015-12-01

    Spontonaeous otoacoustic emissions (SOAE) reflect the net effect of self-sustained activity in the cochlea, but do not directly provide information about the underlying mechanism and place of origin within the cochlea. The present study investigates if frequency plateaus as found in a linear array of coupled oscillators (OAM) [7] are also found in a transmission line model (TLM) which is able to generate realistic SOAEs [2] and if these frequency plateaus can be used to explain the formation of SOAEs. The simulations showed a clustering of oscillators along the simulated basilar membrane Both, the OAM and the TLM show traveling-wave like behavior along the oscillators coupled into one frequency plateau. While in the TLM roughness is required in order to produce SOAEs, no roughness is required to trigger frequency plateaus in the linear array of oscillators. The formation of frequency plateaus as a consequence of coupling between neighbored active oscillators might be the mechanism underlying SOAEs.

  10. Fractal analysis of experimentally generated pyroclasts: A tool for volcanic hazard assessment

    NASA Astrophysics Data System (ADS)

    Perugini, Diego; Kueppers, Ulrich

    2012-06-01

    Rapid decompression experiments on natural volcanic rocks mimick explosive eruptions. Fragment size distributions (FSD) of such experimentally generated pyroclasts are investigated using fractal geometry. The fractal dimension of fragmentation, D, of FSD is measured for samples from Unzen (Japan) and Popocatépetl (Mexico) volcanoes. Results show that: (i) FSD are fractal and can be quantified by measuring D values; (ii) D increases linearly with potential energy for fragmentation (PEF) and, thus, with increasing applied pressure; (iii) the rate of increase of D with PEF depends on open porosity: the higher the open porosity, the lower the increase of D with PEF; (iv) at comparable open porosity, samples display a similar behavior for any rock composition. The method proposed here has the potential to become a standard routine to estimate eruptive energy of past and recent eruptions using values of D and open porosity, providing an important step towards volcanic hazard assessment.

  11. PLAN-IT-2: The next generation planning and scheduling tool

    NASA Technical Reports Server (NTRS)

    Eggemeyer, William C.; Cruz, Jennifer W.

    1990-01-01

    PLAN-IT is a scheduling program which has been demonstrated and evaluated in a variety of scheduling domains. The capability enhancements being made for the next generation of PLAN-IT, called PLAN-IT-2 is discussed. PLAN-IT-2 represents a complete rewrite of the original PLAN-IT incorporating major changes as suggested by the application experiences with the original PLAN-IT. A few of the enhancements described are additional types of constraints, such as states and resettable-depletables (batteries), dependencies between constraints, multiple levels of activity planning during the scheduling process, pattern constraint searching for opportunities as opposed to just minimizing the amount of conflicts, additional customization construction features for display and handling of diverse multiple time systems, and reduction in both the size and the complexity for creating the knowledge-base to address the different problem domains.

  12. Generative Topographic Mapping (GTM): Universal Tool for Data Visualization, Structure-Activity Modeling and Dataset Comparison.

    PubMed

    Kireeva, N; Baskin, I I; Gaspar, H A; Horvath, D; Marcou, G; Varnek, A

    2012-04-01

    Here, the utility of Generative Topographic Maps (GTM) for data visualization, structure-activity modeling and database comparison is evaluated, on hand of subsets of the Database of Useful Decoys (DUD). Unlike other popular dimensionality reduction approaches like Principal Component Analysis, Sammon Mapping or Self-Organizing Maps, the great advantage of GTMs is providing data probability distribution functions (PDF), both in the high-dimensional space defined by molecular descriptors and in 2D latent space. PDFs for the molecules of different activity classes were successfully used to build classification models in the framework of the Bayesian approach. Because PDFs are represented by a mixture of Gaussian functions, the Bhattacharyya kernel has been proposed as a measure of the overlap of datasets, which leads to an elegant method of global comparison of chemical libraries. PMID:27477099

  13. Development and implementation of an electronic health record generated surgical handoff and rounding tool.

    PubMed

    Raval, Mehul V; Rust, Laura; Thakkar, Rajan K; Kurtovic, Kelli J; Nwomeh, Benedict C; Besner, Gail E; Kenney, Brian D

    2015-02-01

    Electronic health records (EHR) have been adopted across the nation at tremendous effort and expense. The purpose of this study was to assess improvements in accuracy, efficiency, and patient safety for a high-volume pediatric surgical service with adoption of an EHR-generated handoff and rounding list. The quality and quantity of errors were compared pre- and post-EHR-based list implementation. A survey was used to determine time spent by team members using the two versions of the list. Perceived utility, safety, and quality of the list were reported. Serious safety events determined by the hospital were also compared for the two periods. The EHR-based list eliminated clerical errors while improving efficiency by automatically providing data such as vital signs. Survey respondents reported 43 min saved per week per team member, translating to 372 work hours of time saved annually for a single service. EHR-based list users reported higher satisfaction and perceived improvement in efficiency, accuracy, and safety. Serious safety events remained unchanged. In conclusion, creation of an EHR-based list to assist with daily handoffs, rounding, and patient management demonstrated improved accuracy, increased efficiency, and assisted in maintaining a high level of safety. PMID:25631842

  14. Second harmonic generation imaging as a potential tool for staging pregnancy and predicting preterm birth

    NASA Astrophysics Data System (ADS)

    Akins, Meredith L.; Luby-Phelps, Katherine; Mahendroo, Mala

    2010-03-01

    We use second harmonic generation (SHG) microscopy to assess changes in collagen structure of murine cervix during cervical remodeling of normal pregnancy and in a preterm birth model. Visual inspection of SHG images revealed substantial changes in collagen morphology throughout normal gestation. SHG images collected in both the forward and backward directions were analyzed quantitatively for changes in overall mean intensity, forward to backward intensity ratio, collagen fiber size, and porosity. Changes in mean SHG intensity and intensity ratio take place in early pregnancy, suggesting that submicroscopic changes in collagen fibril size and arrangement occur before macroscopic changes become evident. Fiber size progressively increased from early to late pregnancy, while pores between collagen fibers became larger and farther apart. Analysis of collagen features in premature cervical remodeling show that changes in collagen structure are dissimilar from normal remodeling. The ability to quantify multiple morphological features of collagen that characterize normal cervical remodeling and distinguish abnormal remodeling in preterm birth models supports future studies aimed at development of SHG endoscopic devices for clinical assessment of collagen changes during pregnancy in women and for predicting risk of preterm labor which occurs in 12.5% of all pregnancies.

  15. Transposon assisted gene insertion technology (TAGIT): a tool for generating fluorescent fusion proteins.

    PubMed

    Gregory, James A; Becker, Eric C; Jung, James; Tuwatananurak, Ida; Pogliano, Kit

    2010-01-01

    We constructed a transposon (transposon assisted gene insertion technology, or TAGIT) that allows the random insertion of gfp (or other genes) into chromosomal loci without disrupting operon structure or regulation. TAGIT is a modified Tn5 transposon that uses Kan(R) to select for insertions on the chromosome or plasmid, beta-galactosidase to identify in-frame gene fusions, and Cre recombinase to excise the kan and lacZ genes in vivo. The resulting gfp insertions maintain target gene reading frame (to the 5' and 3' of gfp) and are integrated at the native chromosomal locus, thereby maintaining native expression signals. Libraries can be screened to identify GFP insertions that maintain target protein function at native expression levels, allowing more trustworthy localization studies. We here use TAGIT to generate a library of GFP insertions in the Escherichia coli lactose repressor (LacI). We identified fully functional GFP insertions and partially functional insertions that bind DNA but fail to repress the lacZ operon. Several of these latter GFP insertions localize to lacO arrays integrated in the E. coli chromosome without producing the elongated cells frequently observed when functional LacI-GFP fusions are used in chromosome tagging experiments. TAGIT thereby faciliates the isolation of fully functional insertions of fluorescent proteins into target proteins expressed from the native chromosomal locus as well as potentially useful partially functional proteins. PMID:20090956

  16. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  17. Generation of Fluorogen-Activating Designed Ankyrin Repeat Proteins (FADAs) as Versatile Sensor Tools.

    PubMed

    Schütz, Marco; Batyuk, Alexander; Klenk, Christoph; Kummer, Lutz; de Picciotto, Seymour; Gülbakan, Basri; Wu, Yufan; Newby, Gregory A; Zosel, Franziska; Schöppe, Jendrik; Sedlák, Erik; Mittl, Peer R E; Zenobi, Renato; Wittrup, K Dane; Plückthun, Andreas

    2016-03-27

    Fluorescent probes constitute a valuable toolbox to address a variety of biological questions and they have become irreplaceable for imaging methods. Commonly, such probes consist of fluorescent proteins or small organic fluorophores coupled to biological molecules of interest. Recently, a novel class of fluorescence-based probes, fluorogen-activating proteins (FAPs), has been reported. These binding proteins are based on antibody single-chain variable fragments and activate fluorogenic dyes, which only become fluorescent upon activation and do not fluoresce when free in solution. Here we present a novel class of fluorogen activators, termed FADAs, based on the very robust designed ankyrin repeat protein scaffold, which also readily folds in the reducing environment of the cytoplasm. The FADA generated in this study was obtained by combined selections with ribosome display and yeast surface display. It enhances the fluorescence of malachite green (MG) dyes by a factor of more than 11,000 and thus activates MG to a similar extent as FAPs based on single-chain variable fragments. As shown by structure determination and in vitro measurements, this FADA was evolved to form a homodimer for the activation of MG dyes. Exploiting the favorable properties of the designed ankyrin repeat protein scaffold, we created a FADA biosensor suitable for imaging of proteins on the cell surface, as well as in the cytosol. Moreover, based on the requirement of dimerization for strong fluorogen activation, a prototype FADA biosensor for in situ detection of a target protein and protein-protein interactions was developed. Therefore, FADAs are versatile fluorescent probes that are easily produced and suitable for diverse applications and thus extend the FAP technology. PMID:26812208

  18. Astronomy as a Tool for Training the Next Generation Technical Workforce

    NASA Astrophysics Data System (ADS)

    Romero, V.; Walsh, G.; Ryan, W.; Ryan, E.

    A major challenge for today's institutes of higher learning is training the next generation of scientists, engineers, and optical specialists to be proficient in the latest technologies they will encounter when they enter the workforce. Although research facilities can offer excellent hands-on instructional opportunities, integrating such experiential learning into academic coursework without disrupting normal operations at such facilities can be difficult. Also, motivating entry level students to increase their skill levels by undertaking and successfully completing difficult coursework can require more creative instructional approaches, including fostering a fun, non-threatening environment for enhancing basic abilities. Astronomy is a universally appealing subject area, and can be very effective as a foundation for cultivating advanced competencies. We report on a project underway at the New Mexico Institute of Mining and Technology (NM Tech), a science and engineering school in Socorro, NM, to incorporate a state-of-the-art optical telescope and laboratory experiments into an entry-level course in basic engineering. Students enrolled in an explosive engineering course were given a topical problem in Planetary Astronomy: they were asked to develop a method to energetically mitigate a potentially hazardous impact between our planet and a Near-Earth asteroid to occur sometime in the future. They were first exposed to basic engineering training in the areas of fracture and material response to failure under different environmental conditions through lectures and traditional laboratory exercises. The students were then given access to NM Tech's Magdalena Ridge Observatory's (MRO) 2.4-meter telescope to collect physical characterization data, (specifically shape information) on two potentially hazardous asteroids (one roughly spherical, the other an elongated ellipsoid). Finally, the students used NM Tech's Energetic Materials Research and Testing Center (EMRTC) to

  19. The Society-Deciders Model and Fairness in Nations

    NASA Astrophysics Data System (ADS)

    Flomenbom, Ophir

    2015-05-01

    Modeling the dynamics in nations from economical and sociological perspectives is a central theme in economics and sociology. Accurate models can predict and therefore help all the world's citizens. Yet, recent years have show that the current models are missing. Here, we develop a dynamical society-deciders model that can explain the stability in a nation, based on concepts from dynamics, ecology and socio-econo-physics; a nation has two groups that interconnect, the deciders and the society. We show that a nation is either stable or it collapses. This depends on just two coefficients that we relate with sociological and economical indicators. We define a new socio-economic indicator, fairness. Fairness can measure the stability in a nation and how probable a change favoring the society is. We compute fairness among all the world's nations. Interestingly, in comparison with other indicators, fairness shows that the USA loses its rank among Western democracies, India is the best among the 15 most populated nations, and Egypt, Libya and Tunisia have significantly improved their rankings as a result of recent revolutions, further increasing the probability of additional positive changes. Within the model, long lasting crises are solved rather than with increasing governmental spending or cuts with regulations that reduce the stability of the deciders, namely, increasing fairness, while, for example, shifting wealth in the direction of the people, and therefore increasing further opportunities.

  20. Generation of a Knockout Mouse Embryonic Stem Cell Line Using a Paired CRISPR/Cas9 Genome Engineering Tool.

    PubMed

    Wettstein, Rahel; Bodak, Maxime; Ciaudo, Constance

    2016-01-01

    CRISPR/Cas9, originally discovered as a bacterial immune system, has recently been engineered into the latest tool to successfully introduce site-specific mutations in a variety of different organisms. Composed only of the Cas9 protein as well as one engineered guide RNA for its functionality, this system is much less complex in its setup and easier to handle than other guided nucleases such as Zinc-finger nucleases or TALENs.Here, we describe the simultaneous transfection of two paired CRISPR sgRNAs-Cas9 plasmids, in mouse embryonic stem cells (mESCs), resulting in the knockout of the selected target gene. Together with a four primer-evaluation system, it poses an efficient way to generate new independent knockout mouse embryonic stem cell lines. PMID:25762293

  1. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives

    PubMed Central

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development. PMID:24564169

  2. Generation of an ABCG2{sup GFPn-puro} transgenic line - A tool to study ABCG2 expression in mice

    SciTech Connect

    Orford, Michael; Mean, Richard; Lapathitis, George; Genethliou, Nicholas; Panayiotou, Elena; Panayi, Helen; Malas, Stavros

    2009-06-26

    The ATP-binding cassette (ABC) transporter 2 (ABCG2) is expressed by stem cells in many organs and in stem cells of solid tumors. These cells are isolated based on the side population (SP) phenotype, a Hoechst 3342 dye efflux property believed to be conferred by ABCG2. Because of the limitations of this approach we generated transgenic mice that express Nuclear GFP (GFPn) coupled to the Puromycin-resistance gene, under the control of ABCG2 promoter/enhancer sequences. We show that ABCG2 is expressed in neural progenitors of the developing forebrain and spinal cord and in embryonic and adult endothelial cells of the brain. Using the neurosphere assay, we isolated tripotent ABCG2-expressing neural stem cells from embryonic mouse brain. This transgenic line is a powerful tool for studying the expression of ABCG2 in many tissues and for performing functional studies in different experimental settings.

  3. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  4. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  5. A new generation of tools for search, recovery and quality evaluation of World Wide Web medical resources.

    PubMed

    Aguillo, I

    2000-01-01

    Although the Internet is already a valuable information resource in medicine, there are important challenges to be faced before physicians and general users will have extensive access to this information. As a result of a research effort to compile a health-related Internet directory, new tools and strategies have been developed to solve key problems derived from the explosive growth of medical information on the Net and the great concern over the quality of such critical information. The current Internet search engines lack some important capabilities. We suggest using second generation tools (client-side based) able to deal with large quantities of data and to increase the usability of the records recovered. We tested the capabilities of these programs to solve health-related information problems, recognising six groups according to the kind of topics addressed: Z39.50 clients, downloaders, multisearchers, tracing agents, indexers and mappers. The evaluation of the quality of health information available on the Internet could require a large amount of human effort. A possible solution may be to use quantitative indicators based on the hypertext visibility of the Web sites. The cybermetric measures are valid for quality evaluation if they are derived from indirect peer review by experts with Web pages citing the site. The hypertext links acting as citations need to be extracted from a controlled sample of quality super-sites. PMID:11142063

  6. A Scalable and Accurate Targeted Gene Assembly Tool (SAT-Assembler) for Next-Generation Sequencing Data

    PubMed Central

    Zhang, Yuan; Sun, Yanni; Cole, James R.

    2014-01-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material. PMID:25122209

  7. A scalable and accurate targeted gene assembly tool (SAT-Assembler) for next-generation sequencing data.

    PubMed

    Zhang, Yuan; Sun, Yanni; Cole, James R

    2014-08-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material. PMID:25122209

  8. Next-Generation Phage Display: Integrating and Comparing Available Molecular Tools to Enable Cost-Effective High-Throughput Analysis

    PubMed Central

    Dias-Neto, Emmanuel; Nunes, Diana N.; Giordano, Ricardo J.; Sun, Jessica; Botz, Gregory H.; Yang, Kuan; Setubal, João C.; Pasqualini, Renata; Arap, Wadih

    2009-01-01

    Background Combinatorial phage display has been used in the last 20 years in the identification of protein-ligands and protein-protein interactions, uncovering relevant molecular recognition events. Rate-limiting steps of combinatorial phage display library selection are (i) the counting of transducing units and (ii) the sequencing of the encoded displayed ligands. Here, we adapted emerging genomic technologies to minimize such challenges. Methodology/Principal Findings We gained efficiency by applying in tandem real-time PCR for rapid quantification to enable bacteria-free phage display library screening, and added phage DNA next-generation sequencing for large-scale ligand analysis, reporting a fully integrated set of high-throughput quantitative and analytical tools. The approach is far less labor-intensive and allows rigorous quantification; for medical applications, including selections in patients, it also represents an advance for quantitative distribution analysis and ligand identification of hundreds of thousands of targeted particles from patient-derived biopsy or autopsy in a longer timeframe post library administration. Additional advantages over current methods include increased sensitivity, less variability, enhanced linearity, scalability, and accuracy at much lower cost. Sequences obtained by qPhage plus pyrosequencing were similar to a dataset produced from conventional Sanger-sequenced transducing-units (TU), with no biases due to GC content, codon usage, and amino acid or peptide frequency. These tools allow phage display selection and ligand analysis at >1,000-fold faster rate, and reduce costs ∼250-fold for generating 106 ligand sequences. Conclusions/Significance Our analyses demonstrates that whereas this approach correlates with the traditional colony-counting, it is also capable of a much larger sampling, allowing a faster, less expensive, more accurate and consistent analysis of phage enrichment. Overall, qPhage plus pyrosequencing is

  9. Evaluation of pulsed laser ablation in liquids generated gold nanoparticles as novel transfection tools: efficiency and cytotoxicity

    NASA Astrophysics Data System (ADS)

    Willenbrock, Saskia; Durán, María. Carolina; Barchanski, Annette; Barcikowski, Stephan; Feige, Karsten; Nolte, Ingo; Murua Escobar, Hugo

    2014-03-01

    Varying transfection efficiencies and cytotoxicity are crucial aspects in cell manipulation. The utilization of gold nanoparticles (AuNP) has lately attracted special interest to enhance transfection efficiency. Conventional AuNP are usually generated by chemical reactions or gas pyrolysis requiring often cell-toxic stabilizers or coatings to conserve their characteristics. Alternatively, stabilizer- and coating-free, highly pure, colloidal AuNP can be generated by pulsed laser ablation in liquids (PLAL). Mammalian cells were transfected efficiently by addition of PLAL-AuNP, but data systematically evaluating the cell-toxic potential are lacking. Herein, the transfection efficiency and cytotoxicity of PLAL AuNP was evaluated by transfection of a mammalian cell line with a recombinant HMGB1/GFP DNA expression vector. Different methods were compared using two sizes of PLAL-AuNP, commercialized AuNP, two magnetic NP-based protocols and a conventional transfection reagent (FuGENE HD; FHD). PLAL-AuNP were generated using a Spitfire Pro femtosecond laser system delivering 120 fs laser pulses at a wavelength of 800 nm focusing the fs-laser beam on a 99.99% pure gold target placed in ddH2O. Transfection efficiencies were analyzed after 24h using fluorescence microscopy and flow cytometry. Toxicity was assessed measuring cell proliferation and percentage of necrotic, propidium iodide positive cells (PI %). The addition of PLAL-AuNP significantly enhanced transfection efficiencies (FHD: 31 %; PLAL-AuNP size-1: 46 %; size-2: 50 %) with increased PI% but no reduced cell proliferation. Commercial AuNP-transfection showed significantly lower efficiency (23 %), slightly increased PI % and reduced cell proliferation. Magnetic NP based methods were less effective but showing also lowest cytotoxicity. In conclusion, addition of PLAL-AuNP provides a novel tool for transfection efficiency enhancement with acceptable cytotoxic side-effects.

  10. Do sunbirds use taste to decide how much to drink?

    PubMed

    Bailey, Ida E; Nicolson, Susan W

    2016-03-01

    Nectarivorous birds typically consume smaller meals of more concentrated than of less concentrated sugar solutions. It is not clear, however, whether they use taste to decide how much to consume or whether they base this decision on post-ingestive feedback. Taste, a cue to nectar concentration, is available to nectarivores during ingestion whereas post-ingestive information about resource quality becomes available only after a meal. When conditions are variable, we would expect nectarivorous birds to base their decisions on how much to consume on taste, as post-ingestive feedback from previous meals would not be a reliable cue to current resource quality. Here, we tested whether white-bellied sunbirds (Cinnyris talatala), foraging from an array of artificial flowers, use taste to decide how much to consume per meal when nectar concentration is highly variable: they did not. Instead, how much they chose to consume per meal appeared to depend on the energy intake at the previous meal, that is how hungry they were. Our birds did, however, appear to use taste to decide how much to consume per flower visited within a meal. Unexpectedly, some individuals preferred to consume more from flowers with lower concentration rewards and some preferred to do the opposite. We draw attention to the fact that many studies perhaps misleadingly claim that birds use sweet taste to inform their foraging decisions, as they analyse mean data for multiple meals over which post-ingestive feedback will have become available rather than data for individual meals when only sensory information is available. We discuss how conflicting foraging rules could explain why sunbirds do not use sweet taste to inform their meal size decisions. PMID:26618299

  11. Wireless sensor systems for sense/decide/act/communicate.

    SciTech Connect

    Berry, Nina M.; Cushner, Adam; Baker, James A.; Davis, Jesse Zehring; Stark, Douglas P.; Ko, Teresa H.; Kyker, Ronald D.; Stinnett, Regan White; Pate, Ronald C.; Van Dyke, Colin; Kyckelhahn, Brian

    2003-12-01

    After 9/11, the United States (U.S.) was suddenly pushed into challenging situations they could no longer ignore as simple spectators. The War on Terrorism (WoT) was suddenly ignited and no one knows when this war will end. While the government is exploring many existing and potential technologies, the area of wireless Sensor networks (WSN) has emerged as a foundation for establish future national security. Unlike other technologies, WSN could provide virtual presence capabilities needed for precision awareness and response in military, intelligence, and homeland security applications. The Advance Concept Group (ACG) vision of Sense/Decide/Act/Communicate (SDAC) sensor system is an instantiation of the WSN concept that takes a 'systems of systems' view. Each sensing nodes will exhibit the ability to: Sense the environment around them, Decide as a collective what the situation of their environment is, Act in an intelligent and coordinated manner in response to this situational determination, and Communicate their actions amongst each other and to a human command. This LDRD report provides a review of the research and development done to bring the SDAC vision closer to reality.

  12. Next generation genome-wide association tool: Design and coverage of a high-throughput European-optimized SNP array

    PubMed Central

    Hoffmann, Thomas J.; Kvale, Mark N.; Hesselson, Stephanie E.; Zhan, Yiping; Aquino, Christine; Cao, Yang; Cawley, Simon; Chung, Elaine; Connell, Sheryl; Eshragh, Jasmin; Ewing, Marcia; Gollub, Jeremy; Henderson, Mary; Hubbell, Earl; Iribarren, Carlos; Kaufman, Jay; Lao, Richard Z.; Lu, Yontao; Ludwig, Dana; Mathauda, Gurpreet K.; McGuire, William; Mei, Gangwu; Miles, Sunita; Purdy, Matthew M.; Quesenberry, Charles; Ranatunga, Dilrini; Rowell, Sarah; Sadler, Marianne; Shapero, Michael H.; Shen, Ling; Shenoy, Tanushree R.; Smethurst, David; Van den Eeden, Stephen K.; Walter, Larry; Wan, Eunice; Wearley, Reid; Webster, Teresa; Wen, Christopher C.; Weng, Li; Whitmer, Rachel A.; Williams, Alan; Wong, Simon C.; Zau, Chia; Finn, Andrea; Schaefer, Catherine; Kwok, Pui-Yan; Risch, Neil

    2011-01-01

    The success of genome-wide association studies has paralleled the development of efficient genotyping technologies. We describe the development of a next-generation microarray based on the new highly-efficient Affymetrix Axiom genotyping technology that we are using to genotype individuals of European ancestry from the Kaiser Permanente Research Program on Genes, Environment and Health (RPGEH). The array contains 674,517 SNPs, and provides excellent genome-wide as well as gene-based and candidate-SNP coverage. Coverage was calculated using an approach based on imputation and cross validation. Preliminary results for the first 80,301 saliva-derived DNA samples from the RPGEH demonstrate very high quality genotypes, with sample success rates above 94% and over 98% of successful samples having SNP call rates exceeding 98%. At steady state, we have produced 462 million genotypes per week for each Axiom system. The new array provides a valuable addition to the repertoire of tools for large scale genome-wide association studies. PMID:21565264

  13. Dynamic combinatorial/covalent chemistry: a tool to read, generate and modulate the bioactivity of compounds and compound mixtures.

    PubMed

    Herrmann, Andreas

    2014-03-21

    Reversible covalent bond formation under thermodynamic control adds reactivity to self-assembled supramolecular systems, and is therefore an ideal tool to assess complexity of chemical and biological systems. Dynamic combinatorial/covalent chemistry (DCC) has been used to read structural information by selectively assembling receptors with the optimum molecular fit around a given template from a mixture of reversibly reacting building blocks. This technique allows access to efficient sensing devices and the generation of new biomolecules, such as small molecule receptor binders for drug discovery, but also larger biomimetic polymers and macromolecules with particular three-dimensional structural architectures. Adding a kinetic factor to a thermodynamically controlled equilibrium results in dynamic resolution and in self-sorting and self-replicating systems, all of which are of major importance in biological systems. Furthermore, the temporary modification of bioactive compounds by reversible combinatorial/covalent derivatisation allows control of their release and facilitates their transport across amphiphilic self-assembled systems such as artificial membranes or cell walls. The goal of this review is to give a conceptual overview of how the impact of DCC on supramolecular assemblies at different levels can allow us to understand, predict and modulate the complexity of biological systems. PMID:24296754

  14. Phenotype MicroArrays as a complementary tool to next generation sequencing for characterization of tree endophytes.

    PubMed

    Blumenstein, Kathrin; Macaya-Sanz, David; Martín, Juan A; Albrectsen, Benedicte R; Witzell, Johanna

    2015-01-01

    There is an increasing need to calibrate microbial community profiles obtained through next generation sequencing (NGS) with relevant taxonomic identities of the microbes, and to further associate these identities with phenotypic attributes. Phenotype MicroArray (PM) techniques provide a semi-high throughput assay for characterization and monitoring the microbial cellular phenotypes. Here, we present detailed descriptions of two different PM protocols used in our recent studies on fungal endophytes of forest trees, and highlight the benefits and limitations of this technique. We found that the PM approach enables effective screening of substrate utilization by endophytes. However, the technical limitations are multifaceted and the interpretation of the PM data challenging. For the best result, we recommend that the growth conditions for the fungi are carefully standardized. In addition, rigorous replication and control strategies should be employed whether using pre-configured, commercial microwell-plates or in-house designed PM plates for targeted substrate analyses. With these precautions, the PM technique is a valuable tool to characterize the metabolic capabilities of individual endophyte isolates, or successional endophyte communities identified by NGS, allowing a functional interpretation of the taxonomic data. Thus, PM approaches can provide valuable complementary information for NGS studies of fungal endophytes in forest trees. PMID:26441951

  15. Phenotype MicroArrays as a complementary tool to next generation sequencing for characterization of tree endophytes

    PubMed Central

    Blumenstein, Kathrin; Macaya-Sanz, David; Martín, Juan A.; Albrectsen, Benedicte R.; Witzell, Johanna

    2015-01-01

    There is an increasing need to calibrate microbial community profiles obtained through next generation sequencing (NGS) with relevant taxonomic identities of the microbes, and to further associate these identities with phenotypic attributes. Phenotype MicroArray (PM) techniques provide a semi-high throughput assay for characterization and monitoring the microbial cellular phenotypes. Here, we present detailed descriptions of two different PM protocols used in our recent studies on fungal endophytes of forest trees, and highlight the benefits and limitations of this technique. We found that the PM approach enables effective screening of substrate utilization by endophytes. However, the technical limitations are multifaceted and the interpretation of the PM data challenging. For the best result, we recommend that the growth conditions for the fungi are carefully standardized. In addition, rigorous replication and control strategies should be employed whether using pre-configured, commercial microwell-plates or in-house designed PM plates for targeted substrate analyses. With these precautions, the PM technique is a valuable tool to characterize the metabolic capabilities of individual endophyte isolates, or successional endophyte communities identified by NGS, allowing a functional interpretation of the taxonomic data. Thus, PM approaches can provide valuable complementary information for NGS studies of fungal endophytes in forest trees. PMID:26441951

  16. Profiling biopharmaceutical deciding properties of absorption of lansoprazole enteric-coated tablets using gastrointestinal simulation technology.

    PubMed

    Wu, Chunnuan; Sun, Le; Sun, Jin; Yang, Yajun; Ren, Congcong; Ai, Xiaoyu; Lian, He; He, Zhonggui

    2013-09-10

    The aim of the present study was to correlate in vitro properties of drug formulation to its in vivo performance, and to elucidate the deciding properties of oral absorption. Gastrointestinal simulation technology (GST) was used to simulate the in vivo plasma concentration-time curve and was implemented by GastroPlus™ software. Lansoprazole, a typical BCS class II drug, was chosen as a model drug. Firstly, physicochemical and pharmacokinetic parameters of lansoprazole were determined or collected from literature to construct the model. Validation of the developed model was performed by comparison of the predicted and the experimental plasma concentration data. We found that the predicted curve was in a good agreement with the experimental data. Then, parameter sensitivity analysis (PSA) was performed to find the key parameters of oral absorption. The absorption was particularly sensitive to dose, solubility and particle size for lansoprazole enteric-coated tablets. With a single dose of 30 mg and the solubility of 0.04 mg/ml, the absorption was complete. A good absorption could be achieved with lansoprazole particle radius down to about 25 μm. In summary, GST is a useful tool for profiling biopharmaceutical deciding properties of absorption of lansoprazole enteric-coated tablets and guiding the formulation optimization. PMID:23806811

  17. Methylation status of IGFBP-3 as a useful clinical tool for deciding on a concomitant radiotherapy

    PubMed Central

    Pernía, Olga; Belda-Iniesta, Cristobal; Pulido, Veronica; Cortes-Sempere, María; Rodriguez, Carlos; Vera, Olga; Soto, Javier; Jiménez, Julia; Taus, Alvaro; Rojo, Federico; Arriola, Edurne; Rovira, Ana; Albanell, Joan; Macías, M Teresa; de Castro, Javier; Perona, Rosario; Ibañez de Caceres, Inmaculada

    2014-01-01

    The methylation status of the IGFBP-3 gene is strongly associated with cisplatin sensitivity in patients with non-small cell lung cancer (NSCLC). In this study, we found in vitro evidence that linked the presence of an unmethylated promoter with poor response to radiation. Our data also indicate that radiation might sensitize chemotherapy-resistant cells by reactivating IGFBP-3-expression through promoter demethylation, inactivating the PI3K/AKT pathway. We also explored the IGFBP-3 methylation effect on overall survival (OS) in a population of 40 NSCLC patients who received adjuvant therapy after R0 surgery. Our results indicate that patients harboring an unmethylated promoter could benefit more from a chemotherapy schedule alone than from a multimodality therapy involving radiotherapy and platinum-based treatments, increasing their OS by 2.5 y (p = .03). Our findings discard this epi-marker as a prognostic factor in a patient population without adjuvant therapy, indicating that radiotherapy does not improve survival for patients harboring an unmethylated IGFBP-3 promoter. PMID:25482372

  18. Methylation status of IGFBP-3 as a useful clinical tool for deciding on a concomitant radiotherapy.

    PubMed

    Pernía, Olga; Belda-Iniesta, Cristobal; Pulido, Veronica; Cortes-Sempere, María; Rodriguez, Carlos; Vera, Olga; Soto, Javier; Jiménez, Julia; Taus, Alvaro; Rojo, Federico; Arriola, Edurne; Rovira, Ana; Albanell, Joan; Macías, M Teresa; de Castro, Javier; Perona, Rosario; Ibañez de Caceres, Inmaculada

    2014-11-01

    The methylation status of the IGFBP-3 gene is strongly associated with cisplatin sensitivity in patients with non-small cell lung cancer (NSCLC). In this study, we found in vitro evidence that linked the presence of an unmethylated promoter with poor response to radiation. Our data also indicate that radiation might sensitize chemotherapy-resistant cells by reactivating IGFBP-3-expression through promoter demethylation, inactivating the PI3K/AKT pathway. We also explored the IGFBP-3 methylation effect on overall survival (OS) in a population of 40 NSCLC patients who received adjuvant therapy after R0 surgery. Our results indicate that patients harboring an unmethylated promoter could benefit more from a chemotherapy schedule alone than from a multimodality therapy involving radiotherapy and platinum-based treatments, increasing their OS by 2.5 y (p = .03). Our findings discard this epi-marker as a prognostic factor in a patient population without adjuvant therapy, indicating that radiotherapy does not improve survival for patients harboring an unmethylated IGFBP-3 promoter. PMID:25482372

  19. Deciding Termination for Ancestor Match- Bounded String Rewriting Systems

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2005-01-01

    Termination of a string rewriting system can be characterized by termination on suitable recursively defined languages. This kind of termination criteria has been criticized for its lack of automation. In an earlier paper we have shown how to construct an automated termination criterion if the recursion is aligned with the rewrite relation. We have demonstrated the technique with Dershowitz's forward closure criterion. In this paper we show that a different approach is suitable when the recursion is aligned with the inverse of the rewrite relation. We apply this idea to Kurth's ancestor graphs and obtain ancestor match-bounded string rewriting systems. Termination is shown to be decidable for this class. The resulting method improves upon those based on match-boundedness or inverse match-boundedness.

  20. Monte Carlo event generators in atomic collisions: A new tool to tackle the few-body dynamics

    NASA Astrophysics Data System (ADS)

    Ciappina, M. F.; Kirchner, T.; Schulz, M.

    2010-04-01

    We present a set of routines to produce theoretical event files, for both single and double ionization of atoms by ion impact, based on a Monte Carlo event generator (MCEG) scheme. Such event files are the theoretical counterpart of the data obtained from a kinematically complete experiment; i.e. they contain the momentum components of all collision fragments for a large number of ionization events. Among the advantages of working with theoretical event files is the possibility to incorporate the conditions present in a real experiment, such as the uncertainties in the measured quantities. Additionally, by manipulating them it is possible to generate any type of cross sections, specially those that are usually too complicated to compute with conventional methods due to a lack of symmetry. Consequently, the numerical effort of such calculations is dramatically reduced. We show examples for both single and double ionization, with special emphasis on a new data analysis tool, called four-body Dalitz plots, developed very recently. Program summaryProgram title: MCEG Catalogue identifier: AEFV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2695 No. of bytes in distributed program, including test data, etc.: 18 501 Distribution format: tar.gz Programming language: FORTRAN 77 with parallelization directives using scripting Computer: Single machines using Linux and Linux servers/clusters (with cores with any clock speed, cache memory and bits in a word) Operating system: Linux (any version and flavor) and FORTRAN 77 compilers Has the code been vectorised or parallelized?: Yes RAM: 64-128 kBytes (the codes are very cpu intensive) Classification: 2.6 Nature of problem: The code deals with single and double

  1. E-DECIDER Disaster Response and Decision Support Cyberinfrastructure: Technology and Challenges

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    Timely delivery of critical information to decision makers during a disaster is essential to response and damage assessment. Key issues to an efficient emergency response after a natural disaster include rapidly processing and delivering this critical information to emergency responders and reducing human intervention as much as possible. Essential elements of information necessary to achieve situational awareness are often generated by a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. A key challenge is the current state of practice does not easily support information sharing and technology interoperability. NASA E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) has worked with the California Earthquake Clearinghouse and its partners to address these issues and challenges by adopting the XChangeCore Web Service Data Orchestration technology and participating in several earthquake response exercises. The E-DECIDER decision support system provides rapid delivery of advanced situational awareness data products to operations centers and emergency responders in the field. Remote sensing and hazard data, model-based map products, information from simulations, damage detection, and crowdsourcing is integrated into a single geospatial view and delivered through a service oriented architecture for improved decision-making and then directly to mobile devices of responders. By adopting a Service Oriented Architecture based on Open Geospatial Consortium standards, the system provides an extensible, comprehensive framework for geospatial data processing and distribution on Cloud platforms and other distributed environments. While the Clearinghouse and its partners are not first responders, they do support the emergency response community by providing information about the damaging effects earthquakes. It is critical for decision makers to maintain a situational awareness

  2. Evolutionary algorithm based optimization of hydraulic machines utilizing a state-of-the-art block coupled CFD solver and parametric geometry and mesh generation tools

    NASA Astrophysics Data System (ADS)

    S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr

    2014-03-01

    An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.

  3. FANSe2: A Robust and Cost-Efficient Alignment Tool for Quantitative Next-Generation Sequencing Applications

    PubMed Central

    Xiao, Chuan-Le; Mai, Zhi-Biao; Lian, Xin-Lei; Zhong, Jia-Yong; Jin, Jing-jie; He, Qing-Yu; Zhang, Gong

    2014-01-01

    Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT) based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads) to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/. PMID:24743329

  4. FANSe2: a robust and cost-efficient alignment tool for quantitative next-generation sequencing applications.

    PubMed

    Xiao, Chuan-Le; Mai, Zhi-Biao; Lian, Xin-Lei; Zhong, Jia-Yong; Jin, Jing-Jie; He, Qing-Yu; Zhang, Gong

    2014-01-01

    Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT) based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads) to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/. PMID:24743329

  5. A Graphic Symbol Tool for the Evaluation of Communication, Satisfaction and Priorities of Individuals with Intellectual Disability Who Use a Speech Generating Device

    ERIC Educational Resources Information Center

    Valiquette, Christine; Sutton, Ann; Ska, Bernadette

    2010-01-01

    This article reports on the views of individuals with learning disability (LD) on their use of their speech generating devices (SGDs), their satisfaction about their communication, and their priorities. The development of an interview tool made of graphic symbols and entitled Communication, Satisfaction and Priorities of SGD Users (CSPU) is…

  6. Assessing next-generation sequencing and 4 bioinformatics tools for detection of Enterovirus D68 and other respiratory viruses in clinical samples.

    PubMed

    Huang, Weihua; Wang, Guiqing; Lin, Henry; Zhuge, Jian; Nolan, Sheila M; Vail, Eric; Dimitrova, Nevenka; Fallon, John T

    2016-05-01

    We used 4 different bioinformatics algorithms to evaluate the application of a metagenomic shot-gun sequencing method in detection of Enterovirus D68 and other respiratory viruses in clinical specimens. Our data supported that next-generation sequencing, combined with improved bioinformatics tools, is practically feasible and useful for clinical diagnosis of viral infections. PMID:26971640

  7. Generations.

    PubMed

    Chambers, David W

    2005-01-01

    Groups naturally promote their strengths and prefer values and rules that give them an identity and an advantage. This shows up as generational tensions across cohorts who share common experiences, including common elders. Dramatic cultural events in America since 1925 can help create an understanding of the differing value structures of the Silents, the Boomers, Gen Xers, and the Millennials. Differences in how these generations see motivation and values, fundamental reality, relations with others, and work are presented, as are some applications of these differences to the dental profession. PMID:16623137

  8. Dormancy and germination: How does the crop seed decide?

    PubMed

    Shu, K; Meng, Y J; Shuai, H W; Liu, W G; Du, J B; Liu, J; Yang, W Y

    2015-11-01

    Whether seeds germinate or maintain dormancy is decided upon through very intricate physiological processes. Correct timing of these processes is most important for the plants life cycle. If moist conditions are encountered, a low dormancy level causes pre-harvest sprouting in various crop species, such as wheat, corn and rice, this decreases crop yield and negatively impacts downstream industrial processing. In contrast, a deep level of seed dormancy prevents normal germination even under favourable conditions, resulting in a low emergence rate during agricultural production. Therefore, an optimal seed dormancy level is valuable for modern mechanised agricultural systems. Over the past several years, numerous studies have demonstrated that diverse endogenous and environmental factors regulate the balance between dormancy and germination, such as light, temperature, water status and bacteria in soil, and phytohormones such as ABA (abscisic acid) and GA (gibberellic acid). In this updated review, we highlight recent advances regarding the molecular mechanisms underlying regulation of seed dormancy and germination processes, including the external environmental and internal hormonal cues, and primarily focusing on the staple crop species. Furthermore, future challenges and research directions for developing a full understanding of crop seed dormancy and germination are also discussed. PMID:26095078

  9. Point of decision: when do pigeons decide to head home?

    NASA Astrophysics Data System (ADS)

    Schiffner, Ingo; Wiltschko, Roswitha

    2009-02-01

    Pigeons released away from their loft usually fly around at the release site for a while before they finally leave. Visual observations had suggested that the moment when the birds decide to head home is associated with a certain change in flying style. To see whether this change is also reflected by GPS-recorded tracks, a group of pigeons equipped with flight recorders was released at two sites about 10 km from their home loft. The initial part of their flight paths was analyzed in order to find objective criteria indicating the point of decision. We selected the highest increase in steadiness as the best estimate for the moment of decision. This criterion allows us to divide the pigeons’ paths in two distinct phases, an initial phase and the homing phase, with the moment of decision, on an average, 2 min after release. The moment of decision marks a change in behavior, with a significant increase in steadiness and flying speed and headings significantly closer to the home direction. The behavior of the individual birds at the two sites was not correlated, suggesting no pronounced individual traits for the length of the initial phase. The behavior during this phase seems to be controlled by flight preparation, exploration, and non-navigational motivations rather than by navigational necessities alone.

  10. Using Tableau to Decide Expressive Description Logics with Role Negation

    NASA Astrophysics Data System (ADS)

    Schmidt, Renate A.; Tishkovsky, Dmitry

    This paper presents a tableau approach for deciding description logics outside the scope of OWL DL/1.1 and current state-of-the-art tableau-based description logic systems. In particular, we define a sound and complete tableau calculus for the description logic {ALBO} and show that it provides a basis for decision procedures for this logic and numerous other description logics with full role negation. {ALBO} is the extension of {ALC} with the Boolean role operators, inverse of roles, domain and range restriction operators and it includes full support for nominals (individuals). {ALBO} is a very expressive description logic which subsumes Boolean modal logic and the two-variable fragment of first-order logic and reasoning in it is NExpTime-complete. An important novelty is the use of a generic, unrestricted blocking rule as a replacement for standard loop checking mechanisms implemented in description logic systems. An implementation of our approach exists in the {textsc{MetTeL}} system.