Science.gov

Sample records for generation tool decider

  1. DECIDE: a Decision Support Tool to Facilitate Parents' Choices Regarding Genome-Wide Sequencing.

    PubMed

    Birch, Patricia; Adam, S; Bansback, N; Coe, R R; Hicklin, J; Lehman, A; Li, K C; Friedman, J M

    2016-12-01

    We describe the rationale, development, and usability testing for an integrated e-learning tool and decision aid for parents facing decisions about genome-wide sequencing (GWS) for their children with a suspected genetic condition. The online tool, DECIDE, is designed to provide decision-support and to promote high quality decisions about undergoing GWS with or without return of optional incidental finding results. DECIDE works by integrating educational material with decision aids. Users may tailor their learning by controlling both the amount of information and its format - text and diagrams and/or short videos. The decision aid guides users to weigh the importance of various relevant factors in their own lives and circumstances. After considering the pros and cons of GWS and return of incidental findings, DECIDE summarizes the user's responses and apparent preferred choices. In a usability study of 16 parents who had already chosen GWS after conventional genetic counselling, all participants found DECIDE to be helpful. Many would have been satisfied to use it alone to guide their GWS decisions, but most would prefer to have the option of consulting a health care professional as well to aid their decision. Further testing is necessary to establish the effectiveness of using DECIDE as an adjunct to or instead of conventional pre-test genetic counselling for clinical genome-wide sequencing.

  2. Disaster Response Tools for Decision Support and Data Discovery - E-DECIDER and GeoGateway

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Granat, R. A.; Lyzenga, G. A.; Pierce, M. E.; Wang, J.; Grant Ludwig, L.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2015-12-01

    Providing actionable data for situational awareness following an earthquake or other disaster is critical to decision makers in order to improve their ability to anticipate requirements and provide appropriate resources for response. E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) is a decision support system producing remote sensing and geophysical modeling products that are relevant to the emergency preparedness and response communities and serves as a gateway to enable the delivery of actionable information to these communities. GeoGateway is a data product search and analysis gateway for scientific discovery, field use, and disaster response focused on NASA UAVSAR and GPS data that integrates with fault data, seismicity and models. Key information on the nature, magnitude and scope of damage, or Essential Elements of Information (EEI), necessary to achieve situational awareness are often generated from a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. We have worked in partnership with the California Earthquake Clearinghouse to develop actionable data products for use in their response efforts, particularly in regularly scheduled, statewide exercises like the recent May 2015 Capstone/SoCal NLE/Ardent Sentry Exercises and in the August 2014 South Napa earthquake activation. We also provided a number of products, services, and consultation to the NASA agency-wide response to the April 2015 Gorkha, Nepal earthquake. We will present perspectives on developing tools for decision support and data discovery in partnership with the Clearinghouse and for the Nepal earthquake. Products delivered included map layers as part of the common operational data plan for the Clearinghouse, delivered through XchangeCore Web Service Data Orchestration, enabling users to create merged datasets from multiple providers. For the Nepal response effort, products included models

  3. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  4. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  5. Web Tools: The Second Generation

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  6. Web Tools: The Second Generation

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  7. Next generation VLSI tools

    NASA Technical Reports Server (NTRS)

    Gibson, J.

    1990-01-01

    This paper focuses on what features would be useful in VLSI Computer Aided Design Tools and Systems to be used in the next five to ten years. Examples of current design tasks will be used to emphasize the areas where new or expanded VLSI CAD tools are needed. To provide a basis for projecting the future of VLSI tools, a brief history of the evolution of VLSI design software and hardware platforms is presented. The role of design methodology is considered with respect to the anticipated scale of future VLSI design projects. Future requirements of design verification and manufacturing testing are projected based on the challenge of surviving in a competitive market. Examples of VLSI tools reflect the author's involvement on VLSI design teams developing integrated circuits for disk memory and other computer peripherals for the last eight years.

  8. Sense, decide, act, communicate (SDAC): next generation of smart sensor systems

    NASA Astrophysics Data System (ADS)

    Berry, Nina; Davis, Jesse; Ko, Teresa H.; Kyker, Ron; Pate, Ron; Stark, Doug; Stinnett, Regan; Baker, James; Cushner, Adam; Van Dyke, Colin; Kyckelhahn, Brian

    2004-09-01

    The recent war on terrorism and increased urban warfare has been a major catalysis for increased interest in the development of disposable unattended wireless ground sensors. While the application of these sensors to hostile domains has been generally governed by specific tasks, this research explores a unique paradigm capitalizing on the fundamental functionality related to sensor systems. This functionality includes a sensors ability to Sense - multi-modal sensing of environmental events, Decide - smart analysis of sensor data, Act - response to environmental events, and Communication - internal to system and external to humans (SDAC). The main concept behind SDAC sensor systems is to integrate the hardware, software, and networking to generate 'knowledge and not just data'. This research explores the usage of wireless SDAC units to collectively make up a sensor system capable of persistent, adaptive, and autonomous behavior. These systems are base on the evaluation of scenarios and existing systems covering various domains. This paper presents a promising view of sensor network characteristics, which will eventually yield smart (intelligent collectives) network arrays of SDAC sensing units generally applicable to multiple related domains. This paper will also discuss and evaluate the demonstration system developed to test the concepts related to SDAC systems.

  9. Deciding about hormone therapy

    MedlinePlus

    HRT - deciding; Estrogen replacement therapy - deciding; ERT- deciding; Hormone replacement therapy - deciding; Menopause - deciding; HT - deciding; Menopausal hormone therapy - deciding; MHT - deciding

  10. Quantitative versus Qualitative Evaluation: A Tool to Decide Which to Use

    ERIC Educational Resources Information Center

    Dobrovolny, Jackie L.; Fuentes, Stephanie Christine G.

    2008-01-01

    Evaluation is often avoided in human performance technology (HPT), but it is an essential and frequently catalytic activity that adds significant value to projects. Knowing how to approach an evaluation and whether to use qualitative, quantitative, or both methods makes evaluation much easier. In this article, we provide tools to help determine…

  11. E-DECIDER: Using Earth Science Data and Modeling Tools to Develop Decision Support for Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Stough, T. M.; Burl, M. C.; Pierce, M.; Wang, J.; Ma, Y.; Rundle, J. B.; yoder, M. R.; Bawden, G. W.

    2012-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision-making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. Geodetic imaging data, including from inteferometric synthetic aperture radar (InSAR) and GPS, have a rich scientific heritage for use in earthquake research. Survey grade GPS was developed in the 1980s and the first InSAR image of an earthquake was produced for the 1992 Landers event. As more of these types of data have become increasingly available they have also shown great utility for providing key information for disaster response. Work has been done to translate these data into useful and actionable information for decision makers in the event of an earthquake disaster. In addition to observed data, modeling tools provide essential preliminary estimates while data are still being collected and/or processed, which can be refined as data products become available. Now, with more data and better models, we are able apply these to responders who need easy tools and routinely produced data products. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER has taken advantage of the legacy of Earth science data, including MODIS, Landsat, SCIGN, PBO, UAVSAR, and modeling tools such as the ones developed by QuakeSim, in order to deliver successful decision support products for earthquake disaster response. The project has

  12. Health. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Health, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday…

  13. Thrust generator for boring tools

    SciTech Connect

    Dismukes, N.B.

    1984-03-13

    The present invention provides an electrically powered system for advancing a rotary boring tool in situations where the inclination of the bore hole is such that the force of gravity does not provide sufficient forward thrust. One or more marine screw propellers are rotated by the motor which itself is restrained from rotation by being fixedly connected to a flexible, twist resistant conduit for conducting the drilling fluid and electric power from the surface. The system may also provide for different rotative speeds for propeller and bit and for counter-rotating propellers to minimize torque forces on the conduit.

  14. SUPPORT Tools for evidence-informed health Policymaking (STP) 8: Deciding how much confidence to place in a systematic review.

    PubMed

    Lewin, Simon; Oxman, Andrew D; Lavis, John N; Fretheim, Atle

    2009-12-16

    This article is part of a series written for people responsible for making decisions about health policies and programmes and for those who support these decision makers. The reliability of systematic reviews of the effects of health interventions is variable. Consequently, policymakers and others need to assess how much confidence can be placed in such evidence. The use of systematic and transparent processes to determine such decisions can help to prevent the introduction of errors and bias in these judgements. In this article, we suggest five questions that can be considered when deciding how much confidence to place in the findings of a systematic review of the effects of an intervention. These are: 1. Did the review explicitly address an appropriate policy or management question? 2. Were appropriate criteria used when considering studies for the review? 3. Was the search for relevant studies detailed and reasonably comprehensive? 4. Were assessments of the studies' relevance to the review topic and of their risk of bias reproducible? 5. Were the results similar from study to study?

  15. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  16. Decision generation tools and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  17. Automatic tool path generation for finish machining

    SciTech Connect

    Kwok, Kwan S.; Loucks, C.S.; Driessen, B.J.

    1997-03-01

    A system for automatic tool path generation was developed at Sandia National Laboratories for finish machining operations. The system consists of a commercially available 5-axis milling machine controlled by Sandia developed software. This system was used to remove overspray on cast turbine blades. A laser-based, structured-light sensor, mounted on a tool holder, is used to collect 3D data points around the surface of the turbine blade. Using the digitized model of the blade, a tool path is generated which will drive a 0.375 inch diameter CBN grinding pin around the tip of the blade. A fuzzified digital filter was developed to properly eliminate false sensor readings caused by burrs, holes and overspray. The digital filter was found to successfully generate the correct tool path for a blade with intentionally scanned holes and defects. The fuzzified filter improved the computation efficiency by a factor of 25. For application to general parts, an adaptive scanning algorithm was developed and presented with simulation results. A right pyramid and an ellipsoid were scanned successfully with the adaptive algorithm.

  18. Maraviroc Clinical Test (MCT) as an alternative tool to decide CCR5-antagonists prescription in naïve HIV-infected patients.

    PubMed

    Genebat, Miguel; de Pablo-Bernal, Rebeca S; Pulido, Ildefonso; Jiménez-Mejías, Manuel E; Martínez, Onofre; Pacheco, Yolanda M; Raffi-El-Idrissi Benhia, Mohammed; Abad, María Antonia; Ruiz-Mateos, Ezequiel; Leal, Manuel

    2015-09-01

    Our aim was to analyze the virological response to a combined antiretroviral therapy started after Maraviroc Clinical Test (MCT) in naïve HIV-infected patients. Forty-one patients were exposed to MCT, based on an 8-day MVC monotherapy. If undetectability or a viral load reduction >1 log10 HIV-RNA copies/ml was achieved, a MVC-containing cART was prescribed. Forty patients showed a positive MCT; undetectability after 48weeks on cART was achieved in 34/41 (82.9%) patients. The result of MCT was compared with a genotypic tropism method and with Trofile®, showing 10.7% and 18.75% discordance rates, respectively. MCT is a reliable tool to decide CCR5-antagonists prescription, also in the naïve scenario where most patients show a virological response to MVC independently the tropism result reported by genotypic or phenotypic methods.

  19. Protocol for a randomised controlled trial of a web-based healthy relationship tool and safety decision aid for women experiencing domestic violence (I-DECIDE).

    PubMed

    Hegarty, Kelsey; Tarzia, Laura; Murray, Elizabeth; Valpied, Jodie; Humphreys, Cathy; Taft, Angela; Gold, Lisa; Glass, Nancy

    2015-08-01

    Domestic violence is a serious problem affecting the health and wellbeing of women globally. Interventions in health care settings have primarily focused on screening and referral, however, women often may not disclose abuse to health practitioners. The internet offers a confidential space in which women can assess the health of their relationships and make a plan for safety and wellbeing for themselves and their children. This randomised controlled trial is testing the effectiveness of a web-based healthy relationship tool and safety decision aid (I-DECIDE). Based broadly on the IRIS trial in the United States, it has been adapted for the Australian context where it is conducted entirely online and uses the Psychosocial Readiness Model as the basis for the intervention. In this two arm, pragmatic randomised controlled trial, women who have experienced abuse or fear of a partner in the previous 6 months will be computer randomised to receive either the I-DECIDE website or a comparator website (basic relationship and safety advice). The intervention includes self-directed reflection exercises on their relationship, danger level, priority setting, and results in an individualised, tailored action plan. Primary self-reported outcomes are: self-efficacy (General Self-Efficacy Scale) immediately after completion, 6 and 12 months post-baseline; and depressive symptoms (Centre for Epidemiologic Studies Depression Scale, Revised, 6 and 12 months post-baseline). Secondary outcomes include mean number of helpful actions for safety and wellbeing, mean level of fear of partner and cost-effectiveness. This fully-automated trial will evaluate a web-based self-information, self-reflection and self-management tool for domestic violence. We hypothesise that the improvement in self-efficacy and mental health will be mediated by increased perceived support and awareness encouraging positive change. If shown to be effective, I-DECIDE could be easily incorporated into the community

  20. Next generation tools for genomic data generation, distribution, and visualization.

    PubMed

    Nix, David A; Di Sera, Tonya L; Dalley, Brian K; Milash, Brett A; Cundick, Robert M; Quinn, Kevin S; Courdy, Samir J

    2010-09-09

    With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq.

  1. GROUNDWATER MONITORING REPORT GENERATION TOOLS - 12005

    SciTech Connect

    Lopez, N.

    2011-11-21

    Compliance with National and State environmental regulations (e.g. Resource Conservation and Recovery Act (RCRA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) aka SuperFund) requires Savannah River Site (SRS) to extensively collect and report groundwater monitoring data, with potential fines for missed reporting deadlines. Several utilities have been developed at SRS to facilitate production of the regulatory reports which include maps, data tables, charts and statistics. Components of each report are generated in accordance with complex sets of regulatory requirements specific to each site monitored. SRS developed a relational database to incorporate the detailed reporting rules with the groundwater data, and created a set of automation tools to interface with the information and generate the report components. These process improvements enhanced quality and consistency by centralizing the information, and have reduced manpower and production time through automated efficiencies.

  2. Groundwater Monitoring Report Generation Tools - 12005

    SciTech Connect

    Lopez, Natalie

    2012-07-01

    Compliance with National and State environmental regulations (e.g. Resource Conservation and Recovery Act (RCRA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) aka SuperFund) requires Savannah River Site (SRS) to extensively collect and report groundwater monitoring data, with potential fines for missed reporting deadlines. Several utilities have been developed at SRS to facilitate production of the regulatory reports which include maps, data tables, charts and statistics. Components of each report are generated in accordance with complex sets of regulatory requirements specific to each site monitored. SRS developed a relational database to incorporate the detailed reporting rules with the groundwater data, and created a set of automation tools to interface with the information and generate the report components. These process improvements enhanced quality and consistency by centralizing the information, and have reduced manpower and production time through automated efficiencies. (author)

  3. Particle generation mechanisms in vacuum processing tools

    NASA Astrophysics Data System (ADS)

    Fu, Thomas T. H.; Bennett, Marylyn H.; Bowling, R. A.

    1992-06-01

    It is estimated that by the year 1995, as much as ninety percent of the contamination in IC manufacturing will be caused by equipment and processes. Contamination can be in the form of particles, defects, scratches, stains, and so on. All are major concerns for yielding ULSI devices. In order to eliminate process/equipment-induced particles, particle formation/generation must be understood before appropriate action can be taken to meet the contamination-free requirements of the future. A variety of vacuum processing tools were studied, including CVD, PECVD, and plasma etch systems with heat lamps, RF, and remote microwave energy sources. A particle collection and characterization methodology was adopted to analyze the particles generated from the vacuum processing tools. By using SEM and EDS to analyze particles collected from equipment chamber walls, both the particle morphology and composition were discerned. The elemental analyses indicate that the composition of particles varied a great deal depending on the chemical nature of the process, chamber material/process compatibility, and energy source.

  4. Next-Generation Tools For Next-Generation Surveys

    NASA Astrophysics Data System (ADS)

    Murray, S. G.

    2017-04-01

    The next generation of large-scale galaxy surveys, across the electromagnetic spectrum, loom on the horizon as explosively game-changing datasets, in terms of our understanding of cosmology and structure formation. We are on the brink of a torrent of data that is set to both confirm and constrain current theories to an unprecedented level, and potentially overturn many of our conceptions. One of the great challenges of this forthcoming deluge is to extract maximal scientific content from the vast array of raw data. This challenge requires not only well-understood and robust physical models, but a commensurate network of software implementations with which to efficiently apply them. The halo model, a semi-analytic treatment of cosmological spatial statistics down to nonlinear scales, provides an excellent mathematical framework for exploring the nature of dark matter. This thesis presents a next-generation toolkit based on the halo model formalism, intended to fulfil the requirements of next-generation surveys. Our toolkit comprises three tools: (i) hmf, a comprehensive and flexible calculator for halo mass functions (HMFs) within extended Press-Schechter theory, (ii) the MRP distribution for extremely efficient analytic characterisation of HMFs, and (iii) halomod, an extension of hmf which provides support for the full range of halo model components. In addition to the development and technical presentation of these tools, we apply each to the task of physical modelling. With hmf, we determine the precision of our knowledge of the HMF, due to uncertainty in our knowledge of the cosmological parameters, over the past decade of cosmic microwave background (CMB) experiments. We place rule-of-thumb uncertainties on the predicted HMF for the Planck cosmology, and find that current limits on the precision are driven by modeling uncertainties rather than those from cosmological parameters. With the MRP, we create and test a method for robustly fitting the HMF to observed

  5. Next Generation of Visualization Tools for Astrophysics

    NASA Astrophysics Data System (ADS)

    Gheller, C.; Becciani, U.; Teuben, P. J.

    2007-10-01

    Visualization exists in many niche packages, each with their own strengths and weaknesses. In this BoF a series of presentations was meant to stimulate a discussion between developers and users about the future of visualization tools. A wiki {http://wiki.eurovotech.org/twiki/bin/view/VOTech/BoFADASSTucson2006} page has been set up to log and continue this discussion. One recent technique, dubbed ``plastifying,'' has enabled different tools to inter-operate data between them and result in a very flexible environment. Also deemed important are the ability to write plug-ins for analysis in visualization tools. Publication quality graphs are sometimes missing from the tools we use.

  6. Next-Generation Design and Simulation Tools

    NASA Technical Reports Server (NTRS)

    Weber, Tod A.

    1997-01-01

    Thirty years ago, the CAD industry was created as electronic drafting tools were developed to move people from the traditional two-dimensional drafting boards. While these tools provided an improvement in accuracy (true perpendicular lines, etc.), they did offer a significant improvement in productivity or impact development times. They electronically captured a manual process.

  7. Projectile-generating explosive access tool

    SciTech Connect

    Jakaboski, Juan-Carlos; Hughs, Chance G; Todd, Steven N

    2013-06-11

    A method for generating a projectile using an explosive device that can generate a projectile from the opposite side of a wall from the side where the explosive device is detonated. The projectile can be generated without breaching the wall of the structure or container. The device can optionally open an aperture in a solid wall of a structure or a container and form a high-kinetic-energy projectile from the portion of the wall removed to create the aperture.

  8. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  9. Deciding about an IUD

    MedlinePlus

    ... a small, plastic, T-shaped device used for birth control . It is inserted into the uterus where it stays to prevent pregnancy. Alternative names Contraception - IUD; Birth control - IUD; Intrauterine - deciding Types of ...

  10. Dewarless Logging Tool - 1st Generation

    SciTech Connect

    HENFLING,JOSEPH A.; NORMANN,RANDY A.

    2000-08-01

    This report focuses on Sandia National Laboratories' effort to create high-temperature logging tools for geothermal applications without the need for heat shielding. One of the mechanisms for failure in conventional downhole tools is temperature. They can only survive a limited number of hours in high temperature environments. For the first time since the evolution of integrated circuits, components are now commercially available that are qualified to 225 C with many continuing to work up to 300 C. These components are primarily based on Silicon-On-Insulator (SOI) technology. Sandia has developed and tested a simple data logger based on this technology that operates up to 300 C with a few limiting components operating to only 250 C without thermal protection. An actual well log to 240 C without shielding is discussed. The first prototype high-temperature tool measures pressure and temperature using a wire-line for power and communication. The tool is based around the HT83C51 microcontroller. A brief discussion of the background and status of the High Temperature Instrumentation program at Sandia, objectives, data logger development, and future project plans are given.

  11. Projectile-generating explosive access tool

    DOEpatents

    Jakaboski, Juan-Carlos; Todd, Steven N.

    2011-10-18

    An explosive device that can generate a projectile from the opposite side of a wall from the side where the explosive device is detonated. The projectile can be generated without breaching the wall of the structure or container. The device can optionally open an aperture in a solid wall of a structure or a container and form a high-kinetic-energy projectile from the portion of the wall removed to create the aperture.

  12. Next-Generation Ion Thruster Design Tool

    NASA Technical Reports Server (NTRS)

    Stolz, Peter

    2015-01-01

    Computational tools that accurately predict the performance of electric propulsion devices are highly desirable and beneficial to NASA and the broader electric propulsion community. The current state of the art in electric propulsion modeling relies heavily on empirical data and numerous computational "knobs." In Phase I of this project, Tech-X Corporation developed the most detailed ion engine discharge chamber model that currently exists. This kinetic model simulates all particles in the discharge chamber along with a physically correct simulation of the electric fields. In addition, kinetic erosion models are included for modeling the ion-impingement effects on thruster component erosion. In Phase II, Tech-X developed a user-friendly computer program for NASA and other governmental and industry customers. Tech-X has implemented a number of advanced numerical routines to bring the computational time down to a commercially acceptable level. NASA now has a highly sophisticated, user-friendly ion engine discharge chamber modeling tool.

  13. BGen: A UML Behavior Network Generator Tool

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry; Reder, Leonard J.; Balian, Harry

    2010-01-01

    BGen software was designed for autogeneration of code based on a graphical representation of a behavior network used for controlling automatic vehicles. A common format used for describing a behavior network, such as that used in the JPL-developed behavior-based control system, CARACaS ["Control Architecture for Robotic Agent Command and Sensing" (NPO-43635), NASA Tech Briefs, Vol. 32, No. 10 (October 2008), page 40] includes a graph with sensory inputs flowing through the behaviors in order to generate the signals for the actuators that drive and steer the vehicle. A computer program to translate Unified Modeling Language (UML) Freeform Implementation Diagrams into a legacy C implementation of Behavior Network has been developed in order to simplify the development of C-code for behavior-based control systems. UML is a popular standard developed by the Object Management Group (OMG) to model software architectures graphically. The C implementation of a Behavior Network is functioning as a decision tree.

  14. The Exercise: An Exercise Generator Tool for the SOURCe Project

    ERIC Educational Resources Information Center

    Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios

    2016-01-01

    The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…

  15. The parser generator as a general purpose tool

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.; Collins, W. R.

    1985-01-01

    The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.

  16. MEAT: An Authoring Tool for Generating Adaptable Learning Resources

    ERIC Educational Resources Information Center

    Kuo, Yen-Hung; Huang, Yueh-Min

    2009-01-01

    Mobile learning (m-learning) is a new trend in the e-learning field. The learning services in m-learning environments are supported by fundamental functions, especially the content and assessment services, which need an authoring tool to rapidly generate adaptable learning resources. To fulfill the imperious demand, this study proposes an…

  17. MEAT: An Authoring Tool for Generating Adaptable Learning Resources

    ERIC Educational Resources Information Center

    Kuo, Yen-Hung; Huang, Yueh-Min

    2009-01-01

    Mobile learning (m-learning) is a new trend in the e-learning field. The learning services in m-learning environments are supported by fundamental functions, especially the content and assessment services, which need an authoring tool to rapidly generate adaptable learning resources. To fulfill the imperious demand, this study proposes an…

  18. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  19. Designing the next generation of user support tools: methodology

    NASA Astrophysics Data System (ADS)

    Koratkar, Anuradha; Douglas, Robert E.; Gerb, Andrew; Jones, Jeremy E.; Peterson, Karla A.; Van Der Marel, Roeland P.

    2000-07-01

    In this paper we present a strategy for developing the next generation of proposal preparation tools so that we can continue to optimize scientific returns from the Hubble Space Telescope in an era of constrained budgets. The new proposal preparation tools must be built with two goals: (1) to facilitate scientific investigation for observers, and (2) to decrease the effort spent on routine matters by observatory staff. We have based our conclusions on lessons learned from the Next Generation Space Telescope's Scientist's Expert Assistant experiment. We conclude that: (1) Compared to existing Hubble Space Telescope's Phase II RPS2 software, a modern set of proposal tools and an environment that integrates them will be appreciated by the user community. From the user's perspective the proposed software must be more intuitive, visual, and responsive. From the observatory's perspective the tools must be interoperable and extensible to other observatories. (2) To ensure state-of-the-art tools for proposal preparation for the user community, there needs to be a management structure that supports innovation. Further, the development activities need to be divided into innovating and fielding efforts to prevent operational pressures from inhibiting innovation. This will allow use of up-to-date technology so that the system can remain fluid and responsive to changes.

  20. Tools for Simulation and Benchmark Generation at Exascale

    SciTech Connect

    Lagadapati, Mahesh; Mueller, Frank; Engelmann, Christian

    2013-01-01

    The path to exascale high-performance computing (HPC) poses several challenges related to power, performance, resilience, productivity, programmability, data movement, and data management. Investigating the performance of parallel applications at scale on future architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. Simulations using models of future HPC systems and communication traces from applications running on existing HPC systems can offer an insight into the performance of future architectures. This work targets technology developed for scalable application tracing of communication events and memory profiles, but can be extended to other areas, such as I/O, control flow, and data flow. It further focuses on extreme-scale simulation of millions of Message Passing Interface (MPI) ranks using a lightweight parallel discrete event simulation (PDES) toolkit for performance evaluation. Instead of simply replaying a trace within a simulation, the approach is to generate a benchmark from it and to run this benchmark within a simulation using models to reflect the performance characteristics of future-generation HPC systems. This provides a number of benefits, such as eliminating the data intensive trace replay and enabling simulations at different scales. The presented work utilizes the ScalaTrace tool to generate scalable trace files, the ScalaBenchGen tool to generate the benchmark, and the xSim tool to run the benchmark within a simulation.

  1. Deciding for themselves.

    PubMed

    Mccormack, J; Nelson, C

    1985-11-01

    World Education, in a collaboration with PfP/International and with funding from US AID, has begun comprehensive program in Kenya that offers non-governmental organizations non-formal training, technical assistance in organization and business management, and financial assistance in the form of loans for revolving credit funds. The approach emphasizes Kenyans deciding for themselves about the directions projects should take. This article discusses the Tototo Home Industries' rural economic development program. After receiving a loan from Tototo, the women of Bofu village planned to stock their small village shop with matches, kerosene, soap, salt, and cooking oil. The remainder of the loan was saved to purchase future stock. For this project, bookkeeping and management skills were necessary. To meet this need, the Tototo small business advisor designed a simple cash boom system to be used by all the groups. Sessions in accounting were included in the annual training of trainers workshop. Currently, accounts advisor visit the groups monthly to provide follow-up training and assistance to ensure the women understand how to record project transactions accurately. The lesson to be drawn from these projects is simple. It is not unrealistic to set high expectations for project participants, but it is important to remain aware of the difficulty of the new concepts presented to these groups. In order for them to adequately master both concept and practice, the participants must be given sufficient time and support. In fact, consistent follow-up and close contact with the villages is the key to Tototo's success.

  2. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  3. Measuring the Effectiveness of "The Major Decision": A Career Counseling Group for Undecided and Re-Deciding First Year, First-Generation College Students

    ERIC Educational Resources Information Center

    Wheeler, Melissa

    2014-01-01

    Researchers have reported that the graduation and retention rate of students whose parents do not hold college degrees (first-generation college students or FGCS) are lower than that of their peers whose parents do hold college degrees. FGCS are 1.3 times more likely to leave college after their first year compared to their non-FGCS peers…

  4. Nearly arc-length tool path generation and tool radius compensation algorithm research in FTS turning

    NASA Astrophysics Data System (ADS)

    Zhao, Minghui; Zhao, Xuesen; Li, Zengqiang; Sun, Tao

    2014-08-01

    In the non-rotational symmetrical microstrcture surfaces generation using turning method with Fast Tool Servo(FTS), non-uniform distribution of the interpolation data points will lead to long processing cycle and poor surface quality. To improve this situation, nearly arc-length tool path generation algorithm is proposed, which generates tool tip trajectory points in nearly arc-length instead of the traditional interpolation rule of equal angle and adds tool radius compensation. All the interpolation points are equidistant in radial distribution because of the constant feeding speed in X slider, the high frequency tool radius compensation components are in both X direction and Z direction, which makes X slider difficult to follow the input orders due to its large mass. Newton iterative method is used to calculate the neighboring contour tangent point coordinate value with the interpolation point X position as initial value, in this way, the new Z coordinate value is gotten, and the high frequency motion components in X direction is decomposed into Z direction. Taking a typical microstructure with 4μm PV value for test, which is mixed with two 70μm wave length sine-waves, the max profile error at the angle of fifteen is less than 0.01μm turning by a diamond tool with big radius of 80μm. The sinusoidal grid is machined on a ultra-precision lathe succesfully, the wavelength is 70.2278μm the Ra value is 22.81nm evaluated by data points generated by filtering out the first five harmonics.

  5. Automatic Tool Path Generation for Robot Integrated Surface Sculpturing System

    NASA Astrophysics Data System (ADS)

    Zhu, Jiang; Suzuki, Ryo; Tanaka, Tomohisa; Saito, Yoshio

    In this paper, a surface sculpturing system based on 8-axis robot is proposed, the CAD/CAM software and tool path generation algorithm for this sculpturing system are presented. The 8-axis robot is composed of a 6-axis manipulator and a 2-axis worktable, it carves block of polystyrene foams by heated cutting tools. Multi-DOF (Degree of Freedom) robot benefits from the faster fashion than traditional RP (Rapid Prototyping) methods and more flexibility than CNC machining. With its flexibility driven from an 8-axis configuration, as well as efficient custom-developed software for rough cutting and finish cutting, this surface sculpturing system can carve sculptured surface accurately and efficiently.

  6. Benchmarking the next generation of homology inference tools.

    PubMed

    Saripella, Ganapathi Varma; Sonnhammer, Erik L L; Forslund, Kristoffer

    2016-09-01

    Over the last decades, vast numbers of sequences were deposited in public databases. Bioinformatics tools allow homology and consequently functional inference for these sequences. New profile-based homology search tools have been introduced, allowing reliable detection of remote homologs, but have not been systematically benchmarked. To provide such a comparison, which can guide bioinformatics workflows, we extend and apply our previously developed benchmark approach to evaluate the 'next generation' of profile-based approaches, including CS-BLAST, HHSEARCH and PHMMER, in comparison with the non-profile based search tools NCBI-BLAST, USEARCH, UBLAST and FASTA. We generated challenging benchmark datasets based on protein domain architectures within either the PFAM + Clan, SCOP/Superfamily or CATH/Gene3D domain definition schemes. From each dataset, homologous and non-homologous protein pairs were aligned using each tool, and standard performance metrics calculated. We further measured congruence of domain architecture assignments in the three domain databases. CSBLAST and PHMMER had overall highest accuracy. FASTA, UBLAST and USEARCH showed large trade-offs of accuracy for speed optimization. Profile methods are superior at inferring remote homologs but the difference in accuracy between methods is relatively small. PHMMER and CSBLAST stand out with the highest accuracy, yet still at a reasonable computational cost. Additionally, we show that less than 0.1% of Swiss-Prot protein pairs considered homologous by one database are considered non-homologous by another, implying that these classifications represent equivalent underlying biological phenomena, differing mostly in coverage and granularity. Benchmark datasets and all scripts are placed at (http://sonnhammer.org/download/Homology_benchmark). forslund@embl.de Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  7. How do we decide what to do? Resting-state connectivity patterns and components of self-generated thought linked to the development of more concrete personal goals.

    PubMed

    Medea, Barbara; Karapanagiotidis, Theodoros; Konishi, Mahiko; Ottaviani, Cristina; Margulies, Daniel; Bernasconi, Andrea; Bernasconi, Neda; Bernhardt, Boris C; Jefferies, Elizabeth; Smallwood, Jonathan

    2016-07-21

    Human cognition is not limited to the available environmental input but can consider realities that are different to the here and now. We describe the cognitive states and neural processes linked to the refinement of descriptions of personal goals. When personal goals became concrete, participants reported greater thoughts about the self and the future during mind-wandering. This pattern was not observed for descriptions of TV programmes. Connectivity analysis of participants who underwent a resting-state functional magnetic resonance imaging scan revealed neural traits associated with this pattern. Strong hippocampal connectivity with ventromedial pre-frontal cortex was common to better-specified descriptions of goals and TV programmes, while connectivity between hippocampus and the pre-supplementary motor area was associated with individuals whose goals were initially abstract but became more concrete over the course of the experiment. We conclude that self-generated cognition that arises during the mind-wandering state can allow goals to be refined, and this depends on neural systems anchored in the hippocampus.

  8. DNA Assembly Tools and Strategies for the Generation of Plasmids.

    PubMed

    Baek, Chang-Ho; Liss, Michael; Clancy, Kevin; Chesnut, Jonathan; Katzen, Federico

    2014-10-01

    Since the discovery of restriction enzymes and the generation of the first recombinant DNA molecule over 40 years ago, molecular biology has evolved into a multidisciplinary field that has democratized the conversion of a digitized DNA sequence stored in a computer into its biological counterpart, usually as a plasmid, stored in a living cell. In this article, we summarize the most relevant tools that allow the swift assembly of DNA sequences into useful plasmids for biotechnological purposes. We cover the main components and stages in a typical DNA assembly workflow, namely in silico design, de novo gene synthesis, and in vitro and in vivo sequence assembly methodologies.

  9. The Requirements Generation System: A tool for managing mission requirements

    NASA Technical Reports Server (NTRS)

    Sheppard, Sylvia B.

    1994-01-01

    Historically, NASA's cost for developing mission requirements has been a significant part of a mission's budget. Large amounts of time have been allocated in mission schedules for the development and review of requirements by the many groups who are associated with a mission. Additionally, tracing requirements from a current document to a parent document has been time-consuming and costly. The Requirements Generation System (RGS) is a computer-supported cooperative-work tool that assists mission developers in the online creation, review, editing, tracing, and approval of mission requirements as well as in the production of requirements documents. This paper describes the RGS and discusses some lessons learned during its development.

  10. Benchmarking the next generation of homology inference tools

    PubMed Central

    Saripella, Ganapathi Varma; Sonnhammer, Erik L. L.; Forslund, Kristoffer

    2016-01-01

    Motivation: Over the last decades, vast numbers of sequences were deposited in public databases. Bioinformatics tools allow homology and consequently functional inference for these sequences. New profile-based homology search tools have been introduced, allowing reliable detection of remote homologs, but have not been systematically benchmarked. To provide such a comparison, which can guide bioinformatics workflows, we extend and apply our previously developed benchmark approach to evaluate the ‘next generation’ of profile-based approaches, including CS-BLAST, HHSEARCH and PHMMER, in comparison with the non-profile based search tools NCBI-BLAST, USEARCH, UBLAST and FASTA. Method: We generated challenging benchmark datasets based on protein domain architectures within either the PFAM + Clan, SCOP/Superfamily or CATH/Gene3D domain definition schemes. From each dataset, homologous and non-homologous protein pairs were aligned using each tool, and standard performance metrics calculated. We further measured congruence of domain architecture assignments in the three domain databases. Results: CSBLAST and PHMMER had overall highest accuracy. FASTA, UBLAST and USEARCH showed large trade-offs of accuracy for speed optimization. Conclusion: Profile methods are superior at inferring remote homologs but the difference in accuracy between methods is relatively small. PHMMER and CSBLAST stand out with the highest accuracy, yet still at a reasonable computational cost. Additionally, we show that less than 0.1% of Swiss-Prot protein pairs considered homologous by one database are considered non-homologous by another, implying that these classifications represent equivalent underlying biological phenomena, differing mostly in coverage and granularity. Availability and Implementation: Benchmark datasets and all scripts are placed at (http://sonnhammer.org/download/Homology_benchmark). Contact: forslund@embl.de Supplementary information: Supplementary data are available at

  11. Deciding to quit drinking alcohol

    MedlinePlus

    ... Alcohol abuse - quitting drinking; Quitting drinking; Quitting alcohol; Alcoholism - deciding to quit ... pubmed/23698791 . National Institute on Alcohol Abuse and Alcoholism. Alcohol and health. www.niaaa.nih.gov/alcohol- ...

  12. Ambit-Tautomer: An Open Source Tool for Tautomer Generation.

    PubMed

    Kochev, Nikolay T; Paskaleva, Vesselina H; Jeliazkova, Nina

    2013-06-01

    We present a new open source tool for automatic generation of all tautomeric forms of a given organic compound. Ambit-Tautomer is a part of the open source software package Ambit2. It implements three tautomer generation algorithms: combinatorial method, improved combinatorial method and incremental depth-first search algorithm. All algorithms utilize a set of fully customizable rules for tautomeric transformations. The predefined knowledge base covers 1-3, 1-5 and 1-7 proton tautomeric shifts. Some typical supported tautomerism rules are keto-enol, imin-amin, nitroso-oxime, azo-hydrazone, thioketo-thioenol, thionitroso-thiooxime, amidine-imidine, diazoamino-diazoamino, thioamide-iminothiol and nitrosamine-diazohydroxide. Ambit-Tautomer uses a simple energy based system for tautomer ranking implemented by a set of empirically derived rules. A fine-grained output control is achieved by a set of post-generation filters. We performed an exhaustive comparison of the Ambit-Tautomer Incremental algorithm against several other software packages which offer tautomer generation: ChemAxon Marvin, Molecular Networks MN.TAUTOMER, ACDLabs, CACTVS and the CDK implementation of the algorithm, based on the mobile H atoms listed in the InChI. According to the presented test results, Ambit-Tautomer's performance is either comparable to or better than the competing algorithms. Ambit-Tautomer module is available for download as a Java library, a command line application, a demo web page or OpenTox API compatible Web service. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Tool for Generating Realistic Residential Hot Water Event Schedules: Preprint

    SciTech Connect

    Hendron, B.; Burch, J.; Barker, G.

    2010-08-01

    The installed energy savings for advanced residential hot water systems can depend greatly on detailed occupant use patterns. Quantifying these patterns is essential for analyzing measures such as tankless water heaters, solar hot water systems with demand-side heat exchangers, distribution system improvements, and recirculation loops. This paper describes the development of an advanced spreadsheet tool that can generate a series of year-long hot water event schedules consistent with realistic probability distributions of start time, duration and flow rate variability, clustering, fixture assignment, vacation periods, and seasonality. This paper also presents the application of the hot water event schedules in the context of an integral-collector-storage solar water heating system in a moderate climate.

  14. Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools

    NASA Technical Reports Server (NTRS)

    Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory

    2013-01-01

    Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.

  15. Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools

    NASA Technical Reports Server (NTRS)

    Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory

    2013-01-01

    Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.

  16. Automatic Generation of Remote Visualization Tools with WATT

    NASA Astrophysics Data System (ADS)

    Jensen, P. A.; Bollig, E. F.; Yuen, D. A.; Erlebacher, G.; Momsen, A. R.

    2006-12-01

    The ever increasing size and complexity of geophysical and other scientific datasets has forced developers to turn to more powerful alternatives for visualizing results of computations and experiments. These alternative need to be faster, scalable, more efficient, and able to be run on large machines. At the same time, advances in scripting languages and visualization libraries have significantly decreased the development time of smaller, desktop visualization tools. Ideally, programmers would be able to develop visualization tools in a high-level, local, scripted environment and then automatically convert their programs into compiled, remote visualization tools for integration into larger computation environments. The Web Automation and Translation Toolkit (WATT) [1] converts a Tcl script for the Visualization Toolkit (VTK) [2] into a standards-compliant web service. We will demonstrate the used of WATT for the automated conversion of a desktop visualization application (written in Tcl for VTK) into a remote visualization service of interest to geoscientists. The resulting service will allow real-time access to a large dataset through the Internet, and will be easily integrated into the existing architecture of the Virtual Laboratory for Earth and Planetary Materials (VLab) [3]. [1] Jensen, P.A., Yuen, D.A., Erlebacher, G., Bollig, E.F., Kigelman, D.G., Shukh, E.A., Automated Generation of Web Services for Visualization Toolkits, Eos Trans. AGU, 86(52), Fall Meet. Suppl., Abstract IN42A-06, 2005. [2] The Visualization Toolkit, http://www.vtk.org [3] The Virtual Laboratory for Earth and Planetary Materials, http://vlab.msi.umn.edu

  17. World wide matching of registration metrology tools of various generations

    NASA Astrophysics Data System (ADS)

    Laske, F.; Pudnos, A.; Mackey, L.; Tran, P.; Higuchi, M.; Enkrich, C.; Roeth, K.-D.; Schmidt, K.-H.; Adam, D.; Bender, J.

    2008-10-01

    Turn around time/cycle time is a key success criterion in the semiconductor photomask business. Therefore, global mask suppliers typically allocate work loads based on fab capability and utilization capacity. From a logistical point of view, the manufacturing location of a photomask should be transparent to the customer (mask user). Matching capability of production equipment and especially metrology tools is considered a key enabler to guarantee cross site manufacturing flexibility. Toppan, with manufacturing sites in eight countries worldwide, has an on-going program to match the registration metrology systems of all its production sites. This allows for manufacturing flexibility and risk mitigation.In cooperation with Vistec Semiconductor Systems, Toppan has recently completed a program to match the Vistec LMS IPRO systems at all production sites worldwide. Vistec has developed a new software feature which allows for significantly improved matching of LMS IPRO(x) registration metrology tools of various generations. We will report on the results of the global matching campaign of several of the leading Toppan sites.

  18. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    SciTech Connect

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  19. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGES

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; ...

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  20. Next generation sequencing: new tools in immunology and hematology.

    PubMed

    Mori, Antonio; Deola, Sara; Xumerle, Luciano; Mijatovic, Vladan; Malerba, Giovanni; Monsurrò, Vladia

    2013-12-01

    One of the hallmarks of the adaptive immune system is the specificity of B and T cell receptors. Thanks to somatic recombination, a large repertoire of receptors can be generated within an individual that guarantee the recognition of a vast number of antigens. Monoclonal antibodies have limited applicability, given the high degree of diversity among these receptors, in BCR and TCR monitoring. Furthermore, with regard to cancer, better characterization of complex genomes and the ability to monitor tumor-specific cryptic mutations or translocations are needed to develop better tailored therapies. Novel technologies, by enhancing the ability of BCR and TCR monitoring, can help in the search for minimal residual disease during hematological malignancy diagnosis and follow-up, and can aid in improving bone marrow transplantation techniques. Recently, a novel technology known as next generation sequencing has been developed; this allows the recognition of unique sequences and provides depth of coverage, heterogeneity, and accuracy of sequencing. This provides a powerful tool that, along with microarray analysis for gene expression, may become integral in resolving the remaining key problems in hematology. This review describes the state of the art of this novel technology, its application in the immunological and hematological fields, and the possible benefits it will provide for the hematology and immunology community.

  1. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    SciTech Connect

    Cem Sarica; Holden Zhang

    2006-05-31

    The developments of oil and gas fields in deep waters (5000 ft and more) will become more common in the future. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas, oil and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of hydrocarbon recovery from design to operation. Recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications, including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is crucial for any multiphase separation technique, either at topside, seabed or bottom-hole, to know inlet conditions such as flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. Therefore, the development of a new generation of multiphase flow predictive tools is needed. The overall objective of the proposed study is to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). In the current multiphase modeling approach, flow pattern and flow behavior (pressure gradient and phase fractions) prediction modeling are separated. Thus, different models based on different physics are employed, causing inaccuracies and discontinuities. Moreover, oil and water are treated as a pseudo single phase, ignoring the distinct characteristics of both oil and water, and often resulting in inaccurate design that leads to operational problems. In this study, a new model is being developed through a theoretical and experimental study employing a revolutionary approach. The

  2. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  3. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  4. Performance Evaluation Tools for Next Generation Scalable Computing Platforms

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Sarukkai, Sekhar; Craw, James (Technical Monitor)

    1995-01-01

    The Federal High Performance and Communications (HPCC) Program continue to focus on R&D in a wide range of high performance computing and communications technologies. Using its accomplishments in the past four years as building blocks towards a Global Information Infrastructure (GII), an Implementation Plan that identifies six Strategic Focus Areas for R&D has been proposed. This white paper argues that a new generation of system software and programming tools must be developed to support these focus areas, so that the R&D we invest today can lead to technology pay-off a decade from now. The Global Computing Infrastructure (GCI) in the Year 2000 and Beyond would consists of thousands of powerful computing nodes connected via high-speed networks across the globe. Users will be able to obtain computing in formation services the GCI with the ease of using a plugging a toaster into the electrical outlet on the wall anywhere in the country. Developing and managing the GO requires performance prediction and monitoring capabilities that do not exist. Various accomplishments in this field today must be integrated and expanded to support this vision.

  5. Deciding where to Stop Speaking

    ERIC Educational Resources Information Center

    Tydgat, Ilse; Stevens, Michael; Hartsuiker, Robert J.; Pickering, Martin J.

    2011-01-01

    This study investigated whether speakers strategically decide where to interrupt their speech once they need to stop. We conducted four naming experiments in which pictures of colored shapes occasionally changed in color or shape. Participants then merely had to stop (Experiment 1); or they had to stop and resume speech (Experiments 2-4). They…

  6. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    SciTech Connect

    Tulsa Fluid Flow

    2008-08-31

    The developments of fields in deep waters (5000 ft and more) is a common occurrence. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas-oil-and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of the hydrocarbon recovery from design to operation. The recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is very crucial to any multiphase separation technique that is employed either at topside, seabed or bottom-hole to know inlet conditions such as the flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. The overall objective was to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict the flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). The project was conducted in two periods. In Period 1 (four years), gas-oil-water flow in pipes were investigated to understand the fundamental physical mechanisms describing the interaction between the gas-oil-water phases under flowing conditions, and a unified model was developed utilizing a novel modeling approach. A gas-oil-water pipe flow database including field and laboratory data was formed in Period 2 (one year). The database was utilized in model performance demonstration. Period 1 primarily consisted of the development of a unified model and software to predict the gas-oil-water flow, and experimental studies of the gas-oil-water project, including flow behavior description and

  7. Developing the Next Generation of Tools for Simulating Galaxy Outflows

    NASA Astrophysics Data System (ADS)

    Scannapieco, Evan

    Outflows are observed in starbursting galaxies of all masses and at all cosmological epochs. They play a key role throughout the history of the Universe: shaping the galaxy mass-metallicity relation, drastically affecting the content and number density of dwarf galaxies, and transforming the chemical composition of the intergalactic medium. Yet, a complete model of galaxy out ows has proven to be elusive, as it requires both a better understanding of the evolution of the turbulent, multiphase gas in and around starbursting galaxies, and better tools to reproduce this evolution in galaxy-scale simulations. Here we propose to conduct a detailed series of numerical simulations designed to help develop such next-generation tools for the simulation of galaxy outflows. The program will consist of three types of direct numerical simulations, each of which will be targeted to allow galaxy-scale simulations to more accurately model key microphysical processes and their observational consequences. Our first set of simulations will be targeted at better modeling the starbursting interstellar medium (ISM) from which galaxy outflows are driven. The surface densities in starbursting galaxies are much larger than those in the Milky Way, resulting in larger gravitational accelerations and random velocities exceeding 30 or even 100 km/s. Under these conditions, the thermal stability of the ISM is changed dramatically, due to the sharp peak in gas cooling efficiency at H 200,000 K. Our simulations will carefully quantify the key ways in which this medium differs from the local ISM, and the consequences of these differences for when, where, and how outflows are driven. A second set of simulations will be targeted at better modeling the observed properties of rapidly cooling, highly turbulent gas. Because gas cooling in and around starbursts is extremely efficient, turbulent motions are often supersonic, which leads to a distribution of ionization states that is vastly different than

  8. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  9. Lyo code generator: A model-based code generator for the development of OSLC-compliant tool interfaces

    NASA Astrophysics Data System (ADS)

    El-khoury, Jad

    To promote the newly emerging OSLC (Open Services for Lifecycle Collaboration) tool interoperability standard, an open source code generator is developed that allows for the specification of OSLC-compliant tool interfaces, and from which almost complete Java code of the interface can be generated. The software takes a model-based development approach to tool interoperability, with the aim of providing modeling support for the complete development cycle of a tool interface. The software targets both OSLC developers, as well as the interoperability research community, with proven capabilities to be extended to support their corresponding needs.

  10. Ontodog: a web-based ontology community view generation tool.

    PubMed

    Zheng, Jie; Xiang, Zuoshuang; Stoeckert, Christian J; He, Yongqun

    2014-05-01

    Biomedical ontologies are often very large and complex. Only a subset of the ontology may be needed for a specified application or community. For ontology end users, it is desirable to have community-based labels rather than the labels generated by ontology developers. Ontodog is a web-based system that can generate an ontology subset based on Excel input, and support generation of an ontology community view, which is defined as the whole or a subset of the source ontology with user-specified annotations including user-preferred labels. Ontodog allows users to easily generate community views with minimal ontology knowledge and no programming skills or installation required. Currently >100 ontologies including all OBO Foundry ontologies are available to generate the views based on user needs. We demonstrate the application of Ontodog for the generation of community views using the Ontology for Biomedical Investigations as the source ontology.

  11. Virtual Tool Mark Generation for Efficient Striation Analysis

    SciTech Connect

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  12. Virtual tool mark generation for efficient striation analysis.

    PubMed

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-07-01

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5-10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  13. New generation of exploration tools: interactive modeling software and microcomputers

    SciTech Connect

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly, these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.

  14. Bioethics in America: Who decides?

    SciTech Connect

    Yesley, M.S.

    1992-05-01

    This paper is concerned with the process by which bioethics decisions are made as well as the actual decisions that are reached. The process commonly is one of ``shared decision-making,`` that is, decisionmaking at several levels, beginning with the government and ending with the individual. After the government has defined a scope of permissible activity, the research or health care institution may further limit what activities are permitted. Finally, the individual patient, or, if the patient is incompetent, the patient`s legal representative decides whether or not to participate in the activity. Because bioethics in general, and bioethics related to genetics in particular, evolves through this process of decisionmaking at several levels, this paper briefly traces the process, to see how it works in several areas of bioethics, in order to provide a perspective on the way in which ethical decisions related to genetics are or will be made.

  15. Bioethics in America: Who decides

    SciTech Connect

    Yesley, M.S.

    1992-01-01

    This paper is concerned with the process by which bioethics decisions are made as well as the actual decisions that are reached. The process commonly is one of shared decision-making,'' that is, decisionmaking at several levels, beginning with the government and ending with the individual. After the government has defined a scope of permissible activity, the research or health care institution may further limit what activities are permitted. Finally, the individual patient, or, if the patient is incompetent, the patient's legal representative decides whether or not to participate in the activity. Because bioethics in general, and bioethics related to genetics in particular, evolves through this process of decisionmaking at several levels, this paper briefly traces the process, to see how it works in several areas of bioethics, in order to provide a perspective on the way in which ethical decisions related to genetics are or will be made.

  16. JVM: Java Visual Mapping tool for next generation sequencing read.

    PubMed

    Yang, Ye; Liu, Juan

    2015-01-01

    We developed a program JVM (Java Visual Mapping) for mapping next generation sequencing read to reference sequence. The program is implemented in Java and is designed to deal with millions of short read generated by sequence alignment using the Illumina sequencing technology. It employs seed index strategy and octal encoding operations for sequence alignments. JVM is useful for DNA-Seq, RNA-Seq when dealing with single-end resequencing. JVM is a desktop application, which supports reads capacity from 1 MB to 10 GB.

  17. Computer Generated Optical Illusions: A Teaching and Research Tool.

    ERIC Educational Resources Information Center

    Bailey, Bruce; Harman, Wade

    Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…

  18. Financial management: a necessary tool for generating cash.

    PubMed

    Humphrey, E; Cilwik, C J

    1994-01-01

    This article is an introduction to four types of financial analysis and a foundation for additional exposure to financial analysis. If you don't like working with numbers, consider hiring an accountant or a qualified industry consultant to help you analyze your business. Eventually, you will learn what financial clues to look for when analyzing your business and how to reach your objectives and generate cash to reinvest in your business.

  19. HALOGEN: a tool for fast generation of mock halo catalogues

    NASA Astrophysics Data System (ADS)

    Avila, Santiago; Murray, Steven G.; Knebe, Alexander; Power, Chris; Robotham, Aaron S. G.; Garcia-Bellido, Juan

    2015-06-01

    We present a simple method of generating approximate synthetic halo catalogues: HALOGEN. This method uses a combination of second-order Lagrangian Perturbation Theory (2LPT) in order to generate the large-scale matter distribution, analytical mass functions to generate halo masses, and a single-parameter stochastic model for halo bias to position haloes. HALOGEN represents a simplification of similar recently published methods. Our method is constrained to recover the two-point function at intermediate (10 h-1 Mpc < r < 50 h-1 Mpc) scales, which we show is successful to within 2 per cent. Larger scales (˜100 h-1 Mpc) are reproduced to within 15 per cent. We compare several other statistics (e.g. power spectrum, point distribution function, redshift space distortions) with results from N-body simulations to determine the validity of our method for different purposes. One of the benefits of HALOGEN is its flexibility, and we demonstrate this by showing how it can be adapted to varying cosmologies and simulation specifications. A driving motivation for the development of such approximate schemes is the need to compute covariance matrices and study the systematic errors for large galaxy surveys, which requires thousands of simulated realizations. We discuss the applicability of our method in this context, and conclude that it is well suited to mass production of appropriate halo catalogues. The code is publicly available at https://github.com/savila/halogen.

  20. An Educational Tool for a New Generation of Tsunami Scientists

    NASA Astrophysics Data System (ADS)

    Bernard, E. N.; Robinson, A. R.

    2008-12-01

    What emerges from the 2004 Indian Ocean tsunami and society's response is a call for research that will mitigate the effects of the next tsunami on society. The scale of the 2004 tsunami's impact (227,000 deaths, 10B damage), and the world's compassionate response (13.5B), requires that tsunami research focus on applications that benefit society. Tsunami science will be expected to develop standards that ensure mitigation products are based on state-of-the-science. Standards based on scientifically endorsed procedures assure the highest quality application of this science. Community educational activities will be expected to focus on preparing society for the next tsunami. An excellent starting point for the challenges ahead is education, at all levels, including practitioners, the public, and a new generation of tsunami scientists. To educate the new generation of scientists, Volume 15 of The Sea: Tsunamis has been written to capture the technical elements of tsunami state-of-the-science today. The volume includes: the recorded and geologic history of tsunamis and how to assess the probability of the tsunami risk; the generation of tsunamis; the measurement and modeling of tsunami propagation and inundation; the impacts of tsunamis on coastlines; and tsunami forecast and warnings. Together, this volume gives a technical foundation to apply tsunami science to community-based tsunami preparedness. The editors of The Sea: Tsunamis will present an overview of the volume with emphasis on its value to higher education.

  1. UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar; Degani, Asaf; Heymann, Michael

    2004-01-01

    In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.

  2. UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar; Degani, Asaf; Heymann, Michael

    2004-01-01

    In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.

  3. E-DECIDER Decision Support Gateway For Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.

    2013-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that

  4. Architecture for the Next Generation System Management Tools

    SciTech Connect

    Gallard, Jerome; Scott, Stephen L; Vallee, Geoffroy R

    2011-01-01

    To get more results or greater accuracy, computational scientists execute their applications on distributed computing platforms such as Clusters, Grids and Clouds. These platforms are different in terms of hardware and software resources as well as locality: some span across multiple sites and multiple administrative domains whereas others are limited to a single site/domain. As a consequence, in order to scale their applica- tions up the scientists have to manage technical details for each target platform. From our point of view, this complexity should be hidden from the scientists who, in most cases, would prefer to focus on their research rather than spending time dealing with platform configuration concerns. In this article, we advocate for a system management framework that aims to automatically setup the whole run-time environment according to the applications needs. The main difference with regards to usual approaches is that they generally only focus on the software layer whereas we address both the hardware and the software expecta- tions through a unique system. For each application, scientists describe their requirements through the definition of a Virtual Platform (VP) and a Virtual System Environment (VSE). Relying on the VP/VSE definitions, the framework is in charge of: (i) the configuration of the physical infrastructure to satisfy the VP requirements, (ii) the setup of the VP, and (iii) the customization of the execution environment (VSE) upon the former VP. We propose a new formalism that the system can rely upon to successfully perform each of these three steps without burdening the user with the specifics of the configuration for the physical resources, and system management tools. This formalism leverages Goldberg s theory for recursive virtual machines by introducing new concepts based on system virtualization (identity, partitioning, aggregation) and emulation (simple, abstraction). This enables the definition of complex VP/VSE configurations

  5. Functional viral metagenomics and the next generation of molecular tools.

    PubMed

    Schoenfeld, Thomas; Liles, Mark; Wommack, K Eric; Polson, Shawn W; Godiska, Ronald; Mead, David

    2010-01-01

    The enzymes of bacteriophages and other viruses have been essential research tools since the first days of molecular biology. However, the current repertoire of viral enzymes only hints at their overall potential. The most commonly used enzymes are derived from a surprisingly small number of cultivated viruses, which is remarkable considering the extreme abundance and diversity of viruses revealed over the past decade by metagenomic analysis. To access the treasure trove of enzymes hidden in the global virosphere and develop them for research, therapeutic and diagnostic uses, improvements are needed in our ability to rapidly and efficiently discover, express and characterize viral genes to produce useful proteins. In this paper, we discuss improvements to sampling and cloning methods, functional and genomics-based screens, and expression systems, which should accelerate discovery of new enzymes and other viral proteins for use in research and medicine.

  6. A distributed computing tool for generating neural simulation databases.

    PubMed

    Calin-Jageman, Robert J; Katz, Paul S

    2006-12-01

    After developing a model neuron or network, it is important to systematically explore its behavior across a wide range of parameter values or experimental conditions, or both. However, compiling a very large set of simulation runs is challenging because it typically requires both access to and expertise with high-performance computing facilities. To lower the barrier for large-scale model analysis, we have developed NeuronPM, a client/server application that creates a "screen-saver" cluster for running simulations in NEURON (Hines & Carnevale, 1997). NeuronPM provides a user-friendly way to use existing computing resources to catalog the performance of a neural simulation across a wide range of parameter values and experimental conditions. The NeuronPM client is a Windows-based screen saver, and the NeuronPM server can be hosted on any Apache/PHP/MySQL server. During idle time, the client retrieves model files and work assignments from the server, invokes NEURON to run the simulation, and returns results to the server. Administrative panels make it simple to upload model files, define the parameters and conditions to vary, and then monitor client status and work progress. NeuronPM is open-source freeware and is available for download at http://neuronpm.homeip.net . It is a useful entry-level tool for systematically analyzing complex neuron and network simulations.

  7. Multiscale Toxicology - Building the Next Generation Tools for Toxicology

    SciTech Connect

    Thrall, Brian D.; Minard, Kevin R.; Teeguarden, Justin G.; Waters, Katrina M.

    2012-09-01

    A Cooperative Research and Development Agreement (CRADA) was sponsored by Battelle Memorial Institute (Battelle, Columbus), to initiate a collaborative research program across multiple Department of Energy (DOE) National Laboratories aimed at developing a suite of new capabilities for predictive toxicology. Predicting the potential toxicity of emerging classes of engineered nanomaterials was chosen as one of two focusing problems for this program. PNNL’s focus toward this broader goal was to refine and apply experimental and computational tools needed to provide quantitative understanding of nanoparticle dosimetry for in vitro cell culture systems, which is necessary for comparative risk estimates for different nanomaterials or biological systems. Research conducted using lung epithelial and macrophage cell models successfully adapted magnetic particle detection and fluorescent microscopy technologies to quantify uptake of various forms of engineered nanoparticles, and provided experimental constraints and test datasets for benchmark comparison against results obtained using an in vitro computational dosimetry model, termed the ISSD model. The experimental and computational approaches developed were used to demonstrate how cell dosimetry is applied to aid in interpretation of genomic studies of nanoparticle-mediated biological responses in model cell culture systems. The combined experimental and theoretical approach provides a highly quantitative framework for evaluating relationships between biocompatibility of nanoparticles and their physical form in a controlled manner.

  8. Generating mammalian sirtuin tools for protein-interaction analysis.

    PubMed

    Hershberger, Kathleen A; Motley, Jonathan; Hirschey, Matthew D; Anderson, Kristin A

    2013-01-01

    The sirtuins are a family of NAD(+)-dependent deacylases with important effects on aging, cancer, and metabolism. Sirtuins exert their biological effects by catalyzing deacetylation and/or deacylation reactions in which Acyl groups are removed from lysine residues of specific proteins. A current challenge is to identify specific sirtuin target proteins against the high background of acetylated proteins recently identified by proteomic surveys. New evidence indicates that bona fide sirtuin substrate proteins form stable physical associations with their sirtuin regulator. Therefore, identification of sirtuin interacting proteins could be a useful aid in focusing the search for substrates. Described here is a method for identifying sirtuin protein interactors. Employing basic techniques of molecular cloning and immunochemistry, the method describes the generation of mammalian sirtuin protein expression plasmids and their use to overexpress and immunoprecipitate sirtuins with their interacting partners. Also described is the use of the Database for Annotation, Visualization, and Integrated Discovery for interpreting the sirtuin protein-interaction data obtained.

  9. The clinical case report: a tool for hypothesis generation.

    PubMed

    Sniderman, A D

    1996-10-01

    The clinical case report is generally limited to a description of unusual examples of the complications of disease or responses to therapy. However, it can also be used to present novel hypotheses which have been derived from individual cases. Two examples of this latter genre are presented and updated. These are Syndrome X and the stiff left atrial syndrome. In both instances, general and novel formulations were derived from single cases. With respect to Syndrome X, a hypothesis was generated that the chest pain and ST abnormalities in these patients represent excess activation of adenosine A1 receptors in the absence of myocardial ischemia. With respect to the stiff left atrial syndrome, recognition of the first case led to the recognition of the problem in many others. Now, a variant of the syndrome has been recognized in which mitral regurgitation is also present. In addition, the possibility that tricuspid annuloplasty may rescue patients dying of cardiac cachexia due to right heart failure caused by combined pressure and volume overload of the right ventricle is outlined.

  10. Next generation sequencing technologies: tool to study avian virus diversity.

    PubMed

    Kapgate, S S; Barbuddhe, S B; Kumanan, K

    2015-03-01

    Increased globalisation, climatic changes and wildlife-livestock interface led to emergence of novel viral pathogens or zoonoses that have become serious concern to avian, animal and human health. High biodiversity and bird migration facilitate spread of the pathogen and provide reservoirs for emerging infectious diseases. Current classical diagnostic methods designed to be virus-specific or aim to be limited to group of viral agents, hinder identifying of novel viruses or viral variants. Recently developed approaches of next-generation sequencing (NGS) provide culture-independent methods that are useful for understanding viral diversity and discovery of novel virus, thereby enabling a better diagnosis and disease control. This review discusses the different possible steps of a NGS study utilizing sequence-independent amplification, high-throughput sequencing and bioinformatics approaches to identify novel avian viruses and their diversity. NGS lead to the identification of a wide range of new viruses such as picobirnavirus, picornavirus, orthoreovirus and avian gamma coronavirus associated with fulminating disease in guinea fowl and is also used in describing viral diversity among avian species. The review also briefly discusses areas of viral-host interaction and disease associated causalities with newly identified avian viruses.

  11. Generation of Comprehensive Thoracic Oncology Database - Tool for Translational Research

    PubMed Central

    Surati, Mosmi; Robinson, Matthew; Nandi, Suvobroto; Faoro, Leonardo; Demchuk, Carley; Kanteti, Rajani; Ferguson, Benjamin; Gangadhar, Tara; Hensing, Thomas; Hasina, Rifat; Husain, Aliya; Ferguson, Mark; Karrison, Theodore; Salgia, Ravi

    2011-01-01

    The Thoracic Oncology Program Database Project was created to serve as a comprehensive, verified, and accessible repository for well-annotated cancer specimens and clinical data to be available to researchers within the Thoracic Oncology Research Program. This database also captures a large volume of genomic and proteomic data obtained from various tumor tissue studies. A team of clinical and basic science researchers, a biostatistician, and a bioinformatics expert was convened to design the database. Variables of interest were clearly defined and their descriptions were written within a standard operating manual to ensure consistency of data annotation. Using a protocol for prospective tissue banking and another protocol for retrospective banking, tumor and normal tissue samples from patients consented to these protocols were collected. Clinical information such as demographics, cancer characterization, and treatment plans for these patients were abstracted and entered into an Access database. Proteomic and genomic data have been included in the database and have been linked to clinical information for patients described within the database. The data from each table were linked using the relationships function in Microsoft Access to allow the database manager to connect clinical and laboratory information during a query. The queried data can then be exported for statistical analysis and hypothesis generation. PMID:21304468

  12. Profiting from competition: Financial tools for electric generation companies

    NASA Astrophysics Data System (ADS)

    Richter, Charles William, Jr.

    Regulations governing the operation of electric power systems in North America and many other areas of the world are undergoing major changes designed to promote competition. This process of change is often referred to as deregulation. Participants in deregulated electricity systems may find that their profits will greatly benefit from the implementation of successful bidding strategies. While the goal of the regulators may be to create rules which balance reliable power system operation with maximization of the total benefit to society, the goal of generation companies is to maximize their profit, i.e., return to their shareholders. The majority of the research described here is conducted from the point of view of generation companies (GENCOs) wishing to maximize their expected utility function, which is generally comprised of expected profit and risk. Strategies that help a GENCO to maximize its objective function must consider the impact of (and aid in making) operating decisions that may occur within a few seconds to multiple years. The work described here assumes an environment in which energy service companies (ESCOs) buy and GENCOs sell power via double auctions in regional commodity exchanges. Power is transported on wires owned by transmission companies (TRANSCOs) and distribution companies (DISTCOs). The proposed market framework allows participants to trade electrical energy contracts via the spot, futures, options, planning, and swap markets. An important method of studying these proposed markets and the behavior of participating agents is the field of experimental/computational economics. For much of the research reported here, the market simulator developed by Kumar and Sheble and similar simulators has been adapted to allow computerized agents to trade energy. Creating computerized agents that can react as rationally or irrationally as a human trader is a difficult problem for which we have turned to the field of artificial intelligence. Some of our

  13. Multiscale Toxicology- Building the Next Generation Tools for Toxicology

    SciTech Connect

    Retterer, S. T.; Holsapple, M. P.

    2013-10-31

    A Cooperative Research and Development Agreement (CRADA) was established between Battelle Memorial Institute (BMI), Pacific Northwest National Laboratory (PNNL), Oak Ridge National Laboratory (ORNL), Brookhaven National Laboratory (BNL), Lawrence Livermore National Laboratory (LLNL) with the goal of combining the analytical and synthetic strengths of the National Laboratories with BMI's expertise in basic and translational medical research to develop a collaborative pipeline and suite of high throughput and imaging technologies that could be used to provide a more comprehensive understanding of material and drug toxicology in humans. The Multi-Scale Toxicity Initiative (MSTI), consisting of the team members above, was established to coordinate cellular scale, high-throughput in vitro testing, computational modeling and whole animal in vivo toxicology studies between MSTI team members. Development of a common, well-characterized set of materials for testing was identified as a crucial need for the initiative. Two research tracks were established by BMI during the course of the CRADA. The first research track focused on the development of tools and techniques for understanding the toxicity of nanomaterials, specifically inorganic nanoparticles (NPs). ORNL"s work focused primarily on the synthesis, functionalization and characterization of a common set of NPs for dissemination to the participating laboratories. These particles were synthesized to retain the same surface characteristics and size, but to allow visualization using the variety of imaging technologies present across the team. Characterization included the quantitative analysis of physical and chemical properties of the materials as well as the preliminary assessment of NP toxicity using commercially available toxicity screens and emerging optical imaging strategies. Additional efforts examined the development of high-throughput microfluidic and imaging assays for measuring NP uptake, localization, and

  14. PC graphics generation and management tool for real-time applications

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1992-01-01

    A graphics tool was designed and developed for easy generation and management of personal computer graphics. It also provides methods and 'run-time' software for many common artificial intelligence (AI) or expert system (ES) applications.

  15. Skateboards or Wildlife? Kids Decide!

    ERIC Educational Resources Information Center

    Thomas, Julie; Cooper, Sandra; Haukos, David

    2004-01-01

    How can teachers make science learning relevant to today's technology savvy students? They can incorporate the Internet and use it as a tool to help solve real-life problems. A group of university professors, a field biologist, and classroom teachers teamed up to create an exciting, interactive Web-based learning environment for students and…

  16. Skateboards or Wildlife? Kids Decide!

    ERIC Educational Resources Information Center

    Thomas, Julie; Cooper, Sandra; Haukos, David

    2004-01-01

    How can teachers make science learning relevant to today's technology savvy students? They can incorporate the Internet and use it as a tool to help solve real-life problems. A group of university professors, a field biologist, and classroom teachers teamed up to create an exciting, interactive Web-based learning environment for students and…

  17. The mission events graphic generator software: A small tool with big results

    NASA Technical Reports Server (NTRS)

    Lupisella, Mark; Leibee, Jack; Scaffidi, Charles

    1993-01-01

    Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.

  18. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  19. Rover Team Decides: Safety First

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA's Mars Exploration Rover Spirit recorded this view while approaching the northwestern edge of 'Home Plate,' a circular plateau-like area of bright, layered outcrop material roughly 80 meters (260 feet) in diameter. The images combined into this mosaic were taken by Spirit's navigation camera during the rover's 746th, 748th and 750th Martian days, or sols (Feb. 7, 9 and 11, 2006).

    With Martian winter closing in, engineers and scientists working with NASA's Mars Exploration Rover Spirit decided to play it safe for the time being rather than attempt to visit the far side of Home Plate in search of rock layers that might show evidence of a past watery environment. This feature has been one of the major milestones of the mission. Though it's conceivable that rock layers might be exposed on the opposite side, sunlight is diminishing on the rover's solar panels and team members chose not to travel in a counterclockwise direction that would take the rover to the west and south slopes of the plateau. Slopes in that direction are hidden from view and team members chose, following a long, thorough discussion, to have the rover travel clockwise and remain on north-facing slopes rather than risk sending the rover deeper into unknown terrain.

    In addition to studying numerous images from Spirit's cameras, team members studied three-dimensional models created with images from the Mars Orbiter Camera on NASA's Mars Globel Surveyor orbiter. The models showed a valley on the southern side of Home Plate, the slopes of which might cause the rover's solar panels to lose power for unknown lengths of time. In addition, images from Spirit's cameras showed a nearby, talus-covered section of slope on the west side of Home Plate, rather than exposed rock layers scientists eventually hope to investigate.

    Home Plate has been on the rover's potential itinerary since the early days of the mission, when it stood out in images taken by the Mars Orbiter Camera shortly after

  20. Rover Team Decides: Safety First

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA's Mars Exploration Rover Spirit recorded this view while approaching the northwestern edge of 'Home Plate,' a circular plateau-like area of bright, layered outcrop material roughly 80 meters (260 feet) in diameter. The images combined into this mosaic were taken by Spirit's navigation camera during the rover's 746th, 748th and 750th Martian days, or sols (Feb. 7, 9 and 11, 2006).

    With Martian winter closing in, engineers and scientists working with NASA's Mars Exploration Rover Spirit decided to play it safe for the time being rather than attempt to visit the far side of Home Plate in search of rock layers that might show evidence of a past watery environment. This feature has been one of the major milestones of the mission. Though it's conceivable that rock layers might be exposed on the opposite side, sunlight is diminishing on the rover's solar panels and team members chose not to travel in a counterclockwise direction that would take the rover to the west and south slopes of the plateau. Slopes in that direction are hidden from view and team members chose, following a long, thorough discussion, to have the rover travel clockwise and remain on north-facing slopes rather than risk sending the rover deeper into unknown terrain.

    In addition to studying numerous images from Spirit's cameras, team members studied three-dimensional models created with images from the Mars Orbiter Camera on NASA's Mars Globel Surveyor orbiter. The models showed a valley on the southern side of Home Plate, the slopes of which might cause the rover's solar panels to lose power for unknown lengths of time. In addition, images from Spirit's cameras showed a nearby, talus-covered section of slope on the west side of Home Plate, rather than exposed rock layers scientists eventually hope to investigate.

    Home Plate has been on the rover's potential itinerary since the early days of the mission, when it stood out in images taken by the Mars Orbiter Camera shortly after

  1. Generation X Teaches College: Generation Construction as Pedagogical Tool in the Writing Classroom.

    ERIC Educational Resources Information Center

    Hassel, Holly; Epp, Dawn Vernooy

    In the 1996 book "Generation X Goes to College: An Eye-Opening Account of Teaching in Post-Modern America," Peter Sacks probes the "decay" of higher education in the United States; a decay he attributes to listless, entitled students. This paper interrogates the paradigm of Boomers and Generation Xers poised in opposition to…

  2. Generation X Teaches College: Generation Construction as Pedagogical Tool in the Writing Classroom.

    ERIC Educational Resources Information Center

    Hassel, Holly; Epp, Dawn Vernooy

    In the 1996 book "Generation X Goes to College: An Eye-Opening Account of Teaching in Post-Modern America," Peter Sacks probes the "decay" of higher education in the United States; a decay he attributes to listless, entitled students. This paper interrogates the paradigm of Boomers and Generation Xers poised in opposition to…

  3. Generative Text Sets: Tools for Negotiating Critically Inclusive Early Childhood Teacher Education Pedagogical Practices

    ERIC Educational Resources Information Center

    Souto-Manning, Mariana

    2017-01-01

    Through a case study, this article sheds light onto generative text sets as tools for developing and enacting critically inclusive early childhood teacher education pedagogies. In doing so, it positions teaching and learning processes as sociocultural, historical, and political acts as it inquires into the use of generative text sets in one early…

  4. Enzyme Function Initiative-Enzyme Similarity Tool (EFI-EST): A web tool for generating protein sequence similarity networks

    PubMed Central

    Gerlt, John A.; Bouvier, Jason T.; Davidson, Daniel B.; Imker, Heidi J.; Sadkhin, Boris; Slater, David R.; Whalen, Katie L.

    2015-01-01

    The Enzyme Function Initiative, an NIH/NIGMS-supported Large-Scale Collaborative Project (EFI; U54GM093342; http://enzymefunction.org/), is focused on devising and disseminating bioinformatics and computational tools as well as experimental strategies for the prediction and assignment of functions (in vitro activities and in vivo physiological/metabolic roles) to uncharacterized enzymes discovered in genome projects. Protein sequence similarity networks (SSNs) are visually powerful tools for analyzing sequence relationships in protein families (H.J. Atkinson, J.H. Morris, T.E. Ferrin, and P.C. Babbitt, PLoS One 2009, 4, e4345). However, the members of the biological/biomedical community have not had access to the capability to generate SSNs for their “favorite” protein families. In this article we announce the EFI-EST (Enzyme Function Initiative-Enzyme Similarity Tool) web tool (http://efi.igb.illinois.edu/efi-est/) that is available without cost for the automated generation of SSNs by the community. The tool can create SSNs for the “closest neighbors” of a user-supplied protein sequence from the UniProt database (Option A) or of members of any user-supplied Pfam and/or InterPro family (Option B). We provide an introduction to SSNs, a description of EFI-EST, and a demonstration of the use of EFI-EST to explore sequence-function space in the OMP decarboxylase superfamily (PF00215). This article is designed as a tutorial that will allow members of the community to use the EFI-EST web tool for exploring sequence/function space in protein families. PMID:25900361

  5. Deciding as Intentional Action: Control over Decisions

    PubMed Central

    Shepherd, Joshua

    2015-01-01

    Common-sense folk psychology and mainstream philosophy of action agree about decisions: these are under an agent's direct control, and are thus intentional actions for which agents can be held responsible. I begin this paper by presenting a problem for this view. In short, since the content of the motivational attitudes that drive deliberation and decision remains open-ended until the moment of decision, it is unclear how agents can be thought to exercise control over what they decide at the moment of deciding. I note that this problem might motivate a non-actional view of deciding—a view that decisions are not actions, but are instead passive events of intention acquisition. For without an understanding of how an agent might exercise control over what is decided at the moment of deciding, we lack a good reason for maintaining commitment to an actional view of deciding. However, I then offer the required account of how agents exercise control over decisions at the moment of deciding. Crucial to this account is an understanding of the relation of practical deliberation to deciding, an understanding of skilled deliberative activity, and the role of attention in the mental action of deciding. PMID:26321765

  6. E-DECIDER Rapid Response to the M 6.0 South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    E-DECIDER initiated rapid response mode when the California Earthquake Clearinghouse was activated the morning following the M6 Napa earthquake. Data products, including: 1) rapid damage and loss estimates, 2) deformation magnitude and slope change maps, and 3) aftershock forecasts were provided to the Clearinghouse partners within 24 hours of the event via XchangeCore Web Service Data Orchestration sharing. NASA data products were provided to end-users via XchangeCore, EERI and Clearinghouse websites, and ArcGIS online for Napa response, reaching a wide response audience. The E-DECIDER team helped facilitate rapid delivery of NASA products to stakeholders and participated in Clearinghouse Napa earthquake briefings to update stakeholders on product information. Rapid response products from E-DECIDER can be used to help prioritize response efforts shortly after the event has occurred. InLET (Internet Loss Estimation Tool) post-event damage and casualty estimates were generated quickly after the Napa earthquake. InLET provides immediate post-event estimates of casualties and building damage by performing loss/impact simulations using USGS ground motion data and FEMA HAZUS damage estimation technology. These results were provided to E-DECIDER by their collaborators, ImageCat, Inc. and the Community Stakeholder Network (CSN). Strain magnitude and slope change maps were automatically generated when the Napa earthquake appeared on the USGS feed. These maps provide an early estimate of where the deformation has occurred and where damage may be localized. Using E-DECIDER critical infrastructure overlays with damage estimates, decision makers can direct response effort that can be verified later with field reconnaissance and remote sensing-based observations. Earthquake aftershock forecast maps were produced within hours of the event. These maps highlight areas where aftershocks are likely to occur and can also be coupled with infrastructure overlays to help direct response

  7. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  8. Ramping and Uncertainty Prediction Tool - Analysis and Visualization of Wind Generation Impact on Electrical Grid

    SciTech Connect

    Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris; PNNL,

    2014-03-03

    RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variable resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.

  9. The Sequence of Events generator: A powerful tool for mission operations

    NASA Technical Reports Server (NTRS)

    Wobbe, Hubertus; Braun, Armin

    1994-01-01

    The functions and features of the sequence of events (SOE) and flight operations procedures (FOP) generator developed and used at DLR/GSOC for the positioning of EUTELSAT 2 satellites are presented. The SOE and FOP are the main operational documents that are prepared for nominal as well as for non-nominal mission execution. Their structure and application are described. Both of these documents are generated, validated, and maintained by a common software tool. Its main features and advantages are demonstrated. The tool has been improved continuously over the last 5 years. Due to its flexibility it can easily be applied to other projects and new features may be added.

  10. Description of an on-line decision-aid tool for generation-load balance control

    SciTech Connect

    Jourdin, P.; Vintache, P.; Heilbronn, B.; Lagrange, V. . Direction des Etudes et Recherches); Cartignies, E.; Millot, P. . Lab. d'Automatique Industrielle et Humaine)

    1994-02-01

    This paper presents the preliminary results of a design study, carried out at the Research Center of Electricite de France, of an on-line decision aid tool for real-time operation. This tool attempts to provide operators with an aid for an on-line generation rescheduling, when disturbances occur in the generation-load balance. Using Artificial Intelligence techniques, a method has been developed (based on a design-aid mock-up) which combines a knowledge base, heuristic rules, and classical algorithms. The first results, issued from simulations on cases derived from actual operation, are very promising.

  11. Ring-tool profiling - graphical method in CATIA based on Generating trajectories theorem

    NASA Astrophysics Data System (ADS)

    Frumuşanu, G.; Teodor, V.; Oancea, N.

    2016-11-01

    Machining of threads having high dimensions and multiple starts by turning is a challenging problem. An alternative possibility is to machine them by milling. The most productive milling solution is when using tools with inner active surface, namely ring tools. In the case of threads with multiple starts, the reciprocal enwrapped profile of the ring tool is considerably different to the shape of the thread axial (normal) section. In this paper, we suggest a methodology to profile the generator ring tool, based on a complementary theorem from enwrapped surfaces field. At the same time, a graphical algorithm aiming to find the ring tool profile, developed in CATIA graphical environment has been applied in the concrete case of a trapezoidal thread. The graphical profiling solution is presented in comparison to an analytical solution, in order to test the results precision. The graphical profiling method proves to be rigorous, easy to apply and highly intuitive.

  12. A comparison of tools for the simulation of genomic next-generation sequencing data.

    PubMed

    Escalona, Merly; Rocha, Sara; Posada, David

    2016-08-01

    Computer simulation of genomic data has become increasingly popular for assessing and validating biological models or for gaining an understanding of specific data sets. Several computational tools for the simulation of next-generation sequencing (NGS) data have been developed in recent years, which could be used to compare existing and new NGS analytical pipelines. Here we review 23 of these tools, highlighting their distinct functionality, requirements and potential applications. We also provide a decision tree for the informed selection of an appropriate NGS simulation tool for the specific question at hand.

  13. A comparison of tools for the simulation of genomic next-generation sequencing data

    PubMed Central

    Escalona, Merly; Rocha, Sara; Posada, David

    2017-01-01

    Computer simulation of genomic data has become increasingly popular for assessing and validating biological models or to gain understanding about specific datasets. Multiple computational tools for the simulation of next-generation sequencing (NGS) data have been developed in recent years, which could be used to compare existing and new NGS analytical pipelines. Here we review 23 of these tools, highlighting their distinct functionality, requirements and potential applications. We also provide a decision tree for the informed selection of an appropriate NGS simulation tool for the specific question at hand. PMID:27320129

  14. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the

  15. A decision support tool for landfill methane generation and gas collection.

    PubMed

    Emkes, Harriet; Coulon, Frédéric; Wagland, Stuart

    2015-09-01

    This study presents a decision support tool (DST) to enhance methane generation at individual landfill sites. To date there is no such tool available to provide landfill decision makers with clear and simplified information to evaluate biochemical processes within a landfill site, to assess performance of gas production and to identify potential remedies to any issues. The current lack in understanding stems from the complexity of the landfill waste degradation process. Two scoring sets for landfill gas production performance are calculated with the tool: (1) methane output score which measures the deviation of the actual methane output rate at each site which the prediction generated by the first order decay model LandGEM; and (2) landfill gas indicators' score, which measures the deviation of the landfill gas indicators from their ideal ranges for optimal methane generation conditions. Landfill gas indicators include moisture content, temperature, alkalinity, pH, BOD, COD, BOD/COD ratio, ammonia, chloride, iron and zinc. A total landfill gas indicator score is provided using multi-criteria analysis to calculate the sum of weighted scores for each indicator. The weights for each indicator are calculated using an analytical hierarchical process. The tool is tested against five real scenarios for landfill sites in UK with a range of good, average and poor landfill methane generation over a one year period (2012). An interpretation of the results is given for each scenario and recommendations are highlighted for methane output rate enhancement. Results demonstrate how the tool can help landfill managers and operators to enhance their understanding of methane generation at a site-specific level, track landfill methane generation over time, compare and rank sites, and identify problems areas within a landfill site.

  16. Case-Based Pedagogy Using Student-Generated Vignettes: A Pre-Service Intercultural Awareness Tool

    ERIC Educational Resources Information Center

    Cournoyer, Amy

    2010-01-01

    This qualitative study investigated the effectiveness of case-based pedagogy as an instructional tool aimed at increasing cultural awareness and competence in the preparation of 18 pre-service and in-service students enrolled in an Intercultural Education course. Each participant generated a vignette based on an instructional challenge identified…

  17. Genomic tools to improve progress and preserve variation for future generations

    USDA-ARS?s Scientific Manuscript database

    Use of genomic tools has greatly decreased generation intervals and increased genetic progress in dairy cattle, but faster selection cycles can also increase rates of inbreeding per unit of time. Average pedigree inbreeding of Holstein cows increased from 4.6% in 2000 to 5.6% in 2009 to 6.6% in 201...

  18. Evaluation of Computer Tools for Idea Generation and Team Formation in Project-Based Learning

    ERIC Educational Resources Information Center

    Ardaiz-Villanueva, Oscar; Nicuesa-Chacon, Xabier; Brene-Artazcoz, Oscar; Sanz de Acedo Lizarraga, Maria Luisa; Sanz de Acedo Baquedano, Maria Teresa

    2011-01-01

    The main objective of this research was to validate the effectiveness of Wikideas and Creativity Connector tools to stimulate the generation of ideas and originality by university students organized into groups according to their indexes of creativity and affinity. Another goal of the study was to evaluate the classroom climate created by these…

  19. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  20. Case-Based Pedagogy Using Student-Generated Vignettes: A Pre-Service Intercultural Awareness Tool

    ERIC Educational Resources Information Center

    Cournoyer, Amy

    2010-01-01

    This qualitative study investigated the effectiveness of case-based pedagogy as an instructional tool aimed at increasing cultural awareness and competence in the preparation of 18 pre-service and in-service students enrolled in an Intercultural Education course. Each participant generated a vignette based on an instructional challenge identified…

  1. Evaluation of Computer Tools for Idea Generation and Team Formation in Project-Based Learning

    ERIC Educational Resources Information Center

    Ardaiz-Villanueva, Oscar; Nicuesa-Chacon, Xabier; Brene-Artazcoz, Oscar; Sanz de Acedo Lizarraga, Maria Luisa; Sanz de Acedo Baquedano, Maria Teresa

    2011-01-01

    The main objective of this research was to validate the effectiveness of Wikideas and Creativity Connector tools to stimulate the generation of ideas and originality by university students organized into groups according to their indexes of creativity and affinity. Another goal of the study was to evaluate the classroom climate created by these…

  2. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  3. Analytical Tools and Databases for Metagenomics in the Next-Generation Sequencing Era

    PubMed Central

    Kim, Mincheol; Lee, Ki-Hyun; Yoon, Seok-Whan; Kim, Bong-Soo; Chun, Jongsik

    2013-01-01

    Metagenomics has become one of the indispensable tools in microbial ecology for the last few decades, and a new revolution in metagenomic studies is now about to begin, with the help of recent advances of sequencing techniques. The massive data production and substantial cost reduction in next-generation sequencing have led to the rapid growth of metagenomic research both quantitatively and qualitatively. It is evident that metagenomics will be a standard tool for studying the diversity and function of microbes in the near future, as fingerprinting methods did previously. As the speed of data accumulation is accelerating, bioinformatic tools and associated databases for handling those datasets have become more urgent and necessary. To facilitate the bioinformatics analysis of metagenomic data, we review some recent tools and databases that are used widely in this field and give insights into the current challenges and future of metagenomics from a bioinformatics perspective. PMID:24124405

  4. Trajectory Assessment and Modification Tools for Next Generation Air Traffic Management Operations

    NASA Technical Reports Server (NTRS)

    Brasil, Connie; Lee, Paul; Mainini, Matthew; Lee, Homola; Lee, Hwasoo; Prevot, Thomas; Smith, Nancy

    2011-01-01

    This paper reviews three Next Generation Air Transportation System (NextGen) based high fidelity air traffic control human-in-the-loop (HITL) simulations, with a focus on the expected requirement of enhanced automated trajectory assessment and modification tools to support future air traffic flow management (ATFM) planning positions. The simulations were conducted at the National Aeronautics and Space Administration (NASA) Ames Research Centers Airspace Operations Laboratory (AOL) in 2009 and 2010. The test airspace for all three simulations assumed the mid-term NextGenEn-Route high altitude environment utilizing high altitude sectors from the Kansas City and Memphis Air Route Traffic Control Centers. Trajectory assessment, modification and coordination decision support tools were developed at the AOL in order to perform future ATFM tasks. Overall tool usage results and user acceptability ratings were collected across three areas of NextGen operatoins to evaluate the tools. In addition to the usefulness and usability feedback, feasibility issues, benefits, and future requirements were also addressed. Overall, the tool sets were rated very useful and usable, and many elements of the tools received high scores and were used frequently and successfully. Tool utilization results in all three HITLs showed both user and system benefits including better airspace throughput, reduced controller workload, and highly effective communication protocols in both full Data Comm and mixed-equipage environments.

  5. The role of optimization in the next generation of computer-based design tools

    NASA Technical Reports Server (NTRS)

    Rogan, J. Edward

    1989-01-01

    There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.

  6. General application of rapid 3-D digitizing and tool path generation for complex shapes

    SciTech Connect

    Kwok, K.S.; Loucks, C.S.; Driessen, B.J.

    1997-09-01

    A system for automatic tool path generation was developed at Sandia National Laboratories for finish machining operations. The system consists of a commercially available 5-axis milling machine controlled by Sandia developed software. This system was used to remove overspray on cast turbine blades. A laser-based, structured-light sensor, mounted on a tool holder, is used to collect 3D data points around the surface of the turbine blade. Using the digitized model of the blade, a tool path is generated which will drive a 0.375 inch grinding pin around the tip of the blade. A fuzzified digital filter was developed to properly eliminate false sensor readings caused by burrs, holes and overspray. The digital filter was found to successfully generate the correct tool path for a blade with intentionally scanned holes and defects. The fuzzified filter improved the computation efficiency by a factor of 25. For application to general parts, an adaptive scanning algorithm was developed and presented with simulation and experimental results. A right pyramid and an ellipsoid were scanned successfully with the adaptive algorithm in simulation studies. In actual experiments, a nose cone and a turbine blade were successfully scanned. A complex shaped turbine blade was successfully scanned and finished machined using these algorithms.

  7. Adaptive scallop height tool path generation for robot-based incremental sheet metal forming

    NASA Astrophysics Data System (ADS)

    Seim, Patrick; Möllensiep, Dennis; Störkle, Denis Daniel; Thyssen, Lars; Kuhlenkötter, Bernd

    2016-10-01

    Incremental sheet metal forming is an emerging process for the production of individualized products or prototypes in low batch sizes and with short times to market. In these processes, the desired shape is produced by the incremental inward motion of the workpiece-independent forming tool in depth direction and its movement along the contour in lateral direction. Based on this shape production, the tool path generation is a key factor on e.g. the resulting geometric accuracy, the resulting surface quality, and the working time. This paper presents an innovative tool path generation based on a commercial milling CAM package considering the surface quality and working time. This approach offers the ability to define a specific scallop height as an indicator of the surface quality for specific faces of a component. Moreover, it decreases the required working time for the production of the entire component compared to the use of a commercial software package without this adaptive approach. Different forming experiments have been performed to verify the newly developed tool path generation. Mainly, this approach serves to solve the existing conflict of combining the working time and the surface quality within the process of incremental sheet metal forming.

  8. A Tool for Model-Based Generation of Scenario-driven Electric Power Load Profiles

    NASA Technical Reports Server (NTRS)

    Rozek, Matthew L.; Donahue, Kenneth M.; Ingham, Michel D.; Kaderka, Justin D.

    2015-01-01

    Power consumption during all phases of spacecraft flight is of great interest to the aerospace community. As a result, significant analysis effort is exerted to understand the rates of electrical energy generation and consumption under many operational scenarios of the system. Previously, no standard tool existed for creating and maintaining a power equipment list (PEL) of spacecraft components that consume power, and no standard tool existed for generating power load profiles based on this PEL information during mission design phases. This paper presents the Scenario Power Load Analysis Tool (SPLAT) as a model-based systems engineering tool aiming to solve those problems. SPLAT is a plugin for MagicDraw (No Magic, Inc.) that aids in creating and maintaining a PEL, and also generates a power and temporal variable constraint set, in Maple language syntax, based on specified operational scenarios. The constraint set can be solved in Maple to show electric load profiles (i.e. power consumption from loads over time). SPLAT creates these load profiles from three modeled inputs: 1) a list of system components and their respective power modes, 2) a decomposition hierarchy of the system into these components, and 3) the specification of at least one scenario, which consists of temporal constraints on component power modes. In order to demonstrate how this information is represented in a system model, a notional example of a spacecraft planetary flyby is introduced. This example is also used to explain the overall functionality of SPLAT, and how this is used to generate electric power load profiles. Lastly, a cursory review of the usage of SPLAT on the Cold Atom Laboratory project is presented to show how the tool was used in an actual space hardware design application.

  9. Tool path generation and back-off error analyze for robot milling process

    NASA Astrophysics Data System (ADS)

    Zhang, Bin; Tang, Chen; Wang, Ju; Wang, Qian

    2017-06-01

    An improved CC route tool path generation method is presented for robot milling process. Corresponding back-off error model is established based on the robot static elastic model and the ball-end cutter milling force model. Compared with the traditional CC route method, the distance between the adjacent constraint surfaces is adjusted dynamically and thus the milling accuracy will be improved. According to the back-off error model, tool posture can be optimized using genetic algorithms. It is significantly important for reducing the back-off error during robot milling process.

  10. Tool-assisted mesh generation based on a tissue-growth model.

    PubMed

    Smirnov, A V

    2003-07-01

    An heuristic mesh generation technique is proposed that is based on the model of forced particle motion, an edgewise cell-splitting algorithm and a moving tool concept. The method differs from conventional mesh generators in that it uses outward growth of the mesh, in contrast to the inward growth used in traditional meshing techniques. The method does not require prior meshing and patching of two-dimensional (2D) boundary surfaces. Instead, it uses a pre-defined skeleton of one-dimensional segments, or an arbitrary tool motion in three-dimensional (3D) space. In this respect, the technique can be considered as a 3D extension of a 2D drawing tool and can find applications in virtual reality systems. The method also guarantees the smoothness of the outer boundary of the mesh at each step of mesh generation, which is not the case with traditional propagating-front methods. The approach is based on the model of tissue growth and is suitable for meshing complex networks of bifurcating branches commonly found in biological structures: blood vessels, lungs, neural networks, plants etc. The generated meshes were used in solving unsteady flow and particle transport problems in lungs.

  11. Letting defective babies die: who decides?

    PubMed

    Ellis, T S

    1982-01-01

    This article explores who, in the first instance, should decide whether to withhold or withdraw treatment from a defective newborn. The Article begins by defining the term "severely defective newborn" and discussing potential sources of liability for persons who decide to withhold or withdraw treatment. It next analyzes the ability of parents, physicians, and courts to make these treatment decisions. The Article concludes that, although parents and physicians may eventually make the specific determination, the legislature should at least set guidelines so that the decisions will be, in some measure, consistent, predictable, adequately informed, and in accord with community values.

  12. Forecasting municipal solid waste generation using prognostic tools and regression analysis.

    PubMed

    Ghinea, Cristina; Drăgoi, Elena Niculina; Comăniţă, Elena-Diana; Gavrilescu, Marius; Câmpean, Teofil; Curteanu, Silvia; Gavrilescu, Maria

    2016-11-01

    For an adequate planning of waste management systems the accurate forecast of waste generation is an essential step, since various factors can affect waste trends. The application of predictive and prognosis models are useful tools, as reliable support for decision making processes. In this paper some indicators such as: number of residents, population age, urban life expectancy, total municipal solid waste were used as input variables in prognostic models in order to predict the amount of solid waste fractions. We applied Waste Prognostic Tool, regression analysis and time series analysis to forecast municipal solid waste generation and composition by considering the Iasi Romania case study. Regression equations were determined for six solid waste fractions (paper, plastic, metal, glass, biodegradable and other waste). Accuracy Measures were calculated and the results showed that S-curve trend model is the most suitable for municipal solid waste (MSW) prediction.

  13. A tool for accelerating material calculations through the generation of highly efficient k-point grids

    NASA Astrophysics Data System (ADS)

    Mueller, Tim; Wisesa, Pandu

    The calculation of many material properties requires the evaluation of an integral over the Brillouin zone, which is commonly approximated by sampling a regular grid of points, known as k-points, in reciprocal space. We have developed an automated tool for generating k-point grids that significantly accelerates the calculation of material properties compared to commonly used methods. Our tool, which is being made freely available to the public, is capable of generating highly efficient k-point grids in a fraction of a second for any crystalline material. We present an overview of our method, benchmark results, and a discussion of how it can be integrated into a high-throughput computing environment.

  14. Development tool for generating optimized HL7 libraries for embedded medical devices.

    PubMed

    Kim, Hang-Chan; Yi, Byoung-Kee; Kim, Ilkon

    2008-11-06

    Embedded medical devices with constraints on the CPU power and memory capacity must communicate with HIS only in a few HL7 messages. Therefore, it is not desirable to deploy one big library that processes all HL7 messages to all types of devices. We present a development tool that automatically generates an optimized library with a small memory footprint that only processes a subset of HL7 messages for each target device type.

  15. Developing Next-Generation Telehealth Tools and Technologies: Patients, Systems, and Data Perspectives

    PubMed Central

    Filart, Rosemarie; Burgess, Lawrence P.; Lee, Insup; Poropatich, Ronald K.

    2010-01-01

    Abstract The major goals of telemedicine today are to develop next-generation telehealth tools and technologies to enhance healthcare delivery to medically underserved populations using telecommunication technology, to increase access to medical specialty services while decreasing healthcare costs, and to provide training of healthcare providers, clinical trainees, and students in health-related fields. Key drivers for these tools and technologies are the need and interest to collaborate among telehealth stakeholders, including patients, patient communities, research funders, researchers, healthcare services providers, professional societies, industry, healthcare management/economists, and healthcare policy makers. In the development, marketing, adoption, and implementation of these tools and technologies, communication, training, cultural sensitivity, and end-user customization are critical pieces to the process. Next-generation tools and technologies are vehicles toward personalized medicine, extending the telemedicine model to include cell phones and Internet-based telecommunications tools for remote and home health management with video assessment, remote bedside monitoring, and patient-specific care tools with event logs, patient electronic profile, and physician note-writing capability. Telehealth is ultimately a system of systems in scale and complexity. To cover the full spectrum of dynamic and evolving needs of end-users, we must appreciate system complexity as telehealth moves toward increasing functionality, integration, interoperability, outreach, and quality of service. Toward that end, our group addressed three overarching questions: (1) What are the high-impact topics? (2) What are the barriers to progress? and (3) What roles can the National Institutes of Health and its various institutes and centers play in fostering the future development of telehealth? PMID:20043711

  16. Developing next-generation telehealth tools and technologies: patients, systems, and data perspectives.

    PubMed

    Ackerman, Michael J; Filart, Rosemarie; Burgess, Lawrence P; Lee, Insup; Poropatich, Ronald K

    2010-01-01

    The major goals of telemedicine today are to develop next-generation telehealth tools and technologies to enhance healthcare delivery to medically underserved populations using telecommunication technology, to increase access to medical specialty services while decreasing healthcare costs, and to provide training of healthcare providers, clinical trainees, and students in health-related fields. Key drivers for these tools and technologies are the need and interest to collaborate among telehealth stakeholders, including patients, patient communities, research funders, researchers, healthcare services providers, professional societies, industry, healthcare management/economists, and healthcare policy makers. In the development, marketing, adoption, and implementation of these tools and technologies, communication, training, cultural sensitivity, and end-user customization are critical pieces to the process. Next-generation tools and technologies are vehicles toward personalized medicine, extending the telemedicine model to include cell phones and Internet-based telecommunications tools for remote and home health management with video assessment, remote bedside monitoring, and patient-specific care tools with event logs, patient electronic profile, and physician note-writing capability. Telehealth is ultimately a system of systems in scale and complexity. To cover the full spectrum of dynamic and evolving needs of end-users, we must appreciate system complexity as telehealth moves toward increasing functionality, integration, interoperability, outreach, and quality of service. Toward that end, our group addressed three overarching questions: (1) What are the high-impact topics? (2) What are the barriers to progress? and (3) What roles can the National Institutes of Health and its various institutes and centers play in fostering the future development of telehealth?

  17. Automatic generation of conceptual database design tools from data model specifications

    SciTech Connect

    Hong, Shuguang.

    1989-01-01

    The problems faced in the design and implementation of database software systems based on object-oriented data models are similar to that of other software design, i.e., difficult, complex, yet redundant effort. Automatic generation of database software system has been proposed as a solution to the problems. In order to generate database software system for a variety of object-oriented data models, two critical issues: data model specification and software generation, must be addressed. SeaWeed is a software system that automatically generates conceptual database design tools from data model specifications. A meta model has been defined for the specification of a class of object-oriented data models. This meta model provides a set of primitive modeling constructs that can be used to express the semantics, or unique characteristics, of specific data models. Software reusability has been adopted for the software generation. The technique of design reuse is utilized to derive the requirement specification of the software to be generated from data model specifications. The mechanism of code reuse is used to produce the necessary reusable software components. This dissertation presents the research results of SeaWeed including the meta model, data model specification, a formal representation of design reuse and code reuse, and the software generation paradigm.

  18. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    SciTech Connect

    Boe, Timothy; Lemieux, Paul; Schultheisz, Daniel; Peake, Tom; Hayes, Colin

    2013-07-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies on the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri{sup R} ArcGIS{sup R} scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus{sup R}-MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel{sup R} 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)

  19. Criteria for deciding about forestry research programs

    Treesearch

    Robert Z. Callaham

    1981-01-01

    In early 1979, the Forest Service, U.S. Department of Agriculture, was required to decide several significant issues affecting its future research program. These decisions were in response to requirements of the Forest and Rangeland Renewable Resources Planning Act of 1974 (RPA). The decisions required information that was not either available or assembled. Most...

  20. Helping Youth Decide: A Workshop Guide.

    ERIC Educational Resources Information Center

    Duquette, Donna Marie; Boo, Katherine

    This guide was written to complement the publication "Helping Youth Decide," a manual designed to help parents develop effective parent-child communication and help their children make responsible decisions during the adolescent years. The workshop guide is intended to assist people who work with families to provide additional information and…

  1. Automatic generation of bioinformatics tools for predicting protein–ligand binding sites

    PubMed Central

    Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-01-01

    Motivation: Predictive tools that model protein–ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein–ligand binding predictive tools would be useful. Results: We developed a system for automatically generating protein–ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5–1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. Availability and implementation: The source code and web application are freely available for download at http://utprot.net. They are implemented in Python and supported on Linux. Contact: shimizu@bi.a.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26545824

  2. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  3. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  4. Automatic generation of bioinformatics tools for predicting protein-ligand binding sites.

    PubMed

    Komiyama, Yusuke; Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-03-15

    Predictive tools that model protein-ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein-ligand binding predictive tools would be useful. We developed a system for automatically generating protein-ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5-1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. The source code and web application are freely available for download at http://utprot.net They are implemented in Python and supported on Linux. shimizu@bi.a.u-tokyo.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  5. Simulation Tool to Assess Mechanical and Electrical Stresses on Wind Turbine Generators: Preprint

    SciTech Connect

    Singh, M.; Muljadi, E.; Gevorgian, V.; Jonkman, J.

    2013-10-01

    Wind turbine generators (WTGs) consist of many different components to convert kinetic energy of the wind into electrical energy for end users. Wind energy is accessed to provide mechanical torque for driving the shaft of the electrical generator. The conversion from wind power to mechanical power is governed by the aerodynamic conversion. The aerodynamic-electrical-conversion efficiency of a WTGis influenced by the efficiency of the blades, the gearbox, the generator, and the power converter. This paper describes the use of MATLAB/Simulink to simulate the electrical and grid-related aspects of a WTG coupled with the FAST aero-elastic wind turbine computer-aided engineering tool to simulate the aerodynamic and mechanical aspects of a WTG. The combination of the two enables studiesinvolving both electrical and mechanical aspects of a WTG. This digest includes some examples of the capabilities of the FAST and MATLAB coupling, namely the effects of electrical faults on the blade moments.

  6. A generative tool for building health applications driven by ISO 13606 archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.

  7. What can next generation sequencing do for you? Next generation sequencing as a valuable tool in plant research.

    PubMed

    Bräutigam, A; Gowik, U

    2010-11-01

    Next generation sequencing (NGS) technologies have opened fascinating opportunities for the analysis of plants with and without a sequenced genome on a genomic scale. During the last few years, NGS methods have become widely available and cost effective. They can be applied to a wide variety of biological questions, from the sequencing of complete eukaryotic genomes and transcriptomes, to the genome-scale analysis of DNA-protein interactions. In this review, we focus on the use of NGS for plant transcriptomics, including gene discovery, transcript quantification and marker discovery for non-model plants, as well as transcript annotation and quantification, small RNA discovery and antisense transcription analysis for model plants. We discuss the experimental design for analysis of plants with and without a sequenced genome, including considerations on sampling, RNA preparation, sequencing platforms and bioinformatics tools for data analysis. NGS technologies offer exciting new opportunities for the plant sciences, especially for work on plants without a sequenced genome, since large sequence resources can be generated at moderate cost.

  8. Future application of e-beam repair tool beyond 3X generation

    NASA Astrophysics Data System (ADS)

    Kanamitsu, Shingo; Hirano, Takashi

    2010-05-01

    Currently, repair technology is one of the key factors in mask making process regarding TAT reduction and yield level enhancement. Since its commercial release EB repair tool has been commonly used for production line and contributed to high quality repair. But it is not guaranteed whether those conventional machines can keep up with future pattern reduction trend or not. In 2Xnm generation node some advanced exposure techniques seem to be adopted and that will inevitably require higher specification of repair machine. A simple lithography simulation predicts 5nm of indispensable repair accuracy for 2Xnm generation pattern. This number implies the necessity of upper class machine. Generally, the error budget of EB repair tool is composed of three to four components, stated another way mechanical stability, electrical (charging) uniformity, process stability, and graphical quality including software ability. If errors from those components are reduced, overall repair accuracy could be better. A suggestion which can improve those errors was issued last year from tool vender including new machine concept. We have conducted several kind of experiment in order to confirm the performance of new machine. In this paper, we will report the result of experiment and consider which part can effectively contribute to repair accuracy. And we have also evaluated its practical utility value for 2Xnm node by verifying actual application of some 3Xnm production masks.

  9. Evaluating Variant Calling Tools for Non-Matched Next-Generation Sequencing Data

    PubMed Central

    Sandmann, Sarah; de Graaf, Aniek O.; Karimi, Mohsen; van der Reijden, Bert A.; Hellström-Lindberg, Eva; Jansen, Joop H.; Dugas, Martin

    2017-01-01

    Valid variant calling results are crucial for the use of next-generation sequencing in clinical routine. However, there are numerous variant calling tools that usually differ in algorithms, filtering strategies, recommendations and thus, also in the output. We evaluated eight open-source tools regarding their ability to call single nucleotide variants and short indels with allelic frequencies as low as 1% in non-matched next-generation sequencing data: GATK HaplotypeCaller, Platypus, VarScan, LoFreq, FreeBayes, SNVer, SAMtools and VarDict. We analysed two real datasets from patients with myelodysplastic syndrome, covering 54 Illumina HiSeq samples and 111 Illumina NextSeq samples. Mutations were validated by re-sequencing on the same platform, on a different platform and expert based review. In addition we considered two simulated datasets with varying coverage and error profiles, covering 50 samples each. In all cases an identical target region consisting of 19 genes (42,322 bp) was analysed. Altogether, no tool succeeded in calling all mutations. High sensitivity was always accompanied by low precision. Influence of varying coverages- and background noise on variant calling was generally low. Taking everything into account, VarDict performed best. However, our results indicate that there is a need to improve reproducibility of the results in the context of multithreading. PMID:28233799

  10. Evaluating Variant Calling Tools for Non-Matched Next-Generation Sequencing Data

    NASA Astrophysics Data System (ADS)

    Sandmann, Sarah; de Graaf, Aniek O.; Karimi, Mohsen; van der Reijden, Bert A.; Hellström-Lindberg, Eva; Jansen, Joop H.; Dugas, Martin

    2017-02-01

    Valid variant calling results are crucial for the use of next-generation sequencing in clinical routine. However, there are numerous variant calling tools that usually differ in algorithms, filtering strategies, recommendations and thus, also in the output. We evaluated eight open-source tools regarding their ability to call single nucleotide variants and short indels with allelic frequencies as low as 1% in non-matched next-generation sequencing data: GATK HaplotypeCaller, Platypus, VarScan, LoFreq, FreeBayes, SNVer, SAMtools and VarDict. We analysed two real datasets from patients with myelodysplastic syndrome, covering 54 Illumina HiSeq samples and 111 Illumina NextSeq samples. Mutations were validated by re-sequencing on the same platform, on a different platform and expert based review. In addition we considered two simulated datasets with varying coverage and error profiles, covering 50 samples each. In all cases an identical target region consisting of 19 genes (42,322 bp) was analysed. Altogether, no tool succeeded in calling all mutations. High sensitivity was always accompanied by low precision. Influence of varying coverages- and background noise on variant calling was generally low. Taking everything into account, VarDict performed best. However, our results indicate that there is a need to improve reproducibility of the results in the context of multithreading.

  11. A survey of tools for variant analysis of next-generation genome sequencing data

    PubMed Central

    Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R.; Zschocke, Johannes

    2014-01-01

    Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

  12. Metagenomics: Tools and Insights for Analyzing Next-Generation Sequencing Data Derived from Biodiversity Studies

    PubMed Central

    Oulas, Anastasis; Pavloudi, Christina; Polymenakou, Paraskevi; Pavlopoulos, Georgios A; Papanikolaou, Nikolas; Kotoulas, Georgios; Arvanitidis, Christos; Iliopoulos, Ioannis

    2015-01-01

    Advances in next-generation sequencing (NGS) have allowed significant breakthroughs in microbial ecology studies. This has led to the rapid expansion of research in the field and the establishment of “metagenomics”, often defined as the analysis of DNA from microbial communities in environmental samples without prior need for culturing. Many metagenomics statistical/computational tools and databases have been developed in order to allow the exploitation of the huge influx of data. In this review article, we provide an overview of the sequencing technologies and how they are uniquely suited to various types of metagenomic studies. We focus on the currently available bioinformatics techniques, tools, and methodologies for performing each individual step of a typical metagenomic dataset analysis. We also provide future trends in the field with respect to tools and technologies currently under development. Moreover, we discuss data management, distribution, and integration tools that are capable of performing comparative metagenomic analyses of multiple datasets using well-established databases, as well as commonly used annotation standards. PMID:25983555

  13. A graphical solution in CATIA for profiling end mill tool which generates a helical surface

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Berbinschi, S.; Susac, F.; Oancea, N.

    2017-08-01

    The generation of a helical flute, which belongs to a helical cylindrical surface with constant pitch, can be made using end mill tools. The tools on this type are easiest to make than the side mills and represent a less expensive solution. The end mill profiling may be done using the classical theorems of surfaces enveloping, analytical expressed, as Olivier theorem or Nikolaev method. In this paper is proposed an algorithm, developed in the CATIA design environment, for profiling such tool’s type. The proposed solution is intuitive, rigorous and fast due to the utilization of the graphical design environment capabilities. Numerical examples are considered in order to validate the quality of this method.

  14. Automated protein motif generation in the structure-based protein function prediction tool ProMOL.

    PubMed

    Osipovitch, Mikhail; Lambrecht, Mitchell; Baker, Cameron; Madha, Shariq; Mills, Jeffrey L; Craig, Paul A; Bernstein, Herbert J

    2015-12-01

    ProMOL, a plugin for the PyMOL molecular graphics system, is a structure-based protein function prediction tool. ProMOL includes a set of routines for building motif templates that are used for screening query structures for enzyme active sites. Previously, each motif template was generated manually and required supervision in the optimization of parameters for sensitivity and selectivity. We developed an algorithm and workflow for the automation of motif building and testing routines in ProMOL. The algorithm uses a set of empirically derived parameters for optimization and requires little user intervention. The automated motif generation algorithm was first tested in a performance comparison with a set of manually generated motifs based on identical active sites from the same 112 PDB entries. The two sets of motifs were equally effective in identifying alignments with homologs and in rejecting alignments with unrelated structures. A second set of 296 active site motifs were generated automatically, based on Catalytic Site Atlas entries with literature citations, as an expansion of the library of existing manually generated motif templates. The new motif templates exhibited comparable performance to the existing ones in terms of hit rates against native structures, homologs with the same EC and Pfam designations, and randomly selected unrelated structures with a different EC designation at the first EC digit, as well as in terms of RMSD values obtained from local structural alignments of motifs and query structures. This research is supported by NIH grant GM078077.

  15. Customised 3D Printing: An Innovative Training Tool for the Next Generation of Orbital Surgeons.

    PubMed

    Scawn, Richard L; Foster, Alex; Lee, Bradford W; Kikkawa, Don O; Korn, Bobby S

    2015-01-01

    Additive manufacturing or 3D printing is the process by which three dimensional data fields are translated into real-life physical representations. 3D printers create physical printouts using heated plastics in a layered fashion resulting in a three-dimensional object. We present a technique for creating customised, inexpensive 3D orbit models for use in orbital surgical training using 3D printing technology. These models allow trainee surgeons to perform 'wet-lab' orbital decompressions and simulate upcoming surgeries on orbital models that replicate a patient's bony anatomy. We believe this represents an innovative training tool for the next generation of orbital surgeons.

  16. Synthetic biology in mammalian cells: Next generation research tools and therapeutics

    PubMed Central

    Lienert, Florian; Lohmueller, Jason J; Garg, Abhishek; Silver, Pamela A

    2014-01-01

    Recent progress in DNA manipulation and gene circuit engineering has greatly improved our ability to programme and probe mammalian cell behaviour. These advances have led to a new generation of synthetic biology research tools and potential therapeutic applications. Programmable DNA-binding domains and RNA regulators are leading to unprecedented control of gene expression and elucidation of gene function. Rebuilding complex biological circuits such as T cell receptor signalling in isolation from their natural context has deepened our understanding of network motifs and signalling pathways. Synthetic biology is also leading to innovative therapeutic interventions based on cell-based therapies, protein drugs, vaccines and gene therapies. PMID:24434884

  17. A web based tool for storing and visualising data generated within a smart home.

    PubMed

    McDonald, H A; Nugent, C D; Moore, G; Finlay, D D; Hallberg, J

    2011-01-01

    There is a growing need to re-assess the current approaches available to researchers for storing and managing heterogeneous data generated within a smart home environment. In our current work we have developed the homeML Application; a web based tool to support researchers engaged in the area of smart home research as they perform experiments. Within this paper the homeML Application is presented which includes the fundamental components of the homeML Repository and the homeML Toolkit. Results from a usability study conducted by 10 computer science researchers are presented; the initial results of which have been positive.

  18. ODG: Omics database generator - a tool for generating, querying, and analyzing multi-omics comparative databases to facilitate biological understanding.

    PubMed

    Guhlin, Joseph; Silverstein, Kevin A T; Zhou, Peng; Tiffin, Peter; Young, Nevin D

    2017-08-10

    Rapid generation of omics data in recent years have resulted in vast amounts of disconnected datasets without systemic integration and knowledge building, while individual groups have made customized, annotated datasets available on the web with few ways to link them to in-lab datasets. With so many research groups generating their own data, the ability to relate it to the larger genomic and comparative genomic context is becoming increasingly crucial to make full use of the data. The Omics Database Generator (ODG) allows users to create customized databases that utilize published genomics data integrated with experimental data which can be queried using a flexible graph database. When provided with omics and experimental data, ODG will create a comparative, multi-dimensional graph database. ODG can import definitions and annotations from other sources such as InterProScan, the Gene Ontology, ENZYME, UniPathway, and others. This annotation data can be especially useful for studying new or understudied species for which transcripts have only been predicted, and rapidly give additional layers of annotation to predicted genes. In better studied species, ODG can perform syntenic annotation translations or rapidly identify characteristics of a set of genes or nucleotide locations, such as hits from an association study. ODG provides a web-based user-interface for configuring the data import and for querying the database. Queries can also be run from the command-line and the database can be queried directly through programming language hooks available for most languages. ODG supports most common genomic formats as well as generic, easy to use tab-separated value format for user-provided annotations. ODG is a user-friendly database generation and query tool that adapts to the supplied data to produce a comparative genomic database or multi-layered annotation database. ODG provides rapid comparative genomic annotation and is therefore particularly useful for non-model or

  19. A Web Tool for Generating High Quality Machine-readable Biological Pathways

    PubMed Central

    Ramirez-Gaona, Miguel; Marcu, Ana; Pon, Allison; Grant, Jason; Wu, Anthony; Wishart, David S.

    2017-01-01

    PathWhiz is a web server built to facilitate the creation of colorful, interactive, visually pleasing pathway diagrams that are rich in biological information. The pathways generated by this online application are machine-readable and fully compatible with essentially all web-browsers and computer operating systems. It uses a specially developed, web-enabled pathway drawing interface that permits the selection and placement of different combinations of pre-drawn biological or biochemical entities to depict reactions, interactions, transport processes and binding events. This palette of entities consists of chemical compounds, proteins, nucleic acids, cellular membranes, subcellular structures, tissues, and organs. All of the visual elements in it can be interactively adjusted and customized. Furthermore, because this tool is a web server, all pathways and pathway elements are publicly accessible. This kind of pathway "crowd sourcing" means that PathWhiz already contains a large and rapidly growing collection of previously drawn pathways and pathway elements. Here we describe a protocol for the quick and easy creation of new pathways and the alteration of existing pathways. To further facilitate pathway editing and creation, the tool contains replication and propagation functions. The replication function allows existing pathways to be used as templates to create or edit new pathways. The propagation function allows one to take an existing pathway and automatically propagate it across different species. Pathways created with this tool can be "re-styled" into different formats (KEGG-like or text-book like), colored with different backgrounds, exported to BioPAX, SBGN-ML, SBML, or PWML data exchange formats, and downloaded as PNG or SVG images. The pathways can easily be incorporated into online databases, integrated into presentations, posters or publications, or used exclusively for online visualization and exploration. This protocol has been successfully applied to

  20. A Web Tool for Generating High Quality Machine-readable Biological Pathways.

    PubMed

    Ramirez-Gaona, Miguel; Marcu, Ana; Pon, Allison; Grant, Jason; Wu, Anthony; Wishart, David S

    2017-02-08

    PathWhiz is a web server built to facilitate the creation of colorful, interactive, visually pleasing pathway diagrams that are rich in biological information. The pathways generated by this online application are machine-readable and fully compatible with essentially all web-browsers and computer operating systems. It uses a specially developed, web-enabled pathway drawing interface that permits the selection and placement of different combinations of pre-drawn biological or biochemical entities to depict reactions, interactions, transport processes and binding events. This palette of entities consists of chemical compounds, proteins, nucleic acids, cellular membranes, subcellular structures, tissues, and organs. All of the visual elements in it can be interactively adjusted and customized. Furthermore, because this tool is a web server, all pathways and pathway elements are publicly accessible. This kind of pathway "crowd sourcing" means that PathWhiz already contains a large and rapidly growing collection of previously drawn pathways and pathway elements. Here we describe a protocol for the quick and easy creation of new pathways and the alteration of existing pathways. To further facilitate pathway editing and creation, the tool contains replication and propagation functions. The replication function allows existing pathways to be used as templates to create or edit new pathways. The propagation function allows one to take an existing pathway and automatically propagate it across different species. Pathways created with this tool can be "re-styled" into different formats (KEGG-like or text-book like), colored with different backgrounds, exported to BioPAX, SBGN-ML, SBML, or PWML data exchange formats, and downloaded as PNG or SVG images. The pathways can easily be incorporated into online databases, integrated into presentations, posters or publications, or used exclusively for online visualization and exploration. This protocol has been successfully applied to

  1. Guideline for Deciding Physical Parameters for Acrobot

    NASA Astrophysics Data System (ADS)

    Baba, Atsushi; Watanabe, Ryo

    This paper presents a guideline for deciding physical parameters for the Acrobot, a typical example of two-link underactuated robots. A set of physical parameters for the Acrobot is determined using an evaluation function, which is derived by analyzing the robustness of an existing swing-up controller for the Acrobot. A typical set of physical parameters used in many papers are then evaluated using our evaluation function, and the validity and performance of our proposed guideline are studied with a series of simulations.

  2. Experiences with the application of the ADIC automatic differentiation tool for to the CSCMDO 3-D volume grid generation code

    SciTech Connect

    Bischof, C.H.; Mauer, A.; Jones, W.T.

    1995-12-31

    Automatic differentiation (AD) is a methodology for developing reliable sensitivity-enhanced versions of arbitrary computer programs with little human effort. It can vastly accelerate the use of advanced simulation codes in multidisciplinary design optimization, since the time for generating and verifying derivative codes is greatly reduced. In this paper, we report on the application of the recently developed ADIC automatic differentiation tool for ANSI C programs to the CSCMDO multiblock three-dimensional volume grid generator. The ADIC-generated code can easily be interfaced with Fortran derivative codes generated with the ADIFOR AD tool FORTRAN 77 programs, thus providing efficient sensitivity-enhancement techniques for multilanguage, multidiscipline problems.

  3. Gravity Waves Generated by Convection: A New Idealized Model Tool and Direct Validation with Satellite Observations

    NASA Astrophysics Data System (ADS)

    Alexander, M. Joan; Stephan, Claudia

    2015-04-01

    In climate models, gravity waves remain too poorly resolved to be directly modelled. Instead, simplified parameterizations are used to include gravity wave effects on model winds. A few climate models link some of the parameterized waves to convective sources, providing a mechanism for feedback between changes in convection and gravity wave-driven changes in circulation in the tropics and above high-latitude storms. These convective wave parameterizations are based on limited case studies with cloud-resolving models, but they are poorly constrained by observational validation, and tuning parameters have large uncertainties. Our new work distills results from complex, full-physics cloud-resolving model studies to essential variables for gravity wave generation. We use the Weather Research Forecast (WRF) model to study relationships between precipitation, latent heating/cooling and other cloud properties to the spectrum of gravity wave momentum flux above midlatitude storm systems. Results show the gravity wave spectrum is surprisingly insensitive to the representation of microphysics in WRF. This is good news for use of these models for gravity wave parameterization development since microphysical properties are a key uncertainty. We further use the full-physics cloud-resolving model as a tool to directly link observed precipitation variability to gravity wave generation. We show that waves in an idealized model forced with radar-observed precipitation can quantitatively reproduce instantaneous satellite-observed features of the gravity wave field above storms, which is a powerful validation of our understanding of waves generated by convection. The idealized model directly links observations of surface precipitation to observed waves in the stratosphere, and the simplicity of the model permits deep/large-area domains for studies of wave-mean flow interactions. This unique validated model tool permits quantitative studies of gravity wave driving of regional

  4. System diagnostic builder: a rule-generation tool for expert systems that do intelligent data evaluation

    NASA Astrophysics Data System (ADS)

    Nieten, Joseph L.; Burke, Roger

    1993-03-01

    The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.

  5. Developing a tool to estimate water withdrawal and consumption in electricity generation in the United States.

    SciTech Connect

    Wu, M.; Peng, J.

    2011-02-24

    Freshwater consumption for electricity generation is projected to increase dramatically in the next couple of decades in the United States. The increased demand is likely to further strain freshwater resources in regions where water has already become scarce. Meanwhile, the automotive industry has stepped up its research, development, and deployment efforts on electric vehicles (EVs) and plug-in hybrid electric vehicles (PHEVs). Large-scale, escalated production of EVs and PHEVs nationwide would require increased electricity production, and so meeting the water demand becomes an even greater challenge. The goal of this study is to provide a baseline assessment of freshwater use in electricity generation in the United States and at the state level. Freshwater withdrawal and consumption requirements for power generated from fossil, nonfossil, and renewable sources via various technologies and by use of different cooling systems are examined. A data inventory has been developed that compiles data from government statistics, reports, and literature issued by major research institutes. A spreadsheet-based model has been developed to conduct the estimates by means of a transparent and interactive process. The model further allows us to project future water withdrawal and consumption in electricity production under the forecasted increases in demand. This tool is intended to provide decision makers with the means to make a quick comparison among various fuel, technology, and cooling system options. The model output can be used to address water resource sustainability when considering new projects or expansion of existing plants.

  6. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    NASA Astrophysics Data System (ADS)

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi

  7. Decidability problems for meta-R-functions

    SciTech Connect

    Losovik, L.P.; Drobyshev, P.V.

    1995-09-01

    In this article we consider classes of functions defined by R-transformers, which are machines that sequentially process real numbers represented in the binary number system. The class of real functions defined by R-transformers includes all continuous and some discontinuous functions. The closure of this class under superposition produces the wider class of real meta-R-functions. Such functions are defined by finite sequences of R-transformers. Here we examine the class of meta-R-meta-R-functions. Such functions are defined by finite sequences of R-transformers. Here we examine the class of meta-R-functions and specifically the class of finite meta-R-functions. The latter are defined by finite sequences of finite R-transformers. Decidability of the equivalent problem in the class of functions defined by finite R-transformers, i.e., the class of finite R-functions, is proved elsewhere. Here we generalize this result to the class of finite meta-R-functions. We investigate not only the equivalence problem, but also the monotonicity and continuity problems. The proof is by reduction to the decidable nonemptiness problem for nondeterministic bounded-mode finite transformers with finite-turnaround counters on labeled trees. We also consider the general properties of the class of meta-R-functions.

  8. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    PubMed Central

    Braisted, John C; Kuntumalla, Srilatha; Vogel, Christine; Marcotte, Edward M; Rodrigues, Alan R; Wang, Rong; Huang, Shih-Ting; Ferlanti, Erik S; Saeed, Alexander I; Fleischmann, Robert D; Peterson, Scott N; Pieper, Rembert

    2008-01-01

    Background Mass spectrometry (MS) based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS) data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007) described a modified spectral counting technique, Absolute Protein Expression (APEX), which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value) that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi). This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a utility to merge multiple

  9. Mobile Phones Democratize and Cultivate Next-Generation Imaging, Diagnostics and Measurement Tools

    PubMed Central

    Ozcan, Aydogan

    2014-01-01

    In this article, I discuss some of the emerging applications and the future opportunities and challenges created by the use of mobile phones and their embedded components for the development of next-generation imaging, sensing, diagnostics and measurement tools. The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones, transforming the mobile phone into a cost-effective and yet extremely powerful platform to run e.g., biomedical tests and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally. PMID:24647550

  10. Tool for Generation of MAC/GMC Representative Unit Cell for CMC/PMC Analysis

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Pineda, Evan J.

    2016-01-01

    This document describes a recently developed analysis tool that enhances the resident capabilities of the Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) 4.0. This tool is especially useful in analyzing ceramic matrix composites (CMCs), where higher fidelity with improved accuracy of local response is needed. The tool, however, can be used for analyzing polymer matrix composites (PMCs) as well. MAC/GMC 4.0 is a composite material and laminate analysis software developed at NASA Glenn Research Center. The software package has been built around the concept of the generalized method of cells (GMC). The computer code is developed with a user friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermomechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The primary focus of the current effort is to provide a graphical user interface (GUI) capability that generates a number of different user-defined repeating unit cells (RUCs). In addition, the code has provisions for generation of a MAC/GMC-compatible input text file that can be merged with any MAC/GMC input file tailored to analyze composite materials. Although the primary intention was to address the three different constituents and phases that are usually present in CMCs-namely, fibers, matrix, and interphase-it can be easily modified to address two-phase polymer matrix composite (PMC) materials where an interphase is absent. Currently, the

  11. NgsRelate: a software tool for estimating pairwise relatedness from next-generation sequencing data.

    PubMed

    Korneliussen, Thorfinn Sand; Moltke, Ida

    2015-12-15

    Pairwise relatedness estimation is important in many contexts such as disease mapping and population genetics. However, all existing estimation methods are based on called genotypes, which is not ideal for next-generation sequencing (NGS) data of low depth from which genotypes cannot be called with high certainty. We present a software tool, NgsRelate, for estimating pairwise relatedness from NGS data. It provides maximum likelihood estimates that are based on genotype likelihoods instead of genotypes and thereby takes the inherent uncertainty of the genotypes into account. Using both simulated and real data, we show that NgsRelate provides markedly better estimates for low-depth NGS data than two state-of-the-art genotype-based methods. NgsRelate is implemented in C++ and is available under the GNU license at www.popgen.dk/software. © The Author 2015. Published by Oxford University Press.

  12. NgsRelate: a software tool for estimating pairwise relatedness from next-generation sequencing data

    PubMed Central

    Korneliussen, Thorfinn Sand; Moltke, Ida

    2015-01-01

    Motivation: Pairwise relatedness estimation is important in many contexts such as disease mapping and population genetics. However, all existing estimation methods are based on called genotypes, which is not ideal for next-generation sequencing (NGS) data of low depth from which genotypes cannot be called with high certainty. Results: We present a software tool, NgsRelate, for estimating pairwise relatedness from NGS data. It provides maximum likelihood estimates that are based on genotype likelihoods instead of genotypes and thereby takes the inherent uncertainty of the genotypes into account. Using both simulated and real data, we show that NgsRelate provides markedly better estimates for low-depth NGS data than two state-of-the-art genotype-based methods. Availability: NgsRelate is implemented in C++ and is available under the GNU license at www.popgen.dk/software. Contact: ida@binf.ku.dk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26323718

  13. Numerical study of electromagnetic waves generated by a prototype dielectric logging tool

    USGS Publications Warehouse

    Ellefsen, K.J.; Abraham, J.D.; Wright, D.L.; Mazzella, A.T.

    2004-01-01

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a numerical study was conducted using both the finite-difference, time-domain method and a frequency-wavenumber method. When the propagation velocity in the borehole was greater than that in the formation (e.g., an air-filled borehole in the unsaturated zone), only a guided wave propagated along the borehole. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave radiated electromagnetic energy into the formation, causing its amplitude to decrease. When the propagation velocity in the borehole was less than that in the formation (e.g., a water-filled borehole in the saturated zone), both a refracted wave and a guided wave propagated along the borehole. The velocity of the refracted wave equaled the phase velocity of a plane wave in the formation, and the refracted wave preceded the guided wave. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave did not radiate electromagnetic energy into the formation. To analyze traces recorded by the prototype tool during laboratory tests, they were compared to traces calculated with the finite-difference method. The first parts of both the recorded and the calculated traces were similar, indicating that guided and refracted waves indeed propagated along the prototype tool. ?? 2004 Society of Exploration Geophysicists. All rights reserved.

  14. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    SciTech Connect

    Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan; Saha, Pradip; Loewen, Eric

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parameters applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.

  15. Recombineering strategies for developing next generation BAC transgenic tools for optogenetics and beyond

    PubMed Central

    Ting, Jonathan T.; Feng, Guoping

    2014-01-01

    The development and application of diverse BAC transgenic rodent lines has enabled rapid progress for precise molecular targeting of genetically-defined cell types in the mammalian central nervous system. These transgenic tools have played a central role in the optogenetic revolution in neuroscience. Indeed, an overwhelming proportion of studies in this field have made use of BAC transgenic Cre driver lines to achieve targeted expression of optogenetic probes in the brain. In addition, several BAC transgenic mouse lines have been established for direct cell-type specific expression of Channelrhodopsin-2 (ChR2). While the benefits of these new tools largely outweigh any accompanying challenges, many available BAC transgenic lines may suffer from confounds due in part to increased gene dosage of one or more “extra” genes contained within the large BAC DNA sequences. Here we discuss this under-appreciated issue and propose strategies for developing the next generation of BAC transgenic lines that are devoid of extra genes. Furthermore, we provide evidence that these strategies are simple, reproducible, and do not disrupt the intended cell-type specific transgene expression patterns for several distinct BAC clones. These strategies may be widely implemented for improved BAC transgenesis across diverse disciplines. PMID:24772073

  16. Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations

    NASA Astrophysics Data System (ADS)

    Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.

    2017-09-01

    Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (i) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (ii) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph; (iii) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (iv) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (vi) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.

  17. LCFM - LIVING COLOR FRAME MAKER: PC GRAPHICS GENERATION AND MANAGEMENT TOOL FOR REAL-TIME APPLICATIONS

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1994-01-01

    Computer graphics are often applied for better understanding and interpretation of data under observation. These graphics become more complicated when animation is required during "run-time", as found in many typical modern artificial intelligence and expert systems. Living Color Frame Maker is a solution to many of these real-time graphics problems. Living Color Frame Maker (LCFM) is a graphics generation and management tool for IBM or IBM compatible personal computers. To eliminate graphics programming, the graphic designer can use LCFM to generate computer graphics frames. The graphical frames are then saved as text files, in a readable and disclosed format, which can be easily accessed and manipulated by user programs for a wide range of "real-time" visual information applications. For example, LCFM can be implemented in a frame-based expert system for visual aids in management of systems. For monitoring, diagnosis, and/or controlling purposes, circuit or systems diagrams can be brought to "life" by using designated video colors and intensities to symbolize the status of hardware components (via real-time feedback from sensors). Thus status of the system itself can be displayed. The Living Color Frame Maker is user friendly with graphical interfaces, and provides on-line help instructions. All options are executed using mouse commands and are displayed on a single menu for fast and easy operation. LCFM is written in C++ using the Borland C++ 2.0 compiler for IBM PC series computers and compatible computers running MS-DOS. The program requires a mouse and an EGA/VGA display. A minimum of 77K of RAM is also required for execution. The documentation is provided in electronic form on the distribution medium in WordPerfect format. A sample MS-DOS executable is provided on the distribution medium. The standard distribution medium for this program is one 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools

  18. LCFM - LIVING COLOR FRAME MAKER: PC GRAPHICS GENERATION AND MANAGEMENT TOOL FOR REAL-TIME APPLICATIONS

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1994-01-01

    Computer graphics are often applied for better understanding and interpretation of data under observation. These graphics become more complicated when animation is required during "run-time", as found in many typical modern artificial intelligence and expert systems. Living Color Frame Maker is a solution to many of these real-time graphics problems. Living Color Frame Maker (LCFM) is a graphics generation and management tool for IBM or IBM compatible personal computers. To eliminate graphics programming, the graphic designer can use LCFM to generate computer graphics frames. The graphical frames are then saved as text files, in a readable and disclosed format, which can be easily accessed and manipulated by user programs for a wide range of "real-time" visual information applications. For example, LCFM can be implemented in a frame-based expert system for visual aids in management of systems. For monitoring, diagnosis, and/or controlling purposes, circuit or systems diagrams can be brought to "life" by using designated video colors and intensities to symbolize the status of hardware components (via real-time feedback from sensors). Thus status of the system itself can be displayed. The Living Color Frame Maker is user friendly with graphical interfaces, and provides on-line help instructions. All options are executed using mouse commands and are displayed on a single menu for fast and easy operation. LCFM is written in C++ using the Borland C++ 2.0 compiler for IBM PC series computers and compatible computers running MS-DOS. The program requires a mouse and an EGA/VGA display. A minimum of 77K of RAM is also required for execution. The documentation is provided in electronic form on the distribution medium in WordPerfect format. A sample MS-DOS executable is provided on the distribution medium. The standard distribution medium for this program is one 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools

  19. Unexpected benefits of deciding by mind wandering

    PubMed Central

    Giblin, Colleen E.; Morewedge, Carey K.; Norton, Michael I.

    2013-01-01

    The mind wanders, even when people are attempting to make complex decisions. We suggest that mind wandering—allowing one's thoughts to wander until the “correct” choice comes to mind—can positively impact people's feelings about their decisions. We compare post-choice satisfaction from choices made by mind wandering to reason-based choices and randomly assigned outcomes. Participants chose a poster by mind wandering or deliberating, or were randomly assigned a poster. Whereas forecasters predicted that participants who chose by mind wandering would evaluate their outcome as inferior to participants who deliberated (Experiment 1), participants who used mind wandering as a decision strategy evaluated their choice just as positively as did participants who used deliberation (Experiment 2). In some cases, it appears that people can spare themselves the effort of deliberation and instead “decide by wind wandering,” yet experience no decrease in satisfaction. PMID:24046760

  20. Next generation analytic tools for large scale genetic epidemiology studies of complex diseases.

    PubMed

    Mechanic, Leah E; Chen, Huann-Sheng; Amos, Christopher I; Chatterjee, Nilanjan; Cox, Nancy J; Divi, Rao L; Fan, Ruzong; Harris, Emily L; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M; McAllister, Kimberly; Moore, Jason H; Paltoo, Dina N; Province, Michael A; Ramos, Erin M; Ritchie, Marylyn D; Roeder, Kathryn; Schaid, Daniel J; Stephens, Matthew; Thomas, Duncan C; Weinberg, Clarice R; Witte, John S; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J; Gillanders, Elizabeth M

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled "Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases" on September 15-16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. © 2011 Wiley Periodicals, Inc.

  1. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    NASA Astrophysics Data System (ADS)

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  2. STOP using just GO: a multi-ontology hypothesis generation tool for high throughput experimentation

    PubMed Central

    2013-01-01

    Background Gene Ontology (GO) enrichment analysis remains one of the most common methods for hypothesis generation from high throughput datasets. However, we believe that researchers strive to test other hypotheses that fall outside of GO. Here, we developed and evaluated a tool for hypothesis generation from gene or protein lists using ontological concepts present in manually curated text that describes those genes and proteins. Results As a consequence we have developed the method Statistical Tracking of Ontological Phrases (STOP) that expands the realm of testable hypotheses in gene set enrichment analyses by integrating automated annotations of genes to terms from over 200 biomedical ontologies. While not as precise as manually curated terms, we find that the additional enriched concepts have value when coupled with traditional enrichment analyses using curated terms. Conclusion Multiple ontologies have been developed for gene and protein annotation, by using a dataset of both manually curated GO terms and automatically recognized concepts from curated text we can expand the realm of hypotheses that can be discovered. The web application STOP is available at http://mooneygroup.org/stop/. PMID:23409969

  3. Next Generation Analytic Tools for Large Scale Genetic Epidemiology Studies of Complex Diseases

    PubMed Central

    Mechanic, Leah E.; Chen, Huann-Sheng; Amos, Christopher I.; Chatterjee, Nilanjan; Cox, Nancy J.; Divi, Rao L.; Fan, Ruzong; Harris, Emily L.; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Province, Michael A.; Ramos, Erin M.; Ritchie, Marylyn D.; Roeder, Kathryn; Schaid, Daniel J.; Stephens, Matthew; Thomas, Duncan C.; Weinberg, Clarice R.; Witte, John S.; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J.; Gillanders, Elizabeth M.

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled “Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases” on September 15–16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. PMID:22147673

  4. Generation of Look-Up Tables for Dynamic Job Shop Scheduling Decision Support Tool

    NASA Astrophysics Data System (ADS)

    Oktaviandri, Muchamad; Hassan, Adnan; Mohd Shaharoun, Awaluddin

    2016-02-01

    Majority of existing scheduling techniques are based on static demand and deterministic processing time, while most job shop scheduling problem are concerned with dynamic demand and stochastic processing time. As a consequence, the solutions obtained from the traditional scheduling technique are ineffective wherever changes occur to the system. Therefore, this research intends to develop a decision support tool (DST) based on promising artificial intelligent that is able to accommodate the dynamics that regularly occur in job shop scheduling problem. The DST was designed through three phases, i.e. (i) the look-up table generation, (ii) inverse model development and (iii) integration of DST components. This paper reports the generation of look-up tables for various scenarios as a part in development of the DST. A discrete event simulation model was used to compare the performance among SPT, EDD, FCFS, S/OPN and Slack rules; the best performances measures (mean flow time, mean tardiness and mean lateness) and the job order requirement (inter-arrival time, due dates tightness and setup time ratio) which were compiled into look-up tables. The well-known 6/6/J/Cmax Problem from Muth and Thompson (1963) was used as a case study. In the future, the performance measure of various scheduling scenarios and the job order requirement will be mapped using ANN inverse model.

  5. Decidability of classes of algebraic systems in polynomial time

    SciTech Connect

    Anokhin, M I

    2002-02-28

    For some classes of algebraic systems several kinds of polynomial-time decidability are considered, which use an oracle performing signature operations and computing predicates. Relationships between various kinds of decidability are studied. Several results on decidability and undecidability in polynomial time are proved for some finitely based varieties of universal algebras.

  6. 34 CFR 85.942 - ED Deciding Official.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Definitions § 85.942 ED Deciding Official. The ED Deciding Official is an ED officer who has delegated authority under the procedures of the Department of Education to decide whether to affirm a suspension or enter a debarment. Authority: E.O. 12549 (3 CFR, 1986 Comp., p. 189), E.O. 12689 ( 3 CFR, 1989 Comp., p...

  7. 25 CFR 2.4 - Officials who may decide appeals.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false Officials who may decide appeals. 2.4 Section 2.4 Indians... ACTIONS § 2.4 Officials who may decide appeals. The following officials may decide appeals: (a) An Area...) An Area Education Programs Administrator, Agency Superintendent for Education, President of a Post...

  8. 2 CFR 3485.937 - ED Deciding Official.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false ED Deciding Official. 3485.937 Section 3485.937 Grants and Agreements Federal Agency Regulations for Grants and Agreements DEPARTMENT OF EDUCATION NONPROCUREMENT DEBARMENT AND SUSPENSION Definitions § 3485.937 ED Deciding Official. The ED Deciding Official...

  9. 34 CFR 85.942 - ED Deciding Official.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 1 2011-07-01 2011-07-01 false ED Deciding Official. 85.942 Section 85.942 Education Office of the Secretary, Department of Education GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 85.942 ED Deciding Official. The ED Deciding Official is an ED officer who has...

  10. 2 CFR 3485.937 - ED Deciding Official.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 2 Grants and Agreements 1 2013-01-01 2013-01-01 false ED Deciding Official. 3485.937 Section 3485.937 Grants and Agreements Federal Agency Regulations for Grants and Agreements DEPARTMENT OF EDUCATION NONPROCUREMENT DEBARMENT AND SUSPENSION Definitions § 3485.937 ED Deciding Official. The ED Deciding Official...

  11. Next generation human skin constructs as advanced tools for drug development.

    PubMed

    Abaci, H E; Guo, Zongyou; Doucet, Yanne; Jacków, Joanna; Christiano, Angela

    2017-01-01

    Many diseases, as well as side effects of drugs, manifest themselves through skin symptoms. Skin is a complex tissue that hosts various specialized cell types and performs many roles including physical barrier, immune and sensory functions. Therefore, modeling skin in vitro presents technical challenges for tissue engineering. Since the first attempts at engineering human epidermis in 1970s, there has been a growing interest in generating full-thickness skin constructs mimicking physiological functions by incorporating various skin components, such as vasculature and melanocytes for pigmentation. Development of biomimetic in vitro human skin models with these physiological functions provides a new tool for drug discovery, disease modeling, regenerative medicine and basic research for skin biology. This goal, however, has long been delayed by the limited availability of different cell types, the challenges in establishing co-culture conditions, and the ability to recapitulate the 3D anatomy of the skin. Recent breakthroughs in induced pluripotent stem cell (iPSC) technology and microfabrication techniques such as 3D-printing have allowed for building more reliable and complex in vitro skin models for pharmaceutical screening. In this review, we focus on the current developments and prevailing challenges in generating skin constructs with vasculature, skin appendages such as hair follicles, pigmentation, immune response, innervation, and hypodermis. Furthermore, we discuss the promising advances that iPSC technology offers in order to generate in vitro models of genetic skin diseases, such as epidermolysis bullosa and psoriasis. We also discuss how future integration of the next generation human skin constructs onto microfluidic platforms along with other tissues could revolutionize the early stages of drug development by creating reliable evaluation of patient-specific effects of pharmaceutical agents. Impact statement Skin is a complex tissue that hosts various

  12. Geant4-DNA simulations using complex DNA geometries generated by the DnaFabric tool

    NASA Astrophysics Data System (ADS)

    Meylan, S.; Vimont, U.; Incerti, S.; Clairand, I.; Villagrasa, C.

    2016-07-01

    Several DNA representations are used to study radio-induced complex DNA damages depending on the approach and the required level of granularity. Among all approaches, the mechanistic one requires the most resolved DNA models that can go down to atomistic DNA descriptions. The complexity of such DNA models make them hard to modify and adapt in order to take into account different biological conditions. The DnaFabric project was started to provide a tool to generate, visualise and modify such complex DNA models. In the current version of DnaFabric, the models can be exported to the Geant4 code to be used as targets in the Monte Carlo simulation. In this work, the project was used to generate two DNA fibre models corresponding to two DNA compaction levels representing the hetero and the euchromatin. The fibres were imported in a Geant4 application where computations were performed to estimate the influence of the DNA compaction on the amount of calculated DNA damage. The relative difference of the DNA damage computed in the two fibres for the same number of projectiles was found to be constant and equal to 1.3 for the considered primary particles (protons from 300 keV to 50 MeV). However, if only the tracks hitting the DNA target are taken into account, then the relative difference is more important for low energies and decreases to reach zero around 10 MeV. The computations were performed with models that contain up to 18,000 DNA nucleotide pairs. Nevertheless, DnaFabric will be extended to manipulate multi-scale models that go from the molecular to the cellular levels.

  13. Evaluating an image-fusion algorithm with synthetic-image-generation tools

    NASA Astrophysics Data System (ADS)

    Gross, Harry N.; Schott, John R.

    1996-06-01

    An algorithm that combines spectral mixing and nonlinear optimization is used to fuse multiresolution images. Image fusion merges images of different spatial and spectral resolutions to create a high spatial resolution multispectral combination. High spectral resolution allows identification of materials in the scene, while high spatial resolution locates those materials. In this algorithm, conventional spectral mixing estimates the percentage of each material (called endmembers) within each low resolution pixel. Three spectral mixing models are compared; unconstrained, partially constrained, and fully constrained. In the partially constrained application, the endmember fractions are required to sum to one. In the fully constrained application, all fractions are additionally required to lie between zero and one. While negative fractions seem inappropriate, they can arise from random spectral realizations of the materials. In the second part of the algorithm, the low resolution fractions are used as inputs to a constrained nonlinear optimization that calculates the endmember fractions for the high resolution pixels. The constraints mirror the low resolution constraints and maintain consistency with the low resolution fraction results. The algorithm can use one or more higher resolution sharpening images to locate the endmembers to high spatial accuracy. The algorithm was evaluated with synthetic image generation (SIG) tools. A SIG developed image can be used to control the various error sources that are likely to impair the algorithm performance. These error sources include atmospheric effects, mismodeled spectral endmembers, and variability in topography and illumination. By controlling the introduction of these errors, the robustness of the algorithm can be studied and improved upon. The motivation for this research is to take advantage of the next generation of multi/hyperspectral sensors. Although the hyperspectral images will be of modest to low resolution

  14. Diagnostic tool for red blood cell membrane disorders: Assessment of a new generation ektacytometer☆

    PubMed Central

    Da Costa, Lydie; Suner, Ludovic; Galimand, Julie; Bonnel, Amandine; Pascreau, Tiffany; Couque, Nathalie; Fenneteau, Odile; Mohandas, Narla

    2016-01-01

    Inherited red blood cell (RBC) membrane disorders, such as hereditary spherocytosis, elliptocytosis and hereditary ovalocytosis, result from mutations in genes encoding various RBC membrane and skeletal proteins. The RBC membrane, a composite structure composed of a lipid bilayer linked to a spectrin/actin-based membrane skeleton, confers upon the RBC unique features of deformability and mechanical stability. The disease severity is primarily dependent on the extent of membrane surface area loss. RBC membrane disorders can be readily diagnosed by various laboratory approaches that include RBC cytology, flow cytometry, ektacytometry, electrophoresis of RBC membrane proteins and genetics. The reference technique for diagnosis of RBC membrane disorders is the osmotic gradient ektacytometry. However, in spite of its recognition as the reference technique, this technique is rarely used as a routine diagnosis tool for RBC membrane disorders due to its limited availability. This may soon change as a new generation of ektacytometer has been recently engineered. In this review, we describe the workflow of the samples shipped to our Hematology laboratory for RBC membrane disorder analysis and the data obtained for a large cohort of French patients presenting with RBC membrane disorders using a newly available version of the ektacytomer. PMID:26603718

  15. Diagnostic tool for red blood cell membrane disorders: Assessment of a new generation ektacytometer.

    PubMed

    Da Costa, Lydie; Suner, Ludovic; Galimand, Julie; Bonnel, Amandine; Pascreau, Tiffany; Couque, Nathalie; Fenneteau, Odile; Mohandas, Narla

    2016-01-01

    Inherited red blood cell (RBC) membrane disorders, such as hereditary spherocytosis, elliptocytosis and hereditary ovalocytosis, result from mutations in genes encoding various RBC membrane and skeletal proteins. The RBC membrane, a composite structure composed of a lipid bilayer linked to a spectrin/actin-based membrane skeleton, confers upon the RBC unique features of deformability and mechanical stability. The disease severity is primarily dependent on the extent of membrane surface area loss. RBC membrane disorders can be readily diagnosed by various laboratory approaches that include RBC cytology, flow cytometry, ektacytometry, electrophoresis of RBC membrane proteins and genetics. The reference technique for diagnosis of RBC membrane disorders is the osmotic gradient ektacytometry. However, in spite of its recognition as the reference technique, this technique is rarely used as a routine diagnosis tool for RBC membrane disorders due to its limited availability. This may soon change as a new generation of ektacytometer has been recently engineered. In this review, we describe the workflow of the samples shipped to our Hematology laboratory for RBC membrane disorder analysis and the data obtained for a large cohort of French patients presenting with RBC membrane disorders using a newly available version of the ektacytomer.

  16. SED-ML web tools: generate, modify and export standard-compliant simulation studies.

    PubMed

    Bergmann, Frank T; Nickerson, David; Waltemath, Dagmar; Scharm, Martin

    2017-04-15

    The Simulation Experiment Description Markup Language (SED-ML) is a standardized format for exchanging simulation studies independently of software tools. We present the SED-ML Web Tools, an online application for creating, editing, simulating and validating SED-ML documents. The Web Tools implement all current SED-ML specifications and, thus, support complex modifications and co-simulation of models in SBML and CellML formats. Ultimately, the Web Tools lower the bar on working with SED-ML documents and help users create valid simulation descriptions. http://sysbioapps.dyndns.org/SED-ML_Web_Tools/ . fbergman@caltech.edu .

  17. Humans and Insects Decide in Similar Ways

    PubMed Central

    Louâpre, Philippe; van Alphen, Jacques J. M.; Pierre, Jean-Sébastien

    2010-01-01

    Behavioral ecologists assume that animals use a motivational mechanism for decisions such as action selection and time allocation, allowing the maximization of their fitness. They consider both the proximate and ultimate causes of behavior in order to understand this type of decision-making in animals. Experimental psychologists and neuroeconomists also study how agents make decisions but they consider the proximate causes of the behavior. In the case of patch-leaving, motivation-based decision-making remains simple speculation. In contrast to other animals, human beings can assess and evaluate their own motivation by an introspection process. It is then possible to study the declared motivation of humans during decision-making and discuss the mechanism used as well as its evolutionary significance. In this study, we combine both the proximate and ultimate causes of behavior for a better understanding of the human decision-making process. We show for the first time ever that human subjects use a motivational mechanism similar to small insects such as parasitoids [1] and bumblebees [2] to decide when to leave a patch. This result is relevant for behavioral ecologists as it supports the biological realism of this mechanism. Humans seem to use a motivational mechanism of decision making known to be adaptive to a heterogeneously distributed resource. As hypothesized by Hutchinson et al. [3] and Wilke and Todd [4], our results are consistent with the evolutionary shaping of decision making because hominoids were hunters and gatherers on food patches for more than two million years. We discuss the plausibility of a neural basis for the motivation mechanism highlighted here, bridging the gap between behavioral ecology and neuroeconomy. Thus, both the motivational mechanism observed here and the neuroeconomy findings are most likely adaptations that were selected for during ancestral times. PMID:21170378

  18. Space Laboratory on a Table Top: A Next Generative ECLSS design and diagnostic tool

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale-time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data

  19. Space Laboratory on a Table Top: A Next Generative ECLSS design and diagnostic tool

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale-time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data

  20. An accurate tool for the fast generation of dark matter halo catalogues

    NASA Astrophysics Data System (ADS)

    Monaco, P.; Sefusatti, E.; Borgani, S.; Crocce, M.; Fosalba, P.; Sheth, R. K.; Theuns, T.

    2013-08-01

    We present a new parallel implementation of the PINpointing Orbit Crossing-Collapsed HIerarchical Objects (PINOCCHIO) algorithm, a quick tool, based on Lagrangian Perturbation Theory, for the hierarchical build-up of dark matter (DM) haloes in cosmological volumes. To assess its ability to predict halo correlations on large scales, we compare its results with those of an N-body simulation of a 3 h-1 Gpc box sampled with 20483 particles taken from the MICE suite, matching the same seeds for the initial conditions. Thanks to the Fastest Fourier Transforms in the West (FFTW) libraries and to the relatively simple design, the code shows very good scaling properties. The CPU time required by PINOCCHIO is a tiny fraction (˜1/2000) of that required by the MICE simulation. Varying some of PINOCCHIO numerical parameters allows one to produce a universal mass function that lies in the range allowed by published fits, although it underestimates the MICE mass function of Friends-of-Friends (FoF) haloes in the high-mass tail. We compare the matter-halo and the halo-halo power spectra with those of the MICE simulation and find that these two-point statistics are well recovered on large scales. In particular, when catalogues are matched in number density, agreement within 10 per cent is achieved for the halo power spectrum. At scales k > 0.1 h Mpc-1, the inaccuracy of the Zel'dovich approximation in locating halo positions causes an underestimate of the power spectrum that can be modelled as a Gaussian factor with a damping scale of d = 3 h-1 Mpc at z = 0, decreasing at higher redshift. Finally, a remarkable match is obtained for the reduced halo bispectrum, showing a good description of non-linear halo bias. Our results demonstrate the potential of PINOCCHIO as an accurate and flexible tool for generating large ensembles of mock galaxy surveys, with interesting applications for the analysis of large galaxy redshift surveys.

  1. Tools for Generating Useful Time-series Data from PhenoCam Images

    NASA Astrophysics Data System (ADS)

    Milliman, T. E.; Friedl, M. A.; Frolking, S.; Hufkens, K.; Klosterman, S.; Richardson, A. D.; Toomey, M. P.

    2012-12-01

    The PhenoCam project (http://phenocam.unh.edu/) is tasked with acquiring, processing, and archiving digital repeat photography to be used for scientific studies of vegetation phenological processes. Over the past 5 years the PhenoCam project has collected over 2 million time series images for a total over 700 GB of image data. Several papers have been published describing derived "vegetation indices" (such as green-chromatic-coordinate or gcc) which can be compared to standard measures such as NDVI or EVI. Imagery from our archive is available for download but converting series of images for a particular camera into useful scientific data, while simple in principle, is complicated by a variety of factors. Cameras are often exposed to harsh weather conditions (high wind, rain, ice, snow pile up), which result in images where the field of view (FOV) is partially obscured or completely blocked for periods of time. The FOV can also change for other reasons (mount failures, tower maintenance, etc.) Some of the relatively inexpensive cameras that are being used can also temporarily lose color balance or exposure controls resulting in loss of imagery. All these factors negatively influence the automated analysis of the image time series making this a non-trivial task. Here we discuss the challenges of processing PhenoCam image time-series for vegetation monitoring and the associated data management tasks. We describe our current processing framework and a simple standardized output format for the resulting time-series data. The time-series data in this format will be generated for specific "regions of interest" (ROI's) for each of the cameras in the PhenoCam network. This standardized output (which will be updated daily) can be considered 'the pulse' of a particular camera and will provide a default phenological dynamic for said camera. The time-series data can also be viewed as a higher level product which can be used to generate "vegetation indices", like gcc, for

  2. Evaluation of machine learning tools for inspection of steam generator tube structures using pulsed eddy current

    NASA Astrophysics Data System (ADS)

    Buck, J. A.; Underhill, P. R.; Morelli, J.; Krause, T. W.

    2017-02-01

    Degradation of nuclear steam generator (SG) tubes and support structures can result in a loss of reactor efficiency. Regular in-service inspection, by conventional eddy current testing (ECT), permits detection of cracks, measurement of wall loss, and identification of other SG tube degradation modes. However, ECT is challenged by overlapping degradation modes such as might occur for SG tube fretting accompanied by tube off-set within a corroding ferromagnetic support structure. Pulsed eddy current (PEC) is an emerging technology examined here for inspection of Alloy-800 SG tubes and associated carbon steel drilled support structures. Support structure hole size was varied to simulate uniform corrosion, while SG tube was off-set relative to hole axis. PEC measurements were performed using a single driver with an 8 pick-up coil configuration in the presence of flat-bottom rectangular frets as an overlapping degradation mode. A modified principal component analysis (MPCA) was performed on the time-voltage data in order to reduce data dimensionality. The MPCA scores were then used to train a support vector machine (SVM) that simultaneously targeted four independent parameters associated with; support structure hole size, tube off-centering in two dimensions and fret depth. The support vector machine was trained, tested, and validated on experimental data. Results were compared with a previously developed artificial neural network (ANN) trained on the same data. Estimates of tube position showed comparable results between the two machine learning tools. However, the ANN produced better estimates of hole inner diameter and fret depth. The better results from ANN analysis was attributed to challenges associated with the SVM when non-constant variance is present in the data.

  3. A wrapper generation tool for the creation of scriptable scientific applications

    NASA Astrophysics Data System (ADS)

    Beazley, David Martin

    In recent years, there has been considerable interest in the use of scripting languages as a mechanism for controlling and developing scientific software. Scripting languages allow scientific applications to be encapsulated in an interpreted environment similar to that found in commercial scientific packages such as MATLAB, Mathematica, and IDL. This improves the usability of scientific software by providing a powerful mechanism for specifying and controlling complex problems as well as giving users an interactive and exploratory problem solving environment. Scripting languages also provide a framework for building and integrating software components that allows tools to be used in a more efficient manner. This streamlines the problem solving process and enables scientists to be more productive. One of the most powerful features of modern scripting languages is their ability to be extended with code written in C, C++, or Fortran. This allows scientists to integrate existing scientific applications into a scripting language environment. Unfortunately, this integration is not easily accomplished due to the complexity of combining scripting languages with compiled code. To simplify the use of scripting languages, a compiler, SWIG (Simplified Wrapper and Interface Generator), has been developed. SWIG automates the construction of scripting language extension modules and allows existing programs written in C or C++ to be easily transformed into scriptable applications. This, in turn, improves the usability and organization of those programs. The design and implementation of SWIG are described as well as strategies for building scriptable scientific applications. A detailed case study is presented in which SWIG has been used to transform a high performance molecular dynamics code at Los Alamos National Laboratory into a highly flexible scriptable application. This transformation revolutionized the use of this application and allowed scientists to perform large

  4. Systems Prototyping with Fourth Generation Tools: One Answer to the Productivity Puzzle? AIR 1983 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Sholtys, Phyllis A.

    The development of information systems using an engineering approach employing both traditional programming techniques and nonprocedural languages is described. A fourth generation application tool is used to develop a prototype system that is revised and expanded as the user clarifies individual requirements. When fully defined, a combination of…

  5. Planetary Spectrum Generator (PSG): An Online Tool to Synthesize Spectra of Comets, Small Bodies, and (Exo)Planets

    NASA Astrophysics Data System (ADS)

    Villanueva, G. L.; Mandell, A.; Protopapa, S.; Faggi, S.; Smith, M. D.; Wolff, M.; Hewagama, T.; Mumma, M. J.

    2017-02-01

    The Planetary Spectrum Generator is an online tool for synthesizing planetary spectra (atmospheres and surfaces) in a broad range of wavelengths (0.1 μm to 100 mm, UV/Vis/near-IR/IR/far-IR/THz/sub-mm/radio) for any observatory, orbiter, or lander.

  6. HAPCAD: An open-source tool to detect PCR crossovers in next-generation sequencing generated HLA data

    PubMed Central

    McDevitt, Shana L.; Bredeson, Jessen V.; Roy, Scott W.; Lane, Julie A.; Noble, Janelle A.

    2016-01-01

    Next-generation sequencing (NGS) based HLA genotyping can generate PCR artifacts corresponding to IMGT/HLA Database alleles, for which multiple examples have been observed, including sequence corresponding to the HLA-DRB1*03:42 allele. Repeat genotyping of 131 samples, previously genotyped as DRB1*03:01 homozygotes using probe-based methods, resulted in the heterozygous call DRB1*03:01+DRB1*03:42. The apparent rare DRB1*03:42 allele is hypothesized to be a “hybrid amplicon” generated by PCR crossover, a process in which a partial PCR product denatures from its template, anneals to a different allele template, and extends to completion. Unlike most PCR crossover products, “hybrid amplicons” always corresponds to an IMGT/HLA Database allele, necessitating a case-by-case analysis of whether its occurrence reflects the actual allele or is simply the result of PCR crossover. The Hybrid Amplicon/PCR Crossover Artifact Detector (HAPCAD) program mimics jumping PCR in silico and flags allele sequences that may also be generated as hybrid amplicon. PMID:26802209

  7. Re-Imagining Specialized STEM Academies: Igniting and Nurturing "Decidedly Different Minds", by Design

    ERIC Educational Resources Information Center

    Marshall, Stephanie Pace

    2010-01-01

    This article offers a personal vision and conceptual design for reimagining specialized science, technology, engineering, and mathematics (STEM) academies designed to nurture "decidedly different" STEM minds and ignite a new generation of global STEM talent, innovation, and entrepreneurial leadership. This design enables students to engage…

  8. Re-Imagining Specialized STEM Academies: Igniting and Nurturing "Decidedly Different Minds", by Design

    ERIC Educational Resources Information Center

    Marshall, Stephanie Pace

    2010-01-01

    This article offers a personal vision and conceptual design for reimagining specialized science, technology, engineering, and mathematics (STEM) academies designed to nurture "decidedly different" STEM minds and ignite a new generation of global STEM talent, innovation, and entrepreneurial leadership. This design enables students to engage…

  9. deepTools2: a next generation web server for deep-sequencing data analysis.

    PubMed

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    NASA Astrophysics Data System (ADS)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  11. Generated spiral bevel gears: Optimal machine-tool settings and tooth contact analysis

    NASA Technical Reports Server (NTRS)

    Litvin, F. L.; Tsung, W. J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  12. Generated spiral bevel gears - Optimal machine-tool settings and tooth contact analysis

    NASA Technical Reports Server (NTRS)

    Litvin, F. L.; Tsung, W.-J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  13. Generated spiral bevel gears - Optimal machine-tool settings and tooth contact analysis

    NASA Technical Reports Server (NTRS)

    Litvin, F. L.; Tsung, W.-J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  14. WordGen: a tool for word selection and nonword generation in Dutch, English, German, and French.

    PubMed

    Duyck, Wouter; Desmet, Timothy; Verbeke, Lieven P C; Brysbaert, Marc

    2004-08-01

    WordGen is an easy-to-use program that uses the CELEX and Lexique lexical databases for word selection and nonword generation in Dutch, English, German, and French. Items can be generated in these four languages, specifying any combination of seven linguistic constraints: number of letters, neighborhood size, frequency, summated position-nonspecific bigram frequency, minimum position-nonspecific bigram f requency, position-specific frequency of the initial and final bigram, and orthographic relatedness. The program also has a module to calculate the respective values of these variables for items that have already been constructed, either with the program or taken from earlier studies. Stimulus queries can be entered through WordGen's graphical user interface or by means of batch files. WordGen is especially useful for (1) Dutch and German item generation, because no such stimulus-selection tool exists for these languages, (2) the generation of nonwords for all four languages, because our program has some important advantages over previous nonword generation approaches, and (3) psycholinguistic experiments on bilingualism, because the possibility of using the same tool for different languages increases the cross-linguistic comparability of the generated item lists. WordGen is free and available at http://expsy.ugent.be/wordgen.htm.

  15. miRNAsong: a web-based tool for generation and testing of miRNA sponge constructs in silico.

    PubMed

    Barta, Tomas; Peskova, Lucie; Hampl, Ales

    2016-11-18

    MicroRNA (miRNA) sponges are RNA transcripts containing multiple high-affinity binding sites that associate with and sequester specific miRNAs to prevent them from interacting with their target messenger (m)RNAs. Due to the high specificity of miRNA sponges and strong inhibition of target miRNAs, these molecules have become increasingly applied in miRNA loss-of-function studies. However, improperly designed sponge constructs may sequester off-target miRNAs; thus, it has become increasingly important to develop a tool for miRNA sponge construct design and testing. In this study, we introduce microRNA sponge generator and tester (miRNAsong), a freely available web-based tool for generation and in silico testing of miRNA sponges. This tool generates miRNA sponge constructs for specific miRNAs and miRNA families/clusters and tests them for potential binding to miRNAs in selected organisms. Currently, miRNAsong allows for testing of sponge constructs in 219 species covering 35,828 miRNA sequences. Furthermore, we also provide an example, supplemented with experimental data, of how to use this tool. Using miRNAsong, we designed and tested a sponge for miR-145 inhibition, and cloned the sequence into an inducible lentiviral vector. We found that established cell lines expressing miR-145 sponge strongly inhibited miR-145, thus demonstrating the usability of miRNAsong tool for sponge generation. URL: http://www.med.muni.cz/histology/miRNAsong/.

  16. miRNAsong: a web-based tool for generation and testing of miRNA sponge constructs in silico

    PubMed Central

    Barta, Tomas; Peskova, Lucie; Hampl, Ales

    2016-01-01

    MicroRNA (miRNA) sponges are RNA transcripts containing multiple high-affinity binding sites that associate with and sequester specific miRNAs to prevent them from interacting with their target messenger (m)RNAs. Due to the high specificity of miRNA sponges and strong inhibition of target miRNAs, these molecules have become increasingly applied in miRNA loss-of-function studies. However, improperly designed sponge constructs may sequester off-target miRNAs; thus, it has become increasingly important to develop a tool for miRNA sponge construct design and testing. In this study, we introduce microRNA sponge generator and tester (miRNAsong), a freely available web-based tool for generation and in silico testing of miRNA sponges. This tool generates miRNA sponge constructs for specific miRNAs and miRNA families/clusters and tests them for potential binding to miRNAs in selected organisms. Currently, miRNAsong allows for testing of sponge constructs in 219 species covering 35,828 miRNA sequences. Furthermore, we also provide an example, supplemented with experimental data, of how to use this tool. Using miRNAsong, we designed and tested a sponge for miR-145 inhibition, and cloned the sequence into an inducible lentiviral vector. We found that established cell lines expressing miR-145 sponge strongly inhibited miR-145, thus demonstrating the usability of miRNAsong tool for sponge generation. URL: http://www.med.muni.cz/histology/miRNAsong/. PMID:27857164

  17. Conceptual Systems Model as a Tool for Hypothesis Generation and Testing in Ecotoxicological Research

    EPA Science Inventory

    Microarray, proteomic, and metabonomic technologies are becoming increasingly accessible as tools for ecotoxicology research. Effective use of these technologies will depend, at least in part, on the ability to apply these techniques within a paradigm of hypothesis driven researc...

  18. Development of Next Generation Synthetic Biology Tools for Use in Streptomyces venezuelae

    SciTech Connect

    Phelan, Ryan M.; Sachs, Daniel; Petkiewicz, Shayne J.; Barajas, Jesus F.; Blake-Hedges, Jacquelyn M.; Thompson, Mitchell G.; Reider Apel, Amanda; Rasor, Blake J.; Katz, Leonard; Keasling, Jay D.

    2016-09-07

    Streptomyces have a rich history as producers of important natural products and this genus of bacteria has recently garnered attention for its potential applications in the broader context of synthetic biology. However, the dearth of genetic tools available to control and monitor protein production precludes rapid and predictable metabolic engineering that is possible in hosts such as Escherichia coli or Saccharomyces cerevisiae. In an effort to improve genetic tools for Streptomyces venezuelae, we developed a suite of standardized, orthogonal integration vectors and an improved method to monitor protein production in this host. These tools were applied to characterize heterologous promoters and various attB chromosomal integration sites. A final study leveraged the characterized toolset to demonstrate its use in producing the biofuel precursor bisabolene using a chromosomally integrated expression system. In conclusion, these tools advance S. venezuelae to be a practical host for future metabolic engineering efforts.

  19. Development of Next Generation Synthetic Biology Tools for Use in Streptomyces venezuelae.

    PubMed

    Phelan, Ryan M; Sachs, Daniel; Petkiewicz, Shayne J; Barajas, Jesus F; Blake-Hedges, Jacquelyn M; Thompson, Mitchell G; Reider Apel, Amanda; Rasor, Blake J; Katz, Leonard; Keasling, Jay D

    2017-01-20

    Streptomyces have a rich history as producers of important natural products and this genus of bacteria has recently garnered attention for its potential applications in the broader context of synthetic biology. However, the dearth of genetic tools available to control and monitor protein production precludes rapid and predictable metabolic engineering that is possible in hosts such as Escherichia coli or Saccharomyces cerevisiae. In an effort to improve genetic tools for Streptomyces venezuelae, we developed a suite of standardized, orthogonal integration vectors and an improved method to monitor protein production in this host. These tools were applied to characterize heterologous promoters and various attB chromosomal integration sites. A final study leveraged the characterized toolset to demonstrate its use in producing the biofuel precursor bisabolene using a chromosomally integrated expression system. These tools advance S. venezuelae to be a practical host for future metabolic engineering efforts.

  20. Conceptual Systems Model as a Tool for Hypothesis Generation and Testing in Ecotoxicological Research

    EPA Science Inventory

    Microarray, proteomic, and metabonomic technologies are becoming increasingly accessible as tools for ecotoxicology research. Effective use of these technologies will depend, at least in part, on the ability to apply these techniques within a paradigm of hypothesis driven researc...

  1. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  2. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 2 2011-07-01 2011-07-01 false Appeal Deciding Officer. 215.8 Section 215.8 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE NOTICE, COMMENT, AND APPEAL PROCEDURES FOR NATIONAL FOREST SYSTEM PROJECTS AND ACTIVITIES § 215.8 Appeal Deciding...

  3. Career Cruising Impact on the Self Efficacy of Deciding Majors

    ERIC Educational Resources Information Center

    Smother, Anthony William

    2012-01-01

    The purpose of this study was to analyze the impact of "Career Cruising"© on self-efficacy of deciding majors in a university setting. The use of the self-assessment instrument, "Career Cruising"©, was used with measuring the career-decision making self-efficacy in a pre and post-test with deciding majors. The independent…

  4. The "Cannot Decide" Option in Thurstone-Type Attitude Scales.

    ERIC Educational Resources Information Center

    Madden, Theodore M.; Klopfer, Frederick J.

    1978-01-01

    Sociology students were administered two Thurstone-type attitude scales under two conditions (with or without a "cannot decide" option), and a measure of ambiguity tolerance. The "cannot decide" option was used by a slight majority of students when available, but usage was not related to ambiguity tolerance. (Author/JKS)

  5. 13 CFR 142.30 - How is the case decided?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false How is the case decided? 142.30 Section 142.30 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION PROGRAM FRAUD CIVIL REMEDIES ACT REGULATIONS Decisions and Appeals § 142.30 How is the case decided? (a) The ALJ will issue...

  6. A Study on Tooling and Its Effect on Heat Generation and Mechanical Properties of Welded Joints in Friction Stir Welding

    NASA Astrophysics Data System (ADS)

    Tikader, Sujoy; Biswas, Pankaj; Puri, Asit Baran

    2016-06-01

    Friction stir welding (FSW) has been the most attracting solid state welding process as it serves numerous advantages like good mechanical, metallurgical properties etc. Non weldable aluminium alloys like 5XXX, 7XXX series can be simply joined by this process. In this present study a mathematical model has been developed and experiments were successfully performed to evaluate mechanical properties of FSW on similar aluminium alloys i.e. AA1100 for different process parameters and mainly two kind of tool geometry (straight cylindrical and conical or cylindrical tapered shaped pin with flat shoulder). Tensile strength and micro hardness for different process parameters are reported of the welded plate sample. It was noticed that in FSW of similar alloy with tool made of SS-310 tool steel, friction is the major contributor for the heat generation. It was seen that tool geometry, tool rotational speed, plunging force by the tool and traverse speed have significant effect on tensile strength and hardness of friction stir welded joints.

  7. BEST: Next-Generation Biomedical Entity Search Tool for Knowledge Discovery from Biomedical Literature

    PubMed Central

    Lee, Kyubum; Choi, Jaehoon; Kim, Seongsoon; Jeon, Minji; Lim, Sangrak; Choi, Donghee; Kim, Sunkyu; Tan, Aik-Choon

    2016-01-01

    As the volume of publications rapidly increases, searching for relevant information from the literature becomes more challenging. To complement standard search engines such as PubMed, it is desirable to have an advanced search tool that directly returns relevant biomedical entities such as targets, drugs, and mutations rather than a long list of articles. Some existing tools submit a query to PubMed and process retrieved abstracts to extract information at query time, resulting in a slow response time and limited coverage of only a fraction of the PubMed corpus. Other tools preprocess the PubMed corpus to speed up the response time; however, they are not constantly updated, and thus produce outdated results. Further, most existing tools cannot process sophisticated queries such as searches for mutations that co-occur with query terms in the literature. To address these problems, we introduce BEST, a biomedical entity search tool. BEST returns, as a result, a list of 10 different types of biomedical entities including genes, diseases, drugs, targets, transcription factors, miRNAs, and mutations that are relevant to a user’s query. To the best of our knowledge, BEST is the only system that processes free text queries and returns up-to-date results in real time including mutation information in the results. BEST is freely accessible at http://best.korea.ac.kr. PMID:27760149

  8. BEST: Next-Generation Biomedical Entity Search Tool for Knowledge Discovery from Biomedical Literature.

    PubMed

    Lee, Sunwon; Kim, Donghyeon; Lee, Kyubum; Choi, Jaehoon; Kim, Seongsoon; Jeon, Minji; Lim, Sangrak; Choi, Donghee; Kim, Sunkyu; Tan, Aik-Choon; Kang, Jaewoo

    2016-01-01

    As the volume of publications rapidly increases, searching for relevant information from the literature becomes more challenging. To complement standard search engines such as PubMed, it is desirable to have an advanced search tool that directly returns relevant biomedical entities such as targets, drugs, and mutations rather than a long list of articles. Some existing tools submit a query to PubMed and process retrieved abstracts to extract information at query time, resulting in a slow response time and limited coverage of only a fraction of the PubMed corpus. Other tools preprocess the PubMed corpus to speed up the response time; however, they are not constantly updated, and thus produce outdated results. Further, most existing tools cannot process sophisticated queries such as searches for mutations that co-occur with query terms in the literature. To address these problems, we introduce BEST, a biomedical entity search tool. BEST returns, as a result, a list of 10 different types of biomedical entities including genes, diseases, drugs, targets, transcription factors, miRNAs, and mutations that are relevant to a user's query. To the best of our knowledge, BEST is the only system that processes free text queries and returns up-to-date results in real time including mutation information in the results. BEST is freely accessible at http://best.korea.ac.kr.

  9. BEAT: A Web-Based Boolean Expression Fault-Based Test Case Generation Tool

    ERIC Educational Resources Information Center

    Chen, T. Y.; Grant, D. D.; Lau, M. F.; Ng, S. P.; Vasa, V. R.

    2006-01-01

    BEAT is a Web-based system that generates fault-based test cases from Boolean expressions. It is based on the integration of our several fault-based test case selection strategies. The generated test cases are considered to be fault-based, because they are aiming at the detection of particular faults. For example, when the Boolean expression is in…

  10. BEAT: A Web-Based Boolean Expression Fault-Based Test Case Generation Tool

    ERIC Educational Resources Information Center

    Chen, T. Y.; Grant, D. D.; Lau, M. F.; Ng, S. P.; Vasa, V. R.

    2006-01-01

    BEAT is a Web-based system that generates fault-based test cases from Boolean expressions. It is based on the integration of our several fault-based test case selection strategies. The generated test cases are considered to be fault-based, because they are aiming at the detection of particular faults. For example, when the Boolean expression is in…

  11. miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.

    PubMed

    Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M

    2009-07-01

    Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.

  12. Virtual tool mark generation for efficient striation analysis in forensic science

    SciTech Connect

    Ekstrand, Laura

    2012-01-01

    In 2009, a National Academy of Sciences report called for investigation into the scienti c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served to con rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5 and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range of angles

  13. The DECIDE evidence to recommendation framework adapted to the public health field in Sweden.

    PubMed

    Guldbrandsson, Karin; Stenström, Nils; Winzer, Regina

    2016-12-01

    Organizations worldwide compile results from scientific studies, and grade the evidence of interventions, in order to assist policy makers. However, quality of evidence alone is seldom sufficient to make a recommendation. The Developing and Evaluating Communication Strategies to Support Informed Decisions and Practice Based on Evidence (DECIDE) framework aims to facilitate decision making and to improve dissemination and implementation of recommendations in the healthcare and public health sector. The aim of this study was to investigate whether the DECIDE framework is applicable in the public health field in Sweden. The DECIDE framework was presented and discussed in interviews with stakeholders and governmental organizations and tested in panels. Content analyses were performed. In general, the informants were positive to the DECIDE framework. However, two questions, the first regarding individual autonomy and the second regarding method sustainability, were by the stakeholders felt to be missing in the framework. The importance of the composition of the DECIDE stakeholder panel was lifted by the informants, as was the significant role of the chair. Further, the informants raised concerns about the general lack of research evidence based on RCT design regarding universal methods in the public health sector. Finally, the local, regional and national levels' responsibility for dissemination and implementation of recommendations were lifted by the informants. The DECIDE framework might be useful as a tool for dissemination and implementation of recommendations in the public health field in Sweden. Important questions for further research are whether these findings are suitable for other public health topics and in other public health settings. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. MAKER2: an annotation pipeline and genome-database management tool for second-generation genome projects.

    PubMed

    Holt, Carson; Yandell, Mark

    2011-12-22

    Second-generation sequencing technologies are precipitating major shifts with regards to what kinds of genomes are being sequenced and how they are annotated. While the first generation of genome projects focused on well-studied model organisms, many of today's projects involve exotic organisms whose genomes are largely terra incognita. This complicates their annotation, because unlike first-generation projects, there are no pre-existing 'gold-standard' gene-models with which to train gene-finders. Improvements in genome assembly and the wide availability of mRNA-seq data are also creating opportunities to update and re-annotate previously published genome annotations. Today's genome projects are thus in need of new genome annotation tools that can meet the challenges and opportunities presented by second-generation sequencing technologies. We present MAKER2, a genome annotation and data management tool designed for second-generation genome projects. MAKER2 is a multi-threaded, parallelized application that can process second-generation datasets of virtually any size. We show that MAKER2 can produce accurate annotations for novel genomes where training-data are limited, of low quality or even non-existent. MAKER2 also provides an easy means to use mRNA-seq data to improve annotation quality; and it can use these data to update legacy annotations, significantly improving their quality. We also show that MAKER2 can evaluate the quality of genome annotations, and identify and prioritize problematic annotations for manual review. MAKER2 is the first annotation engine specifically designed for second-generation genome projects. MAKER2 scales to datasets of any size, requires little in the way of training data, and can use mRNA-seq data to improve annotation quality. It can also update and manage legacy genome annotation datasets.

  15. MAKER2: an annotation pipeline and genome-database management tool for second-generation genome projects

    PubMed Central

    2011-01-01

    Background Second-generation sequencing technologies are precipitating major shifts with regards to what kinds of genomes are being sequenced and how they are annotated. While the first generation of genome projects focused on well-studied model organisms, many of today's projects involve exotic organisms whose genomes are largely terra incognita. This complicates their annotation, because unlike first-generation projects, there are no pre-existing 'gold-standard' gene-models with which to train gene-finders. Improvements in genome assembly and the wide availability of mRNA-seq data are also creating opportunities to update and re-annotate previously published genome annotations. Today's genome projects are thus in need of new genome annotation tools that can meet the challenges and opportunities presented by second-generation sequencing technologies. Results We present MAKER2, a genome annotation and data management tool designed for second-generation genome projects. MAKER2 is a multi-threaded, parallelized application that can process second-generation datasets of virtually any size. We show that MAKER2 can produce accurate annotations for novel genomes where training-data are limited, of low quality or even non-existent. MAKER2 also provides an easy means to use mRNA-seq data to improve annotation quality; and it can use these data to update legacy annotations, significantly improving their quality. We also show that MAKER2 can evaluate the quality of genome annotations, and identify and prioritize problematic annotations for manual review. Conclusions MAKER2 is the first annotation engine specifically designed for second-generation genome projects. MAKER2 scales to datasets of any size, requires little in the way of training data, and can use mRNA-seq data to improve annotation quality. It can also update and manage legacy genome annotation datasets. PMID:22192575

  16. Generation of orientation tools for automated zebrafish screening assays using desktop 3D printing

    PubMed Central

    2014-01-01

    Background The zebrafish has been established as the main vertebrate model system for whole organism screening applications. However, the lack of consistent positioning of zebrafish embryos within wells of microtiter plates remains an obstacle for the comparative analysis of images acquired in automated screening assays. While technical solutions to the orientation problem exist, dissemination is often hindered by the lack of simple and inexpensive ways of distributing and duplicating tools. Results Here, we provide a cost effective method for the production of 96-well plate compatible zebrafish orientation tools using a desktop 3D printer. The printed tools enable the positioning and orientation of zebrafish embryos within cavities formed in agarose. Their applicability is demonstrated by acquiring lateral and dorsal views of zebrafish embryos arrayed within microtiter plates using an automated screening microscope. This enables the consistent visualization of morphological phenotypes and reporter gene expression patterns. Conclusions The designs are refined versions of previously demonstrated devices with added functionality and strongly reduced production costs. All corresponding 3D models are freely available and digital design can be easily shared electronically. In combination with the increasingly widespread usage of 3D printers, this provides access to the developed tools to a wide range of zebrafish users. Finally, the design files can serve as templates for other additive and subtractive fabrication methods. PMID:24886511

  17. Eastern Regional Technical Advisory Committee (ERTAC) Electricity Generating Unit (EGU) Emission Projection Tool (2015 EIC)

    EPA Pesticide Factsheets

    Class content will include examples of how the Tool maybe applied to calculate the impacts of various air pollution control regulations (for example, the Mercury and Air Toxics Rule) on future year activity as well as NOx, SO2, and CO2 emissions.

  18. WORD STATISTICS IN THE GENERATION OF SEMANTIC TOOLS FOR INFORMATION SYSTEMS.

    ERIC Educational Resources Information Center

    STONE, DON C.

    ONE OF THE PROBLEMS IN INFORMATION STORAGE AND RETRIEVAL SYSTEMS OF TECHNICAL DOCUMENTS IS THE INTERPRETATION OF WORDS USED TO INDEX DOCUMENTS. SEMANTIC TOOLS, DEFINED AS CHANNELS FOR THE COMMUNICATION OF WORD MEANINGS BETWEEN TECHNICAL EXPERTS, DOCUMENT INDEXERS, AND SEARCHERS, PROVIDE ONE METHOD OF DEALING WITH THE PROBLEM OF MULTIPLE…

  19. DRIVE Analysis Tool Generates Custom Vehicle Drive Cycles Based on Real-World Data (Fact Sheet)

    SciTech Connect

    Not Available

    2013-04-01

    This fact sheet from the National Renewable Energy Laboratory describes the Drive-Cycle Rapid Investigation, Visualization, and Evaluation (DRIVE) analysis tool, which uses GPS and controller area network data to characterize vehicle operation and produce custom vehicle drive cycles, analyzing thousands of hours of data in a matter of minutes.

  20. Generating and Analyzing Visual Representations of Conic Sections with the Use of Technological Tools

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Espinosa-Perez, Hugo; Reyes-Rodriguez, Aaron

    2006-01-01

    Technological tools have the potential to offer students the possibility to represent information and relationships embedded in problems and concepts in ways that involve numerical, algebraic, geometric, and visual approaches. In this paper, the authors present and discuss an example in which an initial representation of a mathematical object…

  1. Generating and Analyzing Visual Representations of Conic Sections with the Use of Technological Tools

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Espinosa-Perez, Hugo; Reyes-Rodriguez, Aaron

    2006-01-01

    Technological tools have the potential to offer students the possibility to represent information and relationships embedded in problems and concepts in ways that involve numerical, algebraic, geometric, and visual approaches. In this paper, the authors present and discuss an example in which an initial representation of a mathematical object…

  2. EDCATS: An Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Heard, Pamala D.

    1998-01-01

    The purpose of this research is to explore the development of Marshall Space Flight Center Unique Programs. These academic tools provide the Education Program Office with important information from the Education Computer Aided Tracking System (EDCATS). This system is equipped to provide on-line data entry, evaluation, analysis, and report generation, with full archiving for all phases of the evaluation process. Another purpose is to develop reports and data that is tailored to Marshall Space Flight Center Unique Programs. It also attempts to acquire knowledge on how, why, and where information is derived. As a result, a user will be better prepared to decide which available tool is the most feasible for their reports.

  3. US Greenhouse Gas (GHG) Emissions and Avoided Emissions and Generation Tool Training (AVERT) (2015 EIC)

    EPA Pesticide Factsheets

    AVERT captures the actual historical behavior of electricity generating units' (EGUs’) operation on an hourly basis to predict how EGUs will operate with additional EE/RE delivered to the electricity grid.

  4. NullSeq: A Tool for Generating Random Coding Sequences with Desired Amino Acid and GC Contents

    PubMed Central

    Liu, Sophia S.; Hockenberry, Adam J.; Lancichinetti, Andrea; Jewett, Michael C.

    2016-01-01

    The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. In order to accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. While many tools have been developed to create random nucleotide sequences, protein coding sequences are subject to a unique set of constraints that complicates the process of generating appropriate null models. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content for the purpose of hypothesis testing. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content, which we have developed into a python package. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. Furthermore, this approach can easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes as well as more effective engineering of biological systems. PMID:27835644

  5. Free Tools and Strategies for the Generation of 3D Finite Element Meshes: Modeling of the Cardiac Structures

    PubMed Central

    Pavarino, E.; Neves, L. A.; Machado, J. M.; de Godoy, M. F.; Shiyou, Y.; Momente, J. C.; Zafalon, G. F. D.; Pinto, A. R.; Valêncio, C. R.

    2013-01-01

    The Finite Element Method is a well-known technique, being extensively applied in different areas. Studies using the Finite Element Method (FEM) are targeted to improve cardiac ablation procedures. For such simulations, the finite element meshes should consider the size and histological features of the target structures. However, it is possible to verify that some methods or tools used to generate meshes of human body structures are still limited, due to nondetailed models, nontrivial preprocessing, or mainly limitation in the use condition. In this paper, alternatives are demonstrated to solid modeling and automatic generation of highly refined tetrahedral meshes, with quality compatible with other studies focused on mesh generation. The innovations presented here are strategies to integrate Open Source Software (OSS). The chosen techniques and strategies are presented and discussed, considering cardiac structures as a first application context. PMID:23762031

  6. SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool

    NASA Technical Reports Server (NTRS)

    Boyer, Jeffrey S.

    1994-01-01

    Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  7. Microfluidic dissolved oxygen gradient generator biochip as a useful tool in bacterial biofilm studies.

    PubMed

    Skolimowski, Maciej; Nielsen, Martin Weiss; Emnéus, Jenny; Molin, Søren; Taboryski, Rafael; Sternberg, Claus; Dufva, Martin; Geschke, Oliver

    2010-08-21

    A microfluidic chip for generation of gradients of dissolved oxygen was designed, fabricated and tested. The novel way of active oxygen depletion through a gas permeable membrane was applied. Numerical simulations for generation of O(2) gradients were correlated with measured oxygen concentrations. The developed microsystem was used to study growth patterns of the bacterium Pseudomonas aeruginosa in medium with different oxygen concentrations. The results showed that attachment of Pseudomonas aeruginosa to the substrate changed with oxygen concentration. This demonstrates that the device can be used for studies requiring controlled oxygen levels and for future studies of microaerobic and anaerobic conditions.

  8. Optically active sum-frequency generation as an advanced tool for chiral metallopolymer material

    NASA Astrophysics Data System (ADS)

    Taupier, Grégory; Torres-Werlé, Maria; Boeglin, Alex; Maisse-François, Aline; Achard, Thierry; Bellemin-Laponnaz, Stéphane; Dorkenoo, Kokou Dodzi Honorat

    2017-01-01

    Metallopolymers incorporating metal ions and polytopic ligands offer the advantage to be easily obtained through a self-assembly process in solution but offer great prospects in the development of multifunctional, smart, even self-healing, materials. We have found that chiral enantiopure ligands containing bis(oxazoline) units in combination with Ni(II) salts generate well-defined thin films either by drop casting or by spin-coating and we demonstrate that the condensation process of these chiral metallosupramolecular assemblies can be characterized through optically active sum-frequency generation.

  9. ORIGEN-ARP, A Fast and Easy-to-Use Source Term Generation Tool

    SciTech Connect

    Bowman, S.M.; Hermann, O.W.; Leal, L.C.; Parks, C.V.

    1999-10-17

    ORIGEN-ARP is a new SCALE analytical sequence for spent fuel characterization and source term generation that serves as a faster alternative to the SAS2H sequence by using the Automatic Rapid Processing (ARP) methodology for generating problem-dependent ORIGEN-S cross-section libraries. ORIGEN-ARP provides an easy-to-use menu-driven input processor. This new sequence is two orders of magnitude faster than SAS2H while conserving the rigor and accuracy of the SAS2H methodology. ORIGEN-ARP has been validated against pressurized water reactor (PWR) and boiling water reactor (BWR) spent fuel chemical assay data.

  10. Considerations in Deciding to Treat Contaminated Unsaturated Soils In Situ

    EPA Pesticide Factsheets

    The purpose of this Issue Paper is to assist the user in deciding if in situ treatment of contaminated soil is a potentially feasible remedial alternative and to assist in the process of reviewing and screening in situ technologies.

  11. Quality of Judgment and Deciding Rightness: Ethics and Educational Administration.

    ERIC Educational Resources Information Center

    Corson, David

    1985-01-01

    Challenges certain theoretical assumptions underlying educational administration. Demands critical thinking about ethical aspects, particularly the relationship between "quality of judgement" and "deciding rightness." Proposes an ethics program for school administrators incorporating values reflected in schools as a social…

  12. Deciding If Double Knee Replacement Is Right for You

    MedlinePlus

    ... 166190.html Deciding If Double Knee Replacement Is Right for You For certain patients, the 2-in- ... to their normal lives faster, "and for the right patient, it's a good option," said Westrich. He ...

  13. A Statistical Bias Correction Tool for Generating Climate Change Scenarios in Indonesia based on CMIP5 Datasets

    NASA Astrophysics Data System (ADS)

    Faqih, A.

    2017-03-01

    Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.

  14. Understanding the Generative Capacity of Analogies as a Tool for Explanation.

    ERIC Educational Resources Information Center

    Wong, David E.

    1993-01-01

    Examines analogical reasoning in contexts where understanding is generated from loosely organized, incomplete prior knowledge rather than transferred from a well-structured domain of understanding. Participants, consisting of 11 secondary school science teacher candidates, were presented with a piston/cylinder device and asked to explain the…

  15. Self-Generated Analogies as a Tool for Constructing and Evaluating Explanations of Scientific Phenomena.

    ERIC Educational Resources Information Center

    Wong, E. David

    1993-01-01

    Investigates whether students can use a series of self-generated analogies to bring about change in their understanding of a given scientific phenomenon and the nature of the change in the understanding. Eleven individuals in a teacher education program were audiotaped and videotaped as they attempted to understand a piston/cylinder device. (PR)

  16. Arkose: A Prototype Mechanism and Tool for Collaborative Information Generation and Distillation

    ERIC Educational Resources Information Center

    Nam, Kevin Kyung

    2010-01-01

    The goals of this thesis have been to gain a better understanding of collaborative knowledge sharing and distilling and to build a prototype collaborative system that supports flexible knowledge generation and distillation. To reach these goals, I have conducted two user studies and built two systems. The first system, Arkose 1.0, is a…

  17. HGT-Gen: a tool for generating a phylogenetic tree with horizontal gene transfer.

    PubMed

    Horiike, Tokumasa; Miyata, Daisuke; Tateno, Yoshio; Minai, Ryoichi

    2011-01-01

    Horizontal gene transfer (HGT) is a common event in prokaryotic evolution. Therefore, it is very important to consider HGT in the study of molecular evolution of prokaryotes. This is true also for conducting computer simulations of their molecular phylogeny because HGT is known to be a serious disturbing factor for estimating their correct phylogeny. To the best of our knowledge, no existing computer program has generated a phylogenetic tree with HGT from an original phylogenetic tree. We developed a program called HGT-Gen that generates a phylogenetic tree with HGT on the basis of an original phylogenetic tree of a protein or gene. HGT-Gen converts an operational taxonomic unit or a clade from one place to another in a given phylogenetic tree. We have also devised an algorithm to compute the average length between any pair of branches in the tree. It defines and computes the relative evolutionary time to normalize evolutionary time for each lineage. The algorithm can generate an HGT between a pair of donor and acceptor lineages at the same evolutionary time. HGT-Gen is used with a sequence-generating program to evaluate the influence of HGT on the molecular phylogeny of prokaryotes in a computer simulation study. The database is available for free at http://www.grl.shizuoka.ac.jp/˜thoriike/HGT-Gen.html.

  18. Arkose: A Prototype Mechanism and Tool for Collaborative Information Generation and Distillation

    ERIC Educational Resources Information Center

    Nam, Kevin Kyung

    2010-01-01

    The goals of this thesis have been to gain a better understanding of collaborative knowledge sharing and distilling and to build a prototype collaborative system that supports flexible knowledge generation and distillation. To reach these goals, I have conducted two user studies and built two systems. The first system, Arkose 1.0, is a…

  19. Photography as a Data Generation Tool for Qualitative Inquiry in Education.

    ERIC Educational Resources Information Center

    Cappello, Marva

    This paper discusses the ways in which photography was used for data generation in a 9-month qualitative study on a mixed-age elementary school classroom. Through a review of the research literature in anthropology, sociology, and education, and an analysis of the research data, the usefulness of photography for educational research with young…

  20. Messaging, Gaming, Peer-to-Peer Sharing: Language Learning Strategies & Tools for the Millennial Generation

    ERIC Educational Resources Information Center

    Godwin-Jones, Bob

    2005-01-01

    The next generation's enthusiasm for instant messaging, videogames, and peer-to-peer file swapping is likely to be dismissed by their elders as so many ways to waste time and avoid the real worlds of work or school. But these activities may not be quite as vapid as they may seem from the perspective of outsiders--or educators. Researchers point…

  1. Messaging, Gaming, Peer-to-Peer Sharing: Language Learning Strategies & Tools for the Millennial Generation

    ERIC Educational Resources Information Center

    Godwin-Jones, Bob

    2005-01-01

    The next generation's enthusiasm for instant messaging, videogames, and peer-to-peer file swapping is likely to be dismissed by their elders as so many ways to waste time and avoid the real worlds of work or school. But these activities may not be quite as vapid as they may seem from the perspective of outsiders--or educators. Researchers point…

  2. Rational protein design: developing next-generation biological therapeutics and nanobiotechnological tools.

    PubMed

    Wilson, Corey J

    2015-01-01

    Proteins are the most functionally diverse macromolecules observed in nature, participating in a broad array of catalytic, biosensing, transport, scaffolding, and regulatory functions. Fittingly, proteins have become one of the most promising nanobiotechnological tools to date, and through the use of recombinant DNA and other laboratory methods we have produced a vast number of biological therapeutics derived from human genes. Our emerging ability to rationally design proteins (e.g., via computational methods) holds the promise of significantly expanding the number and diversity of protein therapies and has opened the gateway to realizing true and uncompromised personalized medicine. In the last decade computational protein design has been transformed from a set of fundamental strategies to stringently test our understanding of the protein structure-function relationship, to practical tools for developing useful biological processes, nano-devices, and novel therapeutics. As protein design strategies improve (i.e., in terms of accuracy and efficiency) clinicians will be able to leverage individual genetic data and biological metrics to develop and deliver personalized protein therapeutics with minimal delay. © 2014 Wiley Periodicals, Inc.

  3. Environmental epigenetics: A promising venue for developing next-generation pollution biomonitoring tools in marine invertebrates.

    PubMed

    Suarez-Ulloa, Victoria; Gonzalez-Romero, Rodrigo; Eirin-Lopez, Jose M

    2015-09-15

    Environmental epigenetics investigates the cause-effect relationships between specific environmental factors and the subsequent epigenetic modifications triggering adaptive responses in the cell. Given the dynamic and potentially reversible nature of the different types of epigenetic marks, environmental epigenetics constitutes a promising venue for developing fast and sensible biomonitoring programs. Indeed, several epigenetic biomarkers have been successfully developed and applied in traditional model organisms (e.g., human and mouse). Nevertheless, the lack of epigenetic knowledge in other ecologically and environmentally relevant organisms has hampered the application of these tools in a broader range of ecosystems, most notably in the marine environment. Fortunately, that scenario is now changing thanks to the growing availability of complete reference genome sequences along with the development of high-throughput DNA sequencing and bioinformatic methods. Altogether, these resources make the epigenetic study of marine organisms (and more specifically marine invertebrates) a reality. By building on this knowledge, the present work provides a timely perspective highlighting the extraordinary potential of environmental epigenetic analyses as a promising source of rapid and sensible tools for pollution biomonitoring, using marine invertebrates as sentinel organisms. This strategy represents an innovative, groundbreaking approach, improving the conservation and management of natural resources in the oceans. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. An Automated and Minimally Invasive Tool for Generating Autologous Viable Epidermal Micrografts

    PubMed Central

    Osborne, Sandra N.; Schmidt, Marisa A.; Harper, John R.

    2016-01-01

    ABSTRACT OBJECTIVE: A new epidermal harvesting tool (CelluTome; Kinetic Concepts, Inc, San Antonio, Texas) created epidermal micrografts with minimal donor site damage, increased expansion ratios, and did not require the use of an operating room. The tool, which applies both heat and suction concurrently to normal skin, was used to produce epidermal micrografts that were assessed for uniform viability, donor-site healing, and discomfort during and after the epidermal harvesting procedure. DESIGN: This study was a prospective, noncomparative institutional review board–approved healthy human study to assess epidermal graft viability, donor-site morbidity, and patient experience. SETTING: These studies were conducted at the multispecialty research facility, Clinical Trials of Texas, Inc, San Antonio. PATIENTS: The participants were 15 healthy human volunteers. RESULTS: The average viability of epidermal micrografts was 99.5%. Skin assessment determined that 76% to 100% of the area of all donor sites was the same in appearance as the surrounding skin within 14 days after epidermal harvest. A mean pain of 1.3 (on a scale of 1 to 5) was reported throughout the harvesting process. CONCLUSIONS: Use of this automated, minimally invasive harvesting system provided a simple, low-cost method of producing uniformly viable autologous epidermal micrografts with minimal patient discomfort and superficial donor-site wound healing within 2 weeks. PMID:26765157

  5. Guiding health care transformation: A next-generation, diagnostic remediation tool for leveraging polarities.

    PubMed

    Wesorick, Bonnie; Shaha, Steve

    2015-01-01

    Health care reform is optimized through the Polarity Thinking Model to achieve and sustain improvements in cost, safety, quality, and efficiency. Traditional problem-solving "fix-it" approaches have histories of inadequacy and failure in addressing the multiple polarities inherent in health care transformation. The Polarity Thinking Model is reviewed followed by a study conducted to establish validity and reliability for diagnostic assessment and remediation through leveraging polarities. Thirteen common health care polarities were identified by an International Consortium, each needing to be leveraged or managed within an organization engaged in health care transformation. A Web-based survey tool was designed to provide leaders with readily interpretable diagnostic information for organizational evaluation regarding how well each polarity is being managed. Four hundred ninety-seven volunteers from two American and two Canadian acute care organizations participated. Content and context validity were established, and statistically significant reliabilities for the survey instrument were verified. Interpretations of study findings verify the assessment accuracy and interpretive value of the information for leading organizational optimization. Employing the Polarity Thinking Model and tools to evaluate how well organizations are managing polarities will enhance the organization's ability to self-diagnose and then succeed in achieving transformation toward sustainable desired outcomes. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. M.mode.ify: A Free Online Tool to Generate Post Hoc M-Mode Images From Any Ultrasound Clip.

    PubMed

    Smith, Benjamin C; Avila, Jacob

    2016-02-01

    We present a software tool designed to generate an M-mode image post hoc from any B-mode ultrasound clip, along any possible axis. M.mode.ify works by breaking down an ultrasound clip into individual frames. It then rotates and crops these frames by using a user-selected M-mode line. The post hoc M-mode image is created by splicing these frames together. Users can measure time and distance after proper calibration through the M.mode.ify interface. This tool opens up new possibilities for clinical application, quality assurance, and research. It is available free for public use at http://www.ultrasoundoftheweek.com/M.mode.ify/.

  7. Generation of histo-anatomically representative models of the individual heart: tools and application

    PubMed Central

    Plank, Gernot; Burton, Rebecca A. B.; Hales, Patrick; Bishop, Martin; Mansoori, Tahir; Bernabeu, Miguel; Garny, Alan; Prassl, Anton J.; Bollensdorff, Christian; Mason, Fleur; Mahmood, Fahd; Rodriguez, Blanca; Grau, Vicente; Schneider, Jürgen E.; Gavaghan, David; Kohl, Peter

    2010-01-01

    This paper presents methods to build histo-anatomically detailed individualised cardiac models. The models are based on high-resolution 3D anatomical and/or diffusion tensor magnetic resonance images, combined with serial histological sectioning data, and are used to investigate individualised cardiac function. The current state-of-the-art is reviewed, and its limitations are discussed. We assess the challenges associated with the generation of histo-anatomically representative individualised in-silico models of the heart. The entire processing pipeline including image acquisition, image processing, mesh generation, model set-up and execution of computer simulations, and the underlying methods are described. The multi-faceted challenges associated with these goals are highlighted, suitable solutions are proposed, and an important application of developed high-resolution structure-function models in elucidating the effect of individual structural heterogeneity upon wavefront dynamics is demonstrated. PMID:19414455

  8. Protein engineering for metabolic engineering: current and next-generation tools

    PubMed Central

    Marcheschi, Ryan J.; Gronenberg, Luisa S.; Liao, James C.

    2014-01-01

    Protein engineering in the context of metabolic engineering is increasingly important to the field of industrial biotechnology. As the demand for biologically-produced food, fuels, chemicals, food additives, and pharmaceuticals continues to grow, the ability to design and modify proteins to accomplish new functions will be required to meet the high productivity demands for the metabolism of engineered organisms. This article reviews advances of selecting, modeling, and engineering proteins to improve or alter their activity. Some of the methods have only recently been developed for general use and are just beginning to find greater application in the metabolic engineering community. We also discuss methods of generating random and targeted diversity in proteins to generate mutant libraries for analysis. Recent uses of these techniques to alter cofactor use, produce non-natural amino acids, alcohols, and carboxylic acids, and alter organism phenotypes are presented and discussed as examples of the successful engineering of proteins for metabolic engineering purposes. PMID:23589443

  9. Protein engineering for metabolic engineering: Current and next-generation tools

    SciTech Connect

    Marcheschi, RJ; Gronenberg, LS; Liao, JC

    2013-04-16

    Protein engineering in the context of metabolic engineering is increasingly important to the field of industrial biotechnology. As the demand for biologically produced food, fuels, chemicals, food additives, and pharmaceuticals continues to grow, the ability to design and modify proteins to accomplish new functions will be required to meet the high productivity demands for the metabolism of engineered organisms. We review advances in selecting, modeling, and engineering proteins to improve or alter their activity. Some of the methods have only recently been developed for general use and are just beginning to find greater application in the metabolic engineering community. We also discuss methods of generating random and targeted diversity in proteins to generate mutant libraries for analysis. Recent uses of these techniques to alter cofactor use; produce non-natural amino acids, alcohols, and carboxylic acids; and alter organism phenotypes are presented and discussed as examples of the successful engineering of proteins for metabolic engineering purposes.

  10. In Vitro Generated Hepatocyte-Like Cells: A Novel Tool in Regenerative Medicine and Drug Discovery.

    PubMed

    Zakikhan, Kobra; Pournasr, Behshad; Vosough, Massoud; Nassiri-Asl, Marjan

    2017-01-01

    Hepatocyte-like cells (HLCs) are generated from either various human pluripotent stem cells (hPSCs) including induced pluripotent stem cells (iPSCs) and embryonic stem cells (ESCs), or direct cell conversion, mesenchymal stem cells as well as other stem cells like gestational tissues. They provide potential cell sources for biomedical applications. Liver transplantation is the gold standard treatment for the patients with end stage liver disease, but there are many obstacles limiting this process, like insufficient number of donated healthy livers. Meanwhile, the number of patients receiving a liver organ transplant for a better life is increasing. In this regard, HLCs may provide an adequate cell source to overcome these shortages. New molecular engineering approaches such as CRISPR/ Cas system applying in iPSCs technology provide the basic principles of gene correction for monogenic inherited metabolic liver diseases, as another application of HLCs. It has been shown that HLCs could replace primary human hepatocytes in drug discovery and hepatotoxicity tests. However, generation of fully functional HLCs is still a big challenge; several research groups have been trying to improve current differentiation protocols to achieve better HLCs according to morphology and function of cells. Large-scale generation of functional HLCs in bioreactors could make a new opportunity in producing enough hepatocytes for treating end-stage liver patients as well as other biomedical applications such as drug studies. In this review, regarding the biomedical value of HLCs, we focus on the current and efficient approaches for generating hepatocyte-like cells in vitro and discuss about their applications in regenerative medicine and drug discovery.

  11. In Vitro Generated Hepatocyte-Like Cells: A Novel Tool in Regenerative Medicine and Drug Discovery

    PubMed Central

    Zakikhan, Kobra; Pournasr, Behshad; Vosough, Massoud; Nassiri-Asl, Marjan

    2017-01-01

    Hepatocyte-like cells (HLCs) are generated from either various human pluripotent stem cells (hPSCs) including induced pluripotent stem cells (iPSCs) and embryonic stem cells (ESCs), or direct cell conversion, mesenchymal stem cells as well as other stem cells like gestational tissues. They provide potential cell sources for biomedical applications. Liver transplantation is the gold standard treatment for the patients with end stage liver disease, but there are many obstacles limiting this process, like insufficient number of donated healthy livers. Meanwhile, the number of patients receiving a liver organ transplant for a better life is increasing. In this regard, HLCs may provide an adequate cell source to overcome these shortages. New molecular engineering approaches such as CRISPR/ Cas system applying in iPSCs technology provide the basic principles of gene correction for monogenic inherited metabolic liver diseases, as another application of HLCs. It has been shown that HLCs could replace primary human hepatocytes in drug discovery and hepatotoxicity tests. However, generation of fully functional HLCs is still a big challenge; several research groups have been trying to improve current differentiation protocols to achieve better HLCs according to morphology and function of cells. Large-scale generation of functional HLCs in bioreactors could make a new opportunity in producing enough hepatocytes for treating end-stage liver patients as well as other biomedical applications such as drug studies. In this review, regarding the biomedical value of HLCs, we focus on the current and efficient approaches for generating hepatocyte-like cells in vitro and discuss about their applications in regenerative medicine and drug discovery. PMID:28670513

  12. Cognitive avionics and watching spaceflight crews think: generation-after-next research tools in functional neuroimaging.

    PubMed

    Genik, Richard J; Green, Christopher C; Graydon, Francis X; Armstrong, Robert E

    2005-06-01

    Confinement and isolation have always confounded the extraordinary endeavor of human spaceflight. Psychosocial health is at the forefront in considering risk factors that imperil missions of 1- to 2-yr duration. Current crewmember selection metrics restricted to behavioral observation by definition observe rather than prevent performance degradation and are thus inadequate when preflight training cannot simulate an entire journey. Nascent techniques to monitor functional and task-related cortical neural activity show promise and can be extended to include whole-brain monitoring. Watching spaceflight crews think can reveal the efficiency of training procedures. Moreover, observing subcortical emotion centers may provide early detection of developing neuropsychiatric disorders. The non-invasive functional neuroimaging modalities electroencephalography (EEG), magnetoencephalography (MEG), magnetic resonance imaging (MRI), and near-infrared spectroscopy (NIRS), and highlights of how they may be engineered for spacecraft are detailed. Preflight and in-flight applications to crewmember behavioral health from current generation, next generation, and generation-after-next neuroscience research studies are also described. The emphasis is on preventing the onset of neuropsychiatric dysfunctions, thus reducing the risk of mission failure due to human error.

  13. Next-Generation Sequencing: A Review of Technologies and Tools for Wound Microbiome Research

    PubMed Central

    Hodkinson, Brendan P.; Grice, Elizabeth A.

    2015-01-01

    Significance: The colonization of wounds by specific microbes or communities of microbes may delay healing and/or lead to infection-related complication. Studies of wound-associated microbial communities (microbiomes) to date have primarily relied upon culture-based methods, which are known to have extreme biases and are not reliable for the characterization of microbiomes. Biofilms are very resistant to culture and are therefore especially difficult to study with techniques that remain standard in clinical settings. Recent Advances: Culture-independent approaches employing next-generation DNA sequencing have provided researchers and clinicians a window into wound-associated microbiomes that could not be achieved before and has begun to transform our view of wound-associated biodiversity. Within the past decade, many platforms have arisen for performing this type of sequencing, with various types of applications for microbiome research being possible on each. Critical Issues: Wound care incorporating knowledge of microbiomes gained from next-generation sequencing could guide clinical management and treatments. The purpose of this review is to outline the current platforms, their applications, and the steps necessary to undertake microbiome studies using next-generation sequencing. Future Directions: As DNA sequencing technology progresses, platforms will continue to produce longer reads and more reads per run at lower costs. A major future challenge is to implement these technologies in clinical settings for more precise and rapid identification of wound bioburden. PMID:25566414

  14. Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation Electricore, Inc.

    SciTech Connect

    Daye, Tony

    2013-09-30

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  15. Mid-water Software Tools and the Application to Processing and Analysis of the Latest Generation Multibeam Sonars

    NASA Astrophysics Data System (ADS)

    Gee, L.; Doucet, M.

    2010-12-01

    The latest generation of multibeam sonars now has the ability to map the water-column, along with the seafloor. Currently, the users of these sonars have a limited view of the mid-water data in real-time, and if they do store the data, they are restricted to replaying it only, with no ability for further analysis. The water-column data has the potential to address a number of research areas including detection of small targets (wrecks, etc.) above the seabed, mapping of fish and marine mammals and a wide range of physical oceanographic processes. However, researchers have been required to develop their own in-house software tools before they can even begin their study of the water column data. This paper describes the development of more general software tools for the full processing of raw sonar data (bathymetry, backscatter and water-column) to yield output products suitable for visualization in a 4D time-synchronized environment. The huge water-column data volumes generated by the new sonars, combined with the variety of data formats from the different sonar manufacturers, provides a significant challenge in the design and development of tools that can be applied to the wide variety of applications. The development of the mid-water tools on this project addressed this problem by using a unified way of storing the water column data in a generic water column format (GWC). The sonar data are converted into the GWC by re-integrating the water column packets with time-based navigation and attitude, such that downstream in the workflow, the tools will have access to all relevant data of any particular ping. Dependent on the application and the resolution requirements, the conversion process also allows simple sub-sampling. Additionally, each file is indexed to enable fast non-linear lookup and extraction of any packet type or packet type collection in the sonar file. These tools also fully exploit multi-core and hyper-threading technologies to maximize the throughput

  16. Bone Marrow Transplantation in Mice as a Tool to Generate Genetically Modified Animals

    NASA Astrophysics Data System (ADS)

    Rőszer, Tamás; Pintye, Éva; Benkő, Ilona

    2008-12-01

    Transgenic mice can be used either as models of known inherited human diseases or can be applied to perform phenotypic tests of genes with unknown function. In some special applications of gene modification we have to create a tissue specific mutation of a given gene. In some cases however the gene modification can be lethal in the intrauterine life, therefore we should engraft the mutated cells in the postnatal life period. After total body irradiation transplantation of bone marrow cells can be a solution to introduce mutant hematopoietic stem cells into a mature animal. Bone marrow transplantation is a useful and novel tool to study the role of hematopoietic cells in the pathogenesis of inflammation, autoimmune syndromes and many metabolic alterations coupled recently to leukocyte functions.

  17. Bone Marrow Transplantation in Mice as a Tool to Generate Genetically Modified Animals

    SciTech Connect

    Roszer, Tamas; Pintye, Eva; Benko', Ilona

    2008-12-08

    Transgenic mice can be used either as models of known inherited human diseases or can be applied to perform phenotypic tests of genes with unknown function. In some special applications of gene modification we have to create a tissue specific mutation of a given gene. In some cases however the gene modification can be lethal in the intrauterine life, therefore we should engraft the mutated cells in the postnatal life period. After total body irradiation transplantation of bone marrow cells can be a solution to introduce mutant hematopoietic stem cells into a mature animal. Bone marrow transplantation is a useful and novel tool to study the role of hematopoietic cells in the pathogenesis of inflammation, autoimmune syndromes and many metabolic alterations coupled recently to leukocyte functions.

  18. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  19. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  20. Mapping Active Stream Lengths as a Tool for Understanding Spatial Variations in Runoff Generation

    NASA Astrophysics Data System (ADS)

    Erwin, E. G.; Gannon, J. P.; Zimmer, M. A.

    2016-12-01

    Recent studies have shown temporary stream channels respond in complex ways to precipitation. By investigating how stream networks expand and recede throughout rain events, we may further develop our understanding of runoff generation. This study focused on mapping the expansion and contraction of the stream network in two headwater catchments characterized by differing soil depths and slopes, located in North Carolina, USA. The first is a 43 ha catchment located in the Southern Appalachian region, characterized by incised, steep slopes and soils of varying thickness. The second is a 3.3 ha catchment located in the Piedmont region, characterized as low relief with deep, highly weathered soils. Over a variety of flow conditions, surveys of the entire stream network were conducted at 10 m intervals to determine presence or absence of surface water. These surveys revealed several reaches within the networks that were intermittent, with perennial flow upstream and downstream. Furthermore, in some tributaries, the active stream head moved up the channel in response to precipitation and at others it remained anchored in place. Moreover, when repeat surveys were performed during the same storm, hysteresis was observed in active stream length variations: stream length was not the same on the rising limb and falling limb of the hydrograph. These observations suggest there are different geomorphological controls or runoff generation processes occurring spatially throughout these catchments. Observations of wide spatial and temporal variability of active stream length over a variety of flow conditions suggest runoff dynamics, generation mechanisms, and contributing flowpath depths producing streamflow may be highly variable and not easily predicted from streamflow observations at a fixed point. Finally, the observation of similar patterns in differing geomorphic regions suggests these processes extend beyond unique site characterizations.

  1. Measuring cell-generated forces: a guide to the available tools.

    PubMed

    Polacheck, William J; Chen, Christopher S

    2016-04-28

    Forces generated by cells are critical regulators of cell adhesion, signaling, and function, and they are also essential drivers in the morphogenetic events of development. Over the past 20 years, several methods have been developed to measure these forces. However, despite recent substantial interest in understanding the contribution of these forces in biology, implementation and adoption of the developed methods by the broader biological community remain challenging because of the inherently multidisciplinary expertise required to conduct and interpret the measurements. In this review, we introduce the established methods and highlight the technical challenges associated with implementing each technique in a biological laboratory.

  2. Measuring cell-generated forces: a guide to the available tools

    PubMed Central

    Polacheck, William J.; Chen, Christopher S.

    2017-01-01

    Forces generated by cells are critical regulators of cell adhesion, signaling and function, and are essential drivers in the morphogenetic events of development. Over the past 20 years, several methods have been developed to measure these forces. Despite recent substantial interest in understanding the contribution of these forces in biology, implementation and adoption in the broader biological community remains challenging due to the inherently multidisciplinary expertise required to conduct and interpret these measurements. In this review, we introduce the established methods, and highlight the technical challenges associated with implementing each technique in a biological laboratory. PMID:27123817

  3. Fill My Datebook: a software tool to generate and handle lists of events.

    PubMed

    Lewejohann, Lars

    2008-05-01

    Electronic calendars, and especially Internet-based calendars, are becoming more and more popular. Their advantages over paper calendars include being able to easily share events with others, gain remote access, organize multiple calendars, and receive visible and audible reminders. Scientific experiments often include a huge number of events that have to be organized. Experimental schedules that follow a fixed scheme can be described as lists of events. The software application presented here allows for the easy generation, management, and storage of lists of events using the Internet-based application Google Calendar.

  4. Protein engineering for metabolic engineering: current and next-generation tools.

    PubMed

    Marcheschi, Ryan J; Gronenberg, Luisa S; Liao, James C

    2013-05-01

    Protein engineering in the context of metabolic engineering is increasingly important to the field of industrial biotechnology. As the demand for biologically produced food, fuels, chemicals, food additives, and pharmaceuticals continues to grow, the ability to design and modify proteins to accomplish new functions will be required to meet the high productivity demands for the metabolism of engineered organisms. We review advances in selecting, modeling, and engineering proteins to improve or alter their activity. Some of the methods have only recently been developed for general use and are just beginning to find greater application in the metabolic engineering community. We also discuss methods of generating random and targeted diversity in proteins to generate mutant libraries for analysis. Recent uses of these techniques to alter cofactor use; produce non-natural amino acids, alcohols, and carboxylic acids; and alter organism phenotypes are presented and discussed as examples of the successful engineering of proteins for metabolic engineering purposes. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Large disclosing the nature of computational tools for the analysis of next generation sequencing data.

    PubMed

    Cordero, Francesca; Beccuti, Marco; Donatelli, Susanna; Calogero, Raffaele A

    2012-01-01

    Next-generation sequencing (NGS) technologies are rapidly changing the approach to complex genomic studies, opening the way to personalized drugs development and personalized medicine. NGS technologies are characterized by a massive throughput for relatively short-sequences (30-100), and they are currently the most reliable and accurate method for grouping individuals on the basis of their genetic profiles. The first and crucial step in sequence analysis is the conversion of millions of short sequences (reads) into valuable genetic information by their mapping to a known (reference) genome. New computational methods, specifically designed for the type and the amount of data generated by NGS technologies, are replacing earlier widespread genome alignment algorithms which are unable to cope with such massive amount of data. This review provides an overview of the bioinformatics techniques that have been developed for the mapping of NGS data onto a reference genome, with a special focus on polymorphism rate and sequence error detection. The different techniques have been experimented on an appropriately defined dataset, to investigate their relative computational costs and usability, as seen from an user perspective. Since NGS platforms interrogate the genome using either the conventional nucleotide space or the more recent color space, this review does consider techniques both in nucleotide and color space, emphasizing similarities and diversities.

  6. Cagen:. a Modern, PC Based Computer Modeling Tool for Explosive MCG Generators and Attached Loads

    NASA Astrophysics Data System (ADS)

    Chase, J. B.; Chato, D.; Peterson, G.; Pincosy, P.; Kiuttu, G. F.

    2004-11-01

    We will describe the PC based computer program CAGEN. CAGEN models the performance of many varieties of Magneto-Cumulative-Generators (MCG) or Magnetic Flux Compression Generators (FCG) that are energized with High Explosive (HE). CAGEN models helical wound or coaxial types, which have HE on the interior. Any materials and any HE types may be used. The cylindrical radius of the windings (or outer conductor) and the radius of the armature may vary with axial position. Variable winding width, thickness, and pitch can be represented, and divided windings are allowed. The MHD equations are used to advance the diffusion of magnetic field into the conductors in order to compute resistance, melting, and contact effects. Magnetic pressure effects are included. The MCG model is treated as part of a lumped circuit, which includes the priming circuit, an opening fuse switch, an inline storage inductance, a transformer or a voltage dividing fuse, peaking-circuit, and several interesting load models. A typical problem will complete in a few seconds to a few minutes. Graphical input, run control, and analysis of results is provided by MathGraf, which is a CARE'N CO. application.

  7. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  8. Acoustic solitons: A robust tool to investigate the generation and detection of ultrafast acoustic waves

    NASA Astrophysics Data System (ADS)

    Péronne, Emmanuel; Chuecos, Nicolas; Thevenard, Laura; Perrin, Bernard

    2017-02-01

    Solitons are self-preserving traveling waves of great interest in nonlinear physics but hard to observe experimentally. In this report an experimental setup is designed to observe and characterize acoustic solitons in a GaAs(001) substrate. It is based on careful temperature control of the sample and an interferometric detection scheme. Ultrashort acoustic solitons, such as the one predicted by the Korteweg-de Vries equation, are observed and fully characterized. Their particlelike nature is clearly evidenced and their unique properties are thoroughly checked. The spatial averaging of the soliton wave front is shown to account for the differences between the theoretical and experimental soliton profile. It appears that ultrafast acoustic experiments provide a precise measurement of the soliton velocity. It allows for absolute calibration of the setup as well as the response function analysis of the detection layer. Moreover, the temporal distribution of the solitons is also analyzed with the help of the inverse scattering method. It shows how the initial acoustic pulse profile which gives birth to solitons after nonlinear propagation can be retrieved. Such investigations provide a new tool to probe transient properties of highly excited matter through the study of the emitted acoustic pulse after laser excitation.

  9. DynamiX, numerical tool for design of next-generation x-ray telescopes.

    PubMed

    Chauvin, Maxime; Roques, Jean-Pierre

    2010-07-20

    We present a new code aimed at the simulation of grazing-incidence x-ray telescopes subject to deformations and demonstrate its ability with two test cases: the Simbol-X and the International X-ray Observatory (IXO) missions. The code, based on Monte Carlo ray tracing, computes the full photon trajectories up to the detector plane, accounting for the x-ray interactions and for the telescope motion and deformation. The simulation produces images and spectra for any telescope configuration using Wolter I mirrors and semiconductor detectors. This numerical tool allows us to study the telescope performance in terms of angular resolution, effective area, and detector efficiency, accounting for the telescope behavior. We have implemented an image reconstruction method based on the measurement of the detector drifts by an optical sensor metrology. Using an accurate metrology, this method allows us to recover the loss of angular resolution induced by the telescope instability. In the framework of the Simbol-X mission, this code was used to study the impacts of the parameters on the telescope performance. In this paper we present detailed performance analysis of Simbol-X, taking into account the satellite motions and the image reconstruction. To illustrate the versatility of the code, we present an additional performance analysis with a particular configuration of IXO.

  10. Streptomyces venezuelae TX-TL - a next generation cell-free synthetic biology tool.

    PubMed

    Moore, Simon J; Lai, Hung-En; Needham, Hannah; Polizzi, Karen M; Freemont, Paul S

    2017-01-31

    Streptomyces venezuelae is a promising chassis in synthetic biology for fine chemical and secondary metabolite pathway engineering. The potential of S. venezuelae could be further realized by expanding its capability with the introduction of its own in vitro transcription-translation (TX-TL) system. TX-TL is a fast and expanding technology for bottom-up design of complex gene expression tools, biosensors and protein manufacturing. Herein, we introduce a S. venezuelae TX-TL platform by reporting a streamlined protocol for cell-extract preparation, demonstrating high-yield synthesis of a codon-optimized sfGFP reporter and the prototyping of a synthetic tetracycline-inducible promoter in S. venezuelae TX-TL based on the tetO-TetR repressor system. The aim of this system is to provide a host for the homologous production of exotic enzymes from Actinobacteria secondary metabolism in vitro. As an example, the authors demonstrate the soluble synthesis of a selection of enzymes (12-70 kDa) from the Streptomyces rimosus oxytetracycline pathway.

  11. Understanding the mobile internet to develop the next generation of online medical teaching tools

    PubMed Central

    Christiano, Cynthia; Ferris, Maria

    2011-01-01

    Healthcare providers (HCPs) use online medical information for self-directed learning and patient care. Recently, the mobile internet has emerged as a new platform for accessing medical information as it allows mobile devices to access online information in a manner compatible with their restricted storage. We investigated mobile internet usage parameters to direct the future development of mobile internet teaching websites. Nephrology On-Demand Mobile (NODM) (http://www.nephrologyondemand.org) was made accessible to all mobile devices. From February 1 to December 31, 2010, HCP use of NODM was tracked using code inserted into the root files. Nephrology On-Demand received 15 258 visits, of which approximately 10% were made to NODM, with the majority coming from the USA. Most access to NODM was through the Apple iOS family of devices and cellular connections were the most frequently used. These findings provide a basis for the future development of mobile nephrology and medical teaching tools. PMID:21659443

  12. The Characteristics and Generating Mechanism of Large Precipitates in Ti-Containing H13 Tool Steel

    NASA Astrophysics Data System (ADS)

    Xie, You; Cheng, Guoguang; Chen, Lie; Zhang, Yandong; Yan, Qingzhong

    2017-02-01

    The characteristics of large precipitates in H13 tool steel with 0.015wt% Ti were studied. The result shows that three types of phases larger than 1 μm exist in the as-cast ingot, that is, (Ti, V) (C, N) type phase, (V, Mo, Cr)C type phase and sulfide. (Ti, V) (C, N) type phase could be further classified as the homogeneous Ti-rich one and the Ti-V-rich one in which Ti/V ratio gradually changes. (V, Mo, Cr)C type phase contains the V-rich one and the Mo-Cr-rich one. The compositional characteristics in all of them have little relation with the cutting position or cooling rate. The precipitating process could be well described through calculation by Thermo-Calc software. During solidification, the primary phase (Ti, V)(C, N) first starts to precipitate in the form of Ti-rich carbonitride. With the development of solidification, the ratio of Ti decreases and that of V increases. Then the primary phase Ti-V-rich (Ti, V)(C, N) and V-rich (V, Mo, Cr)C appears successively. Mo-Cr-rich (V, Mo, Cr)C phase does not precipitate until the solidification process reaches to the end. Sulfide precipitates before (V, Mo, Cr)C type phase and it could act as the nucleus of (V, Mo, Cr)C.

  13. NASA's Learning Technology Project: Developing Educational Tools for the Next Generation of Explorers

    NASA Astrophysics Data System (ADS)

    Federman, A. N.; Hogan, P. J.

    2003-12-01

    Since 1996, NASA's Learning Technology has pioneered the use of innovative technology toinspire students to pursue careers in STEM(Science, Technology, Engineering and Math.) In the past this has included Web sites like Quest and the Observatorium, webcasts and distance learning courses, and even interactive television broadcasts. Our current focus is on development of several mission oriented software packages, targeted primarily at the middle-school population, but flexible enough to be used by elementary to graduate students. These products include contributions to an open source solar system simulator, a 3D planetary encyclopedia), development of a planetary surface viewer (atlas) and others. Whenever possible these software products are written to be 'open source' and multi-platform, for the widest use and easiest access for developers. Along with the software products, we are developing activities and lesson plans that are tested and used by educators in the classroom. The products are reviewed by professional educators. Together these products constitute the NASA Experential Platform for learning, in which the tools used by the public are similar (and in some respects) the same as those used by professional investigators. Efforts are now underway to incorporate actual MODIS and other real time data uplink capabilities.

  14. DynamiX, numerical tool for design of next-generation x-ray telescopes

    SciTech Connect

    Chauvin, Maxime; Roques, Jean-Pierre

    2010-07-20

    We present a new code aimed at the simulation of grazing-incidence x-ray telescopes subject to deformations and demonstrate its ability with two test cases: the Simbol-X and the International X-ray Observatory (IXO) missions. The code, based on Monte Carlo ray tracing, computes the full photon trajectories up to the detector plane, accounting for the x-ray interactions and for the telescope motion and deformation. The simulation produces images and spectra for any telescope configuration using Wolter I mirrors and semiconductor detectors. This numerical tool allows us to study the telescope performance in terms of angular resolution, effective area, and detector efficiency, accounting for the telescope behavior. We have implemented an image reconstruction method based on the measurement of the detector drifts by an optical sensor metrology. Using an accurate metrology, this method allows us to recover the loss of angular resolution induced by the telescope instability. In the framework of the Simbol-X mission, this code was used to study the impacts of the parameters on the telescope performance. In this paper we present detailed performance analysis of Simbol-X, taking into account the satellite motions and the image reconstruction. To illustrate the versatility of the code, we present an additional performance analysis with a particular configuration of IXO.

  15. Next-generation sequencing and metagenomic analysis: a universal diagnostic tool in plant virology.

    PubMed

    Adams, Ian P; Glover, Rachel H; Monger, Wendy A; Mumford, Rick; Jackeviciene, Elena; Navalinskiene, Meletele; Samuitiene, Marija; Boonham, Neil

    2009-07-01

    A novel, unbiased approach to plant viral disease diagnosis has been developed which requires no a priori knowledge of the host or pathogen. Next-generation sequencing coupled with metagenomic analysis was used to produce large quantities of cDNA sequence in a model system of tomato infected with Pepino mosaic virus. The method was then applied to a sample of Gomphrena globosa infected with an unknown pathogen originally isolated from the flowering plant Liatris spicata. This plant was found to contain a new cucumovirus, for which we suggest the name 'Gayfeather mild mottle virus'. In both cases, the full viral genome was sequenced. This method expedites the entire process of novel virus discovery, identification, viral genome sequencing and, subsequently, the development of more routine assays for new viral pathogens.

  16. Fractal analysis of experimentally generated pyroclasts: A tool for volcanic hazard assessment

    NASA Astrophysics Data System (ADS)

    Perugini, Diego; Kueppers, Ulrich

    2012-06-01

    Rapid decompression experiments on natural volcanic rocks mimick explosive eruptions. Fragment size distributions (FSD) of such experimentally generated pyroclasts are investigated using fractal geometry. The fractal dimension of fragmentation, D, of FSD is measured for samples from Unzen (Japan) and Popocatépetl (Mexico) volcanoes. Results show that: (i) FSD are fractal and can be quantified by measuring D values; (ii) D increases linearly with potential energy for fragmentation (PEF) and, thus, with increasing applied pressure; (iii) the rate of increase of D with PEF depends on open porosity: the higher the open porosity, the lower the increase of D with PEF; (iv) at comparable open porosity, samples display a similar behavior for any rock composition. The method proposed here has the potential to become a standard routine to estimate eruptive energy of past and recent eruptions using values of D and open porosity, providing an important step towards volcanic hazard assessment.

  17. Clustering of cochlear oscillations in frequency plateaus as a tool to investigate SOAE generation

    NASA Astrophysics Data System (ADS)

    Epp, Bastian; Wit, Hero; van Dijk, Pim

    2015-12-01

    Spontonaeous otoacoustic emissions (SOAE) reflect the net effect of self-sustained activity in the cochlea, but do not directly provide information about the underlying mechanism and place of origin within the cochlea. The present study investigates if frequency plateaus as found in a linear array of coupled oscillators (OAM) [7] are also found in a transmission line model (TLM) which is able to generate realistic SOAEs [2] and if these frequency plateaus can be used to explain the formation of SOAEs. The simulations showed a clustering of oscillators along the simulated basilar membrane Both, the OAM and the TLM show traveling-wave like behavior along the oscillators coupled into one frequency plateau. While in the TLM roughness is required in order to produce SOAEs, no roughness is required to trigger frequency plateaus in the linear array of oscillators. The formation of frequency plateaus as a consequence of coupling between neighbored active oscillators might be the mechanism underlying SOAEs.

  18. Quantum Field Theory Tools:. a Mechanism of Mass Generation of Gauge Fields

    NASA Astrophysics Data System (ADS)

    Flores-Baez, F. V.; Godina-Nava, J. J.; Ordaz-Hernandez, G.

    We present a simple mechanism for mass generation of gauge fields for the Yang-Mills theory, where two gauge SU(N)-connections are introduced to incorporate the mass term. Variations of these two sets of gauge fields compensate each other under local gauge transformations with the local gauge transformations of the matter fields, preserving gauge invariance. In this way the mass term of gauge fields is introduced without violating the local gauge symmetry of the Lagrangian. Because the Lagrangian has strict local gauge symmetry, the model is a renormalizable quantum model. This model, in the appropriate limit, comes from a class of universal Lagrangians which define a new massive Yang-Mills theories without Higgs bosons.

  19. STRait Razor: a length-based forensic STR allele-calling tool for use with second generation sequencing data.

    PubMed

    Warshauer, David H; Lin, David; Hari, Kumar; Jain, Ravi; Davis, Carey; Larue, Bobby; King, Jonathan L; Budowle, Bruce

    2013-07-01

    Recent studies have demonstrated the capability of second generation sequencing (SGS) to provide coverage of short tandem repeats (STRs) found within the human genome. However, there are relatively few bioinformatic software packages capable of detecting these markers in the raw sequence data. The extant STR-calling tools are sophisticated, but are not always applicable to the analysis of the STR loci commonly used in forensic analyses. STRait Razor is a newly developed Perl-based software tool that runs on the Linux/Unix operating system and is designed to detect forensically-relevant STR alleles in FASTQ sequence data, based on allelic length. It is capable of analyzing STR loci with repeat motifs ranging from simple to complex without the need for extensive allelic sequence data. STRait Razor is designed to interpret both single-end and paired-end data and relies on intelligent parallel processing to reduce analysis time. Users are presented with a number of customization options, including variable mismatch detection parameters, as well as the ability to easily allow for the detection of alleles at new loci. In its current state, the software detects alleles for 44 autosomal and Y-chromosome STR loci. The study described herein demonstrates that STRait Razor is capable of detecting STR alleles in data generated by multiple library preparation methods and two Illumina(®) sequencing instruments, with 100% concordance. The data also reveal noteworthy concepts related to the effect of different preparation chemistries and sequencing parameters on the bioinformatic detection of STR alleles.

  20. THREDDS Second Generation (THematic Real-time Environmental Distributed Data Services): Engaging the GIS Community and Tools

    NASA Astrophysics Data System (ADS)

    Domenico, B.; Caron, J.; Davis, E.; Edelson, D.; Kambic, R.; Pandya, R.; Nativi, S.

    2003-12-01

    The central mission of the THREDDS (THematic Real-time Environmental Distributed Data Services) project is to make it possible for educators and researchers to publish, locate, analyze, and visualize data in a wide variety educational settings. In the initial phase THREDDS established a solid, working prototype of services and tools to enable data providers to create inventory catalogs of the data holdings at their site and educational module builders to author compound documents with embedded pointers to environmental datasets and analysis tools. These catalogs and data-interactive documents can then be harvested into digital libraries using standard protocols. THREDDS Second Generation (THREDDS2G) will further enhance collaborations among data providers, toolbuilders, researchers and educators. It will do so by expanding the team of contributors and the breadth of data in the collections, taking advantage of recent technological advancements, and integrating THREDDS technologies with emerging standards and related environmental data systems. Since much of this expansion will involve Geographic Information Systems (GIS), THREDDS will actively engage the GIS community with the disciplines and tools that make the end products more useful at all educational levels, for decision makers and for the general public.

  1. An automated graphics tool for comparative genomics: the Coulson plot generator

    PubMed Central

    2013-01-01

    Background Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. Results We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. Conclusions CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in

  2. An automated graphics tool for comparative genomics: the Coulson plot generator.

    PubMed

    Field, Helen I; Coulson, Richard M R; Field, Mark C

    2013-04-27

    Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in comparative genomics, its

  3. Deciding to Seek Emergency Care for Acute Myocardial Infarction.

    PubMed

    Noureddine, Samar; Dumit, Nuhad Y; Saab, Mohammad

    2015-10-01

    The purpose of this qualitative descriptive study was to explore how patients who experience acute myocardial infarction (AMI) decide to seek emergency care. Fifty patients with AMI were interviewed at two hospitals in Lebanon. The perspective of 22 witnesses of the attack was also sought about the cardiac event. The themes that transpired from the data were as follows: making sense of the symptoms, waiting to see what happens, deciding to come to the hospital, and the family influenced the decision to seek care. The witnesses of the cardiac event, mostly family members, supported the decision to seek emergency care. Deciding to seek emergency care for AMI is complex. Nurses must solicit their patients' perception of the cardiac event to provide them with tailored education and counseling about heart attack symptoms and how to respond to them in case they recur. Family members must be included in the education process.

  4. Development and implementation of an electronic health record generated surgical handoff and rounding tool.

    PubMed

    Raval, Mehul V; Rust, Laura; Thakkar, Rajan K; Kurtovic, Kelli J; Nwomeh, Benedict C; Besner, Gail E; Kenney, Brian D

    2015-02-01

    Electronic health records (EHR) have been adopted across the nation at tremendous effort and expense. The purpose of this study was to assess improvements in accuracy, efficiency, and patient safety for a high-volume pediatric surgical service with adoption of an EHR-generated handoff and rounding list. The quality and quantity of errors were compared pre- and post-EHR-based list implementation. A survey was used to determine time spent by team members using the two versions of the list. Perceived utility, safety, and quality of the list were reported. Serious safety events determined by the hospital were also compared for the two periods. The EHR-based list eliminated clerical errors while improving efficiency by automatically providing data such as vital signs. Survey respondents reported 43 min saved per week per team member, translating to 372 work hours of time saved annually for a single service. EHR-based list users reported higher satisfaction and perceived improvement in efficiency, accuracy, and safety. Serious safety events remained unchanged. In conclusion, creation of an EHR-based list to assist with daily handoffs, rounding, and patient management demonstrated improved accuracy, increased efficiency, and assisted in maintaining a high level of safety.

  5. Human Microtumors Generated in 3D: Novel Tools for Integrated In Situ Studies of Cancer Immunotherapies.

    PubMed

    Hambach, Lothar; Buser, Andreas; Vermeij, Marcel; Pouw, Nadine; van der Kwast, Theo; Goulmy, Els

    2016-01-01

    Cellular immunotherapy targeting human tumor antigens is a promising strategy to treat solid tumors. Yet clinical results of cellular immunotherapy are disappointing. Moreover, the currently available in vitro human tumor models are not designed to study the optimization of T-cell therapies of solid tumors. Here, we describe a novel assay for multiparametric in situ analysis of therapeutic effects on individual human three-dimensional (3D) tumors. In this assay, tumors of several millimeter diameter are generated from human cancer cell lines of different tumor entities in a collagen type I microenvironment. A newly developed approach for efficient morphological analysis reveals that these in vitro tumors resemble many characteristics of the corresponding clinical cancers such as histological features, immunohistochemical staining patterns, distinct tumor growth compartments and heterogeneous protein expression. To assess the response to therapy with tumor antigen specific T-cells, standardized protocols are described to determine T-cell infiltration and tumor destruction by monitoring soluble factors and tumor growth. Human tumors engineered in 3D collagen scaffolds are excellent in vitro surrogates for avascular tumor stages allowing integrated analyses of the antitumor efficacy of cancer specific immunotherapy in situ.

  6. Configurational Entropy in Ice Nanosystems: Tools for Structure Generation and Screening.

    PubMed

    Parkkinen, P; Riikonen, S; Halonen, L

    2014-03-11

    Recently, a number of experimental and theoretical studies of low-temperature ice and water in nanoscale systems have emerged. Any theoretical study trying to model such systems will encounter the proton-disorder problem, i.e., there exist many configurations differing by water-molecule rotations for a fixed oxygen atom structure. An extensive search within the allowed proton-disorder space should always be perfomed to ensure a reasonable low-energy isomer and to address the effect of proton-configurational entropy that may affect experimental observables. In the present work, an efficient general-purpose program for finite, semiperiodic, and periodic systems of hydrogen-bonded molecules is presented, which can be used in searching and enumerating the proton-configurational ensemble. Benchmarking tests are performed for ice nanotubes and finite slabs. Finally, the program is applied to experimentally appropriate ice nanosystems. A boron nitride film supported ice nanodot is studied in detail. Using a systematic generation of its proton-configurational ensemble, we find an isomer that is ∼1 eV lower in total energy than one previously studied. The present isomer features a considerable dipole moment and implies that ice nanodots are inherently ferroelectric parallel to the surface. We conclude by demonstrating how the so-called hydrogen-bond connectivity parameters can be used to screen low-energy isomers.

  7. Second harmonic generation imaging as a potential tool for staging pregnancy and predicting preterm birth

    NASA Astrophysics Data System (ADS)

    Akins, Meredith L.; Luby-Phelps, Katherine; Mahendroo, Mala

    2010-03-01

    We use second harmonic generation (SHG) microscopy to assess changes in collagen structure of murine cervix during cervical remodeling of normal pregnancy and in a preterm birth model. Visual inspection of SHG images revealed substantial changes in collagen morphology throughout normal gestation. SHG images collected in both the forward and backward directions were analyzed quantitatively for changes in overall mean intensity, forward to backward intensity ratio, collagen fiber size, and porosity. Changes in mean SHG intensity and intensity ratio take place in early pregnancy, suggesting that submicroscopic changes in collagen fibril size and arrangement occur before macroscopic changes become evident. Fiber size progressively increased from early to late pregnancy, while pores between collagen fibers became larger and farther apart. Analysis of collagen features in premature cervical remodeling show that changes in collagen structure are dissimilar from normal remodeling. The ability to quantify multiple morphological features of collagen that characterize normal cervical remodeling and distinguish abnormal remodeling in preterm birth models supports future studies aimed at development of SHG endoscopic devices for clinical assessment of collagen changes during pregnancy in women and for predicting risk of preterm labor which occurs in 12.5% of all pregnancies.

  8. A New Generation of Predictive Analytics Tools for Spacecraft and Petascale Simulation Data

    NASA Astrophysics Data System (ADS)

    Sipes, T.; Karimabadi, H.; Gosling, J. T.; Imber, S. M.; Slavin, J. A.; Roberts, D.

    2012-12-01

    Extraction of knowledge from spacecraft measurement and petascale simulation data creates a major hurdle to scientific progress in space physics today. These datasets are not only immense, but can also be highly complex. Our recent 3D kinetic simulation of reconnection, for instance, included over 3 trillion particles and generated well over 200 TB of data. Analysis by hand is time consuming and is of limited utility in uncovering complex features and their relationships. We propose a new approach to solving this problem based on the development of specialized data mining algorithms in combination with innovative feature extraction technique. Accordingly, we have adapted a multivariate time series analysis data mining technique to handle streaming spacecraft and simulation data. The technique extracts global features and metafeatures in the data in order to capture the necessary time-lapse information. The features are then used to create a static, intermediate data set that is suitable for analysis using the standard supervised data mining techniques. Here we present our latest results on the use of these algorithms in analysis of (i) 3D kinetic simulations of reconnection, (ii) modeling and identification of flux ropes in the magnetotail, and (iii) streaming data analysis of reconnection events in the solar wind data.

  9. Generation of comprehensive thoracic oncology database--tool for translational research.

    PubMed

    Surati, Mosmi; Robinson, Matthew; Nandi, Suvobroto; Faoro, Leonardo; Demchuk, Carley; Kanteti, Rajani; Ferguson, Benjamin; Gangadhar, Tara; Hensing, Thomas; Hasina, Rifat; Husain, Aliya; Ferguson, Mark; Karrison, Theodore; Salgia, Ravi

    2011-01-22

    The Thoracic Oncology Program Database Project was created to serve as a comprehensive, verified, and accessible repository for well-annotated cancer specimens and clinical data to be available to researchers within the Thoracic Oncology Research Program. This database also captures a large volume of genomic and proteomic data obtained from various tumor tissue studies. A team of clinical and basic science researchers, a biostatistician, and a bioinformatics expert was convened to design the database. Variables of interest were clearly defined and their descriptions were written within a standard operating manual to ensure consistency of data annotation. Using a protocol for prospective tissue banking and another protocol for retrospective banking, tumor and normal tissue samples from patients consented to these protocols were collected. Clinical information such as demographics, cancer characterization, and treatment plans for these patients were abstracted and entered into an Access database. Proteomic and genomic data have been included in the database and have been linked to clinical information for patients described within the database. The data from each table were linked using the relationships function in Microsoft Access to allow the database manager to connect clinical and laboratory information during a query. The queried data can then be exported for statistical analysis and hypothesis generation.

  10. Second harmonic generation imaging as a potential tool for staging pregnancy and predicting preterm birth

    PubMed Central

    Akins, Meredith L.; Luby-Phelps, Katherine; Mahendroo, Mala

    2010-01-01

    We use second harmonic generation (SHG) microscopy to assess changes in collagen structure of murine cervix during cervical remodeling of normal pregnancy and in a preterm birth model. Visual inspection of SHG images revealed substantial changes in collagen morphology throughout normal gestation. SHG images collected in both the forward and backward directions were analyzed quantitatively for changes in overall mean intensity, forward to backward intensity ratio, collagen fiber size, and porosity. Changes in mean SHG intensity and intensity ratio take place in early pregnancy, suggesting that submicroscopic changes in collagen fibril size and arrangement occur before macroscopic changes become evident. Fiber size progressively increased from early to late pregnancy, while pores between collagen fibers became larger and farther apart. Analysis of collagen features in premature cervical remodeling show that changes in collagen structure are dissimilar from normal remodeling. The ability to quantify multiple morphological features of collagen that characterize normal cervical remodeling and distinguish abnormal remodeling in preterm birth models supports future studies aimed at development of SHG endoscopic devices for clinical assessment of collagen changes during pregnancy in women and for predicting risk of preterm labor which occurs in 12.5% of all pregnancies. PMID:20459265

  11. Pharmacy career deciding: making choice a "good fit".

    PubMed

    Willis, Sarah Caroline; Shann, Phillip; Hassell, Karen

    2009-01-01

    The purpose of this article is to explore factors influencing career deciding amongst pharmacy students and graduates in the U.K. Group interviews were used to devise a topic guide for five subsequent focus groups with pharmacy students and graduates. Focus groups were tape-recorded, recordings transcribed, and transcripts analysed. Key themes and interlinking factors relating to pharmacy career deciding were identified in the transcripts, following a constructivist approach. Participants' described making a "good fit" between themselves, their experiences, social networks etc. and pharmacy. Central to a coherent career deciding narrative were: having a job on graduation; and the instrumental advantage of studying a vocational course. Focusing on career deciding of UK pharmacy students and graduates may limit the study's generalisability to other countries. However, our findings are relevant to those interested in understanding students' motivations for healthcare careers, since our results suggest that making a "good fit" describes a general process of matching between a healthcare career and personal experience. As we have found that pharmacy career deciding was not, usually, a planned activity, career advisors and those involved in higher education recruitment should take into account the roles played by personal preferences and values in choosing a degree course. A qualitative study like this can illustrate how career deciding occurs and provide insight into the process from a student's perspective. This can help inform guidance processes, selection to healthcare professions courses within the higher education sector, and stimulate debate amongst those involved with recruitment of healthcare workers about desirable motivators for healthcare careers.

  12. Correction: Introducing dip pen nanolithography as a tool for controlling stem cell behaviour: unlocking the potential of the next generation of smart materials in regenerative medicine.

    PubMed

    Curran, Judith M; Stokes, Robert; Irvine, Eleanore; Graham, Duncan; Amro, N A; Sanedrin, R G; Jamil, H; Hunt, John A

    2017-06-13

    Correction for 'Introducing dip pen nanolithography as a tool for controlling stem cell behaviour: unlocking the potential of the next generation of smart materials in regenerative medicine' by Judith M. Curran et al., Lab Chip, 2010, 10, 1662-1670.

  13. repgenHMM: a dynamic programming tool to infer the rules of immune receptor generation from sequence data.

    PubMed

    Elhanati, Yuval; Marcou, Quentin; Mora, Thierry; Walczak, Aleksandra M

    2016-07-01

    The diversity of the immune repertoire is initially generated by random rearrangements of the receptor gene during early T and B cell development. Rearrangement scenarios are composed of random events-choices of gene templates, base pair deletions and insertions-described by probability distributions. Not all scenarios are equally likely, and the same receptor sequence may be obtained in several different ways. Quantifying the distribution of these rearrangements is an essential baseline for studying the immune system diversity. Inferring the properties of the distributions from receptor sequences is a computationally hard problem, requiring enumerating every possible scenario for every sampled receptor sequence. We present a Hidden Markov model, which accounts for all plausible scenarios that can generate the receptor sequences. We developed and implemented a method based on the Baum-Welch algorithm that can efficiently infer the parameters for the different events of the rearrangement process. We tested our software tool on sequence data for both the alpha and beta chains of the T cell receptor. To test the validity of our algorithm, we also generated synthetic sequences produced by a known model, and confirmed that its parameters could be accurately inferred back from the sequences. The inferred model can be used to generate synthetic sequences, to calculate the probability of generation of any receptor sequence, as well as the theoretical diversity of the repertoire. We estimate this diversity to be [Formula: see text] for human T cells. The model gives a baseline to investigate the selection and dynamics of immune repertoires. Source code and sample sequence files are available at https://bitbucket.org/yuvalel/repgenhmm/downloads elhanati@lpt.ens.fr or tmora@lps.ens.fr or awalczak@lpt.ens.fr. © The Author 2016. Published by Oxford University Press.

  14. repgenHMM: a dynamic programming tool to infer the rules of immune receptor generation from sequence data

    PubMed Central

    Elhanati, Yuval; Marcou, Quentin; Mora, Thierry; Walczak, Aleksandra M.

    2016-01-01

    Motivation: The diversity of the immune repertoire is initially generated by random rearrangements of the receptor gene during early T and B cell development. Rearrangement scenarios are composed of random events—choices of gene templates, base pair deletions and insertions—described by probability distributions. Not all scenarios are equally likely, and the same receptor sequence may be obtained in several different ways. Quantifying the distribution of these rearrangements is an essential baseline for studying the immune system diversity. Inferring the properties of the distributions from receptor sequences is a computationally hard problem, requiring enumerating every possible scenario for every sampled receptor sequence. Results: We present a Hidden Markov model, which accounts for all plausible scenarios that can generate the receptor sequences. We developed and implemented a method based on the Baum–Welch algorithm that can efficiently infer the parameters for the different events of the rearrangement process. We tested our software tool on sequence data for both the alpha and beta chains of the T cell receptor. To test the validity of our algorithm, we also generated synthetic sequences produced by a known model, and confirmed that its parameters could be accurately inferred back from the sequences. The inferred model can be used to generate synthetic sequences, to calculate the probability of generation of any receptor sequence, as well as the theoretical diversity of the repertoire. We estimate this diversity to be ≈1023 for human T cells. The model gives a baseline to investigate the selection and dynamics of immune repertoires. Availability and implementation: Source code and sample sequence files are available at https://bitbucket.org/yuvalel/repgenhmm/downloads. Contact: elhanati@lpt.ens.fr or tmora@lps.ens.fr or awalczak@lpt.ens.fr PMID:27153709

  15. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  16. Patient Generated Subjective Global Assessment as a prognosis tool in women with gynecologic cancer.

    PubMed

    Rodrigues, Camila Santos; Lacerda, Marina Seraphim; Chaves, Gabriela Villaça

    2015-01-01

    The aim of this study was to assess the nutritional status (NS) of women hospitalized for gynecologic tumors and relate it to such outcomes as hospital length of stay and 1-y mortality. We assessed 146 women diagnosed with gynecologic tumors who were admitted to a referral oncologic hospital in November 2012. Data collected included medical history, duration and reason for admission, and cases of death within 1 y. NS was assessed using Patient-Generated Subjective Global Assessment (PG-SGA). The receiver operating characteristic curve was used to define the best cutoff point for discriminating individuals who did or did not die. We used proportional hazards regression to assess associations between malnutrition and 1-y mortality. According to the PG-SGA, 62.4% of the women were classified as being at nutritional risk or having moderate or severe malnutrition. Sorting patients by stage of cancer, there was no statistical difference in NS classification according to the different cancer sites. The median hospital stay, in days, was statistically lower in patients classified as well nourished. Individuals with a score above the cutoff point of 10 were 30.7 times more likely (95% confidence interval, 11.8-79.4) to die. There was a 52.1% rate of mortality within 1 y. Patients classed as having some degree of malnutrition had a significantly lower median survival rate. A diagnosis of cervical cancer and severe malnourishment increases the likelihood of death. Our findings suggest that the PG-SGA can be considered not just as an indicator of nutritional risk, but also as a major predictor of prognosis and mortality in this population. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Astronomy as a Tool for Training the Next Generation Technical Workforce

    NASA Astrophysics Data System (ADS)

    Romero, V.; Walsh, G.; Ryan, W.; Ryan, E.

    A major challenge for today's institutes of higher learning is training the next generation of scientists, engineers, and optical specialists to be proficient in the latest technologies they will encounter when they enter the workforce. Although research facilities can offer excellent hands-on instructional opportunities, integrating such experiential learning into academic coursework without disrupting normal operations at such facilities can be difficult. Also, motivating entry level students to increase their skill levels by undertaking and successfully completing difficult coursework can require more creative instructional approaches, including fostering a fun, non-threatening environment for enhancing basic abilities. Astronomy is a universally appealing subject area, and can be very effective as a foundation for cultivating advanced competencies. We report on a project underway at the New Mexico Institute of Mining and Technology (NM Tech), a science and engineering school in Socorro, NM, to incorporate a state-of-the-art optical telescope and laboratory experiments into an entry-level course in basic engineering. Students enrolled in an explosive engineering course were given a topical problem in Planetary Astronomy: they were asked to develop a method to energetically mitigate a potentially hazardous impact between our planet and a Near-Earth asteroid to occur sometime in the future. They were first exposed to basic engineering training in the areas of fracture and material response to failure under different environmental conditions through lectures and traditional laboratory exercises. The students were then given access to NM Tech's Magdalena Ridge Observatory's (MRO) 2.4-meter telescope to collect physical characterization data, (specifically shape information) on two potentially hazardous asteroids (one roughly spherical, the other an elongated ellipsoid). Finally, the students used NM Tech's Energetic Materials Research and Testing Center (EMRTC) to

  18. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....8 Section 215.8 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE NOTICE, COMMENT, AND APPEAL PROCEDURES FOR NATIONAL FOREST SYSTEM PROJECTS AND ACTIVITIES § 215.8 Appeal Deciding...: Chief Secretary of Agriculture. Regional Forester or Station Director Chief of the Forest Service...

  19. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....8 Section 215.8 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE NOTICE, COMMENT, AND APPEAL PROCEDURES FOR NATIONAL FOREST SYSTEM PROJECTS AND ACTIVITIES § 215.8 Appeal Deciding...: Chief Secretary of Agriculture. Regional Forester or Station Director Chief of the Forest Service...

  20. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ....8 Section 215.8 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE NOTICE, COMMENT, AND APPEAL PROCEDURES FOR NATIONAL FOREST SYSTEM PROJECTS AND ACTIVITIES § 215.8 Appeal Deciding...: Chief Secretary of Agriculture. Regional Forester or Station Director Chief of the Forest Service...

  1. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....8 Section 215.8 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE NOTICE, COMMENT, AND APPEAL PROCEDURES FOR NATIONAL FOREST SYSTEM PROJECTS AND ACTIVITIES § 215.8 Appeal Deciding...: Chief Secretary of Agriculture. Regional Forester or Station Director Chief of the Forest Service...

  2. Validity of Measured Interest for Decided and Undecided Students.

    ERIC Educational Resources Information Center

    Bartling, Herbert C.; Hood, Albert B.

    The usefulness of vocational interest measures has been questioned by those who have studied the predictive validity of expressed choice. The predictive validities of measured interest for decided and undecided students, expressed choice and measured interest, and expressed choice and measured interest when they are congruent and incongruent were…

  3. Who Should Decide How Machines Make Morally Laden Decisions?

    PubMed

    Martin, Dominic

    2017-08-01

    Who should decide how a machine will decide what to do when it is driving a car, performing a medical procedure, or, more generally, when it is facing any kind of morally laden decision? More and more, machines are making complex decisions with a considerable level of autonomy. We should be much more preoccupied by this problem than we currently are. After a series of preliminary remarks, this paper will go over four possible answers to the question raised above. First, we may claim that it is the maker of a machine that gets to decide how it will behave in morally laden scenarios. Second, we may claim that the users of a machine should decide. Third, that decision may have to be made collectively or, fourth, by other machines built for this special purpose. The paper argues that each of these approaches suffers from its own shortcomings, and it concludes by showing, among other things, which approaches should be emphasized for different types of machines, situations, and/or morally laden decisions.

  4. Deciding when It's Time to Buy a New PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    How to best decide when it's time to replace your PC, whether at home or at work, is always tricky. Spending on computers can make you more productive, but it's money you otherwise cannot spend, invest or save, and faster systems always await you in the future. What is clear is that the computer industry really wants you to buy, and the computer…

  5. Project DECIDE. Business Enterprise Approach to Career Exploration. Implementation Handbook.

    ERIC Educational Resources Information Center

    Post, John O., Jr.; And Others

    The purpose of this document is to describe project DECIDE, a business enterprise career exploration program, in the form of an implementation handbook. Chapter 1 presents the major characteristics of the model, which focuses on providing special needs students and regular junior high students the opportunity to improve their personal, social, and…

  6. Deciding in Democracies: A Role for Thinking Skills?

    ERIC Educational Resources Information Center

    Gardner, Peter

    2014-01-01

    In societies that respect our right to decide many things for ourselves, exercising that right can be a source of anxiety. We want to make the right decisions, which is difficult when we are confronted with complex issues that are usually the preserve of specialists. But is help at hand? Are thinking skills the very things that non-specialists…

  7. Affirmative Action in Higher Education: Why Grutter Was Correctly Decided.

    ERIC Educational Resources Information Center

    Sunstein, Cass R.

    2003-01-01

    Asserts that conservatives in the U.S Supreme Court failed to see that affirmative action in higher education is an important and constitutionally protected institutional liberty, suggesting that Grutter v. Bollinger was correctly decided, and Gratz v. Bollinger was a mistake (but not a disaster). Suggests that such difficult issues should not be…

  8. Consumer Economics, Book I [and] Book II. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Consumer Economics, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday…

  9. Deciding What to Research: An Overview of a Participatory Workshop

    ERIC Educational Resources Information Center

    Northway, Ruth; Hurley, Karen; O'Connor, Chris; Thomas, Helen; Howarth, Joyce; Langley, Emma; Bale, Sue

    2014-01-01

    While recent years have seen an increase in the number of participatory and inclusive research studies being undertaken where people with learning disabilities are active members of the research team, little has been published about how teams decide what to research. This paper aims to fill this gap by discussing how in one area of Wales a…

  10. Consumer Economics, Book I [and] Book II. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Consumer Economics, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday…

  11. Deciding when It's Time to Buy a New PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    How to best decide when it's time to replace your PC, whether at home or at work, is always tricky. Spending on computers can make you more productive, but it's money you otherwise cannot spend, invest or save, and faster systems always await you in the future. What is clear is that the computer industry really wants you to buy, and the computer…

  12. 45 CFR 2554.39 - How is the case decided?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 4 2011-10-01 2011-10-01 false How is the case decided? 2554.39 Section 2554.39 Public Welfare Regulations Relating to Public Welfare (Continued) CORPORATION FOR NATIONAL AND COMMUNITY SERVICE PROGRAM FRAUD CIVIL REMEDIES ACT REGULATIONS Decisions and Appeals § 2554.39 How is the...

  13. Generation of an ABCG2{sup GFPn-puro} transgenic line - A tool to study ABCG2 expression in mice

    SciTech Connect

    Orford, Michael; Mean, Richard; Lapathitis, George; Genethliou, Nicholas; Panayiotou, Elena; Panayi, Helen; Malas, Stavros

    2009-06-26

    The ATP-binding cassette (ABC) transporter 2 (ABCG2) is expressed by stem cells in many organs and in stem cells of solid tumors. These cells are isolated based on the side population (SP) phenotype, a Hoechst 3342 dye efflux property believed to be conferred by ABCG2. Because of the limitations of this approach we generated transgenic mice that express Nuclear GFP (GFPn) coupled to the Puromycin-resistance gene, under the control of ABCG2 promoter/enhancer sequences. We show that ABCG2 is expressed in neural progenitors of the developing forebrain and spinal cord and in embryonic and adult endothelial cells of the brain. Using the neurosphere assay, we isolated tripotent ABCG2-expressing neural stem cells from embryonic mouse brain. This transgenic line is a powerful tool for studying the expression of ABCG2 in many tissues and for performing functional studies in different experimental settings.

  14. Latest generation of flat detector CT as a peri-interventional diagnostic tool: a comparative study with multidetector CT.

    PubMed

    Leyhe, Johanna Rosemarie; Tsogkas, Ioannis; Hesse, Amélie Carolina; Behme, Daniel; Schregel, Katharina; Papageorgiou, Ismini; Liman, Jan; Knauth, Michael; Psychogios, Marios-Nikos

    2016-12-20

    Flat detector CT (FDCT) has been used as a peri-interventional diagnostic tool in numerous studies with mixed results regarding image quality and detection of intracranial lesions. We compared the diagnostic aspects of the latest generation FDCT with standard multidetector CT (MDCT). 102 patients were included in our retrospective study. All patients had undergone interventional procedures. FDCT was acquired peri-interventionally and compared with postinterventional MDCT regarding depiction of ventricular/subarachnoidal spaces, detection of intracranial hemorrhage, and delineation of ischemic lesions using an ordinal scale. Ischemic lesions were quantified with the Alberta Stroke Program Early CT Scale (ASPECTS) on both examinations. Two neuroradiologists with varying grades of experience and a medical student scored the anonymized images separately, blinded to the clinical history. The two methods were of equal diagnostic value regarding evaluation of the ventricular system and the subarachnoidal spaces. Subarachnoidal, intraventricular, and parenchymal hemorrhages were detected with a sensitivity of 95%, 97%, and 100% and specificity of 97%, 100%, and 99%, respectively, using FDCT. Gray-white differentiation was feasible in the majority of FDCT scans, and ischemic lesions were detected with a sensitivity of 71% on FDCT, compared with MDCT scans. The mean difference in ASPECTS values on FDCT and MDCT was 0.5 points (95% CI 0.12 to 0.88). The latest generation of FDCT is a reliable and accurate tool for the detection of intracranial hemorrhage. Gray-white differentiation is feasible in the supratentorial region. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  15. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  16. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  17. A scalable and accurate targeted gene assembly tool (SAT-Assembler) for next-generation sequencing data.

    PubMed

    Zhang, Yuan; Sun, Yanni; Cole, James R

    2014-08-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material.

  18. MATURE: A Model Driven bAsed Tool to Automatically Generate a langUage That suppoRts CMMI Process Areas spEcification

    NASA Astrophysics Data System (ADS)

    Musat, David; Castaño, Víctor; Calvo-Manzano, Jose A.; Garbajosa, Juan

    Many companies have achieved a higher quality in their processes by using CMMI. Process definition may be efficiently supported by software tools. A higher automation level will make process improvement and assessment activities easier to be adapted to customer needs. At present, automation of CMMI is based on tools that support practice definition in a textual way. These tools are often enhanced spreadsheets. In this paper, following the Model Driven Development paradigm (MDD), a tool that supports automatic generation of a language that can be used to specify process areas practices is presented. The generation is performed from a metamodel that represents CMMI. This tool, differently from others available, can be customized according to user needs. Guidelines to specify the CMMI metamodel are also provided. The paper also shows how this approach can support other assessment methods.

  19. DNA barcoding, microarrays and next generation sequencing: recent tools for genetic diversity estimation and authentication of medicinal plants.

    PubMed

    Sarwat, Maryam; Yamdagni, Manu Mayank

    2016-01-01

    DNA barcoding, microarray technology and next generation sequencing have emerged as promising tools for the elucidation of plant genetic diversity and its conservation. They are proving to be immensely helpful in authenticating the useful medicinal plants for herbal drug preparations. These newer versions of molecular markers utilize short genetic markers in the genome to characterize the organism to a particular species. This has the potential not only to classify the known and yet unknown species but also has a promising future to link the medicinally important plants according to their properties. The newer trends being followed in DNA chips and barcoding pave the way for a future with many different possibilities. Several of these possibilities might be: characterization of unknown species in a considerably less time than usual, identification of newer medicinal properties possessed by the species and also updating the data of the already existing but unnoticed properties. This can assist us to cure many different diseases and will also generate novel opportunities in medicinal drug delivery and targeting.

  20. A software tool for interactive generation, representation, and systematical storage of transfer functions for 3D medical images.

    PubMed

    Alper Selver, M; Fischer, Felix; Kuntalp, Mehmet; Hillen, Walter

    2007-06-01

    As being a tool that assigns optical parameters, i.e. color, transparency, used in interactive visualization, transfer functions have very important effects on the quality of volume rendered medical images. However, finding accurate transfer functions is a very difficult, tedious, and time consuming task because of the variety of all possibilities. By addressing this problem, a software module, which can be easily plugged into any visualization program, is developed based on the specific expectations of medical experts. Its design includes both a new user interface to ease the interactive generation of the volume rendered medical images and a volumetric histogram based method for initial generation of transfer functions. In addition, a novel file system has been implemented to represent 3D medical images using transfer functions based on the DICOM standard. For evaluation of the system by various medical experts, the software is installed into a DICOM viewer. Based on the feedback obtained from the medical experts, several improvements are made, especially to increase the flexibility of the program. The final version of the implemented system shortens the transfer function design process and is applicable to various application areas.

  1. Evaluation of pulsed laser ablation in liquids generated gold nanoparticles as novel transfection tools: efficiency and cytotoxicity

    NASA Astrophysics Data System (ADS)

    Willenbrock, Saskia; Durán, María. Carolina; Barchanski, Annette; Barcikowski, Stephan; Feige, Karsten; Nolte, Ingo; Murua Escobar, Hugo

    2014-03-01

    Varying transfection efficiencies and cytotoxicity are crucial aspects in cell manipulation. The utilization of gold nanoparticles (AuNP) has lately attracted special interest to enhance transfection efficiency. Conventional AuNP are usually generated by chemical reactions or gas pyrolysis requiring often cell-toxic stabilizers or coatings to conserve their characteristics. Alternatively, stabilizer- and coating-free, highly pure, colloidal AuNP can be generated by pulsed laser ablation in liquids (PLAL). Mammalian cells were transfected efficiently by addition of PLAL-AuNP, but data systematically evaluating the cell-toxic potential are lacking. Herein, the transfection efficiency and cytotoxicity of PLAL AuNP was evaluated by transfection of a mammalian cell line with a recombinant HMGB1/GFP DNA expression vector. Different methods were compared using two sizes of PLAL-AuNP, commercialized AuNP, two magnetic NP-based protocols and a conventional transfection reagent (FuGENE HD; FHD). PLAL-AuNP were generated using a Spitfire Pro femtosecond laser system delivering 120 fs laser pulses at a wavelength of 800 nm focusing the fs-laser beam on a 99.99% pure gold target placed in ddH2O. Transfection efficiencies were analyzed after 24h using fluorescence microscopy and flow cytometry. Toxicity was assessed measuring cell proliferation and percentage of necrotic, propidium iodide positive cells (PI %). The addition of PLAL-AuNP significantly enhanced transfection efficiencies (FHD: 31 %; PLAL-AuNP size-1: 46 %; size-2: 50 %) with increased PI% but no reduced cell proliferation. Commercial AuNP-transfection showed significantly lower efficiency (23 %), slightly increased PI % and reduced cell proliferation. Magnetic NP based methods were less effective but showing also lowest cytotoxicity. In conclusion, addition of PLAL-AuNP provides a novel tool for transfection efficiency enhancement with acceptable cytotoxic side-effects.

  2. How do you decide what education programs to attend?

    PubMed

    Camp, J H

    1981-01-01

    Increasingly, constraints on resources require that any decision to participate in a continuing education program be scrutinized to ensure the most effective and efficient use of time, money, and personnel. In addition to these constraints, and complicating the decision-making process, are professional pressures to keep abreast of developments in the field. But deciding whether to attend a seminar or workshop can be facilitated if the program is evaluated according to the suggestions in this article.

  3. The Society-Deciders Model and Fairness in Nations

    NASA Astrophysics Data System (ADS)

    Flomenbom, Ophir

    2015-05-01

    Modeling the dynamics in nations from economical and sociological perspectives is a central theme in economics and sociology. Accurate models can predict and therefore help all the world's citizens. Yet, recent years have show that the current models are missing. Here, we develop a dynamical society-deciders model that can explain the stability in a nation, based on concepts from dynamics, ecology and socio-econo-physics; a nation has two groups that interconnect, the deciders and the society. We show that a nation is either stable or it collapses. This depends on just two coefficients that we relate with sociological and economical indicators. We define a new socio-economic indicator, fairness. Fairness can measure the stability in a nation and how probable a change favoring the society is. We compute fairness among all the world's nations. Interestingly, in comparison with other indicators, fairness shows that the USA loses its rank among Western democracies, India is the best among the 15 most populated nations, and Egypt, Libya and Tunisia have significantly improved their rankings as a result of recent revolutions, further increasing the probability of additional positive changes. Within the model, long lasting crises are solved rather than with increasing governmental spending or cuts with regulations that reduce the stability of the deciders, namely, increasing fairness, while, for example, shifting wealth in the direction of the people, and therefore increasing further opportunities.

  4. Generations.

    PubMed

    Chambers, David W

    2005-01-01

    Groups naturally promote their strengths and prefer values and rules that give them an identity and an advantage. This shows up as generational tensions across cohorts who share common experiences, including common elders. Dramatic cultural events in America since 1925 can help create an understanding of the differing value structures of the Silents, the Boomers, Gen Xers, and the Millennials. Differences in how these generations see motivation and values, fundamental reality, relations with others, and work are presented, as are some applications of these differences to the dental profession.

  5. The abridged patient-generated subjective global assessment is a useful tool for early detection and characterization of cancer cachexia.

    PubMed

    Vigano, Antonio L; di Tomasso, Jonathan; Kilgour, Robert D; Trutschnigg, Barbara; Lucar, Enriqueta; Morais, José A; Borod, Manuel

    2014-07-01

    Cancer cachexia (CC) is a syndrome characterized by wasting of lean body mass and fat, often driven by decreased food intake, hypermetabolism, and inflammation resulting in decreased lifespan and quality of life. Classification of cancer cachexia has improved, but few clinically relevant diagnostic tools exist for its early identification and characterization. The abridged Patient-Generated Subjective Global Assessment (aPG-SGA) is a modification of the original Patient-Generated Subjective Global Assessment, and consists of a four-part questionnaire that scores patients' weight history, food intake, appetite, and performance status. The purpose of this study was to determine whether the aPG-SGA is associated with both features and clinical sequelae of cancer cachexia. In this prospective cohort study, 207 advanced lung and gastrointestinal cancer patients completed the following tests: aPG-SGA, Edmonton Symptom Assessment System, handgrip strength, a complete blood count, albumin, apolipoprotein A and B, and C-reactive protein. Ninety-four participants with good performance status as assessed by the Eastern Cooperative Oncology Group Performance Status completed additional questionnaires and underwent body composition testing. Of these, 68 patients tested for quadriceps strength and completed a 3-day food recall. Multivariable regression models revealed that higher aPG-SGA scores (≥9 vs 0 to 1) are significantly associated (P<0.05) with the following: unfavorable biological markers of cancer cachexia, such as higher white blood cell counts (10.0 vs 6.7×10(9)/L; lower hemoglobin (115.6 vs 127.7 g/L), elevated C-reactive protein (42.7 vs 18.2 mg/L [406.7 vs 173.3 nmol/L]); decreased anthropometric and physical measures, such as body mass index (22.5 vs 27.1); fat mass (14.4 vs 26.0 kg), handgrip (24.7 vs 34.9 kg) and leg strength; an average 12% greater length of hospital stay; a dose reduction in chemotherapy; and increased mortality. Given its association with

  6. Dynamic combinatorial/covalent chemistry: a tool to read, generate and modulate the bioactivity of compounds and compound mixtures.

    PubMed

    Herrmann, Andreas

    2014-03-21

    Reversible covalent bond formation under thermodynamic control adds reactivity to self-assembled supramolecular systems, and is therefore an ideal tool to assess complexity of chemical and biological systems. Dynamic combinatorial/covalent chemistry (DCC) has been used to read structural information by selectively assembling receptors with the optimum molecular fit around a given template from a mixture of reversibly reacting building blocks. This technique allows access to efficient sensing devices and the generation of new biomolecules, such as small molecule receptor binders for drug discovery, but also larger biomimetic polymers and macromolecules with particular three-dimensional structural architectures. Adding a kinetic factor to a thermodynamically controlled equilibrium results in dynamic resolution and in self-sorting and self-replicating systems, all of which are of major importance in biological systems. Furthermore, the temporary modification of bioactive compounds by reversible combinatorial/covalent derivatisation allows control of their release and facilitates their transport across amphiphilic self-assembled systems such as artificial membranes or cell walls. The goal of this review is to give a conceptual overview of how the impact of DCC on supramolecular assemblies at different levels can allow us to understand, predict and modulate the complexity of biological systems.

  7. Phenotype MicroArrays as a complementary tool to next generation sequencing for characterization of tree endophytes

    PubMed Central

    Blumenstein, Kathrin; Macaya-Sanz, David; Martín, Juan A.; Albrectsen, Benedicte R.; Witzell, Johanna

    2015-01-01

    There is an increasing need to calibrate microbial community profiles obtained through next generation sequencing (NGS) with relevant taxonomic identities of the microbes, and to further associate these identities with phenotypic attributes. Phenotype MicroArray (PM) techniques provide a semi-high throughput assay for characterization and monitoring the microbial cellular phenotypes. Here, we present detailed descriptions of two different PM protocols used in our recent studies on fungal endophytes of forest trees, and highlight the benefits and limitations of this technique. We found that the PM approach enables effective screening of substrate utilization by endophytes. However, the technical limitations are multifaceted and the interpretation of the PM data challenging. For the best result, we recommend that the growth conditions for the fungi are carefully standardized. In addition, rigorous replication and control strategies should be employed whether using pre-configured, commercial microwell-plates or in-house designed PM plates for targeted substrate analyses. With these precautions, the PM technique is a valuable tool to characterize the metabolic capabilities of individual endophyte isolates, or successional endophyte communities identified by NGS, allowing a functional interpretation of the taxonomic data. Thus, PM approaches can provide valuable complementary information for NGS studies of fungal endophytes in forest trees. PMID:26441951

  8. Phenotype MicroArrays as a complementary tool to next generation sequencing for characterization of tree endophytes.

    PubMed

    Blumenstein, Kathrin; Macaya-Sanz, David; Martín, Juan A; Albrectsen, Benedicte R; Witzell, Johanna

    2015-01-01

    There is an increasing need to calibrate microbial community profiles obtained through next generation sequencing (NGS) with relevant taxonomic identities of the microbes, and to further associate these identities with phenotypic attributes. Phenotype MicroArray (PM) techniques provide a semi-high throughput assay for characterization and monitoring the microbial cellular phenotypes. Here, we present detailed descriptions of two different PM protocols used in our recent studies on fungal endophytes of forest trees, and highlight the benefits and limitations of this technique. We found that the PM approach enables effective screening of substrate utilization by endophytes. However, the technical limitations are multifaceted and the interpretation of the PM data challenging. For the best result, we recommend that the growth conditions for the fungi are carefully standardized. In addition, rigorous replication and control strategies should be employed whether using pre-configured, commercial microwell-plates or in-house designed PM plates for targeted substrate analyses. With these precautions, the PM technique is a valuable tool to characterize the metabolic capabilities of individual endophyte isolates, or successional endophyte communities identified by NGS, allowing a functional interpretation of the taxonomic data. Thus, PM approaches can provide valuable complementary information for NGS studies of fungal endophytes in forest trees.

  9. Do sunbirds use taste to decide how much to drink?

    PubMed

    Bailey, Ida E; Nicolson, Susan W

    2016-03-01

    Nectarivorous birds typically consume smaller meals of more concentrated than of less concentrated sugar solutions. It is not clear, however, whether they use taste to decide how much to consume or whether they base this decision on post-ingestive feedback. Taste, a cue to nectar concentration, is available to nectarivores during ingestion whereas post-ingestive information about resource quality becomes available only after a meal. When conditions are variable, we would expect nectarivorous birds to base their decisions on how much to consume on taste, as post-ingestive feedback from previous meals would not be a reliable cue to current resource quality. Here, we tested whether white-bellied sunbirds (Cinnyris talatala), foraging from an array of artificial flowers, use taste to decide how much to consume per meal when nectar concentration is highly variable: they did not. Instead, how much they chose to consume per meal appeared to depend on the energy intake at the previous meal, that is how hungry they were. Our birds did, however, appear to use taste to decide how much to consume per flower visited within a meal. Unexpectedly, some individuals preferred to consume more from flowers with lower concentration rewards and some preferred to do the opposite. We draw attention to the fact that many studies perhaps misleadingly claim that birds use sweet taste to inform their foraging decisions, as they analyse mean data for multiple meals over which post-ingestive feedback will have become available rather than data for individual meals when only sensory information is available. We discuss how conflicting foraging rules could explain why sunbirds do not use sweet taste to inform their meal size decisions.

  10. FANSe2: A Robust and Cost-Efficient Alignment Tool for Quantitative Next-Generation Sequencing Applications

    PubMed Central

    Xiao, Chuan-Le; Mai, Zhi-Biao; Lian, Xin-Lei; Zhong, Jia-Yong; Jin, Jing-jie; He, Qing-Yu; Zhang, Gong

    2014-01-01

    Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT) based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads) to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/. PMID:24743329

  11. FANSe2: a robust and cost-efficient alignment tool for quantitative next-generation sequencing applications.

    PubMed

    Xiao, Chuan-Le; Mai, Zhi-Biao; Lian, Xin-Lei; Zhong, Jia-Yong; Jin, Jing-Jie; He, Qing-Yu; Zhang, Gong

    2014-01-01

    Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT) based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads) to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/.

  12. Evolutionary algorithm based optimization of hydraulic machines utilizing a state-of-the-art block coupled CFD solver and parametric geometry and mesh generation tools

    NASA Astrophysics Data System (ADS)

    S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr

    2014-03-01

    An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.

  13. Wireless sensor systems for sense/decide/act/communicate.

    SciTech Connect

    Berry, Nina M.; Cushner, Adam; Baker, James A.; Davis, Jesse Zehring; Stark, Douglas P.; Ko, Teresa H.; Kyker, Ronald D.; Stinnett, Regan White; Pate, Ronald C.; Van Dyke, Colin; Kyckelhahn, Brian

    2003-12-01

    After 9/11, the United States (U.S.) was suddenly pushed into challenging situations they could no longer ignore as simple spectators. The War on Terrorism (WoT) was suddenly ignited and no one knows when this war will end. While the government is exploring many existing and potential technologies, the area of wireless Sensor networks (WSN) has emerged as a foundation for establish future national security. Unlike other technologies, WSN could provide virtual presence capabilities needed for precision awareness and response in military, intelligence, and homeland security applications. The Advance Concept Group (ACG) vision of Sense/Decide/Act/Communicate (SDAC) sensor system is an instantiation of the WSN concept that takes a 'systems of systems' view. Each sensing nodes will exhibit the ability to: Sense the environment around them, Decide as a collective what the situation of their environment is, Act in an intelligent and coordinated manner in response to this situational determination, and Communicate their actions amongst each other and to a human command. This LDRD report provides a review of the research and development done to bring the SDAC vision closer to reality.

  14. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    SciTech Connect

    Etmektzoglou, A; Mishra, P; Svatos, M

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  15. Profiling biopharmaceutical deciding properties of absorption of lansoprazole enteric-coated tablets using gastrointestinal simulation technology.

    PubMed

    Wu, Chunnuan; Sun, Le; Sun, Jin; Yang, Yajun; Ren, Congcong; Ai, Xiaoyu; Lian, He; He, Zhonggui

    2013-09-10

    The aim of the present study was to correlate in vitro properties of drug formulation to its in vivo performance, and to elucidate the deciding properties of oral absorption. Gastrointestinal simulation technology (GST) was used to simulate the in vivo plasma concentration-time curve and was implemented by GastroPlus™ software. Lansoprazole, a typical BCS class II drug, was chosen as a model drug. Firstly, physicochemical and pharmacokinetic parameters of lansoprazole were determined or collected from literature to construct the model. Validation of the developed model was performed by comparison of the predicted and the experimental plasma concentration data. We found that the predicted curve was in a good agreement with the experimental data. Then, parameter sensitivity analysis (PSA) was performed to find the key parameters of oral absorption. The absorption was particularly sensitive to dose, solubility and particle size for lansoprazole enteric-coated tablets. With a single dose of 30 mg and the solubility of 0.04 mg/ml, the absorption was complete. A good absorption could be achieved with lansoprazole particle radius down to about 25 μm. In summary, GST is a useful tool for profiling biopharmaceutical deciding properties of absorption of lansoprazole enteric-coated tablets and guiding the formulation optimization.

  16. E-DECIDER Disaster Response and Decision Support Cyberinfrastructure: Technology and Challenges

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    Timely delivery of critical information to decision makers during a disaster is essential to response and damage assessment. Key issues to an efficient emergency response after a natural disaster include rapidly processing and delivering this critical information to emergency responders and reducing human intervention as much as possible. Essential elements of information necessary to achieve situational awareness are often generated by a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. A key challenge is the current state of practice does not easily support information sharing and technology interoperability. NASA E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) has worked with the California Earthquake Clearinghouse and its partners to address these issues and challenges by adopting the XChangeCore Web Service Data Orchestration technology and participating in several earthquake response exercises. The E-DECIDER decision support system provides rapid delivery of advanced situational awareness data products to operations centers and emergency responders in the field. Remote sensing and hazard data, model-based map products, information from simulations, damage detection, and crowdsourcing is integrated into a single geospatial view and delivered through a service oriented architecture for improved decision-making and then directly to mobile devices of responders. By adopting a Service Oriented Architecture based on Open Geospatial Consortium standards, the system provides an extensible, comprehensive framework for geospatial data processing and distribution on Cloud platforms and other distributed environments. While the Clearinghouse and its partners are not first responders, they do support the emergency response community by providing information about the damaging effects earthquakes. It is critical for decision makers to maintain a situational awareness

  17. Deciding Termination for Ancestor Match- Bounded String Rewriting Systems

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2005-01-01

    Termination of a string rewriting system can be characterized by termination on suitable recursively defined languages. This kind of termination criteria has been criticized for its lack of automation. In an earlier paper we have shown how to construct an automated termination criterion if the recursion is aligned with the rewrite relation. We have demonstrated the technique with Dershowitz's forward closure criterion. In this paper we show that a different approach is suitable when the recursion is aligned with the inverse of the rewrite relation. We apply this idea to Kurth's ancestor graphs and obtain ancestor match-bounded string rewriting systems. Termination is shown to be decidable for this class. The resulting method improves upon those based on match-boundedness or inverse match-boundedness.

  18. 13 CFR 126.804 - Will SBA decide all HUBZone status protests?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Will SBA decide all HUBZone status protests? 126.804 Section 126.804 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION HUBZONE PROGRAM Protests § 126.804 Will SBA decide all HUBZone status protests? SBA will decide all protests not dismissed as premature, untimely...

  19. 32 CFR 1653.4 - File to be returned after appeal to the President is decided.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... President is decided. 1653.4 Section 1653.4 National Defense Other Regulations Relating to National Defense SELECTIVE SERVICE SYSTEM APPEAL TO THE PRESIDENT § 1653.4 File to be returned after appeal to the President is decided. When the appeal to the President has been decided, the file shall be returned as...

  20. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school boards decide... 25 Indians 1 2012-04-01 2011-04-01 true Who decides how Language Development funds can be used? 39...

  1. ELF/VLF emissions generated in the ionosphere by heating facilities - a new tool for ionospheric and magnetospheric research

    SciTech Connect

    Kotik, D.S.

    1994-12-01

    A brief summary of ELF/VLF generation experiments using the SURA heating facility is presented. The possibilities of applications of the measured ionospherically generated low frequency signal parameters for diagnosing the physical phenomena in the ionosphere and the magnetosphere are discussed.

  2. Volunteers Help Decide Where to Point Mars Camera

    NASA Image and Video Library

    2015-07-22

    This series of images from NASA's Mars Reconnaissance Orbiter successively zooms into "spider" features -- or channels carved in the surface in radial patterns -- in the south polar region of Mars. In a new citizen-science project, volunteers will identify features like these using wide-scale images from the orbiter. Their input will then help mission planners decide where to point the orbiter's high-resolution camera for more detailed views of interesting terrain. Volunteers will start with images from the orbiter's Context Camera (CTX), which provides wide views of the Red Planet. The first two images in this series are from CTX; the top right image zooms into a portion of the image at left. The top right image highlights the geological spider features, which are carved into the terrain in the Martian spring when dry ice turns to gas. By identifying unusual features like these, volunteers will help the mission team choose targets for the orbiter's High Resolution Imaging Science Experiment (HiRISE) camera, which can reveal more detail than any other camera ever put into orbit around Mars. The final image is this series (bottom right) shows a HiRISE close-up of one of the spider features. http://photojournal.jpl.nasa.gov/catalog/PIA19823

  3. Cardiac crossroads: deciding between mechanical or bioprosthetic heart valve replacement

    PubMed Central

    Tillquist, Maggie N; Maddox, Thomas M

    2011-01-01

    Nearly 15 million people in the United States suffer from either aortic or mitral valvular disease. For patients with severe and symptomatic valvular heart disease, valve replacement surgery improves morbidity and mortality outcomes. In 2009, 90,000 valve replacement surgeries were performed in the United States. This review evaluates the advantages and disadvantages of mechanical and bioprosthetic prosthetic heart valves as well as the factors for consideration in deciding the appropriate valve type for an individual patient. Although many caveats exist, the general recommendation is for patients younger than 60 to 65 years to receive mechanical valves due to the valve’s longer durability and for patients older than 60 to 65 years to receive a bioprosthetic valve to avoid complications with anticoagulants. Situations that warrant special consideration include patient co-morbidities, the need for anticoagulation, and the potential for pregnancy. Once these characteristics have been considered, patients’ values, anxieties, and expectations for their lifestyle and quality of life should be incorporated into final valve selection. Decision aids can be useful in integrating preferences in the valve decision. Finally, future directions in valve technology, anticoagulation, and medical decision-making are discussed. PMID:21448466

  4. Dormancy and germination: How does the crop seed decide?

    PubMed

    Shu, K; Meng, Y J; Shuai, H W; Liu, W G; Du, J B; Liu, J; Yang, W Y

    2015-11-01

    Whether seeds germinate or maintain dormancy is decided upon through very intricate physiological processes. Correct timing of these processes is most important for the plants life cycle. If moist conditions are encountered, a low dormancy level causes pre-harvest sprouting in various crop species, such as wheat, corn and rice, this decreases crop yield and negatively impacts downstream industrial processing. In contrast, a deep level of seed dormancy prevents normal germination even under favourable conditions, resulting in a low emergence rate during agricultural production. Therefore, an optimal seed dormancy level is valuable for modern mechanised agricultural systems. Over the past several years, numerous studies have demonstrated that diverse endogenous and environmental factors regulate the balance between dormancy and germination, such as light, temperature, water status and bacteria in soil, and phytohormones such as ABA (abscisic acid) and GA (gibberellic acid). In this updated review, we highlight recent advances regarding the molecular mechanisms underlying regulation of seed dormancy and germination processes, including the external environmental and internal hormonal cues, and primarily focusing on the staple crop species. Furthermore, future challenges and research directions for developing a full understanding of crop seed dormancy and germination are also discussed.

  5. Positioning and deciding: key factors for talent development in soccer.

    PubMed

    Kannekens, R; Elferink-Gemser, M T; Visscher, C

    2011-12-01

    Talent identification and development implicate recognizing youth players who will be successful in the future and guiding them to the top. A major determinant of this success is tactical skills. To identify possible key factors that help in predicting success over time, this study assesses the tactical skills of 105 elite youth soccer players who participated in a talent development program at an earlier stage of their sport career (mean age 17.8±0.9). These skills were related to their adult performance level, specifically whether they became professionals (n=52) or amateurs (n=53). Defenders, midfielders and attackers completed the Tactical Skills Inventory for Sports with scales for declarative and procedural knowledge in either attacking or defensive situations. A logistic regression analysis was performed to identify the tactical skills that contribute to professional performance level in adulthood. Positioning and deciding appeared to be the tactical skill that best predicts adult performance level (P<0.05). This is especially true for midfielders, with the correct classification of elite youth players in the range of 80%. For players scoring high on this skill, the odds ratios indicated a 6.60 times greater chance that a player became a professional than players scoring low (P<0.05). © 2010 John Wiley & Sons A/S.

  6. Consent and assessment of capacity to decide or refuse treatment.

    PubMed

    Simpson, Owena

    Consent protects the right of patients to decide what happens to them. Before any medical intervention, adults must give valid consent, which must be voluntary, informed and given free of undue influence. When consent is being obtained, patients must be informed about the intervention, why it is being done and its risks; information they are given must be recorded. Every effort should be made to explain the issues in terms that the patient can understand and by providing support and aids to communicate. Consent can be expressed, where patients say they consent or put it in writing, or implied, where a healthcare professional infers from their behaviour that they consent. While different types of consent are valid, some are evidence of stronger proof in court that valid consent has been given. Competent adults have the right to refuse treatment, regardless of the reasons they give for refusal and even if the refusal will result in death; clinicians must respect their decision. In some circumstances-such as when an unconscious person is admitted as an emergency-healthcare professionals can make decisions on behalf of patients, and must do so in patients' best interests.

  7. Deciding with Thresholds: Importance Measures and Value of Information.

    PubMed

    Borgonovo, Emanuele; Cillo, Alessandra

    2017-10-01

    Risk-informed decision making is often accompanied by the specification of an acceptable level of risk. Such target level is compared against the value of a risk metric, usually computed through a probabilistic safety assessment model, to decide about the acceptability of a given design, the launch of a space mission, etc. Importance measures complement the decision process with information about the risk/safety significance of events. However, importance measures do not tell us whether the occurrence of an event can change the overarching decision. By linking value of information and importance measures for probabilistic risk assessment models, this work obtains a value-of-information-based importance measure that brings together the risk metric, risk importance measures, and the risk threshold in one expression. The new importance measure does not impose additional computational burden because it can be calculated from our knowledge of the risk achievement and risk reduction worth, and complements the insights delivered by these importance measures. Several properties are discussed, including the joint decision worth of basic event groups. The application to the large loss of coolant accident sequence of the Advanced Test Reactor helps us in illustrating the risk analysis insights. © 2017 Society for Risk Analysis.

  8. NeuroTessMesh: A Tool for the Generation and Visualization of Neuron Meshes and Adaptive On-the-Fly Refinement

    PubMed Central

    Garcia-Cantero, Juan J.; Brito, Juan P.; Mata, Susana; Bayona, Sofia; Pastor, Luis

    2017-01-01

    Gaining a better understanding of the human brain continues to be one of the greatest challenges for science, largely because of the overwhelming complexity of the brain and the difficulty of analyzing the features and behavior of dense neural networks. Regarding analysis, 3D visualization has proven to be a useful tool for the evaluation of complex systems. However, the large number of neurons in non-trivial circuits, together with their intricate geometry, makes the visualization of a neuronal scenario an extremely challenging computational problem. Previous work in this area dealt with the generation of 3D polygonal meshes that approximated the cells’ overall anatomy but did not attempt to deal with the extremely high storage and computational cost required to manage a complex scene. This paper presents NeuroTessMesh, a tool specifically designed to cope with many of the problems associated with the visualization of neural circuits that are comprised of large numbers of cells. In addition, this method facilitates the recovery and visualization of the 3D geometry of cells included in databases, such as NeuroMorpho, and provides the tools needed to approximate missing information such as the soma’s morphology. This method takes as its only input the available compact, yet incomplete, morphological tracings of the cells as acquired by neuroscientists. It uses a multiresolution approach that combines an initial, coarse mesh generation with subsequent on-the-fly adaptive mesh refinement stages using tessellation shaders. For the coarse mesh generation, a novel approach, based on the Finite Element Method, allows approximation of the 3D shape of the soma from its incomplete description. Subsequently, the adaptive refinement process performed in the graphic card generates meshes that provide good visual quality geometries at a reasonable computational cost, both in terms of memory and rendering time. All the described techniques have been integrated into Neuro

  9. Development of a 2nd Generation Decision Support Tool to Optimize Resource and Energy Recovery for Municipal Solid Waste

    EPA Science Inventory

    In 2012, EPA’s Office of Research and Development released the MSW decision support tool (MSW-DST) to help identify strategies for more sustainable MSW management. Depending upon local infrastructure, energy grid mix, population density, and waste composition and quantity, the m...

  10. The Circuit of Culture as a Generative Tool of Contemporary Analysis: Examining the Construction of an Education Commodity

    ERIC Educational Resources Information Center

    Leve, Annabelle M.

    2012-01-01

    Contemporary studies in the field of education cannot afford to neglect the ever present interrelationships between power and politics, economics and consumption, representation and identity. In studying a recent cultural phenomenon in government schools, it became clear that a methodological tool that made sense of these interlinked processes was…

  11. The Effectiveness of Virtual Learning Tools for Millennial Generation Students in a Community College Criminal Justice Degree Program

    ERIC Educational Resources Information Center

    Snyder, Lawrence

    2013-01-01

    An analysis of data from the Community College Survey of Student Engagement and multiyear analysis of pretest/posttest scores in introductory criminal justice courses revealed there was a systemic decline in student engagement and achievement. Because of this analysis, a commercial virtual learning tool (CJI) that purported great success in…

  12. Generation Y, Learner Autonomy and the Potential of Web 2.0 Tools for Language Learning and Teaching

    ERIC Educational Resources Information Center

    Morgan, Liam

    2012-01-01

    Purpose: The purpose of this paper is to examine the relationship between the development of learner autonomy and the application of Web 2.0 tools in the language classroom. Design/methodology/approach: The approach taken is that of qualitative action research within an explicit theoretical framework and the data were collected via surveys and…

  13. The Effectiveness of Virtual Learning Tools for Millennial Generation Students in a Community College Criminal Justice Degree Program

    ERIC Educational Resources Information Center

    Snyder, Lawrence

    2013-01-01

    An analysis of data from the Community College Survey of Student Engagement and multiyear analysis of pretest/posttest scores in introductory criminal justice courses revealed there was a systemic decline in student engagement and achievement. Because of this analysis, a commercial virtual learning tool (CJI) that purported great success in…

  14. Development of a 2nd Generation Decision Support Tool to Optimize Resource and Energy Recovery for Municipal Solid Waste

    EPA Science Inventory

    In 2012, EPA’s Office of Research and Development released the MSW decision support tool (MSW-DST) to help identify strategies for more sustainable MSW management. Depending upon local infrastructure, energy grid mix, population density, and waste composition and quantity, the m...

  15. Canute Rules the Waves?: Hope for E-Library Tools Facing the Challenge of the "Google Generation"

    ERIC Educational Resources Information Center

    Myhill, Martin

    2007-01-01

    Purpose: To consider the findings of a recent e-resources survey at the University of Exeter Library in the context of the dominance of web search engines in academia, balanced by the development of e-library tools such as the library OPAC, OpenURL resolvers, metasearch engines, LDAP and proxy servers, and electronic resource management modules.…

  16. Evaluating Student-Generated Film as a Learning Tool for Qualitative Methods: Geographical "Drifts" and the City

    ERIC Educational Resources Information Center

    Anderson, Jon

    2013-01-01

    Film as a tool for learning offers considerable opportunity for enhancing student understanding. This paper reflects on the experiences of a project that required students to make a short film demonstrating their practical understanding of qualitative methods. In the psychogeographical tradition, students were asked to "drift" across the…

  17. Generation Y, Learner Autonomy and the Potential of Web 2.0 Tools for Language Learning and Teaching

    ERIC Educational Resources Information Center

    Morgan, Liam

    2012-01-01

    Purpose: The purpose of this paper is to examine the relationship between the development of learner autonomy and the application of Web 2.0 tools in the language classroom. Design/methodology/approach: The approach taken is that of qualitative action research within an explicit theoretical framework and the data were collected via surveys and…

  18. Evaluating Student-Generated Film as a Learning Tool for Qualitative Methods: Geographical "Drifts" and the City

    ERIC Educational Resources Information Center

    Anderson, Jon

    2013-01-01

    Film as a tool for learning offers considerable opportunity for enhancing student understanding. This paper reflects on the experiences of a project that required students to make a short film demonstrating their practical understanding of qualitative methods. In the psychogeographical tradition, students were asked to "drift" across the…

  19. Green Tool

    EPA Pesticide Factsheets

    The Green Tool represents infiltration-based stormwater control practices. It allows modelers to select a BMP type, channel shape and BMP unit dimensions, outflow control devices, and infiltration method. The program generates an HSPF-formatted FTABLE.

  20. SynBioSS designer: a web-based tool for the automated generation of kinetic models for synthetic biological constructs

    PubMed Central

    Weeding, Emma; Houle, Jason

    2010-01-01

    Modeling tools can play an important role in synthetic biology the same way modeling helps in other engineering disciplines: simulations can quickly probe mechanisms and provide a clear picture of how different components influence the behavior of the whole. We present a brief review of available tools and present SynBioSS Designer. The Synthetic Biology Software Suite (SynBioSS) is used for the generation, storing, retrieval and quantitative simulation of synthetic biological networks. SynBioSS consists of three distinct components: the Desktop Simulator, the Wiki, and the Designer. SynBioSS Designer takes as input molecular parts involved in gene expression and regulation (e.g. promoters, transcription factors, ribosome binding sites, etc.), and automatically generates complete networks of reactions that represent transcription, translation, regulation, induction and degradation of those parts. Effectively, Designer uses DNA sequences as input and generates networks of biomolecular reactions as output. In this paper we describe how Designer uses universal principles of molecular biology to generate models of any arbitrary synthetic biological system. These models are useful as they explain biological phenotypic complexity in mechanistic terms. In turn, such mechanistic explanations can assist in designing synthetic biological systems. We also discuss, giving practical guidance to users, how Designer interfaces with the Registry of Standard Biological Parts, the de facto compendium of parts used in synthetic biology applications. PMID:20639523

  1. SynBioSS designer: a web-based tool for the automated generation of kinetic models for synthetic biological constructs.

    PubMed

    Weeding, Emma; Houle, Jason; Kaznessis, Yiannis N

    2010-07-01

    Modeling tools can play an important role in synthetic biology the same way modeling helps in other engineering disciplines: simulations can quickly probe mechanisms and provide a clear picture of how different components influence the behavior of the whole. We present a brief review of available tools and present SynBioSS Designer. The Synthetic Biology Software Suite (SynBioSS) is used for the generation, storing, retrieval and quantitative simulation of synthetic biological networks. SynBioSS consists of three distinct components: the Desktop Simulator, the Wiki, and the Designer. SynBioSS Designer takes as input molecular parts involved in gene expression and regulation (e.g. promoters, transcription factors, ribosome binding sites, etc.), and automatically generates complete networks of reactions that represent transcription, translation, regulation, induction and degradation of those parts. Effectively, Designer uses DNA sequences as input and generates networks of biomolecular reactions as output. In this paper we describe how Designer uses universal principles of molecular biology to generate models of any arbitrary synthetic biological system. These models are useful as they explain biological phenotypic complexity in mechanistic terms. In turn, such mechanistic explanations can assist in designing synthetic biological systems. We also discuss, giving practical guidance to users, how Designer interfaces with the Registry of Standard Biological Parts, the de facto compendium of parts used in synthetic biology applications.

  2. The determination of waste generation and composition as an essential tool to improve the waste management plan of a university.

    PubMed

    Gallardo, A; Edo-Alcón, N; Carlos, M; Renau, M

    2016-07-01

    When many people work in organized institutions or enterprises, those institutions or enterprises become big meeting places that also have energy, water and resources necessities. One of these necessities is the correct management of the waste that is daily produced by these communities. Universities are a good example of institution where every day a great amount of people go to work or to study. But independently of their task, they use the different services at the University such as cafeterias, canteens, and photocopy and as a result of their activity a cleaning service is also needed. All these activities generate an environmental impact. Nowadays, many Universities have accepted the challenge to minimize this impact applying several measures. One of the impacts to be reduced is the waste generation. The first step to implement measures to implement a waste management plan at a University is to know the composition, the amount and the distribution of the waste generated in its facilities. As the waste composition and generation depend among other things on the climate, these variables should be analysed over one year. This research work estimates the waste generation and composition of a Spanish University, the Universitat Jaume I, during a school year. To achieve this challenge, all the waste streams generated at the University have been identified and quantified emphasizing on those which are not controlled. Furthermore, several statistical analyses have been carried out to know if the season of the year or the day of the week affect waste generation and composition. All this information will allow the University authorities to propose a set of minimization measures to enhance the current management.

  3. Ewing sarcoma mimicking atypical carcinoid tumor: detection of unexpected genomic alterations demonstrates the use of next generation sequencing as a diagnostic tool.

    PubMed

    Doyle, Leona A; Wong, Kwok-Kin; Bueno, Raphael; Dal Cin, Paola; Fletcher, Jonathan A; Sholl, Lynette M; Kuo, Frank

    2014-01-01

    Increasingly, tumors are being analyzed for a variety of mutations and other genomic changes, with the goals of guiding personalized therapy and directing patients to appropriate clinical trials based on genotype, as well as identifying previously unknown genomic changes in different tumor types and thereby providing new insights into the pathogenesis of human cancers. Next generation sequencing is a powerful research tool now gaining traction in the clinic. In this report, we demonstrate the utility of next generation sequencing assays in providing diagnostic information when evaluating tumor specimens. This is illustrated by a case previously thought to represent an atypical carcinoid tumor, in which an EWSR1-ERG translocation was detected during next generation sequencing using a hybrid capture approach, leading to a revised diagnosis of Ewing sarcoma. The role of translocation detection in these assays is also discussed.

  4. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging.

    PubMed

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-07-30

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  5. A New Browser-based, Ontology-driven Tool for Generating Standardized, Deep Descriptions of Geoscience Models

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; Kelbert, A.; Rudan, S.; Stoica, M.

    2016-12-01

    Standardized metadata for models is the key to reliable and greatly simplified coupling in model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System). This model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. While having this kind of standardized metadata for each model in a repository opens up a wide range of exciting possibilities, it is difficult to collect this information and a carefully conceived "data model" or schema is needed to store it. Automated harvesting and scraping methods can provide some useful information, but they often result in metadata that is inaccurate or incomplete, and this is not sufficient to enable the desired capabilities. In order to address this problem, we have developed a browser-based tool called the MCM Tool (Model Component Metadata) which runs on notebooks, tablets and smart phones. This tool was partially inspired by the TurboTax software, which greatly simplifies the necessary task of preparing tax documents. It allows a model developer or advanced user to provide a standardized, deep description of a computational geoscience model, including hydrologic models. Under the hood, the tool uses a new ontology for models built on the CSDMS Standard Names, expressed as a collection of RDF files (Resource Description Framework). This ontology is based on core concepts

  6. Creating User-Friendly Tools for Data Analysis and Visualization in K-12 Classrooms: A Fortran Dinosaur Meets Generation Y

    NASA Technical Reports Server (NTRS)

    Chambers, L. H.; Chaudhury, S.; Page, M. T.; Lankey, A. J.; Doughty, J.; Kern, Steven; Rogerson, Tina M.

    2008-01-01

    During the summer of 2007, as part of the second year of a NASA-funded project in partnership with Christopher Newport University called SPHERE (Students as Professionals Helping Educators Research the Earth), a group of undergraduate students spent 8 weeks in a research internship at or near NASA Langley Research Center. Three students from this group formed the Clouds group along with a NASA mentor (Chambers), and the brief addition of a local high school student fulfilling a mentorship requirement. The Clouds group was given the task of exploring and analyzing ground-based cloud observations obtained by K-12 students as part of the Students' Cloud Observations On-Line (S'COOL) Project, and the corresponding satellite data. This project began in 1997. The primary analysis tools developed for it were in FORTRAN, a computer language none of the students were familiar with. While they persevered through computer challenges and picky syntax, it eventually became obvious that this was not the most fruitful approach for a project aimed at motivating K-12 students to do their own data analysis. Thus, about halfway through the summer the group shifted its focus to more modern data analysis and visualization tools, namely spreadsheets and Google(tm) Earth. The result of their efforts, so far, is two different Excel spreadsheets and a Google(tm) Earth file. The spreadsheets are set up to allow participating classrooms to paste in a particular dataset of interest, using the standard S'COOL format, and easily perform a variety of analyses and comparisons of the ground cloud observation reports and their correspondence with the satellite data. This includes summarizing cloud occurrence and cloud cover statistics, and comparing cloud cover measurements from the two points of view. A visual classification tool is also provided to compare the cloud levels reported from the two viewpoints. This provides a statistical counterpart to the existing S'COOL data visualization tool

  7. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    PubMed

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.

  8. Effectiveness of Student-Generated Video as a Teaching Tool for an Instrumental Technique in the Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Jordan, Jeremy T.; Box, Melinda C.; Eguren, Kristen E.; Parker, Thomas A.; Saraldi-Gallardo, Victoria M.; Wolfe, Michael I.; Gallardo-Williams, Maria T.

    2016-01-01

    Multimedia instruction has been shown to serve as an effective learning aid for chemistry students. In this study, the viability of student-generated video instruction for organic chemistry laboratory techniques and procedure was examined and its effectiveness compared to instruction provided by a teaching assistant (TA) was evaluated. After…

  9. Next generation sequencing technology: a powerful tool for the genome characterization of sugarcane mosaic virus from Sorghum almum

    USDA-ARS?s Scientific Manuscript database

    Next generation sequencing (NGS) technology was used to analyze the occurrence of viruses in Sorghum almum plants in Florida exhibiting mosaic symptoms. Total RNA was extracted from symptomatic leaves and used as a template for cDNA library preparation. The resulting library was sequenced on an Illu...

  10. Using Tablets as Tools for Learner-Generated Drawings in the Context of Teaching the Kinetic Theory of Gases

    ERIC Educational Resources Information Center

    Lehtinen, A.; Viiri, J.

    2014-01-01

    Even though research suggests that the use of drawings could be an important part of learning science, learner-generated drawings have not received much attention in physics classrooms. This paper presents a method for recording students' drawings and group discussions using tablets. Compared to pen and paper, tablets offer unique benefits, which…

  11. Using Tablets as Tools for Learner-Generated Drawings in the Context of Teaching the Kinetic Theory of Gases

    ERIC Educational Resources Information Center

    Lehtinen, A.; Viiri, J.

    2014-01-01

    Even though research suggests that the use of drawings could be an important part of learning science, learner-generated drawings have not received much attention in physics classrooms. This paper presents a method for recording students' drawings and group discussions using tablets. Compared to pen and paper, tablets offer unique benefits, which…

  12. Effectiveness of Student-Generated Video as a Teaching Tool for an Instrumental Technique in the Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Jordan, Jeremy T.; Box, Melinda C.; Eguren, Kristen E.; Parker, Thomas A.; Saraldi-Gallardo, Victoria M.; Wolfe, Michael I.; Gallardo-Williams, Maria T.

    2016-01-01

    Multimedia instruction has been shown to serve as an effective learning aid for chemistry students. In this study, the viability of student-generated video instruction for organic chemistry laboratory techniques and procedure was examined and its effectiveness compared to instruction provided by a teaching assistant (TA) was evaluated. After…

  13. The C3 Framework: A Powerful Tool for Preparing Future Generations for Informed and Engaged Civic Life

    ERIC Educational Resources Information Center

    Croddy, Marshall; Levine, Peter

    2014-01-01

    As the C3 Framework for the social studies rolls out, it is hoped that its influence will grow, offering a vision and guidance for the development of a new generation of state social studies standards that promote deeper student learning and the acquisition of essentials skills for college, career, and civic life. In the interim, it can be an…

  14. A Useful Laboratory Tool

    ERIC Educational Resources Information Center

    Johnson, Samuel A.; Tutt, Tye

    2008-01-01

    Recently, a high school Science Club generated a large number of questions involving temperature. Therefore, they decided to construct a thermal gradient apparatus in order to conduct a wide range of experiments beyond the standard "cookbook" labs. They felt that this apparatus could be especially useful in future ninth-grade biology classes, in…

  15. A Useful Laboratory Tool

    ERIC Educational Resources Information Center

    Johnson, Samuel A.; Tutt, Tye

    2008-01-01

    Recently, a high school Science Club generated a large number of questions involving temperature. Therefore, they decided to construct a thermal gradient apparatus in order to conduct a wide range of experiments beyond the standard "cookbook" labs. They felt that this apparatus could be especially useful in future ninth-grade biology classes, in…

  16. Restoring the ON Switch in Blind Retinas: Opto-mGluR6, a Next-Generation, Cell-Tailored Optogenetic Tool

    PubMed Central

    van Wyk, Michiel; Pielecka-Fortuna, Justyna; Löwel, Siegrid; Kleinlogel, Sonja

    2015-01-01

    Photoreceptor degeneration is one of the most prevalent causes of blindness. Despite photoreceptor loss, the inner retina and central visual pathways remain intact over an extended time period, which has led to creative optogenetic approaches to restore light sensitivity in the surviving inner retina. The major drawbacks of all optogenetic tools recently developed and tested in mouse models are their low light sensitivity and lack of physiological compatibility. Here we introduce a next-generation optogenetic tool, Opto-mGluR6, designed for retinal ON-bipolar cells, which overcomes these limitations. We show that Opto-mGluR6, a chimeric protein consisting of the intracellular domains of the ON-bipolar cell–specific metabotropic glutamate receptor mGluR6 and the light-sensing domains of melanopsin, reliably recovers vision at the retinal, cortical, and behavioral levels under moderate daylight illumination. PMID:25950461

  17. ESO Council Decides to Continue VLT Project at Paranal

    NASA Astrophysics Data System (ADS)

    1994-08-01

    The Council [1] of the European Southern Observatory has met in extraordinary session at the ESO Headquarters in Garching near Munich on August 8 and 9, 1994. The main agenda items were concerned with the recent developments around ESO's relations with the host state, the Republic of Chile, as well as the status of the organisation's main project, the 16-metre equivalent Very Large Telescope (VLT) which will become the world's largest optical telescope. Council had decided to hold this special meeting [2] because of various uncertainties that have arisen in connection with the implementation of the VLT Project at Cerro Paranal, approx. 130 kilometres south of Antofagasta, capital of the II Region in Chile. Following continued consultations at different levels within the ESO member states and after careful consideration of all aspects of the current situation - including various supportive actions by the Chilean Government as well as the incessive attacks against this international organisation from certain sides reported in the media in that country - Council took the important decision to continue the construction of the VLT Observatory at Paranal, while at the same time requesting the ESO Management to pursue the ongoing studies of alternative solutions. THE COUNCIL DECISIONS In particular, the ESO Council took note of recent positive developments which have occurred since the May 1994 round of discussions with the Chilean authorities in Santiago. The confirmation of ESO's immunities as an International Organization in Chile, contained in a number of important statements and documents, is considered a significant step by the Chilean Government to insure to ESO the unhindered erection and later operation of the VLT on Paranal. Under these circumstances and in order to maintain progress on the VLT project, the ESO Council authorized the ESO Management to continue the on-site work at Paranal. Council also took note of the desire expressed by the Chilean Government

  18. A comprehensive tool for image-based generation of fetus and pregnant women mesh models for numerical dosimetry studies

    NASA Astrophysics Data System (ADS)

    Dahdouh, S.; Varsier, N.; Serrurier, A.; De la Plata, J.-P.; Anquez, J.; Angelini, E. D.; Wiart, J.; Bloch, I.

    2014-08-01

    Fetal dosimetry studies require the development of accurate numerical 3D models of the pregnant woman and the fetus. This paper proposes a 3D articulated fetal growth model covering the main phases of pregnancy and a pregnant woman model combining the utero-fetal structures and a deformable non-pregnant woman body envelope. The structures of interest were automatically or semi-automatically (depending on the stage of pregnancy) segmented from a database of images and surface meshes were generated. By interpolating linearly between fetal structures, each one can be generated at any age and in any position. A method is also described to insert the utero-fetal structures in the maternal body. A validation of the fetal models is proposed, comparing a set of biometric measurements to medical reference charts. The usability of the pregnant woman model in dosimetry studies is also investigated, with respect to the influence of the abdominal fat layer.

  19. Risk stratification in acute heart failure: rationale and design of the STRATIFY and DECIDE studies.

    PubMed

    Collins, Sean P; Lindsell, Christopher J; Jenkins, Cathy A; Harrell, Frank E; Fermann, Gregory J; Miller, Karen F; Roll, Sue N; Sperling, Matthew I; Maron, David J; Naftilan, Allen J; McPherson, John A; Weintraub, Neal L; Sawyer, Douglas B; Storrow, Alan B

    2012-12-01

    A critical challenge for physicians facing patients presenting with signs and symptoms of acute heart failure (AHF) is how and where to best manage them. Currently, most patients evaluated for AHF are admitted to the hospital, yet not all warrant inpatient care. Up to 50% of admissions could be potentially avoided and many admitted patients could be discharged after a short period of observation and treatment. Methods for identifying patients that can be sent home early are lacking. Improving the physician's ability to identify and safely manage low-risk patients is essential to avoiding unnecessary use of hospital beds. Two studies (STRATIFY and DECIDE) have been funded by the National Heart Lung and Blood Institute with the goal of developing prediction rules to facilitate early decision making in AHF. Using prospectively gathered evaluation and treatment data from the acute setting (STRATIFY) and early inpatient stay (DECIDE), rules will be generated to predict risk for death and serious complications. Subsequent studies will be designed to test the external validity, utility, generalizability and cost-effectiveness of these prediction rules in different acute care environments representing racially and socioeconomically diverse patient populations. A major innovation is prediction of 5-day as well as 30-day outcomes, overcoming the limitation that 30-day outcomes are highly dependent on unpredictable, post-visit patient and provider behavior. A novel aspect of the proposed project is the use of a comprehensive cardiology review to correctly assign post-treatment outcomes to the acute presentation. Finally, a rigorous analysis plan has been developed to construct the prediction rules that will maximally extract both the statistical and clinical properties of every data element. Upon completion of this study we will subsequently externally test the prediction rules in a heterogeneous patient cohort. Copyright © 2012 Mosby, Inc. All rights reserved.

  20. electronic Ligand Builder and Optimization Workbench (eLBOW): a tool for ligand coordinate and restraint generation

    PubMed Central

    Moriarty, Nigel W.; Grosse-Kunstleve, Ralf W.; Adams, Paul D.

    2009-01-01

    The electronic Ligand Builder and Optimization Workbench (eLBOW) is a program module of the PHENIX suite of computational crystallographic software. It is designed to be a flexible procedure that uses simple and fast quantum-chemical techniques to provide chemically accurate information for novel and known ligands alike. A variety of input formats and options allow the attainment of a number of diverse goals including geometry optimization and generation of restraints. PMID:19770504

  1. electronic Ligand Builder and Optimisation Workbench (eLBOW): A tool for ligand coordinate and restraint generation

    SciTech Connect

    Moriarty, Nigel; Grosse-Kunstleve, Ralf; Adams, Paul

    2009-07-01

    The electronic Ligand Builder and Optimisation Workbench (eLBOW) is a program module of the PHENIX suite of computational crystallographic software. It's designed to be a flexible procedure using simple and fast quantum chemical techniques to provide chemically accurate information for novel and known ligands alike. A variety of input formats and options allow for the attainment of a number of diverse goals including geometry optimisation and generation of restraints.

  2. ViSAPy: a Python tool for biophysics-based generation of virtual spiking activity for evaluation of spike-sorting algorithms.

    PubMed

    Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T

    2015-04-30

    New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Functional connectivity associated with hand shape generation: Imitating novel hand postures and pantomiming tool grips challenge different nodes of a shared neural network.

    PubMed

    Vingerhoets, Guy; Clauwaert, Amanda

    2015-09-01

    Clinical research suggests that imitating meaningless hand postures and pantomiming tool-related hand shapes rely on different neuroanatomical substrates. We investigated the BOLD responses to different tasks of hand posture generation in 14 right handed volunteers. Conjunction and contrast analyses were applied to select regions that were either common or sensitive to imitation and/or pantomime tasks. The selection included bilateral areas of medial and lateral extrastriate cortex, superior and inferior regions of the lateral and medial parietal lobe, primary motor and somatosensory cortex, and left dorsolateral prefrontal, and ventral and dorsal premotor cortices. Functional connectivity analysis revealed that during hand shape generation the BOLD-response of every region correlated significantly with every other area regardless of the hand posture task performed, although some regions were more involved in some hand postures tasks than others. Based on between-task differences in functional connectivity we predict that imitation of novel hand postures would suffer most from left superior parietal disruption and that pantomiming hand postures for tools would be impaired following left frontal damage, whereas both tasks would be sensitive to inferior parietal dysfunction. We also unveiled that posterior temporal cortex is committed to pantomiming tool grips, but that the involvement of this region to the execution of hand postures in general appears limited. We conclude that the generation of hand postures is subserved by a highly interconnected task-general neural network. Depending on task requirements some nodes/connections will be more engaged than others and these task-sensitive findings are in general agreement with recent lesion studies.

  4. Wikis, blogs and podcasts: a new generation of Web-based tools for virtual collaborative clinical practice and education

    PubMed Central

    Boulos, Maged N Kamel; Maramba, Inocencio; Wheeler, Steve

    2006-01-01

    Background We have witnessed a rapid increase in the use of Web-based 'collaborationware' in recent years. These Web 2.0 applications, particularly wikis, blogs and podcasts, have been increasingly adopted by many online health-related professional and educational services. Because of their ease of use and rapidity of deployment, they offer the opportunity for powerful information sharing and ease of collaboration. Wikis are Web sites that can be edited by anyone who has access to them. The word 'blog' is a contraction of 'Web Log' – an online Web journal that can offer a resource rich multimedia environment. Podcasts are repositories of audio and video materials that can be "pushed" to subscribers, even without user intervention. These audio and video files can be downloaded to portable media players that can be taken anywhere, providing the potential for "anytime, anywhere" learning experiences (mobile learning). Discussion Wikis, blogs and podcasts are all relatively easy to use, which partly accounts for their proliferation. The fact that there are many free and Open Source versions of these tools may also be responsible for their explosive growth. Thus it would be relatively easy to implement any or all within a Health Professions' Educational Environment. Paradoxically, some of their disadvantages also relate to their openness and ease of use. With virtually anybody able to alter, edit or otherwise contribute to the collaborative Web pages, it can be problematic to gauge the reliability and accuracy of such resources. While arguably, the very process of collaboration leads to a Darwinian type 'survival of the fittest' content within a Web page, the veracity of these resources can be assured through careful monitoring, moderation, and operation of the collaborationware in a closed and secure digital environment. Empirical research is still needed to build our pedagogic evidence base about the different aspects of these tools in the context of medical

  5. Wikis, blogs and podcasts: a new generation of Web-based tools for virtual collaborative clinical practice and education.

    PubMed

    Boulos, Maged N Kamel; Maramba, Inocencio; Wheeler, Steve

    2006-08-15

    We have witnessed a rapid increase in the use of Web-based 'collaborationware' in recent years. These Web 2.0 applications, particularly wikis, blogs and podcasts, have been increasingly adopted by many online health-related professional and educational services. Because of their ease of use and rapidity of deployment, they offer the opportunity for powerful information sharing and ease of collaboration. Wikis are Web sites that can be edited by anyone who has access to them. The word 'blog' is a contraction of 'Web Log' - an online Web journal that can offer a resource rich multimedia environment. Podcasts are repositories of audio and video materials that can be "pushed" to subscribers, even without user intervention. These audio and video files can be downloaded to portable media players that can be taken anywhere, providing the potential for "anytime, anywhere" learning experiences (mobile learning). Wikis, blogs and podcasts are all relatively easy to use, which partly accounts for their proliferation. The fact that there are many free and Open Source versions of these tools may also be responsible for their explosive growth. Thus it would be relatively easy to implement any or all within a Health Professions' Educational Environment. Paradoxically, some of their disadvantages also relate to their openness and ease of use. With virtually anybody able to alter, edit or otherwise contribute to the collaborative Web pages, it can be problematic to gauge the reliability and accuracy of such resources. While arguably, the very process of collaboration leads to a Darwinian type 'survival of the fittest' content within a Web page, the veracity of these resources can be assured through careful monitoring, moderation, and operation of the collaborationware in a closed and secure digital environment. Empirical research is still needed to build our pedagogic evidence base about the different aspects of these tools in the context of medical/health education. If

  6. Final Technical Report for Contract No. DE-EE0006332, "Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation"

    SciTech Connect

    Cormier, Dallas; Edra, Sherwin; Espinoza, Michael; Daye, Tony; Kostylev, Vladimir; Pavlovski, Alexandre; Jelen, Deborah

    2014-12-29

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  7. Using a Resource Effect Study Pre-Pilot to Inform a Large Randomized Trial: The Decide2Quit.Org Web-Assisted Tobacco Intervention

    PubMed Central

    Sadasivam, Rajani S.; Allison, Jeroan J; Ray, Midge N.; Ford, Daniel E; Houston, Thomas K.

    2012-01-01

    Resource effect studies can be useful in highlighting areas of improvement in informatics tools. Before a large randomized trial, we tested the functions of the Decide2Quit.org Web-assisted tobacco intervention using smokers (N=204) recruited via Google advertisements. These smokers were given access to Decide2Quit.org for six months and we tracked their usage and assessed their six months cessation using a rigorous follow-up. Multiple, interesting findings were identified: we found the use of tailored emails to dramatically increase participation for a short period. We also found varied effects of the different functions. Functions supporting “seeking social support” (Your Online Community and Family Tools), Healthcare Provider Tools, and the Library had positive effects on quit outcomes. One surprising finding, which needs further investigation, was that writing to our Tobacco Treatment Specialists was negatively associated with quit outcomes. PMID:23304353

  8. The GEISA system in 1996: towards an operational tool for the second generation vertical sounders radiance simulation.

    NASA Astrophysics Data System (ADS)

    Jacquinet-Husson, N.; Scott, N. A.; Chedin, A.; Bonnet, B.; Barbe, A.; Tyuterev, V. G.; Champion, J. P.; Winnewisser, M.; Brown, L. R.; Gamache, R.; Golovko, V. F.; Chursin, A. A.

    1998-05-01

    Since their creation, in 1974, the GEISA (Gestion et Etude des Informations Spectroscopiques Atmospheriques: Management and Study of Atmospheric Spectroscopic Information) database system (more than 730,000 entries between 0 and 22,656 cm-1, corresponding to 40 molecules and 86 isotopic species, in its 1992 edition) and the associated software have been widely used for forward atmospheric radiative transfer modelling, with the maximum reliability, tractability and efficiency. For the upcoming high spectral resolution sounders like IASI (Infrared Atmospheric Sounding Interferometer) and AIRS (Atmospheric InfraRed Sounder), more complete and accurate laboratory measurements of spectroscopic parameters, presently included in the databases, are required, and more sophisticated theoretical radiative transfer modelling should be developed. Consequently, it is intended to elaborate the GEISA database as an interactive tool, named GEISA/IASI, designed for providing spectroscopic information tailored to the IASI sounding radiative transfer modelling.

  9. InfiniCharges: A tool for generating partial charges via the simultaneous fit of multiframe electrostatic potential (ESP) and total dipole fluctuations (TDF)

    NASA Astrophysics Data System (ADS)

    Sant, Marco; Gabrieli, Andrea; Demontis, Pierfranco; Suffritti, Giuseppe B.

    2016-03-01

    The InfiniCharges computer program, for generating reliable partial charges for molecular simulations in periodic systems, is here presented. This tool is an efficient implementation of the recently developed DM-REPEAT method, where the stability of the resulting charges, over a large set of fitting regions, is obtained through the simultaneous fit of multiple electrostatic potential (ESP) configurations together with the total dipole fluctuations (TDF). Besides DM-REPEAT, the program can also perform standard REPEAT fit and its multiframe extension (M-REPEAT), with the possibility to restrain the charges to an arbitrary value. Finally, the code is employed to generate partial charges for ZIF-90, a microporous material of the metal organic frameworks (MOFs) family, and an extensive analysis of the results is carried out.

  10. Appendix 2. Guide for Running AgMIP Climate Scenario Generation Tools with R in Windows, Version 2.3

    NASA Technical Reports Server (NTRS)

    Hudson, Nicholas; Ruane, Alexander Clark

    2013-01-01

    This Guide explains how to create climate series and climate change scenarios by using the AgMip Climate team's methodology as outlined in the AgMIP Guide for Regional Assessment: Handbook of Methods and Procedures. It details how to: install R and the required packages to run the AgMIP Climate Scenario Generation scripts, and create climate scenarios from CMIP5 GCMs using a 30-year baseline daily weather dataset. The Guide also outlines a workflow that can be modified for application to your own climate data.

  11. Radical generating coordination complexes as tools for rapid and effective fragmentation and fluorescent labeling of nucleic acids for microchip hybridization.

    SciTech Connect

    Kelly, J. J.; Chernov, B. N.; Mirzabekov, A. D.; Bavykin, S. G.; Biochip Technology Center; Northwestern Univ.; Engelhardt Inst. of Molecular Biology

    2002-01-01

    DNA microchip technology is a rapid, high-throughput method for nucleic acid hybridization reactions. This technology requires random fragmentation and fluorescent labeling of target nucleic acids prior to hybridization. Radical-generating coordination complexes, such as 1,10-phenanthroline-Cu(II) (OP-Cu) and Fe(II)-EDTA (Fe-EDTA), have been commonly used as sequence nonspecific 'chemical nucleases' to introduce single-strand breaks in nucleic acids. Here we describe a new method based on these radical-generating complexes for random fragmentation and labeling of both single- and double-stranded forms of RNA and DNA. Nucleic acids labeled with the OP-Cu and the Fe-EDTA protocols revealed high hybridization specificity in hybridization with DNA microchips containing oligonucleotide probes selected for identification of 16S rRNA sequences of the Bacillus group microorganisms.We also demonstrated cDNA- and cRNA-labeling and fragmentation with this method. Both the OP-Cu and Fe-EDTA fragmentation and labeling procedures are quick and inexpensive compared to other commonly used methods. A column-based version of the described method does not require centrifugation and therefore is promising for the automation of sample preparations in DNA microchip technology as well as in other nucleic acid hybridization studies.

  12. First-generation versus third-generation comprehensive geriatric assessment instruments in the acute hospital setting: a comparison of the Minimum Geriatric Screening Tools (MGST) and the interRAI Acute Care (interRAI AC).

    PubMed

    Wellens, N I H; Deschodt, M; Flamaing, J; Moons, P; Boonen, S; Boman, X; Gosset, C; Petermans, J; Milisen, K

    2011-08-01

    Comparison of the first-generation Minimum Geriatric Screening Tools (MGST) and the third-generation interRAI Acute Care (interRAI AC). Based on a qualitative multiphase exchange of expert opinion, published evidence was critically analyzed and translated into a consensus. Both methods are intended for a multi-domain geriatric assessment in acute hospital settings, but each with a different scope and goal. MGST contains a collection of single-domain, internationally validated instruments. Assessment is usually triggered by care givers' clinical impression based on geriatric expertise. A limited selection of domains is usually assessed only once, by disciplines with domain-specific expertise. Clinical use results in improvement to screen geriatric problems. InterRAI AC, tailored for acute settings, intends to screen a large number of geriatric domains. Based on systematic observational data, risk domains are triggered and clinical guidelines are suggested. Multiple observation periods outline the evolution of patients' functioning over stay in comparison to the premorbid situation. The method is appropriate for application on geriatric and non-geriatric wards, filling geriatric knowledge gaps. The interRAI Suite contains a common set of standardized items across settings, facilitating data transfer in transitional care. The third-generation interRAI AC has advantages compared to the first-generation MGST. A cascade system is proposed to integrate both, complementary methods in practice. The systematic interRAI AC assessment detects risk domains. Subsequently, clinical protocols suggest components of the MGST as additional assessment. This cascade approach unites the strength of exhaustive assessment of the interRAI AC with domain-specific tools of the MGST.

  13. Evolution of the design methodologies for the next generation of RPV Extensive role of the thermal-hydraulics numerical tools

    SciTech Connect

    Goreaud, Nicolas; Nicaise, Norbert; Stoudt, Roger

    2004-07-01

    The thermal-hydraulic design of the first PWR's was mainly based on an experimental approach, with a large series of test on the main equipment (control rod guide tubes, RPV plenums..), to check its performances. Development of CFD-codes and computers now allows for complex simulations of hydraulic phenomena. Provided adequate qualification, these numerical tools are efficient means to determine hydraulics in given design, and to perform sensitivities for optimization of new designs. Experiments always play their role, first for qualification, and for validation at the last stage of the design. The design of the European Pressurized water Reactor (EPR), is based on both hydraulic calculations and experiments, handled in a complementary approach. This paper describes the effort launched by Framatome-ANP on hydraulic calculations for the Reactor Pressure Vessel (RPV) of the EPR reactor. It concerns 3D-calculations of RPV-inlet including cold legs, RPV-downcomer and lower plenum, RPV-upper plenum up to and including hot legs. It covers normal operating conditions, but also accidental conditions as PTS (Pressurized Thermal Shock) in small break loss of coolant accident (SB-LOCA). Those hydraulic studies have provided numerous useful information for the mechanical design of RPV-internals. (authors)

  14. COV2HTML: A Visualization and Analysis Tool of Bacterial Next Generation Sequencing (NGS) Data for Postgenomics Life Scientists

    PubMed Central

    Orgeur, Mickael; Camiade, Emilie; Brehier, Clément; Dupuy, Bruno

    2014-01-01

    Abstract COV2HTML is an interactive web interface, which is addressed to biologists, and allows performing both coverage visualization and analysis of NGS alignments performed on prokaryotic organisms (bacteria and phages). It combines two processes: a tool that converts the huge NGS mapping or coverage files into light specific coverage files containing information on genetic elements; and a visualization interface allowing a real-time analysis of data with optional integration of statistical results. To demonstrate the scope of COV2HTML, the program was tested with data from two published studies. The first data were from RNA-seq analysis of Campylobacter jejuni, based on comparison of two conditions with two replicates. We were able to recover 26 out of 27 genes highlighted in the publication using COV2HTML. The second data comprised of stranded TSS and RNA-seq data sets on the Archaea Sulfolobus solfataricus. COV2HTML was able to highlight most of the TSSs from the article and allows biologists to visualize both TSS and RNA-seq on the same screen. The strength of the COV2HTML interface is making possible NGS data analysis without software installation, login, or a long training period. A web version is accessible at https://mmonot.eu/COV2HTML/. This website is free and open to users without any login requirement. PMID:24512253

  15. Evolution of Design Methodologies for Next Generation of Reactor Pressure Vessels and Extensive Role of Thermal-Hydraulic Numerical Tools

    SciTech Connect

    Bellet, Serge; Goreaud, Nicolas; Nicaise, Norbert

    2005-11-15

    The thermal-hydraulic design of the first pressurized water reactors was mainly based on an experimental approach, with a large series of tests on the main equipment [control rod guide tubes, reactor pressure vessel (RPV) plenums, etc.] to check performance.Development of computational fluid dynamics codes and computers now allows for complex simulations of hydraulics phenomena. Provided adequate qualification, these numerical tools are an efficient means to determine hydraulics in the given design and to perform sensitivities for optimization of new designs. Experiments always play their role, first for qualification and then for validation at the last stage of the design. The design of the European Pressurized Water Reactor (EPR), jointly developed by Framatome ANP, Electricite de France (EDF), and the German utilities, is based on both hydraulics calculations and experiments handled in a complementary approach.This paper describes the collective effort launched by Framatome ANP and EDF on hydraulics calculations for the RPV of the EPR. It concerns three-dimensional calculations of RPV inlets, including the cold legs, the RPV downcomer and lower plenum, and the RPV upper plenum up to and including the hot legs. It covers normal operating conditions but also accidental conditions such as pressurized thermal shock in a small-break loss-of-coolant accident. Those hydraulics studies have provided much useful information for the mechanical design of RPV internals.

  16. COV2HTML: a visualization and analysis tool of bacterial next generation sequencing (NGS) data for postgenomics life scientists.

    PubMed

    Monot, Marc; Orgeur, Mickael; Camiade, Emilie; Brehier, Clément; Dupuy, Bruno

    2014-03-01

    COV2HTML is an interactive web interface, which is addressed to biologists, and allows performing both coverage visualization and analysis of NGS alignments performed on prokaryotic organisms (bacteria and phages). It combines two processes: a tool that converts the huge NGS mapping or coverage files into light specific coverage files containing information on genetic elements; and a visualization interface allowing a real-time analysis of data with optional integration of statistical results. To demonstrate the scope of COV2HTML, the program was tested with data from two published studies. The first data were from RNA-seq analysis of Campylobacter jejuni, based on comparison of two conditions with two replicates. We were able to recover 26 out of 27 genes highlighted in the publication using COV2HTML. The second data comprised of stranded TSS and RNA-seq data sets on the Archaea Sulfolobus solfataricus. COV2HTML was able to highlight most of the TSSs from the article and allows biologists to visualize both TSS and RNA-seq on the same screen. The strength of the COV2HTML interface is making possible NGS data analysis without software installation, login, or a long training period. A web version is accessible at https://mmonot.eu/COV2HTML/ . This website is free and open to users without any login requirement.

  17. The Michigan Healthy School Action Tools process generates improvements in school nutrition policies and practices, and student dietary intake.

    PubMed

    Alaimo, Katherine; Oleksyk, Shannon; Golzynski, Diane; Drzal, Nick; Lucarelli, Jennifer; Reznar, Melissa; Wen, Yalu; Krabill Yoder, Karen

    2015-05-01

    The Michigan Healthy School Action Tools (HSAT) is an online self-assessment and action planning process for schools seeking to improve their health policies and practices. The School Nutrition Advances Kids study, a 2-year quasi-experimental intervention with low-income middle schools, evaluated whether completing the HSAT with a facilitator assistance and small grant funding resulted in (1) improvements in school nutrition practices and policies and (2) improvements in student dietary intake. A total of 65 low-income Michigan middle schools participated in the study. The Block Youth Food Frequency Questionnaire was completed by 1,176 seventh-grade students at baseline and in eighth grade (during intervention). Schools reported nutrition-related policies and practices/education using the School Environment and Policy Survey. Schools completing the HSAT were compared to schools that did not complete the HSAT with regard to number of policy and practice changes and student dietary intake. Schools that completed the HSAT made significantly more nutrition practice/education changes than schools that did not complete the HSAT, and students in those schools made dietary improvements in fruit, fiber, and cholesterol intake. The Michigan HSAT process is an effective strategy to initiate improvements in nutrition policies and practices within schools, and to improve student dietary intake.

  18. Developmental dysplasia of the hip: usefulness of next generation genomic tools for characterizing the underlying genes - a mini review.

    PubMed

    Basit, S; Hannan, M A; Khoshhal, K I

    2016-07-01

    Developmental dysplasia of the hip (DDH) is one of the most common skeletal anomalies. DDH encompasses a spectrum of the disorder ranging from minor acetabular dysplasia to irreducible dislocation, which may lead to premature arthritis in later life. Involvement of genetic factors underlying DDH became evident when several studies reported chromosomal loci linked to DDH in families with multiple affected individuals. Moreover, using association studies, variants in genes involved in chondrogenesis and joint formation have been shown to be associated with DDH. At least, one study identified a pathogenic variant in the chemokine receptor gene in DDH. No genetic analysis has been reported or carried out in DDH patients from the Middle East. Here, we review the literature related to genetics of DDH and emphasized the usefulness of new generation technologies in identifying genetic variants underlying DDH in consanguineous families. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Generation of Nanobodies against SlyD and development of tools to eliminate this bacterial contaminant from recombinant proteins.

    PubMed

    Hu, Yaozhong; Romão, Ema; Vertommen, Didier; Vincke, Cécile; Morales-Yánez, Francisco; Gutiérrez, Carlos; Liu, Changxiao; Muyldermans, Serge

    2017-09-01

    The gene for a protein domain, derived from a tumor marker, fused to His tag codons and under control of a T7 promotor was expressed in E. coli strain BL21 (DE3). The recombinant protein was purified from cell lysates through immobilized metal affinity chromatography and size-exclusion chromatography. A contaminating bacterial protein was consistently co-purified, even using stringent washing solutions containing 50 or 100 mM imidazole. Immunization of a dromedary with this contaminated protein preparation, and the subsequent generation and panning of the immune Nanobody library yielded several Nanobodies of which 2/3 were directed against the bacterial contaminant, reflecting the immunodominance of this protein to steer the dromedary immune response. Affinity adsorption of this contaminant using one of our specific Nanobodies followed by mass spectrometry identified the bacterial contaminant as FKBP-type peptidyl-prolyl cis-trans isomerase (SlyD) from E. coli. This SlyD protein contains in its C-terminal region 14 histidines in a stretch of 31 amino acids, which explains its co-purification on Ni-NTA resin. This protein is most likely present to varying extents in all recombinant protein preparations after immobilized metal affinity chromatography. Using our SlyD-specific Nb 5 we generated an immune-complex that could be removed either by immunocapturing or by size exclusion chromatography. Both methods allow us to prepare a recombinant protein sample where the SlyD contaminant was quantitatively eliminated. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Instrument for scoring clinical outcome of research for epidermolysis bullosa: a consensus-generated clinical research tool.

    PubMed

    Schwieger-Briel, Agnes; Chakkittakandiyil, Ajith; Lara-Corrales, Irene; Aujla, Nimrita; Lane, Alfred T; Lucky, Anne W; Bruckner, Anna L; Pope, Elena

    2015-01-01

    Epidermolysis bullosa (EB) is a genetic condition characterized by skin fragility and blistering. There is no instrument available for clinical outcome research measurements. Our aim was to develop a comprehensive instrument that is easy to use in the context of interventional studies. Item collection was accomplished using a two-step Delphi Internet survey process for practitioners and qualitative content analysis of patient and family interviews. Items were reduced based on frequency and importance using a 4-point Likert scale and were subject to consensus (>80% agreement) using the nominal group technique. Pilot data testing was performed in 21 consecutive patients attending an EB clinic. The final score, Instrument for Scoring Clinical Outcome of Research for Epidermolysis Bullosa (iscorEB), is a combined score that contains clinician items grouped in five domains (skin, mucosa, organ involvement, laboratory abnormalities, and complications and procedures; maximum score 114) and patient-derived items (pain, itch, functional limitations, sleep, mood, and effect on daily and leisurely activities; maximum score 120). Pilot testing revealed that combined (see below) and subscores were able to differentiate between EB subtypes and degrees of clinical severity (EB simplex 21.7 ± 16.5, junctional EB 28.0 ± 20.7, dystrophic EB 57.3 ± 24.6, p = 0.007; mild 17.3 ± 9.6, moderate 41.0 ± 19.4, and severe 64.5 ± 22.6, p < 0.001). There was high correlation between clinician and patient subscores (correlation coefficient = 0.79, p < 0.001). iscorEB seems to be a sensitive tool in differentiating between EB types and across the clinical spectrum of severity. Further validation studies are needed. © 2014 Wiley Periodicals, Inc.

  1. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    NASA Astrophysics Data System (ADS)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  2. Development and evaluation of the DECIDE to move! Physical activity educational video.

    PubMed

    Majid, Haseeb M; Schumann, Kristina P; Doswell, Angela; Sutherland, June; Hill Golden, Sherita; Stewart, Kerry J; Hill-Briggs, Felicia

    2012-01-01

    To develop a video that provides accessible and usable information about the importance of physical activity to type 2 diabetes self-management and ways of incorporating physical activity into everyday life. A 15-minute physical activity educational video narrated by US Surgeon General Dr Regina Benjamin was developed and evaluated. The video addresses the following topics: the effects of exercise on diabetes, preparations for beginning physical activity, types of physical activity, safety considerations (eg, awareness of symptoms of hypoglycemia during activity), and goal setting. Two patient screening groups were held for evaluation and revision of the video. Patient satisfaction ratings ranged 4.6 to 4.9 out of a possible 5.0 on dimensions of overall satisfaction, how informative they found the video to be, how well the video held their interest and attention, how easy the video was to understand, and how easy the video was to see and hear. Patients reported the educational video effective in empowering them to take strides toward increasing and maintaining physical activity in their lives. The tool is currently used in a clinical research trial, Project DECIDE, as one component of a diabetes and cardiovascular disease self-management program.

  3. 5 CFR 890.1041 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Deciding a contest after a fact-finding proceeding. 890.1041 Section 890.1041 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED... Imposed Against Health Care Providers Suspension § 890.1041 Deciding a contest after a...

  4. 5 CFR 890.1036 - Information considered in deciding a contest.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Information considered in deciding a contest. 890.1036 Section 890.1036 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED... Imposed Against Health Care Providers Suspension § 890.1036 Information considered in deciding a...

  5. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Deciding whether to propose a permissive debarment. 890.1013 Section 890.1013 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED... Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose...

  6. 5 CFR 890.1029 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Deciding a contest after a fact-finding proceeding. 890.1029 Section 890.1029 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED... Imposed Against Health Care Providers Permissive Debarments § 890.1029 Deciding a contest after a...

  7. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Deciding a contest without additional fact-finding. 890.1038 Section 890.1038 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Sanctions Imposed Against Health Care Providers Suspension § 890.1038 Deciding a contest without...

  8. 13 CFR 126.802 - Who decides a HUBZone status protest?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Who decides a HUBZone status protest? 126.802 Section 126.802 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION HUBZONE PROGRAM Protests § 126.802 Who decides a HUBZone status protest? The D/HUB or designee will determine...

  9. 13 CFR 126.802 - Who decides a HUBZone status protest?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Who decides a HUBZone status protest? 126.802 Section 126.802 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION HUBZONE PROGRAM Protests § 126.802 Who decides a HUBZone status protest? The D/HUB or designee will determine...

  10. 13 CFR 126.802 - Who decides a HUBZone status protest?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Who decides a HUBZone status protest? 126.802 Section 126.802 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION HUBZONE PROGRAM Protests § 126.802 Who decides a HUBZone status protest? The D/HUB or designee will determine...

  11. 13 CFR 126.802 - Who decides a HUBZone status protest?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Who decides a HUBZone status protest? 126.802 Section 126.802 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION HUBZONE PROGRAM Protests § 126.802 Who decides a HUBZone status protest? The D/HUB or designee will determine...

  12. 13 CFR 126.802 - Who decides a HUBZone status protest?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Who decides a HUBZone status protest? 126.802 Section 126.802 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION HUBZONE PROGRAM Protests § 126.802 Who decides a HUBZone status protest? The D/HUB or designee will determine...

  13. Decided and Undecided Students: Career Self-Efficacy, Negative Thinking, and Decision-Making Difficulties

    ERIC Educational Resources Information Center

    Bullock-Yowell, Emily; McConnell, Amy E.; Schedin, Emily A.

    2014-01-01

    The career concern differences between undecided and decided college students (N = 223) are examined. Undecided college students (n = 83) reported lower career decision-making self-efficacy, higher incidences of negative career thoughts, and more career decision-making difficulties than their decided peers (n = 143). Results reveal that undecided…

  14. 20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881 Deciding whether someone is your parent or stepparent. (a) We consider your parent to be— (1) Your...

  15. 20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 2 2014-04-01 2014-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881 Deciding whether someone is your parent or stepparent. (a) We consider your parent to be— (1) Your...

  16. 20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881 Deciding whether someone is your parent or stepparent. (a) We consider your parent to be— (1) Your...

  17. 20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881 Deciding whether someone is your parent or stepparent. (a) We consider your parent to be— (1) Your...

  18. 20 CFR 404.708 - How we decide what is enough evidence.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false How we decide what is enough evidence. 404.708 Section 404.708 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Evidence General § 404.708 How we decide what is enough evidence. When you give...

  19. 20 CFR 404.708 - How we decide what is enough evidence.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false How we decide what is enough evidence. 404.708 Section 404.708 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Evidence General § 404.708 How we decide what is enough evidence. When you give...

  20. 20 CFR 670.200 - Who decides where Job Corps centers will be located?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Who decides where Job Corps centers will be... LABOR THE JOB CORPS UNDER TITLE I OF THE WORKFORCE INVESTMENT ACT Site Selection and Protection and Maintenance of Facilities § 670.200 Who decides where Job Corps centers will be located? (a) The Secretary...

  1. 20 CFR 670.200 - Who decides where Job Corps centers will be located?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false Who decides where Job Corps centers will be... LABOR (CONTINUED) THE JOB CORPS UNDER TITLE I OF THE WORKFORCE INVESTMENT ACT Site Selection and Protection and Maintenance of Facilities § 670.200 Who decides where Job Corps centers will be located? (a...

  2. 45 CFR 150.417 - Issues to be heard and decided by ALJ.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Issues to be heard and decided by ALJ. 150.417... issues: (1) Whether a basis exists to assess a civil money penalty against the respondent. (2) Whether the amount of the assessed civil money penalty is reasonable. (b) In deciding whether the amount of...

  3. 49 CFR 40.377 - Who decides whether to issue a PIE?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Who decides whether to issue a PIE? 40.377 Section... a PIE? (a) The ODAPC Director, or his or her designee, decides whether to issue a PIE. If a designee... determination about whether to start a PIE proceeding. (c) There is a “firewall” between the initiating...

  4. 34 CFR 646.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary decide which new grants to make... Secretary Make a Grant? § 646.20 How does the Secretary decide which new grants to make? (a) The Secretary... criterion is indicated in parentheses with the criterion. (b) The Secretary makes new grants in rank order...

  5. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... BENEFIT Reopening, ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding...

  6. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH..., ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding an Appeal before...

  7. 42 CFR 423.2016 - Timeframes for deciding an Appeal before an ALJ.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Timeframes for deciding an Appeal before an ALJ. 423.2016 Section 423.2016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... BENEFIT Reopening, ALJ Hearings, MAC review, and Judicial Review § 423.2016 Timeframes for deciding...

  8. 20 CFR 670.200 - Who decides where Job Corps centers will be located?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Who decides where Job Corps centers will be located? 670.200 Section 670.200 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF... Maintenance of Facilities § 670.200 Who decides where Job Corps centers will be located? (a) The...

  9. 76 FR 59578 - List of Nonconforming Vehicles Decided To Be Eligible for Importation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-27

    ... National Highway Traffic Safety Administration 49 CFR Part 593 List of Nonconforming Vehicles Decided To Be... rule. SUMMARY: This document revises the list of vehicles not originally manufactured to conform to the Federal Motor Vehicle Safety Standards (FMVSS) that NHTSA has decided to be eligible for importation....

  10. 75 FR 57396 - List of Nonconforming Vehicles Decided To Be Eligible for Importation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-21

    ... National Highway Traffic Safety Administration 49 CFR Part 593 List of Nonconforming Vehicles Decided To Be... rule. SUMMARY: This document revises the list of vehicles not originally manufactured to conform to the Federal Motor Vehicle Safety Standards (FMVSS) that NHTSA has decided to be eligible for importation....

  11. 25 CFR 162.531 - How will BIA decide whether to approve a WEEL?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a WEEL? 162.531 Section 162.531 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Weel Approval § 162.531 How will BIA decide whether to approve...

  12. 25 CFR 162.566 - How will BIA decide whether to approve a WSR lease?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a WSR lease? 162.566 Section 162.566 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Approval § 162.566 How will BIA decide whether...

  13. 25 CFR 162.566 - How will BIA decide whether to approve a WSR lease?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a WSR lease? 162.566 Section 162.566 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Approval § 162.566 How will BIA decide whether...

  14. 25 CFR 162.531 - How will BIA decide whether to approve a WEEL?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a WEEL? 162.531 Section 162.531 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Weel Approval § 162.531 How will BIA decide whether to approve...

  15. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2011-04-01 2011-04-01 false Who decides how Language Development funds can be used...

  16. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2014-04-01 2014-04-01 false Who decides how Language Development funds can be used...

  17. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2013-04-01 2013-04-01 false Who decides how Language Development funds can be used...

  18. 5 CFR 890.1029 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Deciding a contest after a fact-finding... Imposed Against Health Care Providers Permissive Debarments § 890.1029 Deciding a contest after a fact... official's findings of fact, unless they are arbitrary, capricious, or clearly erroneous. If the...

  19. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Who decides how Language Development funds can be used... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school...

  20. Decided and Undecided Students: Career Self-Efficacy, Negative Thinking, and Decision-Making Difficulties

    ERIC Educational Resources Information Center

    Bullock-Yowell, Emily; McConnell, Amy E.; Schedin, Emily A.

    2014-01-01

    The career concern differences between undecided and decided college students (N = 223) are examined. Undecided college students (n = 83) reported lower career decision-making self-efficacy, higher incidences of negative career thoughts, and more career decision-making difficulties than their decided peers (n = 143). Results reveal that undecided…

  1. DNA resection in eukaryotes: deciding how to fix the break.

    PubMed

    Huertas, Pablo

    2010-01-01

    DNA double-strand breaks are repaired by different mechanisms, including homologous recombination and nonhomologous end-joining. DNA-end resection, the first step in recombination, is a key step that contributes to the choice of DSB repair. Resection, an evolutionarily conserved process that generates single-stranded DNA, is linked to checkpoint activation and is critical for survival. Failure to regulate and execute this process results in defective recombination and can contribute to human disease. Here I review recent findings on the mechanisms of resection in eukaryotes, from yeast to vertebrates, provide insights into the regulatory strategies that control it, and highlight the consequences of both its impairment and its deregulation.

  2. Use of GIS and Data Visualization Tools for Modeling Aquifer Architecture and Generating Aquifer Vulnerability Maps at a Regional Scale

    NASA Astrophysics Data System (ADS)

    Scibek, J.; Allen, D. M.; Bishop, T. W.; Wei, M.

    2003-12-01

    The Grand Forks aquifer aquifer is one of the first aquifers in the province of British Columbia to undergo a full hydrogeologic characterization because of its importance as a water supply. It is also being used as a case study area for modelling the impact of climate change on groundwater. The aquifer consists of a layered sequence of glacial and alluvial sediments overlying bedrock, which from the top down are comprised of gravel, sand, silt and clay. Aquifer hydrostratigraphy was defined based on some over 300 water well records contained from in the BC Ministry of Water, Land and Air Protection Water WellWELL Database. Lithology data were first standardized to correct errors in syntax, grammar and spelling, recognize equivalent terms, and classify the materials into dominant types so that calculations involving the database could be more easily undertaken. Standardized data have beenwere then used to construct an aquifer architecture model that can be used as input to a numerical groundwater flow model and to construct a vulnerability map. The three-dimensional aquifer architecture model was developed in the data visualization software (GMS) by first constructing cross-sections, and later, generating a solid model that represents the layering and spatial heterogeneity of the aquifer. The bedrock surface was modeled using geostatistical techniques to produce a bedrock digital elevation model (DEM) that better constrains the lower bound of the model. Layers were imported into the numerical groundwater flow code, Visual MODFLOW, and are being used to model current climate conditions and climate change scenarios for the Grand Forks region. A GIS was also used to capture the spatial variability in the input parameters that are used to construct vulnerability maps. Using the DRASTIC approach, indices were assigned to each of seven hydrogeologic parameters. A rastor map was generated for each. A digitized soils map was used to assign soil material and soil topography

  3. miRMOD: a tool for identification and analysis of 5' and 3' miRNA modifications in Next Generation Sequencing small RNA data.

    PubMed

    Kaushik, Abhinav; Saraf, Shradha; Mukherjee, Sunil K; Gupta, Dinesh

    2015-01-01

    In the past decade, the microRNAs (miRNAs) have emerged to be important regulators of gene expression across various species. Several studies have confirmed different types of post-transcriptional modifications at terminal ends of miRNAs. The reports indicate that miRNA modifications are conserved and functionally significant as it may affect miRNA stability and ability to bind mRNA targets, hence affecting target gene repression. Next Generation Sequencing (NGS) of the small RNA (sRNA) provides an efficient and reliable method to explore miRNA modifications. The need for dedicated software, especially for users with little knowledge of computers, to determine and analyze miRNA modifications in sRNA NGS data, motivated us to develop miRMOD. miRMOD is a user-friendly, Microsoft Windows and Graphical User Interface (GUI) based tool for identification and analysis of 5' and 3' miRNA modifications (non-templated nucleotide additions and trimming) in sRNA NGS data. In addition to identification of miRNA modifications, the tool also predicts and compares the targets of query and modified miRNAs. In order to compare binding affinities for the same target, miRMOD utilizes minimum free energies of the miRNA:target and modified-miRNA:target interactions. Comparisons of the binding energies may guide experimental exploration of miRNA post-transcriptional modifications. The tool is available as a stand-alone package to overcome large data transfer problems commonly faced in web-based high-throughput (HT) sequencing data analysis tools. miRMOD package is freely available at http://bioinfo.icgeb.res.in/miRMOD.

  4. To select the best tool for generating 3D maintenance data and to set the detailed process for obtaining the 3D maintenance data

    NASA Astrophysics Data System (ADS)

    Prashanth, B. N.; Roy, Kingshuk

    2017-07-01

    Three Dimensional (3D) maintenance data provides a link between design and technical documentation creating interactive 3D graphical training and maintenance material. It becomes difficult for an operator to always go through huge paper manuals or come running to the computer for doing maintenance of a machine which makes the maintenance work fatigue. Above being the case, a 3D animation makes maintenance work very simple since, there is no language barrier. The research deals with the generation of 3D maintenance data of any given machine. The best tool for obtaining the 3D maintenance is selected and the tool is analyzed. Using the same tool, a detailed process for extracting the 3D maintenance data for any machine is set. This project aims at selecting the best tool for obtaining 3D maintenance data and to select the detailed process for obtaining 3D maintenance data. 3D maintenance reduces use of big volumes of manuals which creates human errors and makes the work of an operator fatiguing. Hence 3-D maintenance would help in training and maintenance and would increase productivity. 3Dvia when compared with Cortona 3D and Deep Exploration proves to be better than them. 3Dvia is good in data translation and it has the best renderings compared to the other two 3D maintenance software. 3Dvia is very user friendly and it has various options for creating 3D animations. Its Interactive Electronic Technical Publication (IETP) integration is also better than the other two software. Hence 3Dvia proves to be the best software for obtaining 3D maintenance data of any machine.

  5. Generation of growth arrested Leishmania amastigotes: a tool to develop live attenuated vaccine candidates against visceral leishmaniasis.

    PubMed

    Selvapandiyan, Angamuthu; Dey, Ranadhir; Gannavaram, Sreenivas; Solanki, Sumit; Salotra, Poonam; Nakhasi, Hira L

    2014-06-30

    Visceral leishmaniasis (VL) is fatal if not treated and is prevalent widely in the tropical and sub-tropical regions of world. VL is caused by the protozoan parasite Leishmania donovani or Leishmania infantum. Although several second generation vaccines have been licensed to protect dogs against VL, there are no effective vaccines against human VL [1]. Since people cured of leishmaniasis develop lifelong protection, development of live attenuated Leishmania parasites as vaccines, which can have controlled infection, may be a close surrogate to leishmanization. This can be achieved by deletion of genes involved in the regulation of growth and/or virulence of the parasite. Such mutant parasites generally do not revert to virulence in animal models even under conditions of induced immune suppression due to complete deletion of the essential gene(s). In the Leishmania life cycle, the intracellular amastigote form is the virulent form and causes disease in the mammalian hosts. We developed centrin gene deleted L. donovani parasites that displayed attenuated growth only in the amastigote stage and were found safe and efficacious against virulent challenge in the experimental animal models. Thus, targeting genes differentially expressed in the amastigote stage would potentially attenuate only the amastigote stage and hence controlled infectivity may be effective in developing immunity. This review lays out the strategies for attenuation of the growth of the amastigote form of Leishmania for use as live vaccine against leishmaniasis, with a focus on visceral leishmaniasis.

  6. Generation of chimeric bispecific G250/anti-CD3 monoclonal antibody, a tool to combat renal cell carcinoma.

    PubMed Central

    Luiten, R. M.; Coney, L. R.; Fleuren, G. J.; Warnaar, S. O.; Litvinov, S. V.

    1996-01-01

    The monoclonal antibody (MAb) G250 binds to a tumour-associated antigen, expressed in renal cell carcinoma (RCC), which has been demonstrated to be a suitable target for antibody-mediated immunotherapy. A bispecific antibody having both G250 and anti-CD3 specificity can cross-link G250 antigen-expressing RCC target cells with T cells and can mediate lysis of such targets. Therapy studies with murine antibodies are limited by immune responses to the antibodies injected (HAMA response), which can be decreased by using chimeric antibodies. We generated a chimeric bispecific G250/anti CD3 MAb by transfecting chimeric genes of heavy and light chains for both the G250 MAb and the anti-CD3 MAb into a myeloma cell line. Cytotoxicity assays revealed that the chimeric bispecific MAb was capable of mediating lysis of RCC cell lines by cloned human CD8+T cells or by IL-2-stimulated peripheral blood lymphocytes (PBLs). Lysis mediated by the MAb was specific for target cells that expressed the G250 antigen and was effective at concentrations as low as 0.01 microgram ml-1. The chimeric bispecific G250/anti-CD3 MAb produced may be an effective adjuvant to the currently used IL-2-based therapy of advanced renal cell arcinoma. Images Figure 7 PMID:8795576

  7. CHARACTERIZATION OF SALT PARTICLE INDUCED CORROSION PROCESSES BY SYNCHROTRON GENERATED X-RAY FLUORESCENCE AND COMPLEMENTARY SURFACE ANALYSIS TOOLS.

    SciTech Connect

    NEUFELD, A.K.; COLE, I.S.; BOND, A.M.; ISAACS, H.S.; FURMAN, S.A.

    2001-03-25

    The benefits of using synchrotron-generated X-rays and X-ray fluorescence analysis in combination with other surface analysis techniques have been demonstrated. In studies of salt-induced corrosion, for example, the detection of Rb ions in the area of secondary spreading when salt-containing micro-droplets are placed on zinc surfaces, further supports a mechanism involving cation transport during the corrosion and spreading of corrosive salt on exposed metal surfaces. Specifically, the new analytical data shows that: (a) cations are transported radially from a primary drop formed from a salt deposit in a thin film of secondary spreading around the drop; (b) subsequently, micro-pools are formed in the area of secondary spreading, and it is likely that cations transported within the thin film accumulate in these micro-pools until the area is dehydrated; (c) the mechanism of cation transport into the area of secondary spreading does not include transport of the anions; and (d) hydroxide is the counter ion formed from oxygen reduction at the metal surface within the spreading layer. Data relevant to iron corrosion is also presented and the distinct differences relative to the zinc situation are discussed.

  8. Metabolomics as a Hypothesis-Generating Functional Genomics Tool for the Annotation of Arabidopsis thaliana Genes of “Unknown Function”

    PubMed Central

    Quanbeck, Stephanie M.; Brachova, Libuse; Campbell, Alexis A.; Guan, Xin; Perera, Ann; He, Kun; Rhee, Seung Y.; Bais, Preeti; Dickerson, Julie A.; Dixon, Philip; Wohlgemuth, Gert; Fiehn, Oliver; Barkan, Lenore; Lange, Iris; Lange, B. Markus; Lee, Insuk; Cortes, Diego; Salazar, Carolina; Shuman, Joel; Shulaev, Vladimir; Huhman, David V.; Sumner, Lloyd W.; Roth, Mary R.; Welti, Ruth; Ilarslan, Hilal; Wurtele, Eve S.; Nikolau, Basil J.

    2012-01-01

    Metabolomics is the methodology that identifies and measures global pools of small molecules (of less than about 1,000 Da) of a biological sample, which are collectively called the metabolome. Metabolomics can therefore reveal the metabolic outcome of a genetic or environmental perturbation of a metabolic regulatory network, and thus provide insights into the structure and regulation of that network. Because of the chemical complexity of the metabolome and limitations associated with individual analytical platforms for determining the metabolome, it is currently difficult to capture the complete metabolome of an organism or tissue, which is in contrast to genomics and transcriptomics. This paper describes the analysis of Arabidopsis metabolomics data sets acquired by a consortium that includes five analytical laboratories, bioinformaticists, and biostatisticians, which aims to develop and validate metabolomics as a hypothesis-generating functional genomics tool. The consortium is determining the metabolomes of Arabidopsis T-DNA mutant stocks, grown in standardized controlled environment optimized to minimize environmental impacts on the metabolomes. Metabolomics data were generated with seven analytical platforms, and the combined data is being provided to the research community to formulate initial hypotheses about genes of unknown function (GUFs). A public database (www.PlantMetabolomics.org) has been developed to provide the scientific community with access to the data along with tools to allow for its interactive analysis. Exemplary datasets are discussed to validate the approach, which illustrate how initial hypotheses can be generated from the consortium-produced metabolomics data, integrated with prior knowledge to provide a testable hypothesis concerning the functionality of GUFs. PMID:22645570

  9. Using soil properties as a tool to differentiate landslide generations and constrain their ages - Rogowiec landslide, Sudetes (SW Poland)

    NASA Astrophysics Data System (ADS)

    Kacprzak, Andrzej; Migoń, Piotr

    2013-04-01

    profiles in the landslide body do not show evidence of protracted soil evolution under contemporary climate and hence, are interpreted as having been formed during a fraction of the Holocene. This implies a Holocene age of the landslide. In addition, an older shallow translational landslide has been recognized on the valley side, with the toe buried by the main Rogowiec landslide. The depletion area was identified through the occurrence of thin, truncated soils (compared to the neighbouring slopes). This and the occurrence of weakly horizonated and poorly structural soils in the landslide body itself suggest that this valley-side landslide is of the Holocene age too. Thus, soils proved a powerful tool to establish the relative chronology of landslides and give strong evidence of their Holocene age. Soil research is recommended as a part of landslide hazard and risk assessment for landslides of unknown age.

  10. Bioanalytical tools for the evaluation of organic micropollutants during sewage treatment, water recycling and drinking water generation.

    PubMed

    Macova, Miroslava; Toze, Simon; Hodgers, Leonie; Mueller, Jochen F; Bartkow, Michael; Escher, Beate I

    2011-08-01

    performed that allows direct comparison of different treatment technologies and covers several orders of magnitude of TEQ from highly contaminated sewage to drinking water with TEQ close or below the limit of detection. Detection limits of the bioassays were decreased in comparison to earlier studies by optimizing sample preparation and test protocols, and were comparable to or lower than the quantification limits of the routine chemical analysis, which allowed monitoring of the presence and removal of micropollutants post Barrier 2 and in drinking water. The results obtained by bioanalytical tools were reproducible, robust and consistent with previous studies assessing the effectiveness of the wastewater and advanced water treatment plants. The results of this study indicate that bioanalytical results expressed as TEQ are useful to assess removal efficiency of micropollutants throughout all treatment steps of water recycling.

  11. Mechanism of generation of large (Ti,Nb,V)(C,N)-type precipitates in H13 + Nb tool steel

    NASA Astrophysics Data System (ADS)

    Xie, You; Cheng, Guo-guang; Chen, Lie; Zhang, Yan-dong; Yan, Qing-zhong

    2016-11-01

    The characteristics and generation mechanism of (Ti,Nb,V)(C,N) precipitates larger than 2 μm in Nb-containing H13 bar steel were studied. The results show that two types of (Ti,Nb,V)(C,N) phases exist—a Ti-V-rich one and an Nb-rich one—in the form of single or complex precipitates. The sizes of the single Ti-V-rich (Ti,Nb,V)(C,N) precipitates are mostly within 5 to 10 μm, whereas the sizes of the single Nb-rich precipitates are mostly 2-5 μm. The complex precipitates are larger and contain an inner Ti-V-rich layer and an outer Nb-rich layer. The compositional distribution of (Ti,Nb,V)(C,N) is concentrated. The average composition of the single Ti-V-rich phase is (Ti0.511V0.356Nb0.133)(C x N y ), whereas that for the single Nb-rich phase is (Ti0.061V0.263Nb0.676)(C x N y ). The calculation results based on the Scheil-Gulliver model in the Thermo-Calc software combining with the thermal stability experiments show that the large phases precipitate during the solidification process. With the development of solidification, the Ti-V-rich phase precipitates first and becomes homogeneous during the subsequent temperature reduction and heat treatment processes. The Nb-rich phase appears later.

  12. Hepatic vein transit time of second-generation ultrasound contrast agent: new tool in the assessment of portal hypertension.

    PubMed

    Luisa, Siciliani; Vitale, Giovanna; Sorbo, Anna Rita; Maurizio, Pompili; Lodovico, Rapaccini Gian

    2017-03-01

    It has been demonstrated that Doppler waveform of the hepatic vein (normally triphasic) is transformed into a biphasic or monophasic waveform in cirrhotic patients. The compressive mechanism of liver tissue has been considered up till now the cause of this change. Moreover, cirrhotics show, after USCA injection, a much earlier HVTT due to intrahepatic shunts. Our aim was to prospectively evaluate the correlation between Doppler pattern of hepatic vein and HVTT of a second-generation USCA; we also correlated HVTT with the most common indexes of portal hypertension. We enrolled 38 participants: 33 cirrhotics and 5 healthy controls. Doppler shift signals were obtained from the right hepatic vein. To characterize the hepatic vein pattern, we used the hepatic vein waveform index (HVWI). This index becomes >1 with the appearance of the triphasic waveform. We recorded a clip from 20 s before to 2 min after a peripheral intravenous bolus injection of 2.4 ml of USCA (sulfur hexafluoride).The time employed by USCA to cross the liver from the hepatic artery and portal vein to the hepatic vein was defined as HA-HVTT and PV-HVTT, respectively. Cirrhotics with low HVWI showed an earlier transit time; participants with higher HVWI had a longer transit time (p < 0.001). HVTT was earlier as MELD, Child-Pugh score and spleen diameter increased. Patients with ascites and varices of large size had significantly shorter transit times. Abnormal hepatic vein Doppler waveform in cirrhotic patients could be due to intrahepatic shunts. HVTT could be useful in the non-invasive evaluation of portal hypertension.

  13. LSG: An External-Memory Tool to Compute String Graphs for Next-Generation Sequencing Data Assembly.

    PubMed

    Bonizzoni, Paola; Vedova, Gianluca Della; Pirola, Yuri; Previtali, Marco; Rizzi, Raffaella

    2016-03-01

    The large amount of short read data that has to be assembled in future applications, such as in metagenomics or cancer genomics, strongly motivates the investigation of disk-based approaches to index next-generation sequencing (NGS) data. Positive results in this direction stimulate the investigation of efficient external memory algorithms for de novo assembly from NGS data. Our article is also motivated by the open problem of designing a space-efficient algorithm to compute a string graph using an indexing procedure based on the Burrows-Wheeler transform (BWT). We have developed a disk-based algorithm for computing string graphs in external memory: the light string graph (LSG). LSG relies on a new representation of the FM-index that is exploited to use an amount of main memory requirement that is independent from the size of the data set. Moreover, we have developed a pipeline for genome assembly from NGS data that integrates LSG with the assembly step of SGA (Simpson and Durbin, 2012 ), a state-of-the-art string graph-based assembler, and uses BEETL for indexing the input data. LSG is open source software and is available online. We have analyzed our implementation on a 875-million read whole-genome dataset, on which LSG has built the string graph using only 1GB of main memory (reducing the memory occupation by a factor of 50 with respect to SGA), while requiring slightly more than twice the time than SGA. The analysis of the entire pipeline shows an important decrease in memory usage, while managing to have only a moderate increase in the running time.

  14. Constraint-Based Model of Shewanella oneidensis MR-1 Metabolism: A Tool for Data Analysis and Hypothesis Generation

    PubMed Central

    Hill, Eric A.; Geydebrekht, Oleg V.; De Ingeniis, Jessica; Zhang, Xiaolin; Osterman, Andrei; Scott, James H.; Reed, Samantha B.; Romine, Margaret F.; Konopka, Allan E.; Beliaev, Alexander S.; Fredrickson, Jim K.

    2010-01-01

    Shewanellae are gram-negative facultatively anaerobic metal-reducing bacteria commonly found in chemically (i.e., redox) stratified environments. Occupying such niches requires the ability to rapidly acclimate to changes in electron donor/acceptor type and availability; hence, the ability to compete and thrive in such environments must ultimately be reflected in the organization and utilization of electron transfer networks, as well as central and peripheral carbon metabolism. To understand how Shewanella oneidensis MR-1 utilizes its resources, the metabolic network was reconstructed. The resulting network consists of 774 reactions, 783 genes, and 634 unique metabolites and contains biosynthesis pathways for all cell constituents. Using constraint-based modeling, we investigated aerobic growth of S. oneidensis MR-1 on numerous carbon sources. To achieve this, we (i) used experimental data to formulate a biomass equation and estimate cellular ATP requirements, (ii) developed an approach to identify cycles (such as futile cycles and circulations), (iii) classified how reaction usage affects cellular growth, (iv) predicted cellular biomass yields on different carbon sources and compared model predictions to experimental measurements, and (v) used experimental results to refine metabolic fluxes for growth on lactate. The results revealed that aerobic lactate-grown cells of S. oneidensis MR-1 used less efficient enzymes to couple electron transport to proton motive force generation, and possibly operated at least one futile cycle involving malic enzymes. Several examples are provided whereby model predictions were validated by experimental data, in particular the role of serine hydroxymethyltransferase and glycine cleavage system in the metabolism of one-carbon units, and growth on different sources of carbon and energy. This work illustrates how integration of computational and experimental efforts facilitates the understanding of microbial metabolism at a systems

  15. Deciding for imperilled newborns: medical authority or parental autonomy?

    PubMed

    McHaffie, H E; Laing, I A; Parker, M; McMillan, J

    2001-04-01

    The ethical issues around decision making on behalf of infants have been illuminated by two empirical research studies carried out in Scotland. In-depth interviews with 176 medical and nursing staff and with 108 parents of babies for whom there was discussion of treatment withholding/withdrawal, generated a wealth of data on both the decision making process and the management of cases. Both staff and parents believe that parents should be involved in treatment limitation decisions on behalf of their babies. However, whilst many doctors and nurses consider the ultimate responsibility too great for families to carry, the majority of parents wish to be the final arbiters. We offer explanations for the differences in perception found in the two groups. The results of these empirical studies provide both aids to ethical reflection and guidance for clinicians dealing with these vulnerable families. They demonstrate the value of empirical data in the philosophical debate.

  16. Correlation-Based Network Generation, Visualization, and Analysis as a Powerful Tool in Biological Studies: A Case Study in Cancer Cell Metabolism

    PubMed Central

    Toubiana, David; Fait, Aaron

    2016-01-01

    In the last decade vast data sets are being generated in biological and medical studies. The challenge lies in their summary, complexity reduction, and interpretation. Correlation-based networks and graph-theory based properties of this type of networks can be successfully used during this process. However, the procedure has its pitfalls and requires specific knowledge that often lays beyond classical biology and includes many computational tools and software. Here we introduce one of a series of methods for correlation-based network generation and analysis using freely available software. The pipeline allows the user to control each step of the network generation and provides flexibility in selection of correlation methods and thresholds. The pipeline was implemented on published metabolomics data of a population of human breast carcinoma cell lines MDA-MB-231 under two conditions: normal and hypoxia. The analysis revealed significant differences between the metabolic networks in response to the tested conditions. The network under hypoxia had 1.7 times more significant correlations between metabolites, compared to normal conditions. Unique metabolic interactions were identified which could lead to the identification of improved markers or aid in elucidating the mechanism of regulation between distantly related metabolites induced by the cancer growth. PMID:27840831

  17. Deciding for imperilled newborns: medical authority or parental autonomy?

    PubMed Central

    McHaffie, H.; Laing, I.; Parker, M.; McMillan, J.

    2001-01-01

    The ethical issues around decision making on behalf of infants have been illuminated by two empirical research studies carried out in Scotland. In-depth interviews with 176 medical and nursing staff and with 108 parents of babies for whom there was discussion of treatment withholding/withdrawal, generated a wealth of data on both the decision making process and the management of cases. Both staff and parents believe that parents should be involved in treatment limitation decisions on behalf of their babies. However, whilst many doctors and nurses consider the ultimate responsibility too great for families to carry, the majority of parents wish to be the final arbiters. We offer explanations for the differences in perception found in the two groups. The results of these empirical studies provide both aids to ethical reflection and guidance for clinicians dealing with these vulnerable families. They demonstrate the value of empirical data in the philosophical debate. Key Words: Empirical ethics • treatment limitation • parental autonomy • decision making PMID:11314152

  18. Slow-oscillatory Transcranial Direct Current Stimulation Modulates Memory in Temporal Lobe Epilepsy by Altering Sleep Spindle Generators: A Possible Rehabilitation Tool.

    PubMed

    Del Felice, Alessandra; Magalini, Alessandra; Masiero, Stefano

    2015-01-01

    Temporal lobe epilepsy (TLE) is often associated with memory deficits. Given the putative role for sleep spindles memory consolidation, spindle generators skewed toward the affected lobe in TLE subjects may be a neurophysiological marker of defective memory. Slow-oscillatory transcranial direct current stimulation (sotDCS) during slow waves sleep (SWS) has previously been shown to enhance sleep-dependent memory consolidation by increasing slow-wave sleep and modulating sleep spindles. To test if anodal sotDCS over the affected TL prior to a nap affects sleep spindles and whether this improves memory consolidation. Randomized controlled cross-over study. 12 people with TLE underwent sotDCS (0.75 Hz; 0-250 μV, 30 min) or sham before daytime nap. Declarative verbal and visuospatial learning were tested. Fast and slow spindle signals were recorded by 256-channel EEG during sleep. In both study arms, electrical source imaging (ESI) localized cortical generators. Neuropsychological data were analyzed with general linear model statistics or the Kruskal-Wallis test (P or Z < 0.05), and neurophysiological data tested with the Mann-Whitney t test and binomial distribution test (P or Z < 0.05). An improvement in declarative (P = 0.05) and visuospatial memory performance (P = 0.048) emerged after sotDCS. SotDCS increased slow spindle generators current density (Z = 0.001), with a shift to the anterior cortical areas. Anodal sotDCS over the affected temporal lobe improves declarative and visuospatial memory performance by modulating slow sleep spindles cortical source generators. SotDCS appears a promising tool for memory rehabilitation in people with TLE. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Electric pulses: a flexible tool to manipulate cytosolic calcium concentrations and generate spontaneous-like calcium oscillations in mesenchymal stem cells

    PubMed Central

    de Menorval, Marie-Amelie; Andre, Franck M.; Silve, Aude; Dalmay, Claire; Français, Olivier; Le Pioufle, Bruno; Mir, Lluis M.

    2016-01-01

    Human adipose mesenchymal stem cells (haMSCs) are multipotent adult stem cells of great interest in regenerative medicine or oncology. They present spontaneous calcium oscillations related to cell cycle progression or differentiation but the correlation between these events is still unclear. Indeed, it is difficult to mimic haMSCs spontaneous calcium oscillations with chemical means. Pulsed electric fields (PEFs) can permeabilise plasma and/or organelles membranes depending on the applied pulses and therefore generate cytosolic calcium peaks by recruiting calcium from the external medium or from internal stores. We show that it is possible to mimic haMSCs spontaneous calcium oscillations (same amplitude, duration and shape) using 100 μs PEFs or 10 ns PEFs. We propose a model that explains the experimental situations reported. PEFs can therefore be a flexible tool to manipulate cytosolic calcium concentrations. This tool, that can be switched on and off instantaneously, contrary to chemicals agents, can be very useful to investigate the role of calcium oscillations in cell physiology and/or to manipulate cell fate. PMID:27561994

  20. Graphical contig analyzer for all sequencing platforms (G4ALL): a new stand-alone tool for finishing and draft generation of bacterial genomes.

    PubMed

    Ramos, Rommel Thiago Jucá; Carneiro, Adriana R; Caracciolo, Pablo H; Azevedo, Vasco; Schneider, Maria Paula C; Barh, Debmalya; Silva, Artur

    2013-01-01

    Genome assembly has always been complicated due to the inherent difficulties of sequencing technologies, as well the computational methods used to process sequences. Although many of the problems for the generation of contigs from reads are well known, especially those involving short reads, the orientation and ordination of contigs in the finishing stages is still very challenging and time consuming, as it requires the manual curation of the contigs to guarantee correct identification them and prevent misassembly. Due to the large numbers of sequences that are produced, especially from the reads produced by next generation sequencers, this process demands considerable manual effort, and there are few software options available to facilitate the process. To address this problem, we have developed the Graphic Contig Analyzer for All Sequencing Platforms (G4ALL): a stand-alone multi-user tool that facilitates the editing of the contigs produced in the assembly process. Besides providing information on the gene products contained in each contig, obtained through a search of the available biological databases, G4ALL produces a scaffold of the genome, based on the overlap of the contigs after curation. THE SOFTWARE IS AVAILABLE AT: http://www.genoma.ufpa.br/rramos/softwares/g4all.xhtml.

  1. Graphical contig analyzer for all sequencing platforms (G4ALL): a new stand-alone tool for finishing and draft generation of bacterial genomes

    PubMed Central

    Ramos, Rommel Thiago Jucá; Carneiro, Adriana R; Caracciolo, Pablo H; Azevedo, Vasco; Schneider, Maria Paula C; Barh, Debmalya; Silva, Artur

    2013-01-01

    Genome assembly has always been complicated due to the inherent difficulties of sequencing technologies, as well the computational methods used to process sequences. Although many of the problems for the generation of contigs from reads are well known, especially those involving short reads, the orientation and ordination of contigs in the finishing stages is still very challenging and time consuming, as it requires the manual curation of the contigs to guarantee correct identification them and prevent misassembly. Due to the large numbers of sequences that are produced, especially from the reads produced by next generation sequencers, this process demands considerable manual effort, and there are few software options available to facilitate the process. To address this problem, we have developed the Graphic Contig Analyzer for All Sequencing Platforms (G4ALL): a stand-alone multi-user tool that facilitates the editing of the contigs produced in the assembly process. Besides providing information on the gene products contained in each contig, obtained through a search of the available biological databases, G4ALL produces a scaffold of the genome, based on the overlap of the contigs after curation. Availability The software is available at: http://www.genoma.ufpa.br/rramos/softwares/g4all.xhtml PMID:23888102

  2. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a contract opportunity for SDVO competition exists. ...

  3. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a contract opportunity for SDVO competition exists....

  4. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a contract opportunity for SDVO competition exists....

  5. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a contract opportunity for SDVO competition exists....

  6. Vaginal Birth After Cesarean Delivery: Deciding on a Trial of Labor After a Cesarean Delivery (TOLAC)

    MedlinePlus

    ... ASKED QUESTIONS FAQ070 LABOR, DELIVERY, AND POSTPARTUM CARE Vaginal Birth After Cesarean Delivery: Deciding on a Trial of Labor After Cesarean Delivery • What is a vaginal birth after cesarean delivery ( VBAC) ? • What is a ...

  7. DRACO-STEM: An Automatic Tool to Generate High-Quality 3D Meshes of Shoot Apical Meristem Tissue at Cell Resolution.

    PubMed

    Cerutti, Guillaume; Ali, Olivier; Godin, Christophe

    2017-01-01

    Context: The shoot apical meristem (SAM), origin of all aerial organs of the plant, is a restricted niche of stem cells whose growth is regulated by a complex network of genetic, hormonal and mechanical interactions. Studying the development of this area at cell level using 3D microscopy time-lapse imaging is a newly emerging key to understand the processes controlling plant morphogenesis. Computational models have been proposed to simulate those mechanisms, however their validation on real-life data is an essential step that requires an adequate representation of the growing tissue to be carried out. Achievements: The tool we introduce is a two-stage computational pipeline that generates a complete 3D triangular mesh of the tissue volume based on a segmented tissue image stack. DRACO (Dual Reconstruction by Adjacency Complex Optimization) is designed to retrieve the underlying 3D topological structure of the tissue and compute its dual geometry, while STEM (SAM Tissue Enhanced Mesh) returns a faithful triangular mesh optimized along several quality criteria (intrinsic quality, tissue reconstruction, visual adequacy). Quantitative evaluation tools measuring the performance of the method along those different dimensions are also provided. The resulting meshes can be used as input and validation for biomechanical simulations. Availability: DRACO-STEM is supplied as a package of the open-source multi-platform plant modeling library OpenAlea (http://openalea.github.io/) implemented in Python, and is freely distributed on GitHub (https://github.com/VirtualPlants/draco-stem) along with guidelines for installation and use.

  8. DRACO-STEM: An Automatic Tool to Generate High-Quality 3D Meshes of Shoot Apical Meristem Tissue at Cell Resolution

    PubMed Central

    Cerutti, Guillaume; Ali, Olivier; Godin, Christophe

    2017-01-01

    Context: The shoot apical meristem (SAM), origin of all aerial organs of the plant, is a restricted niche of stem cells whose growth is regulated by a complex network of genetic, hormonal and mechanical interactions. Studying the development of this area at cell level using 3D microscopy time-lapse imaging is a newly emerging key to understand the processes controlling plant morphogenesis. Computational models have been proposed to simulate those mechanisms, however their validation on real-life data is an essential step that requires an adequate representation of the growing tissue to be carried out. Achievements: The tool we introduce is a two-stage computational pipeline that generates a complete 3D triangular mesh of the tissue volume based on a segmented tissue image stack. DRACO (Dual Reconstruction by Adjacency Complex Optimization) is designed to retrieve the underlying 3D topological structure of the tissue and compute its dual geometry, while STEM (SAM Tissue Enhanced Mesh) returns a faithful triangular mesh optimized along several quality criteria (intrinsic quality, tissue reconstruction, visual adequacy). Quantitative evaluation tools measuring the performance of the method along those different dimensions are also provided. The resulting meshes can be used as input and validation for biomechanical simulations. Availability: DRACO-STEM is supplied as a package of the open-source multi-platform plant modeling library OpenAlea (http://openalea.github.io/) implemented in Python, and is freely distributed on GitHub (https://github.com/VirtualPlants/draco-stem) along with guidelines for installation and use. PMID:28424704

  9. PyBetVH: A Python tool for probabilistic volcanic hazard assessment and for generation of Bayesian hazard curves and maps

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary

    2015-06-01

    PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.

  10. FluxSuite: a New Scientific Tool for Advanced Network Management and Cross-Sharing of Next-Generation Flux Stations

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.

    2015-12-01

    Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from

  11. Modeling of low-temperature plasmas generated using laser-induced breakdown spectroscopy: the ChemCam diagnostic tool on the Mars Science Laboratory Rover

    NASA Astrophysics Data System (ADS)

    Colgan, James

    2016-05-01

    We report on efforts to model the low-temperature plasmas generated using laser-induced breakdown spectroscopy (LIBS). LIBS is a minimally invasive technique that can quickly and efficiently determine the elemental composition of a target and is employed in an extremely wide range of applications due to its ease of use and fast turnaround. In particular, LIBS is the diagnostic tool used by the ChemCam instrument on the Mars Science Laboratory rover Curiosity. In this talk, we report on the use of the Los Alamos plasma modeling code ATOMIC to simulate LIBS plasmas, which are typically at temperatures of order 1 eV and electron densities of order 10 16 - 17 cm-3. At such conditions, these plasmas are usually in local-thermodynamic equilibrium (LTE) and normally contain neutral and singly ionized species only, which then requires that modeling must use accurate atomic structure data for the element under investigation. Since LIBS devices are often employed in a very wide range of applications, it is therefore desirable to have accurate data for most of the elements in the periodic table, ideally including actinides. Here, we discuss some recent applications of our modeling using ATOMIC that have explored the plasma physics aspects of LIBS generated plasmas, and in particular discuss the modeling of a plasma formed from a basalt sample used as a ChemCam standard1. We also highlight some of the more general atomic physics challenges that are encountered when attempting to model low-temperature plasmas. The Los Alamos National Laboratory is operated by Los Alamos National Security, LLC for the National Nuclear Security Administration of the U.S. Department of Energy under Contract No. DE-AC5206NA25396. Work performed in conjunction with D. P. Kilcrease, H. M. Johns, E. J. Judge, J. E. Barefield, R. C. Wiens, S. M. Clegg.

  12. New generation pharmacogenomic tools: a SNP linkage disequilibrium Map, validated SNP assay resource, and high-throughput instrumentation system for large-scale genetic studies.

    PubMed

    De La Vega, Francisco M; Dailey, David; Ziegle, Janet; Williams, Julie; Madden, Dawn; Gilbert, Dennis A

    2002-06-01

    Since public and private efforts announced the first draft of the human genome last year, researchers have reported great numbers of single nucleotide polymorphisms (SNPs). We believe that the availability of well-mapped, quality SNP markers constitutes the gateway to a revolution in genetics and personalized medicine that will lead to better diagnosis and treatment of common complex disorders. A new generation of tools and public SNP resources for pharmacogenomic and genetic studies--specifically for candidate-gene, candidate-region, and whole-genome association studies--will form part of the new scientific landscape. This will only be possible through the greater accessibility of SNP resources and superior high-throughput instrumentation-assay systems that enable affordable, highly productive large-scale genetic studies. We are contributing to this effort by developing a high-quality linkage disequilibrium SNP marker map and an accompanying set of ready-to-use, validated SNP assays across every gene in the human genome. This effort incorporates both the public sequence and SNP data sources, and Celera Genomics' human genome assembly and enormous resource ofphysically mapped SNPs (approximately 4,000,000 unique records). This article discusses our approach and methodology for designing the map, choosing quality SNPs, designing and validating these assays, and obtaining population frequency ofthe polymorphisms. We also discuss an advanced, high-performance SNP assay chemisty--a new generation of the TaqMan probe-based, 5' nuclease assay-and high-throughput instrumentation-software system for large-scale genotyping. We provide the new SNP map and validation information, validated SNP assays and reagents, and instrumentation systems as a novel resource for genetic discoveries.

  13. Generating "Random" Integers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2011-01-01

    One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…

  14. The Clinical Next‐Generation Sequencing Database: A Tool for the Unified Management of Clinical Information and Genetic Variants to Accelerate Variant Pathogenicity Classification

    PubMed Central

    Nishio, Shin‐ya

    2017-01-01

    ABSTRACT Recent advances in next‐generation sequencing (NGS) have given rise to new challenges due to the difficulties in variant pathogenicity interpretation and large dataset management, including many kinds of public population databases as well as public or commercial disease‐specific databases. Here, we report a new database development tool, named the “Clinical NGS Database,” for improving clinical NGS workflow through the unified management of variant information and clinical information. This database software offers a two‐feature approach to variant pathogenicity classification. The first of these approaches is a phenotype similarity‐based approach. This database allows the easy comparison of the detailed phenotype of each patient with the average phenotype of the same gene mutation at the variant or gene level. It is also possible to browse patients with the same gene mutation quickly. The other approach is a statistical approach to variant pathogenicity classification based on the use of the odds ratio for comparisons between the case and the control for each inheritance mode (families with apparently autosomal dominant inheritance vs. control, and families with apparently autosomal recessive inheritance vs. control). A number of case studies are also presented to illustrate the utility of this database. PMID:28008688

  15. The Clinical Next-Generation Sequencing Database: A Tool for the Unified Management of Clinical Information and Genetic Variants to Accelerate Variant Pathogenicity Classification.

    PubMed

    Nishio, Shin-Ya; Usami, Shin-Ichi

    2017-03-01

    Recent advances in next-generation sequencing (NGS) have given rise to new challenges due to the difficulties in variant pathogenicity interpretation and large dataset management, including many kinds of public population databases as well as public or commercial disease-specific databases. Here, we report a new database development tool, named the "Clinical NGS Database," for improving clinical NGS workflow through the unified management of variant information and clinical information. This database software offers a two-feature approach to variant pathogenicity classification. The first of these approaches is a phenotype similarity-based approach. This database allows the easy comparison of the detailed phenotype of each patient with the average phenotype of the same gene mutation at the variant or gene level. It is also possible to browse patients with the same gene mutation quickly. The other approach is a statistical approach to variant pathogenicity classification based on the use of the odds ratio for comparisons between the case and the control for each inheritance mode (families with apparently autosomal dominant inheritance vs. control, and families with apparently autosomal recessive inheritance vs. control). A number of case studies are also presented to illustrate the utility of this database.

  16. A feasibility randomised controlled trial of the DECIDE intervention: dementia carers making informed decisions

    PubMed Central

    Lord, Kathryn; Livingston, Gill

    2017-01-01

    Summary Family carers report high levels of decisional conflict when deciding whether their relative with dementia can continue to be cared for in their own home. We tested, in a feasibility randomised controlled trial, the first decision aid (the DECIDE manual) aiming to reduce such conflict. Twenty family carers received the DECIDE intervention, and 21 received usual treatment. The intervention group had reduced decisional conflict compared with controls (mean difference −11.96, 95% confidence interval −20.10 to −3.83, P=0.005). All carers receiving the intervention completed and valued it, despite some still reporting difficulties with family conflict and problems negotiating services. Declaration of interest None. Copyright and usage © The Royal College of Psychiatrists 2017. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) license. PMID:28243460

  17. A preconsultation web-based tool to generate an agenda for discussion in diabetes outpatient clinics to improve patient outcomes (DIAT): a feasibility study

    PubMed Central

    Vaidya, Bijay; Frost, Julia; Anderson, Rob; Argyle, Catherine; Daly, Mark; Harris-Golesworthy, Faith; Harris, Jim; Gibson, Andy; Ingram, Wendy; Pinkney, Jon; Vickery, Jane; Britten, Nicky

    2017-01-01

    Objective To test the feasibility of running a randomised controlled trial of a preconsultation web-based intervention (Presenting Asking Checking Expressing (PACE-D)) to improve the quality of care and clinical outcomes in patients with diabetes. Design and setting A feasibility study (with randomisation) conducted at outpatient diabetes clinics at two secondary care hospitals in Devon, UK. Participants People with diabetes (type 1 and type 2) attending secondary care general diabetes outpatient clinics. Intervention The PACE-D, a web-based tool adapted for patients with diabetes to use before their consultation to generate an agenda of topics to discuss with their diabetologist. Outcomes The percentage of eligible patients who were recruited and the percentage of participants for whom routine glycosylated haemoglobin (HbA1c) data (the putative primary outcome) could be extracted from medical notes and who completed secondary outcome assessments via questionnaire at follow-up were reported. Results In contrast with the planned recruitment of 120 participants, only 71 participants were randomised during the 7-month recruitment period. This comprised 18.7% (95% CI 14.9% to 23.0%) of those who were eligible. Mean (SD) age of the participants was 56.5 (12.4) years and 66.2% had type 1 diabetes. Thirty-eight patients were randomised to the intervention arm and 33 to the control arm. HbA1c data were available for only 73% (95% CI 61% to 83%) of participants at the 6 months follow-up. The questionnaire-based data were collected for 66% (95% CI 54% to 77%) of the participants at 6 months follow-up. Participants reported that the PACE-D tool was easy to use. Conclusions A randomised controlled trial of the preconsultation web-based intervention as set out in our current protocol is not feasible without significant modification to improve recruitment and follow-up of participants. The study also provides insights into the feasibility and challenges of conducting complex

  18. Examining the impact of larval source management and insecticide-treated nets using a spatial agent-based model of Anopheles gambiae and a landscape generator tool.

    PubMed

    Arifin, S M Niaz; Madey, Gregory R; Collins, Frank H

    2013-08-21

    Agent-based models (ABMs) have been used to estimate the effects of malaria-control interventions. Early studies have shown the efficacy of larval source management (LSM) and insecticide-treated nets (ITNs) as vector-control interventions, applied both in isolation and in combination. However, the robustness of results can be affected by several important modelling assumptions, including the type of boundary used for landscapes, and the number of replicated simulation runs reported in results. Selection of the ITN coverage definition may also affect the predictive findings. Hence, by replication, independent verification of prior findings of published models bears special importance. A spatially-explicit entomological ABM of Anopheles gambiae is used to simulate the resource-seeking process of mosquitoes in grid-based landscapes. To explore LSM and replicate results of an earlier LSM study, the original landscapes and scenarios are replicated by using a landscape generator tool, and 1,800 replicated simulations are run using absorbing and non-absorbing boundaries. To explore ITNs and evaluate the relative impacts of the different ITN coverage schemes, the settings of an earlier ITN study are replicated, the coverage schemes are defined and simulated, and 9,000 replicated simulations for three ITN parameters (coverage, repellence and mortality) are run. To evaluate LSM and ITNs in combination, landscapes with varying densities of houses and human populations are generated, and 12,000 simulations are run. General agreement with an earlier LSM study is observed when an absorbing boundary is used. However, using a non-absorbing boundary produces significantly different results, which may be attributed to the unrealistic killing effect of an absorbing boundary. Abundance cannot be completely suppressed by removing aquatic habitats within 300 m of houses. Also, with density-dependent oviposition, removal of insufficient number of aquatic habitats may prove counter

  19. Examining the impact of larval source management and insecticide-treated nets using a spatial agent-based model of Anopheles gambiae and a landscape generator tool

    PubMed Central

    2013-01-01

    Background Agent-based models (ABMs) have been used to estimate the effects of malaria-control interventions. Early studies have shown the efficacy of larval source management (LSM) and insecticide-treated nets (ITNs) as vector-control interventions, applied both in isolation and in combination. However, the robustness of results can be affected by several important modelling assumptions, including the type of boundary used for landscapes, and the number of replicated simulation runs reported in results. Selection of the ITN coverage definition may also affect the predictive findings. Hence, by replication, independent verification of prior findings of published models bears special importance. Methods A spatially-explicit entomological ABM of Anopheles gambiae is used to simulate the resource-seeking process of mosquitoes in grid-based landscapes. To explore LSM and replicate results of an earlier LSM study, the original landscapes and scenarios are replicated by using a landscape generator tool, and 1,800 replicated simulations are run using absorbing and non-absorbing boundaries. To explore ITNs and evaluate the relative impacts of the different ITN coverage schemes, the settings of an earlier ITN study are replicated, the coverage schemes are defined and simulated, and 9,000 replicated simulations for three ITN parameters (coverage, repellence and mortality) are run. To evaluate LSM and ITNs in combination, landscapes with varying densities of houses and human populations are generated, and 12,000 simulations are run. Results General agreement with an earlier LSM study is observed when an absorbing boundary is used. However, using a non-absorbing boundary produces significantly different results, which may be attributed to the unrealistic killing effect of an absorbing boundary. Abundance cannot be completely suppressed by removing aquatic habitats within 300 m of houses. Also, with density-dependent oviposition, removal of insufficient number of aquatic

  20. Analysis of the Vaginal Microbiome by Next-Generation Sequencing and Evaluation of its Performance as a Clinical Diagnostic Tool in Vaginitis.

    PubMed

    Hong, Ki Ho; Hong, Sung Kuk; Cho, Sung Im; Ra, Eunkyung; Han, Kyung Hee; Kang, Soon Beom; Kim, Eui Chong; Park, Sung Sup; Seong, Moon Woo

    2016-09-01

    Next-generation sequencing (NGS) can detect many more microorganisms of a microbiome than traditional methods. This study aimed to analyze the vaginal microbiomes of Korean women by using NGS that included bacteria and other microorganisms. The NGS results were compared with the results of other assays, and NGS was evaluated for its feasibility for predicting vaginitis. In total, 89 vaginal swab specimens were collected. Microscopic examinations of Gram staining and microbiological cultures were conducted on 67 specimens. NGS was performed with GS junior system on all of the vaginal specimens for the 16S rRNA, internal transcribed spacer (ITS), and Tvk genes to detect bacteria, fungi, and Trichomonas vaginalis. In addition, DNA probe assays of the Candida spp., Gardnerella vaginalis, and Trichomonas vaginalis were performed. Various predictors of diversity that were obtained from the NGS data were analyzed to predict vaginitis. ITS sequences were obtained in most of the specimens (56.2%). The compositions of the intermediate and vaginitis Nugent score groups were similar to each other but differed from the composition of the normal score group. The fraction of the Lactobacillus spp. showed the highest area under the curve value (0.8559) in ROC curve analysis. The NGS and DNA probe assay results showed good agreement (range, 86.2-89.7%). Fungi as well as bacteria should be considered for the investigation of vaginal microbiome. The intermediate and vaginitis Nugent score groups were indistinguishable in NGS. NGS is a promising diagnostic tool of the vaginal microbiome and vaginitis, although some problems need to be resolved.