Sample records for broadly applicable tools

  1. Broad-range PCR: past, present, or future of bacteriology?

    PubMed

    Renvoisé, A; Brossier, F; Sougakoff, W; Jarlier, V; Aubry, A

    2013-08-01

    PCR targeting the gene encoding 16S ribosomal RNA (commonly named broad-range PCR or 16S PCR) has been used for 20 years as a polyvalent tool to study prokaryotes. Broad-range PCR was first used as a taxonomic tool, then in clinical microbiology. We will describe the use of broad-range PCR in clinical microbiology. The first application was identification of bacterial strains obtained by culture but whose phenotypic or proteomic identification remained difficult or impossible. This changed bacterial taxonomy and allowed discovering many new species. The second application of broad-range PCR in clinical microbiology is the detection of bacterial DNA from clinical samples; we will review the clinical settings in which the technique proved useful (such as endocarditis) and those in which it did not (such as characterization of bacteria in ascites, in cirrhotic patients). This technique allowed identifying the etiological agents for several diseases, such as Whipple disease. This review is a synthesis of data concerning the applications, assets, and drawbacks of broad-range PCR in clinical microbiology. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  2. SWAT Check: A screening tool to assist users in the identification of potential model application problems

    USDA-ARS?s Scientific Manuscript database

    The Soil and Water Assessment Tool (SWAT) is a basin scale hydrologic model developed by the US Department of Agriculture-Agricultural Research Service. SWAT's broad applicability, user friendly model interfaces, and automatic calibration software have led to a rapid increase in the number of new u...

  3. Gas-phase broadband spectroscopy using active sources: progress, status, and applications

    PubMed Central

    Cossel, Kevin C.; Waxman, Eleanor M.; Finneran, Ian A.; Blake, Geoffrey A.; Ye, Jun; Newbury, Nathan R.

    2017-01-01

    Broadband spectroscopy is an invaluable tool for measuring multiple gas-phase species simultaneously. In this work we review basic techniques, implementations, and current applications for broadband spectroscopy. We discuss components of broad-band spectroscopy including light sources, absorption cells, and detection methods and then discuss specific combinations of these components in commonly-used techniques. We finish this review by discussing potential future advances in techniques and applications of broad-band spectroscopy. PMID:28630530

  4. REMOTE SENSING FOR BIOENGINEERED CROPS

    EPA Science Inventory

    Increasing interest in the responsible management of technology in the industrial and agricultural sectors of the economy has been met through the development of broadly applicable tools to assess the "sustainability" of new technologies. An arena ripe for application of such ana...

  5. SUSTAINABILITY OF INSECT RESISTANCE MANAGEMENT STRATEGIES FOR TRANSGENIC BT CORN

    EPA Science Inventory

    Increasing interest in the responsible management of technology in the industrial and agricultural sectors of the economy has been met through the development of broadly applicable tools to assess the "sustainability" of new technologies. An arena ripe for application of such ana...

  6. ANALYSIS OF INSECT RESISTANCE MANAGEMENT OPTIONS FOR TRANSGENIC BT CORN,

    EPA Science Inventory

    Increasing interest in the responsible management of technology in the industrial and agricultural sectors of the economy has been met through the development of broadly applicable tools to assess the "sustainability" of new technologies. An arena ripe for application of such ana...

  7. SATURN (Situational Awareness Tool for Urban Responder Networks)

    DTIC Science & Technology

    2012-07-01

    timeline. SATURN is applicable to a broad set of law enforcement, security, and counterterrorism missions typically addressed by urban responders...Keywords-video analytics; sensor fusion; video; urban responders I. INTRODUCTION Urban authorities have a broad set of missions . Duties vary in...both the frequency of occurrence and in the complexity of execution. They include everyday public safety missions such as traffic enforcement as

  8. Applications of Adaptive Quantum Control to Research Questions in Solar Energy Conversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damrauer, Niels

    2017-02-07

    This award supported a broad research effort at the University of Colorado at Boulder comprising synthesis, applications of computational chemistry, development of theory, exploration of material properties, and advancement of spectroscopic tools including femtosecond pulse shaping techniques. It funded six graduate students and two postdoctoral researchers.

  9. Phylogenetic Reconstruction as a Broadly Applicable Teaching Tool in the Biology Classroom: The Value of Data in Estimating Likely Answers

    ERIC Educational Resources Information Center

    Julius, Matthew L.; Schoenfuss, Heiko L.

    2006-01-01

    This laboratory exercise introduces students to a fundamental tool in evolutionary biology--phylogenetic inference. Students are required to create a data set via observation and through mining preexisting data sets. These student data sets are then used to develop and compare competing hypotheses of vertebrate phylogeny. The exercise uses readily…

  10. Approaches to Fungal Genome Annotation

    PubMed Central

    Haas, Brian J.; Zeng, Qiandong; Pearson, Matthew D.; Cuomo, Christina A.; Wortman, Jennifer R.

    2011-01-01

    Fungal genome annotation is the starting point for analysis of genome content. This generally involves the application of diverse methods to identify features on a genome assembly such as protein-coding and non-coding genes, repeats and transposable elements, and pseudogenes. Here we describe tools and methods leveraged for eukaryotic genome annotation with a focus on the annotation of fungal nuclear and mitochondrial genomes. We highlight the application of the latest technologies and tools to improve the quality of predicted gene sets. The Broad Institute eukaryotic genome annotation pipeline is described as one example of how such methods and tools are integrated into a sequencing center’s production genome annotation environment. PMID:22059117

  11. A bacterial type III secretion-based protein delivery tool for broad applications in cell biology.

    PubMed

    Ittig, Simon J; Schmutz, Christoph; Kasper, Christoph A; Amstutz, Marlise; Schmidt, Alexander; Sauteur, Loïc; Vigano, M Alessandra; Low, Shyan Huey; Affolter, Markus; Cornelis, Guy R; Nigg, Erich A; Arrieumerlou, Cécile

    2015-11-23

    Methods enabling the delivery of proteins into eukaryotic cells are essential to address protein functions. Here we propose broad applications to cell biology for a protein delivery tool based on bacterial type III secretion (T3S). We show that bacterial, viral, and human proteins, fused to the N-terminal fragment of the Yersinia enterocolitica T3S substrate YopE, are effectively delivered into target cells in a fast and controllable manner via the injectisome of extracellular bacteria. This method enables functional interaction studies by the simultaneous injection of multiple proteins and allows the targeting of proteins to different subcellular locations by use of nanobody-fusion proteins. After delivery, proteins can be freed from the YopE fragment by a T3S-translocated viral protease or fusion to ubiquitin and cleavage by endogenous ubiquitin proteases. Finally, we show that this delivery tool is suitable to inject proteins in living animals and combine it with phosphoproteomics to characterize the systems-level impact of proapoptotic human truncated BID on the cellular network. © 2015 Ittig et al.

  12. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  13. UPIC + GO: Zeroing in on informative markers

    USDA-ARS?s Scientific Manuscript database

    Microsatellites/SSRs (simple sequence repeats) have become a powerful tool in genomic biology because of their broad range of applications and availability. An efficient method recently developed to generate microsatellite-enriched libraries used in combination with high throughput DNA pyrosequencin...

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voigt, Christopher

    SEED2014 focused on advances in the science and technology emerging from the field of synthetic biology. We broadly define this as technologies that accelerate the process of genetic engineering. It highlighted new tool development, as well as the application of these tools to diverse problems in biotechnology, including therapeutics, industrial chemicals and fuels, natural products, and agriculture. Systems spanned from in vitro experiments and viruses, through diverse bacteria, to eukaryotes (yeast, mammalian cells, plants).

  15. Broad-scale assessments of ecological landscapes: developing methods and applications

    USGS Publications Warehouse

    Carr, Natasha B.; Wood, David J. A.; Bowen, Zachary H.; Haby, Travis S.

    2015-01-01

    A major component of the BLM Landscape Approach is the Rapid Ecoregional Assessment (REA) program. REAs identify important ecosystems and wildlife habitats at broad spatial scales and determine where these resources are at risk from environmental stressors that can affect the integrity of ecological systems. Building on the lessons learned from completed or current REAs, the BLM, in partnership with the U.S. Geological Survey, will perform systematic comparisons of REA methods to identify the most promising suite of landscape-level analysis tools. In addition, the BLM and USGS will develop practical applications that demonstrate how to incorporate assessment information to address existing management issues, such as cumulative effects of proposed management actions. The outcome of these efforts will be a set of comprehensive technical guidance documents for conducting and applying broad-scale assessments.

  16. Improving exposure assessment in environmental epidemiology: Application of spatio-temporal visualization tools

    NASA Astrophysics Data System (ADS)

    Meliker, Jaymie R.; Slotnick, Melissa J.; Avruskin, Gillian A.; Kaufmann, Andrew; Jacquez, Geoffrey M.; Nriagu, Jerome O.

    2005-05-01

    A thorough assessment of human exposure to environmental agents should incorporate mobility patterns and temporal changes in human behaviors and concentrations of contaminants; yet the temporal dimension is often under-emphasized in exposure assessment endeavors, due in part to insufficient tools for visualizing and examining temporal datasets. Spatio-temporal visualization tools are valuable for integrating a temporal component, thus allowing for examination of continuous exposure histories in environmental epidemiologic investigations. An application of these tools to a bladder cancer case-control study in Michigan illustrates continuous exposure life-lines and maps that display smooth, continuous changes over time. Preliminary results suggest increased risk of bladder cancer from combined exposure to arsenic in drinking water (>25 μg/day) and heavy smoking (>30 cigarettes/day) in the 1970s and 1980s, and a possible cancer cluster around automotive, paint, and organic chemical industries in the early 1970s. These tools have broad application for examining spatially- and temporally-specific relationships between exposures to environmental risk factors and disease.

  17. Internal Flow

    NASA Astrophysics Data System (ADS)

    Greitzer, E. M.; Tan, C. S.; Graf, M. B.

    2004-06-01

    Focusing on phenomena important in implementing the performance of a broad range of fluid devices, this work describes the behavior of internal flows encountered in propulsion systems, fluid machinery (compressors, turbines, and pumps) and ducts (diffusers, nozzles and combustion chambers). The book equips students and practicing engineers with a range of new analytical tools. These tools offer enhanced interpretation and application of both experimental measurements and the computational procedures that characterize modern fluids engineering.

  18. DiscoverySpace: an interactive data analysis application

    PubMed Central

    Robertson, Neil; Oveisi-Fordorei, Mehrdad; Zuyderduyn, Scott D; Varhol, Richard J; Fjell, Christopher; Marra, Marco; Jones, Steven; Siddiqui, Asim

    2007-01-01

    DiscoverySpace is a graphical application for bioinformatics data analysis. Users can seamlessly traverse references between biological databases and draw together annotations in an intuitive tabular interface. Datasets can be compared using a suite of novel tools to aid in the identification of significant patterns. DiscoverySpace is of broad utility and its particular strength is in the analysis of serial analysis of gene expression (SAGE) data. The application is freely available online. PMID:17210078

  19. Map reading tools for map libraries.

    USGS Publications Warehouse

    Greenberg, G.L.

    1982-01-01

    Engineers, navigators and military strategists employ a broad array of mechanical devices to facilitate map use. A larger number of map users such as educators, students, tourists, journalists, historians, politicians, economists and librarians are unaware of the available variety of tools which can be used with maps to increase the speed and efficiency of their application and interpretation. This paper identifies map reading tools such as coordinate readers, protractors, dividers, planimeters, and symbol-templets according to a functional classification. Particularly, arrays of tools are suggested for use in determining position, direction, distance, area and form (perimeter-shape-pattern-relief). -from Author

  20. BASTet: Shareable and Reproducible Analysis and Visualization of Mass Spectrometry Imaging Data via OpenMSI.

    PubMed

    Rubel, Oliver; Bowen, Benjamin P

    2018-01-01

    Mass spectrometry imaging (MSI) is a transformative imaging method that supports the untargeted, quantitative measurement of the chemical composition and spatial heterogeneity of complex samples with broad applications in life sciences, bioenergy, and health. While MSI data can be routinely collected, its broad application is currently limited by the lack of easily accessible analysis methods that can process data of the size, volume, diversity, and complexity generated by MSI experiments. The development and application of cutting-edge analytical methods is a core driver in MSI research for new scientific discoveries, medical diagnostics, and commercial-innovation. However, the lack of means to share, apply, and reproduce analyses hinders the broad application, validation, and use of novel MSI analysis methods. To address this central challenge, we introduce the Berkeley Analysis and Storage Toolkit (BASTet), a novel framework for shareable and reproducible data analysis that supports standardized data and analysis interfaces, integrated data storage, data provenance, workflow management, and a broad set of integrated tools. Based on BASTet, we describe the extension of the OpenMSI mass spectrometry imaging science gateway to enable web-based sharing, reuse, analysis, and visualization of data analyses and derived data products. We demonstrate the application of BASTet and OpenMSI in practice to identify and compare characteristic substructures in the mouse brain based on their chemical composition measured via MSI.

  1. The Use of Life Cycle Tools to Support Decision Making for Sustainable Nanotechnologies

    EPA Science Inventory

    Nanotechnology is a broad-impact technology with applications ranging from materials and electronics to analytical methods and metrology. The many benefits that can be realized through the utilization of nanotechnology are intended to lead to an improved quality of life. However,...

  2. Instructional Software and Attention Disorders: A Tool for Teachers.

    ERIC Educational Resources Information Center

    Bice, Joe E.; And Others

    This handbook provides information on 31 software programs designed to instruct students with attention disorders in individual and group settings. The most successful applications of instructional software are identified, and six broad categories of instructional software are discussed. Twenty-one strategies for teaching students with attention…

  3. The soil health tool - theory and initial broad-scale application

    USDA-ARS?s Scientific Manuscript database

    Soil health has traditionally been judged in terms of production; however, it recently has gained a wider focus with a global audience, as soil condition is becoming an environmental quality, human health, and political issue. A crucial initial step in evaluating soil health is properly assessing t...

  4. Optical trapping

    PubMed Central

    Neuman, Keir C.; Block, Steven M.

    2006-01-01

    Since their invention just over 20 years ago, optical traps have emerged as a powerful tool with broad-reaching applications in biology and physics. Capabilities have evolved from simple manipulation to the application of calibrated forces on—and the measurement of nanometer-level displacements of—optically trapped objects. We review progress in the development of optical trapping apparatus, including instrument design considerations, position detection schemes and calibration techniques, with an emphasis on recent advances. We conclude with a brief summary of innovative optical trapping configurations and applications. PMID:16878180

  5. Modular HPC I/O characterization with Darshan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, Shane; Carns, Philip; Harms, Kevin

    2016-11-13

    Contemporary high-performance computing (HPC) applications encompass a broad range of distinct I/O strategies and are often executed on a number of different compute platforms in their lifetime. These large-scale HPC platforms employ increasingly complex I/O subsystems to provide a suitable level of I/O performance to applications. Tuning I/O workloads for such a system is nontrivial, and the results generally are not portable to other HPC systems. I/O profiling tools can help to address this challenge, but most existing tools only instrument specific components within the I/O subsystem that provide a limited perspective on I/O performance. The increasing diversity of scientificmore » applications and computing platforms calls for greater flexibililty and scope in I/O characterization.« less

  6. The quantitative theory of within-host viral evolution

    NASA Astrophysics Data System (ADS)

    Rouzine, Igor M.; Weinberger, Leor S.

    2013-01-01

    During the 1990s, a group of virologists and physicists began development of a quantitative theory to explain the rapid evolution of human immunodeficiency virus type 1 (HIV-1). This theory also proved to be instrumental in understanding the rapid emergence of drug resistance in patients. Over the past two decades, this theory expanded to account for a broad array of factors important to viral evolution and propelled development of a generalized theory applicable to a broad range of asexual and partly sexual populations with many evolving sites. Here, we discuss the conceptual and theoretical tools developed to calculate the speed and other parameters of evolution, with a particular focus on the concept of ‘clonal interference’ and its applications to untreated patients.

  7. Blurring the Lines: Leveraging Internet Technology for Successful Blending of Secondary/Post-Secondary Technical Education

    ERIC Educational Resources Information Center

    Ryan, Kenneth; Kopischke, Kevin

    2008-01-01

    The Remote Automation Management Platform (RAMP) is a real-time, interactive teaching tool which leverages common off-the-shelf internet technologies to provide high school learners extraordinary access to advanced technical education opportunities. This outreach paradigm is applicable to a broad range of advanced technical skills from automation…

  8. The Curriculum Workshop: A Place for Deliberative Inquiry and Teacher Professional Learning

    ERIC Educational Resources Information Center

    Hansen, Klaus-Henning

    2008-01-01

    In this article, the curriculum workshop (CW) is elaborated as an approach to professional learning, deliberation and inquiry. It offers a comprehensive framework for school-based deliberation and inquiry, is rooted in curriculum theory, promises a broad range of applications in teacher education and provides tools to assess the trustworthiness of…

  9. Theme-Based Tests: Teaching in Context

    ERIC Educational Resources Information Center

    Anderson, Gretchen L.; Heck, Marsha L.

    2005-01-01

    Theme-based tests provide an assessment tool that instructs as well as provides a single general context for a broad set of biochemical concepts. A single story line connects the questions on the tests and models applications of scientific principles and biochemical knowledge in an extended scenario. Theme-based tests are based on a set of…

  10. A review and evaluation of numerical tools for fractional calculus and fractional order controls

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Liu, Lu; Dehghan, Sina; Chen, YangQuan; Xue, Dingyü

    2017-06-01

    In recent years, as fractional calculus becomes more and more broadly used in research across different academic disciplines, there are increasing demands for the numerical tools for the computation of fractional integration/differentiation, and the simulation of fractional order systems. Time to time, being asked about which tool is suitable for a specific application, the authors decide to carry out this survey to present recapitulative information of the available tools in the literature, in hope of benefiting researchers with different academic backgrounds. With this motivation, the present article collects the scattered tools into a dashboard view, briefly introduces their usage and algorithms, evaluates the accuracy, compares the performance, and provides informative comments for selection.

  11. Photography applications

    USGS Publications Warehouse

    Cochran, Susan A.; Goodman, James A.; Purkis, Samuel J.; Phinn, Stuart R.

    2013-01-01

    Photographic imaging is the oldest form of remote sensing used in coral reef studies. This chapter briefly explores the history of photography from the 1850s to the present, and delves into its application for coral reef research. The investigation focuses on both photographs collected from low-altitude fixed-wing and rotary aircraft, and those collected from space by astronauts. Different types of classification and analysis techniques are discussed, and several case studies are presented as examples of the broad use of photographs as a tool in coral reef research.

  12. Simulation of Etching in Chlorine Discharges Using an Integrated Feature Evolution-Plasma Model

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Bose, Deepak; Govindan, T. R.; Meyyappan, M.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    To better utilize its vast collection of heterogeneous resources that are geographically distributed across the United States, NASA is constructing a computational grid called the Information Power Grid (IPG). This paper describes various tools and techniques that we are developing to measure and improve the performance of a broad class of NASA applications when run on the IPG. In particular, we are investigating the areas of grid benchmarking, grid monitoring, user-level application scheduling, and decentralized system-level scheduling.

  13. Frontiers of two-dimensional correlation spectroscopy. Part 2. Perturbation methods, fields of applications, and types of analytical probes

    NASA Astrophysics Data System (ADS)

    Noda, Isao

    2014-07-01

    Noteworthy experimental practices, which are advancing forward the frontiers of the field of two-dimensional (2D) correlation spectroscopy, are reviewed with the focus on various perturbation methods currently practiced to induce spectral changes, pertinent examples of applications in various fields, and types of analytical probes employed. Types of perturbation methods found in the published literature are very diverse, encompassing both dynamic and static effects. Although a sizable portion of publications report the use of dynamic perturbatuions, much greater number of studies employ static effect, especially that of temperature. Fields of applications covered by the literature are also very broad, ranging from fundamental research to practical applications in a number of physical, chemical and biological systems, such as synthetic polymers, composites and biomolecules. Aside from IR spectroscopy, which is the most commonly used tool, many other analytical probes are used in 2D correlation analysis. The ever expanding trend in depth, breadth and versatility of 2D correlation spectroscopy techniques and their broad applications all point to the robust and healthy state of the field.

  14. Reliability analysis in the Office of Safety, Environmental, and Mission Assurance (OSEMA)

    NASA Astrophysics Data System (ADS)

    Kauffmann, Paul J.

    1994-12-01

    The technical personnel in the SEMA office are working to provide the highest degree of value-added activities to their support of the NASA Langley Research Center mission. Management perceives that reliability analysis tools and an understanding of a comprehensive systems approach to reliability will be a foundation of this change process. Since the office is involved in a broad range of activities supporting space mission projects and operating activities (such as wind tunnels and facilities), it was not clear what reliability tools the office should be familiar with and how these tools could serve as a flexible knowledge base for organizational growth. Interviews and discussions with the office personnel (both technicians and engineers) revealed that job responsibilities ranged from incoming inspection to component or system analysis to safety and risk. It was apparent that a broad base in applied probability and reliability along with tools for practical application was required by the office. A series of ten class sessions with a duration of two hours each was organized and scheduled. Hand-out materials were developed and practical examples based on the type of work performed by the office personnel were included. Topics covered were: Reliability Systems - a broad system oriented approach to reliability; Probability Distributions - discrete and continuous distributions; Sampling and Confidence Intervals - random sampling and sampling plans; Data Analysis and Estimation - Model selection and parameter estimates; and Reliability Tools - block diagrams, fault trees, event trees, FMEA. In the future, this information will be used to review and assess existing equipment and processes from a reliability system perspective. An analysis of incoming materials sampling plans was also completed. This study looked at the issues associated with Mil Std 105 and changes for a zero defect acceptance sampling plan.

  15. Reliability analysis in the Office of Safety, Environmental, and Mission Assurance (OSEMA)

    NASA Technical Reports Server (NTRS)

    Kauffmann, Paul J.

    1994-01-01

    The technical personnel in the SEMA office are working to provide the highest degree of value-added activities to their support of the NASA Langley Research Center mission. Management perceives that reliability analysis tools and an understanding of a comprehensive systems approach to reliability will be a foundation of this change process. Since the office is involved in a broad range of activities supporting space mission projects and operating activities (such as wind tunnels and facilities), it was not clear what reliability tools the office should be familiar with and how these tools could serve as a flexible knowledge base for organizational growth. Interviews and discussions with the office personnel (both technicians and engineers) revealed that job responsibilities ranged from incoming inspection to component or system analysis to safety and risk. It was apparent that a broad base in applied probability and reliability along with tools for practical application was required by the office. A series of ten class sessions with a duration of two hours each was organized and scheduled. Hand-out materials were developed and practical examples based on the type of work performed by the office personnel were included. Topics covered were: Reliability Systems - a broad system oriented approach to reliability; Probability Distributions - discrete and continuous distributions; Sampling and Confidence Intervals - random sampling and sampling plans; Data Analysis and Estimation - Model selection and parameter estimates; and Reliability Tools - block diagrams, fault trees, event trees, FMEA. In the future, this information will be used to review and assess existing equipment and processes from a reliability system perspective. An analysis of incoming materials sampling plans was also completed. This study looked at the issues associated with Mil Std 105 and changes for a zero defect acceptance sampling plan.

  16. API REST Web service and backend system Of Lecturer’s Assessment Information System on Politeknik Negeri Bali

    NASA Astrophysics Data System (ADS)

    Manuaba, I. B. P.; Rudiastini, E.

    2018-01-01

    Assessment of lecturers is a tool used to measure lecturer performance. Lecturer’s assessment variable can be measured from three aspects : teaching activities, research and community service. Broad aspect to measure the performance of lecturers requires a special framework, so that the system can be developed in a sustainable manner. Issues of this research is to create a API web service data tool, so the lecturer assessment system can be developed in various frameworks. The research was developed with web service and php programming language with the output of json extension data. The conclusion of this research is API web service data application can be developed using several platforms such as web, mobile application

  17. A putative mesenchymal stem cells population isolated from adult human testes.

    PubMed

    Gonzalez, R; Griparic, L; Vargas, V; Burgee, K; Santacruz, P; Anderson, R; Schiewe, M; Silva, F; Patel, A

    2009-08-07

    Mesenchymal stem cells (MSCs) isolated from several adult human tissues are reported to be a promising tool for regenerative medicine. In order to broaden the array of tools for therapeutic application, we isolated a new population of cells from adult human testis termed gonadal stem cells (GSCs). GSCs express CD105, CD166, CD73, CD90, STRO-1 and lack hematopoietic markers CD34, CD45, and HLA-DR which are characteristic identifiers of MSCs. In addition, GSCs express pluripotent markers Oct4, Nanog, and SSEA-4. GSCs propagated for at least 64 population doublings and exhibited clonogenic capability. GSCs have a broad plasticity and the potential to differentiate into adipogenic, osteogenic, and chondrogenic cells. These studies demonstrate that GSCs are easily obtainable stem cells, have growth kinetics and marker expression similar to MSCs, and differentiate into mesodermal lineage cells. Therefore, GSCs may be a valuable tool for therapeutic applications.

  18. CRISPR-Cas9; an efficient tool for precise plant genome editing.

    PubMed

    Islam, Waqar

    2018-06-01

    Efficient plant genome editing is dependent upon induction of double stranded DNA breaks (DSBs) through site specified nucleases. These DSBs initiate the process of DNA repair which can either base upon homologous recombination (HR) or non-homologous end jointing (NHEJ). Recently, CRISPR-Cas9 mechanism got highlighted as revolutionizing genetic tool due to its simpler frame work along with the broad range of adaptability and applications. So, in this review, I have tried to sum up the application of this biotechnological tool in plant genome editing. Furthermore, I have tried to explain successful adaptation of CRISPR in various plant species where it is used for the successful generation of stable mutations in a steadily growing number of species through NHEJ. The review also sheds light upon other biotechnological approaches relying upon single DNA lesion induction such as genomic deletion or pair wise nickases for evasion of offsite effects. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Incorporating unnatural amino acids to engineer biocatalysts for industrial bioprocess applications.

    PubMed

    Ravikumar, Yuvaraj; Nadarajan, Saravanan Prabhu; Hyeon Yoo, Tae; Lee, Chong-Soon; Yun, Hyungdon

    2015-12-01

    The bioprocess engineering with biocatalysts broadly spans its development and actual application of enzymes in an industrial context. Recently, both the use of bioprocess engineering and the development and employment of enzyme engineering techniques have been increasing rapidly. Importantly, engineering techniques that incorporate unnatural amino acids (UAAs) in vivo has begun to produce enzymes with greater stability and altered catalytic properties. Despite the growth of this technique, its potential value in bioprocess applications remains to be fully exploited. In this review, we explore the methodologies involved in UAA incorporation as well as ways to synthesize these UAAs. In addition, we summarize recent efforts to increase the yield of UAA engineered proteins in Escherichia coli and also the application of this tool in enzyme engineering. Furthermore, this protein engineering tool based on the incorporation of UAA can be used to develop immobilized enzymes that are ideal for bioprocess applications. Considering the potential of this tool and by exploiting these engineered enzymes, we expect the field of bioprocess engineering to open up new opportunities for biocatalysis in the near future. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Spectral detection of near-surface moisture content and water-table position in northern peatland ecosystems

    Treesearch

    Karl M. Meingast; Michael J. Falkowski; Evan S. Kane; Lynette R. Potvin; Brian W. Benscoter; Alistair M.S. Smith; Laura L. Bourgeau-Chavez; Mary Ellen Miller

    2014-01-01

    Wildland fire occurrence has been increasing in peatland ecosystems during recent decades. As such, there is a need for broadly applicable tools to detect and monitor controls on combustion such as surface peat moisture and water-table position. A field portable spectroradiometer was used to measure surface reflectance of two Sphagnum moss-dominated...

  1. Cooperative problem solving with personal mobile information tools in hospitals.

    PubMed

    Buchauer, A; Werner, R; Haux, R

    1998-01-01

    Health-care professionals have a broad range of needs for information and cooperation while working at different points of care (e.g., outpatient departments, wards, and functional units such as operating theaters). Patient-related data and medical knowledge have to be widely available to support high-quality patient care. Furthermore, due to the increased specialization of health-care professionals, efficient collaboration is required. Personal mobile information tools have a considerable potential to realize almost ubiquitous information and collaborative support. They enable to unite the functionality of conventional tools such as paper forms, dictating machines, and pagers into one tool. Moreover, they can extend the support already provided by clinical workstations. An approach is described for the integration of mobile information tools with heterogeneous hospital information systems. This approach includes identification of functions which should be provided on mobile tools. Major functions are the presentation of medical records and reports, electronic mailing to support interpersonal communication, and the provision of editors for structured clinical documentation. To realize those functions on mobile tools, we propose a document-based client-server architecture that enables mobile information tools to interoperate with existing computer-based application systems. Open application systems and powerful, partially wireless, hospital-wide networks are the prerequisites for the introduction of mobile information tools.

  2. Version 1.00 programmer`s tools used in constructing the INEL RML/analytical radiochemistry sample tracking database and its user interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Femec, D.A.

    This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.

  3. National Aeronautics and Space Administration (NASA) Earth Science Research for Energy Management. Part 1; Overview of Energy Issues and an Assessment of the Potential for Application of NASA Earth Science Research

    NASA Technical Reports Server (NTRS)

    Zell, E.; Engel-Cox, J.

    2005-01-01

    Effective management of energy resources is critical for the U.S. economy, the environment, and, more broadly, for sustainable development and alleviating poverty worldwide. The scope of energy management is broad, ranging from energy production and end use to emissions monitoring and mitigation and long-term planning. Given the extensive NASA Earth science research on energy and related weather and climate-related parameters, and rapidly advancing energy technologies and applications, there is great potential for increased application of NASA Earth science research to selected energy management issues and decision support tools. The NASA Energy Management Program Element is already involved in a number of projects applying NASA Earth science research to energy management issues, with a focus on solar and wind renewable energy and developing interests in energy modeling, short-term load forecasting, energy efficient building design, and biomass production.

  4. The development and validation of a meta-tool for quality appraisal of public health evidence: Meta Quality Appraisal Tool (MetaQAT).

    PubMed

    Rosella, L; Bowman, C; Pach, B; Morgan, S; Fitzpatrick, T; Goel, V

    2016-07-01

    Most quality appraisal tools were developed for clinical medicine and tend to be study-specific with a strong emphasis on risk of bias. In order to be more relevant to public health, an appropriate quality appraisal tool needs to be less reliant on the evidence hierarchy and consider practice applicability. Given the broad range of study designs used in public health, the objective of this study was to develop and validate a meta-tool that combines public health-focused principles of appraisal coupled with a set of design-specific companion tools. Several design methods were used to develop and validate the tool including literature review, synthesis, and validation with a reference standard. A search of critical appraisal tools relevant to public health was conducted; core concepts were collated. The resulting framework was piloted during three feedback sessions with public health practitioners. Following subsequent revisions, the final meta-tool, the Meta Quality Appraisal Tool (MetaQAT), was then validated through a content analysis of appraisals conducted by two groups of experienced public health researchers (MetaQAT vs generic appraisal form). The MetaQAT framework consists of four domains: relevancy, reliability, validity, and applicability. In addition, a companion tool was assembled from existing critical appraisal tools to provide study design-specific guidance on validity appraisal. Content analysis showed similar methodological and generalizability concerns were raised by both groups; however, the MetaQAT appraisers commented more extensively on applicability to public health practice. Critical appraisal tools designed for clinical medicine have limitations for use in the context of public health. The meta-tool structure of the MetaQAT allows for rigorous appraisal, while allowing users to simultaneously appraise the multitude of study designs relevant to public health research and assess non-standard domains, such as applicability. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Device and method for generating a beam of acoustic energy from a borehole, and applications thereof

    DOEpatents

    Vu, Cung Khac; Sinha, Dipen N; Pantea, Cristian; Nihei, Kurt T; Schmitt, Denis P; Skelt, Christopher

    2013-10-01

    In some aspects of the invention, a method of generating a beam of acoustic energy in a borehole is disclosed. The method includes generating a first broad-band acoustic pulse at a first broad-band frequency range having a first central frequency and a first bandwidth spread; generating a second broad-band acoustic pulse at a second broad-band frequency range different than the first frequency range having a second central frequency and a second bandwidth spread, wherein the first acoustic pulse and second acoustic pulse are generated by at least one transducer arranged on a tool located within the borehole; and transmitting the first and the second broad-band acoustic pulses into an acoustically non-linear medium, wherein the composition of the non-linear medium produces a collimated pulse by a non-linear mixing of the first and second acoustic pulses, wherein the collimated pulse has a frequency equal to the difference in frequencies between the first central frequency and the second central frequency and a bandwidth spread equal to the sum of the first bandwidth spread and the second bandwidth spread.

  6. Localized Overheating Phenomena and Optimization of Spark-Plasma Sintering Tooling Design

    PubMed Central

    Giuntini, Diletta; Olevsky, Eugene A.; Garcia-Cardona, Cristina; Maximenko, Andrey L.; Yurlova, Maria S.; Haines, Christopher D.; Martin, Darold G.; Kapoor, Deepak

    2013-01-01

    The present paper shows the application of a three-dimensional coupled electrical, thermal, mechanical finite element macro-scale modeling framework of Spark Plasma Sintering (SPS) to an actual problem of SPS tooling overheating, encountered during SPS experimentation. The overheating phenomenon is analyzed by varying the geometry of the tooling that exhibits the problem, namely by modeling various tooling configurations involving sequences of disk-shape spacers with step-wise increasing radii. The analysis is conducted by means of finite element simulations, intended to obtain temperature spatial distributions in the graphite press-forms, including punches, dies, and spacers; to identify the temperature peaks and their respective timing, and to propose a more suitable SPS tooling configuration with the avoidance of the overheating as a final aim. Electric currents-based Joule heating, heat transfer, mechanical conditions, and densification are imbedded in the model, utilizing the finite-element software COMSOL™, which possesses a distinguishing ability of coupling multiple physics. Thereby the implementation of a finite element method applicable to a broad range of SPS procedures is carried out, together with the more specific optimization of the SPS tooling design when dealing with excessive heating phenomena. PMID:28811398

  7. Femtosecond Lasers and Corneal Surgical Procedures.

    PubMed

    Marino, Gustavo K; Santhiago, Marcony R; Wilson, Steven E

    2017-01-01

    Our purpose is to present a broad review about the principles, early history, evolution, applications, and complications of femtosecond lasers used in refractive and nonrefractive corneal surgical procedures. Femtosecond laser technology added not only safety, precision, and reproducibility to established corneal surgical procedures such as laser in situ keratomileusis (LASIK) and astigmatic keratotomy, but it also introduced new promising concepts such as the intrastromal lenticule procedures with refractive lenticule extraction (ReLEx). Over time, the refinements in laser optics and the overall design of femtosecond laser platforms led to it becoming an essential tool for corneal surgeons. In conclusion, femtosecond laser is a heavily utilized tool in refractive and nonrefractive corneal surgical procedures, and further technological advances are likely to expand its applications. Copyright 2017 Asia-Pacific Academy of Ophthalmology.

  8. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    PubMed

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  9. Application of CFD in aeronautics at NASA Ames Research Center

    NASA Astrophysics Data System (ADS)

    Maksymiuk, Catherine M.; Enomoto, Francis Y.; Vandalsem, William R.

    1995-03-01

    The role of Computational Fluid Dynamics (CFD) at Ames Research Center has expanded to address a broad range of aeronautical problems, including wind tunnel support, flight test support, design, and analysis. Balancing the requirements of each new problem against the available resources - software, hardware, time, and expertise - is critical to the effective use of CFD. Several case studies of recent applications highlight the depth of CFD capability at Ames, the tradeoffs involved in various approaches, and lessons learned in the use of CFD as an engineering tool.

  10. Scalable data management, analysis and visualization (SDAV) Institute. Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk

    The purpose of the SDAV institute is to provide tools and expertise in scientific data management, analysis, and visualization to DOE’s application scientists. Our goal is to actively work with application teams to assist them in achieving breakthrough science, and to provide technical solutions in the data management, analysis, and visualization regimes that are broadly used by the computational science community. Over the last 5 years members of our institute worked directly with application scientists and DOE leadership-class facilities to assist them by applying the best tools and technologies at our disposal. We also enhanced our tools based on inputmore » from scientists on their needs. Many of the applications we have been working with are based on connections with scientists established in previous years. However, we contacted additional scientists though our outreach activities, as well as engaging application teams running on leading DOE computing systems. Our approach is to employ an evolutionary development and deployment process: first considering the application of existing tools, followed by the customization necessary for each particular application, and then the deployment in real frameworks and infrastructures. The institute is organized into three areas, each with area leaders, who keep track of progress, engagement of application scientists, and results. The areas are: (1) Data Management, (2) Data Analysis, and (3) Visualization. Kitware has been involved in the Visualization area. This report covers Kitware’s contributions over the last 5 years (February 2012 – February 2017). For details on the work performed by the SDAV institute as a whole, please see the SDAV final report.« less

  11. Software automation tools for increased throughput metabolic soft-spot identification in early drug discovery.

    PubMed

    Zelesky, Veronica; Schneider, Richard; Janiszewski, John; Zamora, Ismael; Ferguson, James; Troutman, Matthew

    2013-05-01

    The ability to supplement high-throughput metabolic clearance data with structural information defining the site of metabolism should allow design teams to streamline their synthetic decisions. However, broad application of metabolite identification in early drug discovery has been limited, largely due to the time required for data review and structural assignment. The advent of mass defect filtering and its application toward metabolite scouting paved the way for the development of software automation tools capable of rapidly identifying drug-related material in complex biological matrices. Two semi-automated commercial software applications, MetabolitePilot™ and Mass-MetaSite™, were evaluated to assess the relative speed and accuracy of structural assignments using data generated on a high-resolution MS platform. Review of these applications has demonstrated their utility in providing accurate results in a time-efficient manner, leading to acceleration of metabolite identification initiatives while highlighting the continued need for biotransformation expertise in the interpretation of more complex metabolic reactions.

  12. Model-based reasoning in the physics laboratory: Framework and initial results

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-12-01

    [This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.

  13. Research-tool patents: issues for health in the developing world.

    PubMed Central

    Barton, John H.

    2002-01-01

    The patent system is now reaching into the tools of medical research, including gene sequences themselves. Many of the new patents can potentially preempt large areas of medical research and lay down legal barriers to the development of a broad category of products. Researchers must therefore consider redesigning their research to avoid use of patented techniques, or expending the effort to obtain licences from those who hold the patents. Even if total licence fees can be kept low, there are enormous negotiation costs, and one "hold-out" may be enough to lead to project cancellation. This is making it more difficult to conduct research within the developed world, and poses important questions for the future of medical research for the benefit of the developing world. Probably the most important implication for health in the developing world is the possible general slowing down and complication of medical research. To the extent that these patents do slow down research, they weaken the contribution of the global research community to the creation and application of medical technology for the benefit of developing nations. The patents may also complicate the granting of concessional prices to developing nations - for pharmaceutical firms that seek to offer a concessional price may have to negotiate arrangements with research-tool firms, which may lose royalties as a result. Three kinds of response are plausible. One is to develop a broad or global licence to permit the patented technologies to be used for important applications in the developing world. The second is to change technical patent law doctrines. Such changes could be implemented in developed and developing nations and could be quite helpful while remaining consistent with TRIPS. The third is to negotiate specific licence arrangements, under which specific research tools are used on an agreed basis for specific applications. These negotiations are difficult and expensive, requiring both scientific and legal skills. But they will be an unavoidable part of international medical research. PMID:11953790

  14. Applications of yeast surface display for protein engineering

    PubMed Central

    Cherf, Gerald M.; Cochran, Jennifer R.

    2015-01-01

    The method of displaying recombinant proteins on the surface of Saccharomyces cerevisiae via genetic fusion to an abundant cell wall protein, a technology known as yeast surface display, or simply, yeast display, has become a valuable protein engineering tool for a broad spectrum of biotechnology and biomedical applications. This review focuses on the use of yeast display for engineering protein affinity, stability, and enzymatic activity. Strategies and examples for each protein engineering goal are discussed. Additional applications of yeast display are also briefly presented, including protein epitope mapping, identification of protein-protein interactions, and uses of displayed proteins in industry and medicine. PMID:26060074

  15. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parametersmore » applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.« less

  16. Effects of Field and Job Oriented Technical Retraining on Manpower Utilization of the Unemployed. Vocational-Industrial Education Research Report. Final Report.

    ERIC Educational Resources Information Center

    Bjorkquist, David C.

    A job-oriented program emphasizing application to the specific occupation of tool design was compared with a field-oriented program intended to give a broad basic preparation for a variety of jobs in the field of mechanical technology. Both programs were conducted under the Manpower Development and Training Act (MDTA) for a period of 52 weeks.…

  17. Micro-ultrasound for preclinical imaging

    PubMed Central

    Foster, F. Stuart; Hossack, John; Adamson, S. Lee

    2011-01-01

    Over the past decade, non-invasive preclinical imaging has emerged as an important tool to facilitate biomedical discovery. Not only have the markets for these tools accelerated, but the numbers of peer-reviewed papers in which imaging end points and biomarkers have been used have grown dramatically. High frequency ‘micro-ultrasound’ has steadily evolved in the post-genomic era as a rapid, comparatively inexpensive imaging tool for studying normal development and models of human disease in small animals. One of the fundamental barriers to this development was the technological hurdle associated with high-frequency array transducers. Recently, new approaches have enabled the upper limits of linear and phased arrays to be pushed from about 20 to over 50 MHz enabling a broad range of new applications. The innovations leading to the new transducer technology and scanner architecture are reviewed. Applications of preclinical micro-ultrasound are explored for developmental biology, cancer, and cardiovascular disease. With respect to the future, the latest developments in high-frequency ultrasound imaging are described. PMID:22866232

  18. CEOS visualization environment (COVE) tool for intercalibration of satellite instruments

    USGS Publications Warehouse

    Kessler, P.D.; Killough, B.D.; Gowda, S.; Williams, B.R.; Chander, G.; Qu, Min

    2013-01-01

    Increasingly, data from multiple instruments are used to gain a more complete understanding of land surface processes at a variety of scales. Intercalibration, comparison, and coordination of satellite instrument coverage areas is a critical effort of international and domestic space agencies and organizations. The Committee on Earth Observation Satellites Visualization Environment (COVE) is a suite of browser-based applications that leverage Google Earth to display past, present, and future satellite instrument coverage areas and coincident calibration opportunities. This forecasting and ground coverage analysis and visualization capability greatly benefits the remote sensing calibration community in preparation for multisatellite ground calibration campaigns or individual satellite calibration studies. COVE has been developed for use by a broad international community to improve the efficiency and efficacy of such calibration planning efforts, whether those efforts require past, present, or future predictions. This paper provides a brief overview of the COVE tool, its validation, accuracies, and limitations with emphasis on the applicability of this visualization tool for supporting ground field campaigns and intercalibration of satellite instruments.

  19. CEOS Visualization Environment (COVE) Tool for Intercalibration of Satellite Instruments

    NASA Technical Reports Server (NTRS)

    Kessler, Paul D.; Killough, Brian D.; Gowda, Sanjay; Williams, Brian R.; Chander, Gyanesh; Qu, Min

    2013-01-01

    Increasingly, data from multiple instruments are used to gain a more complete understanding of land surface processes at a variety of scales. Intercalibration, comparison, and coordination of satellite instrument coverage areas is a critical effort of space agencies and of international and domestic organizations. The Committee on Earth Observation Satellites Visualization Environment (COVE) is a suite of browser-based applications that leverage Google Earth to display past, present, and future satellite instrument coverage areas and coincident calibration opportunities. This forecasting and ground coverage analysis and visualization capability greatly benefits the remote sensing calibration community in preparation for multisatellite ground calibration campaigns or individual satellite calibration studies. COVE has been developed for use by a broad international community to improve the efficiency and efficacy of such calibration efforts. This paper provides a brief overview of the COVE tool, its validation, accuracies and limitations with emphasis on the applicability of this visualization tool for supporting ground field campaigns and intercalibration of satellite instruments.

  20. Analyzing Human-Landscape Interactions: Tools That Integrate

    NASA Astrophysics Data System (ADS)

    Zvoleff, Alex; An, Li

    2014-01-01

    Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.

  1. Printing Technologies for Medical Applications.

    PubMed

    Shafiee, Ashkan; Atala, Anthony

    2016-03-01

    Over the past 15 years, printers have been increasingly utilized for biomedical applications in various areas of medicine and tissue engineering. This review discusses the current and future applications of 3D bioprinting. Several 3D printing tools with broad applications from surgical planning to 3D models are being created, such as liver replicas and intermediate splints. Numerous researchers are exploring this technique to pattern cells or fabricate several different tissues and organs, such as blood vessels or cardiac patches. Current investigations in bioprinting applications are yielding further advances. As one of the fastest areas of industry expansion, 3D additive manufacturing will change techniques across biomedical applications, from research and testing models to surgical planning, device manufacturing, and tissue or organ replacement. Copyright © 2016. Published by Elsevier Ltd.

  2. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  3. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Astrophysics Data System (ADS)

    Cull, R. C.; Eltimsahy, A. H.

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  4. Collaborative medical informatics research using the Internet and the World Wide Web.

    PubMed Central

    Shortliffe, E. H.; Barnett, G. O.; Cimino, J. J.; Greenes, R. A.; Huff, S. M.; Patel, V. L.

    1996-01-01

    The InterMed Collaboratory is an interdisciplinary project involving six participating medical institutions. There are two broad mandates for the effort. The first is to further the development, sharing, and demonstration of numerous software and system components, data sets, procedures and tools that will facilitate the collaborations and support the application goals of these projects. The second is to provide a distributed suite of clinical applications, guidelines, and knowledge-bases for clinical, educational, and administrative purposes. To define the interactions among the components, datasets, procedures, and tools that we are producing and sharing, we have identified a model composed of seven tiers, each of which supports the levels above it. In this paper we briefly describe those tiers and the nature of the collaborative process with which we have experimented. PMID:8947641

  5. Understanding and managing fish populations: keeping the toolbox fit for purpose.

    PubMed

    Paris, J R; Sherman, K D; Bell, E; Boulenger, C; Delord, C; El-Mahdi, M B M; Fairfield, E A; Griffiths, A M; Gutmann Roberts, C; Hedger, R D; Holman, L E; Hooper, L H; Humphries, N E; Katsiadaki, I; King, R A; Lemopoulos, A; Payne, C J; Peirson, G; Richter, K K; Taylor, M I; Trueman, C N; Hayden, B; Stevens, J R

    2018-03-01

    Wild fish populations are currently experiencing unprecedented pressures, which are projected to intensify in the coming decades. Developing a thorough understanding of the influences of both biotic and abiotic factors on fish populations is a salient issue in contemporary fish conservation and management. During the 50th Anniversary Symposium of The Fisheries Society of the British Isles at the University of Exeter, UK, in July 2017, scientists from diverse research backgrounds gathered to discuss key topics under the broad umbrella of 'Understanding Fish Populations'. Below, the output of one such discussion group is detailed, focusing on tools used to investigate natural fish populations. Five main groups of approaches were identified: tagging and telemetry; molecular tools; survey tools; statistical and modelling tools; tissue analyses. The appraisal covered current challenges and potential solutions for each of these topics. In addition, three key themes were identified as applicable across all tool-based applications. These included data management, public engagement, and fisheries policy and governance. The continued innovation of tools and capacity to integrate interdisciplinary approaches into the future assessment and management of fish populations is highlighted as an important focus for the next 50 years of fisheries research. © 2018 The Fisheries Society of the British Isles.

  6. Helping solve Georgia's water problems - the USGS Cooperative Water Program

    USGS Publications Warehouse

    Clarke, John S.

    2006-01-01

    The U.S. Geological Survey (USGS) addresses a wide variety of water issues in the State of Georgia through the Cooperative Water Program (CWP). As the primary Federal science agency for water-resource information, the USGS monitors the quantity and quality of water in the Nation's rivers and aquifers, assesses the sources and fate of contaminants in aquatic systems, collects and analyzes data on aquatic ecosystems, develops tools to improve the application of hydrologic information, and ensures that its information and tools are available to all potential users. This broad, diverse mission cannot be accomplished effectively without the contributions of the CWP.

  7. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    PubMed Central

    Grüning, Björn A.; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu). PMID:24109552

  8. A Toolbox of Metrology-Based Techniques for Optical System Alignment

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Casto, Gordon V.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; hide

    2016-01-01

    The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a toolbox format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.

  9. A Toolbox of Metrology-Based Techniques for Optical System Alignment

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; Hetherington, Samuel E.; hide

    2016-01-01

    The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a "toolbox" format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.

  10. iVirus: facilitating new insights in viral ecology with software and community data sets imbedded in a cyberinfrastructure.

    PubMed

    Bolduc, Benjamin; Youens-Clark, Ken; Roux, Simon; Hurwitz, Bonnie L; Sullivan, Matthew B

    2017-01-01

    Microbes affect nutrient and energy transformations throughout the world's ecosystems, yet they do so under viral constraints. In complex communities, viral metagenome (virome) sequencing is transforming our ability to quantify viral diversity and impacts. Although some bottlenecks, for example, few reference genomes and nonquantitative viromics, have been overcome, the void of centralized data sets and specialized tools now prevents viromics from being broadly applied to answer fundamental ecological questions. Here we present iVirus, a community resource that leverages the CyVerse cyberinfrastructure to provide access to viromic tools and data sets. The iVirus Data Commons contains both raw and processed data from 1866 samples and 73 projects derived from global ocean expeditions, as well as existing and legacy public repositories. Through the CyVerse Discovery Environment, users can interrogate these data sets using existing analytical tools (software applications known as 'Apps') for assembly, open reading frame prediction and annotation, as well as several new Apps specifically developed for analyzing viromes. Because Apps are web based and powered by CyVerse supercomputing resources, they enable scalable analyses for a broad user base. Finally, a use-case scenario documents how to apply these advances toward new data. This growing iVirus resource should help researchers utilize viromics as yet another tool to elucidate viral roles in nature.

  11. iVirus: facilitating new insights in viral ecology with software and community data sets imbedded in a cyberinfrastructure

    PubMed Central

    Bolduc, Benjamin; Youens-Clark, Ken; Roux, Simon; Hurwitz, Bonnie L; Sullivan, Matthew B

    2017-01-01

    Microbes affect nutrient and energy transformations throughout the world's ecosystems, yet they do so under viral constraints. In complex communities, viral metagenome (virome) sequencing is transforming our ability to quantify viral diversity and impacts. Although some bottlenecks, for example, few reference genomes and nonquantitative viromics, have been overcome, the void of centralized data sets and specialized tools now prevents viromics from being broadly applied to answer fundamental ecological questions. Here we present iVirus, a community resource that leverages the CyVerse cyberinfrastructure to provide access to viromic tools and data sets. The iVirus Data Commons contains both raw and processed data from 1866 samples and 73 projects derived from global ocean expeditions, as well as existing and legacy public repositories. Through the CyVerse Discovery Environment, users can interrogate these data sets using existing analytical tools (software applications known as ‘Apps') for assembly, open reading frame prediction and annotation, as well as several new Apps specifically developed for analyzing viromes. Because Apps are web based and powered by CyVerse supercomputing resources, they enable scalable analyses for a broad user base. Finally, a use-case scenario documents how to apply these advances toward new data. This growing iVirus resource should help researchers utilize viromics as yet another tool to elucidate viral roles in nature. PMID:27420028

  12. The Hyperwall

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A. (Technical Monitor); Sandstrom, Timothy A.; Henze, Chris; Levit, Creon

    2003-01-01

    This paper presents the hyperwall, a visualization cluster that uses coordinated visualizations for interactive exploration of multidimensional data and simulations. The system strongly leverages the human eye-brain system with a generous 7x7 array offlat panel LCD screens powered by a beowulf clustel: With each screen backed by a workstation class PC, graphic and compute intensive applications can be applied to a broad range of data. Navigational tools are presented that allow for investigation of high dimensional spaces.

  13. Towards a theoretical clarification of biomimetics using conceptual tools from engineering design.

    PubMed

    Drack, M; Limpinsel, M; de Bruyn, G; Nebelsick, J H; Betz, O

    2017-12-13

    Many successful examples of biomimetic products are available, and most research efforts in this emerging field are directed towards the development of specific applications. The theoretical and conceptual underpinnings of the knowledge transfer between biologists, engineers and architects are, however, poorly investigated. The present article addresses this gap. We use a 'technomorphic' approach, i.e. the application of conceptual tools derived from engineering design, to better understand the processes operating during a typical biomimetic research project. This helps to elucidate the formal connections between functions, working principles and constructions (in a broad sense)-because the 'form-function-relationship' is a recurring issue in biology and engineering. The presented schema also serves as a conceptual framework that can be implemented for future biomimetic projects. The concepts of 'function' and 'working principle' are identified as the core elements in the biomimetic knowledge transfer towards applications. This schema not only facilitates the development of a common language in the emerging science of biomimetics, but also promotes the interdisciplinary dialogue among its subdisciplines.

  14. Microsystem technology as a road from macro to nanoworld.

    PubMed

    Grabiec, Piotr; Domański, Krzysztof; Janus, Paweł; Zaborowski, Michał; Jaroszewicz, Bogdan

    2005-04-01

    Tremendous progress of microelectronic technology observed within last 40 years is closely related to even more remarkable progress of technological tools. It is important to note however, that these new tools may be used for fabrication of diverse multifunctional structures as well. Such devices, called MEMS (Micro-Electro-Mechanical-System) and MOEMS (Micro-Electro-Opto-Mechanical-System) integrate microelectronic and micromechanical structures in one system enabling interdisciplinary application, with most interesting and prospective being bio-medical investigations. Development of these applications requires however cooperation of multidisciplinary team of specialists, covering broad range of physics, (bio) chemistry and electronics, not mentioning medical doctors and other medical specialists. Thus, dissemination, of knowledge about existing processing capabilities is of key importance. In this paper, examples of various applications of microelectronic technology for fabrication of Microsystems which may be used for medicine and chemistry, will be presented. Besides, information concerning a design and technology potential available in poland and new, emerging opportunities will be given.

  15. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology

    PubMed Central

    Latendresse, Mario; Paley, Suzanne M.; Krummenacker, Markus; Ong, Quang D.; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M.; Caspi, Ron

    2016-01-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms. PMID:26454094

  16. Adjustable Autonomy and Human-Agent Teamwork in Practice: An Interim Report on Space Applications

    NASA Technical Reports Server (NTRS)

    Bradshaw, Jeffrey M.; Feltovich, Paul; Hoffman, Robert; Jeffers, Renia; Suri, Niranhan; Uszok, Andrzej; VanHoof, Ron; Acquisti, Alessandro; Prescott, Debbie

    2003-01-01

    We give a preliminary perspective on the basic principles and pitfalls of adjustable autonomy and human-centered teamwork. We then summarize the interim results of our study on the problem of work practice modeling and human-agent collaboration in space applications, the development of a broad model of human-agent teamwork grounded in practice, and the integration of the Brahms, KAoS, and NOMADS agent frameworks. We hope our work will benefit those who plan and participate in work activities in a wide variety of space applications, as well as those who are interested in design and execution tools for teams of robots that can function as effective assistants to humans.

  17. Diamond Coatings

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Advances in materials technology have demonstrated that it is possible to get the advantages of diamond in a number of applications without the cost penalty, by coating and chemically bonding an inexpensive substrate with a thin film of diamond-like carbon (DLC). Diamond films offer tremendous technical and economic potential in such advances as chemically inert protective coatings; machine tools and parts capable of resisting wear 10 times longer; ball bearings and metal cutting tools; a broad variety of optical instruments and systems; and consumer products. Among the American companies engaged in DLC commercialization is Diamonex, Inc., a diamond coating spinoff of Air Products and Chemicals, Inc. Along with its own proprietary technology for both polycrystalline diamond and DLC coatings, Diamonex is using, under an exclusive license, NASA technology for depositing DLC on a substrate. Diamonex is developing, and offering commercially, under the trade name Diamond Aegis, a line of polycrystalline diamond-coated products that can be custom tailored for optical, electronic and engineering applications. Diamonex's initial focus is on optical products and the first commercial product is expected in late 1990. Other target applications include electronic heat sink substrates, x-ray lithography masks, metal cutting tools and bearings.

  18. Fundamentals of Clinical Outcomes Assessment for Spinal Disorders: Clinical Outcome Instruments and Applications

    PubMed Central

    Vavken, Patrick; Ganal-Antonio, Anne Kathleen B.; Quidde, Julia; Shen, Francis H.; Chapman, Jens R.; Samartzis, Dino

    2015-01-01

    Study Design A broad narrative review. Objectives Outcome assessment in spinal disorders is imperative to help monitor the safety and efficacy of the treatment in an effort to change the clinical practice and improve patient outcomes. The following article, part two of a two-part series, discusses the various outcome tools and instruments utilized to address spinal disorders and their management. Methods A thorough review of the peer-reviewed literature was performed, irrespective of language, addressing outcome research, instruments and tools, and applications. Results Numerous articles addressing the development and implementation of health-related quality-of-life, neck and low back pain, overall pain, spinal deformity, and other condition-specific outcome instruments have been reported. Their applications in the context of the clinical trial studies, the economic analyses, and overall evidence-based orthopedics have been noted. Additional issues regarding the problems and potential sources of bias utilizing outcomes scales and the concept of minimally clinically important difference were discussed. Conclusion Continuing research needs to assess the outcome instruments and tools used in the clinical outcome assessment for spinal disorders. Understanding the fundamental principles in spinal outcome assessment may also advance the field of “personalized spine care.” PMID:26225283

  19. Phenology Data Products to Support Assessment and Forecasting of Phenology on Multiple Spatiotemporal Scales

    NASA Astrophysics Data System (ADS)

    Gerst, K.; Enquist, C.; Rosemartin, A.; Denny, E. G.; Marsh, L.; Moore, D. J.; Weltzin, J. F.

    2014-12-01

    The USA National Phenology Network (USA-NPN; www.usanpn.org) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and environmental change. The National Phenology Database maintained by USA-NPN now has over 3.7 million records for plants and animals for the period 1954-2014, with the majority of these observations collected since 2008 as part of a broad, national contributory science strategy. These data have been used in a number of science, conservation and resource management applications, including national assessments of historical and potential future trends in phenology, regional assessments of spatio-temporal variation in organismal activity, and local monitoring for invasive species detection. Customizable data downloads are freely available, and data are accompanied by FGDC-compliant metadata, data-use and data-attribution policies, vetted and documented methodologies and protocols, and version control. While users are free to develop custom algorithms for data cleaning, winnowing and summarization prior to analysis, the National Coordinating Office of USA-NPN is developing a suite of standard data products to facilitate use and application by a diverse set of data users. This presentation provides a progress report on data product development, including: (1) Quality controlled raw phenophase status data; (2) Derived phenometrics (e.g. onset, duration) at multiple scales; (3) Data visualization tools; (4) Tools to support assessment of species interactions and overlap; (5) Species responsiveness to environmental drivers; (6) Spatially gridded phenoclimatological products; and (7) Algorithms for modeling and forecasting future phenological responses. The prioritization of these data products is a direct response to stakeholder needs related to informing management and policy decisions. We anticipate that these products will contribute to broad understanding of plant and animal phenology across scientific disciplines.

  20. High-throughput screening of a CRISPR/Cas9 library for functional genomics in human cells.

    PubMed

    Zhou, Yuexin; Zhu, Shiyou; Cai, Changzu; Yuan, Pengfei; Li, Chunmei; Huang, Yanyi; Wei, Wensheng

    2014-05-22

    Targeted genome editing technologies are powerful tools for studying biology and disease, and have a broad range of research applications. In contrast to the rapid development of toolkits to manipulate individual genes, large-scale screening methods based on the complete loss of gene expression are only now beginning to be developed. Here we report the development of a focused CRISPR/Cas-based (clustered regularly interspaced short palindromic repeats/CRISPR-associated) lentiviral library in human cells and a method of gene identification based on functional screening and high-throughput sequencing analysis. Using knockout library screens, we successfully identified the host genes essential for the intoxication of cells by anthrax and diphtheria toxins, which were confirmed by functional validation. The broad application of this powerful genetic screening strategy will not only facilitate the rapid identification of genes important for bacterial toxicity but will also enable the discovery of genes that participate in other biological processes.

  1. Role of Metal and Metal Oxide Nanoparticles as Diagnostic and Therapeutic Tools for Highly Prevalent Viral Infections

    PubMed Central

    Yadavalli, Tejabhiram; Shukla, Deepak

    2016-01-01

    Nanotechnology is increasingly playing important roles in various fields including virology. The emerging use of metal or metal oxide nanoparticles in virus targeting formulations shows the promise of improved diagnostic or therapeutic ability of the agents while uniquely enhancing the prospects of targeted drug delivery. Although a number of nanoparticles varying in composition, size, shape, and surface properties have been approved for human use, the candidates being tested or approved for clinical diagnosis and treatment of viral infections are relatively less in number. Challenges remain in this domain due to a lack of essential knowledge regarding the in vivo comportment of nanoparticles during viral infections. This review provides a broad overview of recent advances in diagnostic, prophylactic and therapeutic applications of metal and metal oxide nanoparticles in Human Immunodeficiency Virus, Hepatitis virus, influenza virus and Herpes virus infections. Types of nanoparticles commonly used and their broad applications have been explained in this review. PMID:27575283

  2. Application of Digital Anthropometry for Craniofacial Assessment

    PubMed Central

    Jayaratne, Yasas S. N.; Zwahlen, Roger A.

    2014-01-01

    Craniofacial anthropometry is an objective technique based on a series of measurements and proportions, which facilitate the characterization of phenotypic variation and quantification of dysmorphology. With the introduction of stereophotography, it is possible to acquire a lifelike three-dimensional (3D) image of the face with natural color and texture. Most of the traditional anthropometric landmarks can be identified on these 3D photographs using specialized software. Therefore, it has become possible to compute new digital measurements, which were not feasible with traditional instruments. The term “digital anthropometry” has been used by researchers based on such systems to separate their methods from conventional manual measurements. Anthropometry has been traditionally used as a research tool. With the advent of digital anthropometry, this technique can be employed in several disciplines as a noninvasive tool for quantifying facial morphology. The aim of this review is to provide a broad overview of digital anthropometry and discuss its clinical applications. PMID:25050146

  3. 10 CFR 33.12 - Applications for specific licenses of broad scope.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Applications for specific licenses of broad scope. 33.12 Section 33.12 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.12 Applications for specific licenses of broad scope. A...

  4. 10 CFR 33.12 - Applications for specific licenses of broad scope.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Applications for specific licenses of broad scope. 33.12 Section 33.12 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.12 Applications for specific licenses of broad scope. A...

  5. 10 CFR 33.12 - Applications for specific licenses of broad scope.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Applications for specific licenses of broad scope. 33.12 Section 33.12 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.12 Applications for specific licenses of broad scope. A...

  6. 10 CFR 33.12 - Applications for specific licenses of broad scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Applications for specific licenses of broad scope. 33.12 Section 33.12 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.12 Applications for specific licenses of broad scope. A...

  7. 10 CFR 33.12 - Applications for specific licenses of broad scope.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Applications for specific licenses of broad scope. 33.12 Section 33.12 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.12 Applications for specific licenses of broad scope. A...

  8. Azelaic Acid: Evidence-based Update on Mechanism of Action and Clinical Application.

    PubMed

    Schulte, Brian C; Wu, Wesley; Rosen, Ted

    2015-09-01

    Azelaic acid is a complex molecule with many diverse activities. The latter include anti-infective and anti-inflammatory action. The agent also inhibits follicular keratinization and epidermal melanogenesis. Due to the wide variety of biological activities, azelaic acid has been utilized as a management tool in a broad spectrum of disease states and cutaneous disorders. This paper reviews the clinical utility of azelaic acid, noting the quality of the evidence supporting each potential use.

  9. Regular dislocation networks in silicon as a tool for nanostructure devices used in optics, biology, and electronics.

    PubMed

    Kittler, M; Yu, X; Mchedlidze, T; Arguirov, T; Vyvenko, O F; Seifert, W; Reiche, M; Wilhelm, T; Seibt, M; Voss, O; Wolff, A; Fritzsche, W

    2007-06-01

    Well-controlled fabrication of dislocation networks in Si using direct wafer bonding opens broad possibilities for nanotechnology applications. Concepts of dislocation-network-based light emitters, manipulators of biomolecules, gettering and insulating layers, and three-dimensional buried conductive channels are presented and discussed. A prototype of a Si-based light emitter working at a wavelength of about 1.5 microm with an efficiency potential estimated at 1% is demonstrated.

  10. IT Data Mining Tool Uses in Aerospace

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.; Freeman, Kenneth; Jones, Kevin L.

    2012-01-01

    Data mining has a broad spectrum of uses throughout the realms of aerospace and information technology. Each of these areas has useful methods for processing, distributing, and storing its corresponding data. This paper focuses on ways to leverage the data mining tools and resources used in NASA's information technology area to meet the similar data mining needs of aviation and aerospace domains. This paper details the searching, alerting, reporting, and application functionalities of the Splunk system, used by NASA's Security Operations Center (SOC), and their potential shared solutions to address aircraft and spacecraft flight and ground systems data mining requirements. This paper also touches on capacity and security requirements when addressing sizeable amounts of data across a large data infrastructure.

  11. Virtual surgery in a (tele-)radiology framework.

    PubMed

    Glombitza, G; Evers, H; Hassfeld, S; Engelmann, U; Meinzer, H P

    1999-09-01

    This paper presents telemedicine as an extension of a teleradiology framework through tools for virtual surgery. To classify the described methods and applications, the research field of virtual reality (VR) is broadly reviewed. Differences with respect to technical equipment, methodological requirements and areas of application are pointed out. Desktop VR, augmented reality, and virtual reality are differentiated and discussed in some typical contexts of diagnostic support, surgical planning, therapeutic procedures, simulation and training. Visualization techniques are compared as a prerequisite for virtual reality and assigned to distinct levels of immersion. The advantage of a hybrid visualization kernel is emphasized with respect to the desktop VR applications that are subsequently shown. Moreover, software design aspects are considered by outlining functional openness in the architecture of the host system. Here, a teleradiology workstation was extended by dedicated tools for surgical planning through a plug-in mechanism. Examples of recent areas of application are introduced such as liver tumor resection planning, diagnostic support in heart surgery, and craniofacial surgery planning. In the future, surgical planning systems will become more important. They will benefit from improvements in image acquisition and communication, new image processing approaches, and techniques for data presentation. This will facilitate preoperative planning and intraoperative applications.

  12. Eucalyptus applied genomics: from gene sequences to breeding tools.

    PubMed

    Grattapaglia, Dario; Kirst, Matias

    2008-01-01

    Eucalyptus is the most widely planted hardwood crop in the tropical and subtropical world because of its superior growth, broad adaptability and multipurpose wood properties. Plantation forestry of Eucalyptus supplies high-quality woody biomass for several industrial applications while reducing the pressure on tropical forests and associated biodiversity. This review links current eucalypt breeding practices with existing and emerging genomic tools. A brief discussion provides a background to modern eucalypt breeding together with some current applications of molecular markers in support of operational breeding. Quantitative trait locus (QTL) mapping and genetical genomics are reviewed and an in-depth perspective is provided on the power of association genetics to dissect quantitative variation in this highly diverse organism. Finally, some challenges and opportunities to integrate genomic information into directional selective breeding are discussed in light of the upcoming draft of the Eucalyptus grandis genome. Given the extraordinary genetic variation that exists in the genus Eucalyptus, the ingenuity of most breeders, and the powerful genomic tools that have become available, the prospects of applied genomics in Eucalyptus forest production are encouraging.

  13. Broadband single-mode operation of standard optical fibers by using a sub-wavelength optical wire filter.

    PubMed

    Jung, Yongmin; Brambilla, Gilberto; Richardson, David J

    2008-09-15

    We report the use of a sub-wavelength optical wire (SOW) with a specifically designed transition region as an efficient tool to filter higher-order modes in multimode waveguides. Higher-order modes are effectively suppressed by controlling the transition taper profile and the diameter of the sub-wavelength optical wire. As a practical example, single-mode operation of a standard telecom optical fiber over a broad spectral window (400 approximately 1700 nm) was demonstrated with a 1microm SOW. The ability to obtain robust and stable single-mode operation over a very broad range of wavelengths offers new possibilities for mode control within fiber devices and is relevant to a range of application sectors including high performance fiber lasers, sensors, photolithography, and optical coherence tomography systems.

  14. Irena : tool suite for modeling and analysis of small-angle scattering.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilavsky, J.; Jemian, P.

    2009-04-01

    Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron)more » using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.« less

  15. Scientific Computation Application Partnerships in Materials and Chemical Sciences, Charge Transfer and Charge Transport in Photoactivated Systems, Developing Electron-Correlated Methods for Excited State Structure and Dynamics in the NWChem Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cramer, Christopher J.

    Charge transfer and charge transport in photoactivated systems are fundamental processes that underlie solar energy capture, solar energy conversion, and photoactivated catalysis, both organometallic and enzymatic. We developed methods, algorithms, and software tools needed for reliable treatment of the underlying physics for charge transfer and charge transport, an undertaking with broad applicability to the goals of the fundamental-interaction component of the Department of Energy Office of Basic Energy Sciences and the exascale initiative of the Office of Advanced Scientific Computing Research.

  16. Advancing biotechnology with CRISPR/Cas9: recent applications and patent landscape.

    PubMed

    Ferreira, Raphael; David, Florian; Nielsen, Jens

    2018-01-24

    Clustered regularly interspaced short palindromic repeats (CRISPR) is poised to become one of the key scientific discoveries of the twenty-first century. Originating from prokaryotic and archaeal immune systems to counter phage invasions, CRISPR-based applications have been tailored for manipulating a broad range of living organisms. From the different elucidated types of CRISPR mechanisms, the type II system adapted from Streptococcus pyogenes has been the most exploited as a tool for genome engineering and gene regulation. In this review, we describe the different applications of CRISPR/Cas9 technology in the industrial biotechnology field. Next, we detail the current status of the patent landscape, highlighting its exploitation through different companies, and conclude with future perspectives of this technology.

  17. A Web-based Google-Earth Coincident Imaging Tool for Satellite Calibration and Validation

    NASA Astrophysics Data System (ADS)

    Killough, B. D.; Chander, G.; Gowda, S.

    2009-12-01

    The Group on Earth Observations (GEO) is coordinating international efforts to build a Global Earth Observation System of Systems (GEOSS) to meet the needs of its nine “Societal Benefit Areas”, of which the most demanding, in terms of accuracy, is climate. To accomplish this vision, satellite on-orbit and ground-based data calibration and validation (Cal/Val) of Earth observation measurements are critical to our scientific understanding of the Earth system. Existing tools supporting space mission Cal/Val are often developed for specific campaigns or events with little desire for broad application. This paper describes a web-based Google-Earth based tool for the calculation of coincident satellite observations with the intention to support a diverse international group of satellite missions to improve data continuity, interoperability and data fusion. The Committee on Earth Observing Satellites (CEOS), which includes 28 space agencies and 20 other national and international organizations, are currently operating and planning over 240 Earth observation satellites in the next 15 years. The technology described here will better enable the use of multiple sensors to promote increased coordination toward a GEOSS. The CEOS Systems Engineering Office (SEO) and the Working Group on Calibration and Validation (WGCV) support the development of the CEOS Visualization Environment (COVE) tool to enhance international coordination of data exchange, mission planning and Cal/Val events. The objective is to develop a simple and intuitive application tool that leverages the capabilities of Google-Earth web to display satellite sensor coverage areas and for the identification of coincident scene locations along with dynamic menus for flexibility and content display. Key features and capabilities include user-defined evaluation periods (start and end dates) and regions of interest (rectangular areas) and multi-user collaboration. Users can select two or more CEOS missions from a database including Satellite Tool Kit (STK) generated orbit information and perform rapid calculations to identify coincident scenes where the groundtracks of the CEOS mission instrument fields-of-view intersect. Calculated results are displayed on a customized Google-Earth web interface to view location and time information along with optional output to EXCEL table format. In addition, multiple viewports can be used for comparisons. COVE was first introduced to the CEOS WGCV community in May 2009. Since that time, the development of a prototype version has progressed. It is anticipated that the capabilities and applications of COVE can support a variety of international Cal/Val activities as well as provide general information on Earth observation coverage for education and societal benefit. This project demonstrates the utility of a systems engineering tool with broad international appeal for enhanced communication and data evaluation opportunities among international CEOS agencies. The COVE tool is publicly accessible via NASA servers.

  18. How to support forest management in a world of change: results of some regional studies.

    PubMed

    Fürst, C; Lorz, C; Vacik, H; Potocic, N; Makeschin, F

    2010-12-01

    This article presents results of several studies in Middle, Eastern and Southeastern Europe on needs and application areas, desirable attributes and marketing potentials of forest management support tools. By comparing present and future application areas, a trend from sectoral planning towards landscape planning and integration of multiple stakeholder needs is emerging. In terms of conflicts, where management support tools might provide benefit, no clear tendencies were found, neither on local nor on regional level. In contrast, on national and European levels, support of the implementation of laws, directives, and regulations was found to be of highest importance. Following the user-requirements analysis, electronic tools supporting communication are preferred against paper-based instruments. The users identified most important attributes of optimized management support tools: (i) a broad accessibility for all users at any time should be guaranteed, (ii) the possibility to integrate iteratively experiences from case studies and from regional experts into the knowledge base (learning system) should be given, and (iii) a self-explanatory user interface is demanded, which is also suitable for users rather inexperienced with electronic tools. However, a market potential analysis revealed that the willingness to pay for management tools is very limited, although the participants specified realistic ranges of maximal amounts of money, which would be invested if the products were suitable and payment inevitable. To bridge the discrepancy between unwillingness to pay and the need to use management support tools, optimized financing or cooperation models between practice and science must be found.

  19. How to Support Forest Management in a World of Change: Results of Some Regional Studies

    NASA Astrophysics Data System (ADS)

    Fürst, C.; Lorz, C.; Vacik, H.; Potocic, N.; Makeschin, F.

    2010-12-01

    This article presents results of several studies in Middle, Eastern and Southeastern Europe on needs and application areas, desirable attributes and marketing potentials of forest management support tools. By comparing present and future application areas, a trend from sectoral planning towards landscape planning and integration of multiple stakeholder needs is emerging. In terms of conflicts, where management support tools might provide benefit, no clear tendencies were found, neither on local nor on regional level. In contrast, on national and European levels, support of the implementation of laws, directives, and regulations was found to be of highest importance. Following the user-requirements analysis, electronic tools supporting communication are preferred against paper-based instruments. The users identified most important attributes of optimized management support tools: (i) a broad accessibility for all users at any time should be guaranteed, (ii) the possibility to integrate iteratively experiences from case studies and from regional experts into the knowledge base (learning system) should be given, and (iii) a self-explanatory user interface is demanded, which is also suitable for users rather inexperienced with electronic tools. However, a market potential analysis revealed that the willingness to pay for management tools is very limited, although the participants specified realistic ranges of maximal amounts of money, which would be invested if the products were suitable and payment inevitable. To bridge the discrepancy between unwillingness to pay and the need to use management support tools, optimized financing or cooperation models between practice and science must be found.

  20. Reconstructing Spatial Distributions from Anonymized Locations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horey, James L; Forrest, Stephanie; Groat, Michael

    2012-01-01

    Devices such as mobile phones, tablets, and sensors are often equipped with GPS that accurately report a person's location. Combined with wireless communication, these devices enable a wide range of new social tools and applications. These same qualities, however, leave location-aware applications vulnerable to privacy violations. This paper introduces the Negative Quad Tree, a privacy protection method for location aware applications. The method is broadly applicable to applications that use spatial density information, such as social applications that measure the popularity of social venues. The method employs a simple anonymization algorithm running on mobile devices, and a more complex reconstructionmore » algorithm on a central server. This strategy is well suited to low-powered mobile devices. The paper analyzes the accuracy of the reconstruction method in a variety of simulated and real-world settings and demonstrates that the method is accurate enough to be used in many real-world scenarios.« less

  1. Application of Stable Isotope-Assisted Metabolomics for Cell Metabolism Studies

    PubMed Central

    You, Le; Zhang, Baichen; Tang, Yinjie J.

    2014-01-01

    The applications of stable isotopes in metabolomics have facilitated the study of cell metabolisms. Stable isotope-assisted metabolomics requires: (1) properly designed tracer experiments; (2) stringent sampling and quenching protocols to minimize isotopic alternations; (3) efficient metabolite separations; (4) high resolution mass spectrometry to resolve overlapping peaks and background noises; and (5) data analysis methods and databases to decipher isotopic clusters over a broad m/z range (mass-to-charge ratio). This paper overviews mass spectrometry based techniques for precise determination of metabolites and their isotopologues. It also discusses applications of isotopic approaches to track substrate utilization, identify unknown metabolites and their chemical formulas, measure metabolite concentrations, determine putative metabolic pathways, and investigate microbial community populations and their carbon assimilation patterns. In addition, 13C-metabolite fingerprinting and metabolic models can be integrated to quantify carbon fluxes (enzyme reaction rates). The fluxome, in combination with other “omics” analyses, may give systems-level insights into regulatory mechanisms underlying gene functions. More importantly, 13C-tracer experiments significantly improve the potential of low-resolution gas chromatography-mass spectrometry (GC-MS) for broad-scope metabolism studies. We foresee the isotope-assisted metabolomics to be an indispensable tool in industrial biotechnology, environmental microbiology, and medical research. PMID:24957020

  2. Electrostatic force microscopy as a broadly applicable method for characterizing pyroelectric materials.

    PubMed

    Martin-Olmos, Cristina; Stieg, Adam Z; Gimzewski, James K

    2012-06-15

    A general method based on the combination of electrostatic force microscopy with thermal cycling of the substrate holder is presented for direct, nanoscale characterization of the pyroelectric effect in a range of materials and sample configurations using commercial atomic force microscope systems. To provide an example of its broad applicability, the technique was applied to the examination of natural tourmaline gemstones. The method was validated using thermal cycles similar to those experienced in ambient conditions, where the induced pyroelectric response produced localized electrostatic surface charges whose magnitude demonstrated a correlation with the iron content and heat dissipation of each gemstone variety. In addition, the surface charge was shown to persist even at thermal equilibrium. This behavior is attributed to constant, stochastic cooling of the gemstone surface through turbulent contact with the surrounding air and indicates a potential utility for energy harvesting in applications including environmental sensors and personal electronics. In contrast to previously reported methods, ours has a capacity to carry out such precise nanoscale measurements with little or no restriction on the sample of interest, and represents a powerful new tool for the characterization of pyroelectric materials and devices.

  3. Electrostatic force microscopy as a broadly applicable method for characterizing pyroelectric materials

    NASA Astrophysics Data System (ADS)

    Martin-Olmos, Cristina; Stieg, Adam Z.; Gimzewski, James K.

    2012-06-01

    A general method based on the combination of electrostatic force microscopy with thermal cycling of the substrate holder is presented for direct, nanoscale characterization of the pyroelectric effect in a range of materials and sample configurations using commercial atomic force microscope systems. To provide an example of its broad applicability, the technique was applied to the examination of natural tourmaline gemstones. The method was validated using thermal cycles similar to those experienced in ambient conditions, where the induced pyroelectric response produced localized electrostatic surface charges whose magnitude demonstrated a correlation with the iron content and heat dissipation of each gemstone variety. In addition, the surface charge was shown to persist even at thermal equilibrium. This behavior is attributed to constant, stochastic cooling of the gemstone surface through turbulent contact with the surrounding air and indicates a potential utility for energy harvesting in applications including environmental sensors and personal electronics. In contrast to previously reported methods, ours has a capacity to carry out such precise nanoscale measurements with little or no restriction on the sample of interest, and represents a powerful new tool for the characterization of pyroelectric materials and devices.

  4. Exposing the Strategies that can Reduce the Obstacles: Improving the Science User Experience

    NASA Astrophysics Data System (ADS)

    Lindsay, F. E.; Brennan, J.; Behnke, J.; Lynnes, C.

    2017-12-01

    It is now well established that pursuing generic solutions to what seem are common problems in Earth science data access and use can often lead to disappointing results for both system developers and the intended users. This presentation focuses on real-world experience of managing a large and complex data system, NASA's Earth Science Data and Information Science System (EOSDIS), whose mission is to serve both broad user communities and those in smaller niche applications of Earth science data and services. In the talk, we focus on our experiences with known data user obstacles characterizing EOSDIS approaches, including various technological techniques, for engaging and bolstering, where possible, user experiences with EOSDIS. For improving how existing and prospective users discover and access NASA data from EOSDIS we introduce our cross-archive tool: Earthdata Search. This new search and order tool further empowers users to quickly access data sets using clever and intuitive features. The Worldview data visualization tool is also discussed highlighting how many users are now performing extensive data exploration without necessarily downloading data. Also, we explore our EOSDIS data discovery and access webinars, data recipes and short tutorials, targeted technical and data publications, user profiles and and social media as additional tools and methods used for improving our outreach and communications to a diverse user community. These efforts have paid substantial dividends for our user communities by allowing us to target discipline specific community needs. The desired take-away from this presentation will be an improved understanding of how EOSDIS has approached, and in several instances achieved, removing or lowering the barriers to data access and use. As we look ahead to more complex Earth science missions, EOSDIS will continue to focus on our user communities, both broad and specialized, so that our overall data system can continue to serve the needs of science and applications users.

  5. Exposing the Strategies that Can Reduce the Obstacles: Improving the Science User Experience

    NASA Technical Reports Server (NTRS)

    Lindsay, Francis E.; Brennan, Jennifer; Behnke, Jeanne; Lynnes, Chris

    2017-01-01

    It is now well established that pursuing generic solutions to what seem are common problems in Earth science data access and use can often lead to disappointing results for both system developers and the intended users. This presentation focuses on real-world experience of managing a large and complex data system, NASAs Earth Science Data and Information Science System (EOSDIS), whose mission is to serve both broad user communities and those in smaller niche applications of Earth science data and services. In the talk, we focus on our experiences with known data user obstacles characterizing EOSDIS approaches, including various technological techniques, for engaging and bolstering, where possible, user experiences with EOSDIS. For improving how existing and prospective users discover and access NASA data from EOSDIS we introduce our cross-archive tool: Earthdata Search. This new search and order tool further empowers users to quickly access data sets using clever and intuitive features. The Worldview data visualization tool is also discussed highlighting how many users are now performing extensive data exploration without necessarily downloading data. Also, we explore our EOSDIS data discovery and access webinars, data recipes and short tutorials, targeted technical and data publications, user profiles and social media as additional tools and methods used for improving our outreach and communications to a diverse user community. These efforts have paid substantial dividends for our user communities by allowing us to target discipline specific community needs. The desired take-away from this presentation will be an improved understanding of how EOSDIS has approached, and in several instances achieved, removing or lowering the barriers to data access and use. As we look ahead to more complex Earth science missions, EOSDIS will continue to focus on our user communities, both broad and specialized, so that our overall data system can continue to serve the needs of science and applications users.

  6. Analysis of ERTS-1 imagery of Wyoming and its application to evaluation of Wyoming's natural resources

    NASA Technical Reports Server (NTRS)

    Marrs, R. W.; Breckenridge, R. M.

    1973-01-01

    The author has identified the following significant results. The Wyoming investigation has progressed according to schedule during the Jan. - Feb., 1973 report period. A map of the maximum extent of Pleistocene glaciation was compiled for northwest Wyoming from interpretations of glacial features seen on ERTS-1 imagery. Using isodensitometry as a tool for image enhancement, techniques were developed which allowed accurate delineation of small urban areas and provided distinction of broad classifications within these small urban centers.

  7. The visible human and digital anatomy learning initiative.

    PubMed

    Dev, Parvati; Senger, Steven

    2005-01-01

    A collaborative initiative is starting within the Internet2 Health Science community to explore the development of a framework for providing access to digital anatomical teaching resources over Internet2. This is a cross-cutting initiative with broad applicability and will require the involvement of a diverse collection of communities. It will seize an opportunity created by a convergence of needs and technical capabilities to identify the technologies and standards needed to support a sophisticated collection of tools for teaching anatomy.

  8. Web-Based Geographic Information System Tool for Accessing Hanford Site Environmental Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Triplett, Mark B.; Seiple, Timothy E.; Watson, David J.

    Data volume, complexity, and access issues pose severe challenges for analysts, regulators and stakeholders attempting to efficiently use legacy data to support decision making at the U.S. Department of Energy’s (DOE) Hanford Site. DOE has partnered with the Pacific Northwest National Laboratory (PNNL) on the PHOENIX (PNNL-Hanford Online Environmental Information System) project, which seeks to address data access, transparency, and integration challenges at Hanford to provide effective decision support. PHOENIX is a family of spatially-enabled web applications providing quick access to decades of valuable scientific data and insight through intuitive query, visualization, and analysis tools. PHOENIX realizes broad, public accessibilitymore » by relying only on ubiquitous web-browsers, eliminating the need for specialized software. It accommodates a wide range of users with intuitive user interfaces that require little or no training to quickly obtain and visualize data. Currently, PHOENIX is actively hosting three applications focused on groundwater monitoring, groundwater clean-up performance reporting, and in-tank monitoring. PHOENIX-based applications are being used to streamline investigative and analytical processes at Hanford, saving time and money. But more importantly, by integrating previously isolated datasets and developing relevant visualization and analysis tools, PHOENIX applications are enabling DOE to discover new correlations hidden in legacy data, allowing them to more effectively address complex issues at Hanford.« less

  9. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology.

    PubMed

    Karp, Peter D; Latendresse, Mario; Paley, Suzanne M; Krummenacker, Markus; Ong, Quang D; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M; Caspi, Ron

    2016-09-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  10. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  11. SHERPA: an image segmentation and outline feature extraction tool for diatoms and other objects

    PubMed Central

    2014-01-01

    Background Light microscopic analysis of diatom frustules is widely used both in basic and applied research, notably taxonomy, morphometrics, water quality monitoring and paleo-environmental studies. In these applications, usually large numbers of frustules need to be identified and/or measured. Although there is a need for automation in these applications, and image processing and analysis methods supporting these tasks have previously been developed, they did not become widespread in diatom analysis. While methodological reports for a wide variety of methods for image segmentation, diatom identification and feature extraction are available, no single implementation combining a subset of these into a readily applicable workflow accessible to diatomists exists. Results The newly developed tool SHERPA offers a versatile image processing workflow focused on the identification and measurement of object outlines, handling all steps from image segmentation over object identification to feature extraction, and providing interactive functions for reviewing and revising results. Special attention was given to ease of use, applicability to a broad range of data and problems, and supporting high throughput analyses with minimal manual intervention. Conclusions Tested with several diatom datasets from different sources and of various compositions, SHERPA proved its ability to successfully analyze large amounts of diatom micrographs depicting a broad range of species. SHERPA is unique in combining the following features: application of multiple segmentation methods and selection of the one giving the best result for each individual object; identification of shapes of interest based on outline matching against a template library; quality scoring and ranking of resulting outlines supporting quick quality checking; extraction of a wide range of outline shape descriptors widely used in diatom studies and elsewhere; minimizing the need for, but enabling manual quality control and corrections. Although primarily developed for analyzing images of diatom valves originating from automated microscopy, SHERPA can also be useful for other object detection, segmentation and outline-based identification problems. PMID:24964954

  12. SHERPA: an image segmentation and outline feature extraction tool for diatoms and other objects.

    PubMed

    Kloster, Michael; Kauer, Gerhard; Beszteri, Bánk

    2014-06-25

    Light microscopic analysis of diatom frustules is widely used both in basic and applied research, notably taxonomy, morphometrics, water quality monitoring and paleo-environmental studies. In these applications, usually large numbers of frustules need to be identified and/or measured. Although there is a need for automation in these applications, and image processing and analysis methods supporting these tasks have previously been developed, they did not become widespread in diatom analysis. While methodological reports for a wide variety of methods for image segmentation, diatom identification and feature extraction are available, no single implementation combining a subset of these into a readily applicable workflow accessible to diatomists exists. The newly developed tool SHERPA offers a versatile image processing workflow focused on the identification and measurement of object outlines, handling all steps from image segmentation over object identification to feature extraction, and providing interactive functions for reviewing and revising results. Special attention was given to ease of use, applicability to a broad range of data and problems, and supporting high throughput analyses with minimal manual intervention. Tested with several diatom datasets from different sources and of various compositions, SHERPA proved its ability to successfully analyze large amounts of diatom micrographs depicting a broad range of species. SHERPA is unique in combining the following features: application of multiple segmentation methods and selection of the one giving the best result for each individual object; identification of shapes of interest based on outline matching against a template library; quality scoring and ranking of resulting outlines supporting quick quality checking; extraction of a wide range of outline shape descriptors widely used in diatom studies and elsewhere; minimizing the need for, but enabling manual quality control and corrections. Although primarily developed for analyzing images of diatom valves originating from automated microscopy, SHERPA can also be useful for other object detection, segmentation and outline-based identification problems.

  13. Defining a Computational Framework for the Assessment of ...

    EPA Pesticide Factsheets

    The Adverse Outcome Pathway (AOP) framework describes the effects of environmental stressors across multiple scales of biological organization and function. This includes an evaluation of the potential for each key event to occur across a broad range of species in order to determine the taxonomic applicability of each AOP. Computational tools are needed to facilitate this process. Recently, we developed a tool that uses sequence homology to evaluate the applicability of molecular initiating events across species (Lalone et al., Toxicol. Sci., 2016). To extend our ability to make computational predictions at higher levels of biological organization, we have created the AOPdb. This database links molecular targets identified associated with key events in the AOPwiki to publically available data (e.g. gene-protein, pathway, species orthology, ontology, chemical, disease) including ToxCast assay information. The AOPdb combines different data types in order to characterize the impacts of chemicals to human health and the environment and serves as a decision support tool for case study development in the area of taxonomic applicability. As a proof of concept, the AOPdb allows identification of relevant molecular targets, biological pathways, and chemical and disease associations across species for four AOPs from the AOP-Wiki (https://aopwiki.org): Estrogen receptor antagonism leading to reproductive dysfunction (Aop:30); Aromatase inhibition leading to reproductive d

  14. Portable Diagnostics Technology Assessment for Space Missions. Part 1; General Technology Capabilities for NASA Exploration Missions

    NASA Technical Reports Server (NTRS)

    Nelson, Emily S.; Chait, Arnon

    2010-01-01

    The changes in the scope of NASA s mission in the coming decade are profound and demand nimble, yet insightful, responses. On-board clinical and environmental diagnostics must be available for both mid-term lunar and long-term Mars exploration missions in an environment marked by scarce resources. Miniaturization has become an obvious focus. Despite solid achievements in lab-based devices, broad-based, robust tools for application in the field are not yet on the market. The confluence of rapid, wide-ranging technology evolution and internal planning needs are the impetus behind this work. This report presents an analytical tool for the ongoing evaluation of promising technology platforms based on mission- and application-specific attributes. It is not meant to assess specific devices, but rather to provide objective guidelines for a rational down-select of general categories of technology platforms. In this study, we have employed our expertise in the microgravity operation of fluidic devices, laboratory diagnostics for space applications, and terrestrial research in biochip development. A rating of the current state of technology development is presented using the present tool. Two mission scenarios are also investigated: a 30-day lunar mission using proven, tested technology in 5 years; and a 2- to 3-year mission to Mars in 10 to 15 years.

  15. The application of a novel optical SPM in biomedicine

    NASA Astrophysics Data System (ADS)

    Li, Yinli; Chen, Haibo; Wu, Shifa; Song, Linfeng; Zhang, Jian

    2005-01-01

    As an analysis tool, SPM has been broadly used in biomedicine in recent years, such as AFM and SNOM; they are effective instruments in detecting life nanostructures at atomic level. Atomic force and photon scanning tunneling microscope (AF/PSTM) is one of member of SPM, it can be used to obtain sample" optical and atomic fore images at once scanning, these images include the transmissivity image, reflection index image and topography image. This report mainly introduces the application of AF/PSTM in red blood membrane and the effect of different sample dealt with processes on the experiment result. The materials for preparing red cells membrane samples are anticoagulant blood, isotonic phosphatic buffer solution (PBS) and new two times distilled water. The images of AF/PSTM give real expression to the biology samples" fact despite of different sample dealt with processes, which prove that AF/PSTM suits to biology sample imaging. At the same time, the optical images and the topography image of AF/PSTM of the same sample are complementary with each other; this will make AF/PSTM a facile tool to analysis biologic samples" nanostructure. As another sample, this paper gives the application of AF/PSTM in immunoassay, the result shows that AF/PSTM is suit to analysis biologic sample, and it will become a new tool for biomedicine test.

  16. Evaluating Mobile Survey Tools (MSTs) for Field-Level Monitoring and Data Collection: Development of a Novel Evaluation Framework, and Application to MSTs for Rural Water and Sanitation Monitoring

    PubMed Central

    Fisher, Michael B.; Mann, Benjamin H.; Cronk, Ryan D.; Shields, Katherine F.; Klug, Tori L.; Ramaswamy, Rohit

    2016-01-01

    Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs. PMID:27563916

  17. Evaluating Mobile Survey Tools (MSTs) for Field-Level Monitoring and Data Collection: Development of a Novel Evaluation Framework, and Application to MSTs for Rural Water and Sanitation Monitoring.

    PubMed

    Fisher, Michael B; Mann, Benjamin H; Cronk, Ryan D; Shields, Katherine F; Klug, Tori L; Ramaswamy, Rohit

    2016-08-23

    Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs.

  18. Survey of Ambient Air Pollution Health Risk Assessment Tools.

    PubMed

    Anenberg, Susan C; Belova, Anna; Brandt, Jørgen; Fann, Neal; Greco, Sue; Guttikunda, Sarath; Heroux, Marie-Eve; Hurley, Fintan; Krzyzanowski, Michal; Medina, Sylvia; Miller, Brian; Pandey, Kiran; Roos, Joachim; Van Dingenen, Rita

    2016-09-01

    Designing air quality policies that improve public health can benefit from information about air pollution health risks and impacts, which include respiratory and cardiovascular diseases and premature death. Several computer-based tools help automate air pollution health impact assessments and are being used for a variety of contexts. Expanding information gathered for a May 2014 World Health Organization expert meeting, we survey 12 multinational air pollution health impact assessment tools, categorize them according to key technical and operational characteristics, and identify limitations and challenges. Key characteristics include spatial resolution, pollutants and health effect outcomes evaluated, and method for characterizing population exposure, as well as tool format, accessibility, complexity, and degree of peer review and application in policy contexts. While many of the tools use common data sources for concentration-response associations, population, and baseline mortality rates, they vary in the exposure information source, format, and degree of technical complexity. We find that there is an important tradeoff between technical refinement and accessibility for a broad range of applications. Analysts should apply tools that provide the appropriate geographic scope, resolution, and maximum degree of technical rigor for the intended assessment, within resources constraints. A systematic intercomparison of the tools' inputs, assumptions, calculations, and results would be helpful to determine the appropriateness of each for different types of assessment. Future work would benefit from accounting for multiple uncertainty sources and integrating ambient air pollution health impact assessment tools with those addressing other related health risks (e.g., smoking, indoor pollution, climate change, vehicle accidents, physical activity). © 2016 Society for Risk Analysis.

  19. SSRPrimer and SSR Taxonomy Tree: Biome SSR discovery

    PubMed Central

    Jewell, Erica; Robinson, Andrew; Savage, David; Erwin, Tim; Love, Christopher G.; Lim, Geraldine A. C.; Li, Xi; Batley, Jacqueline; Spangenberg, German C.; Edwards, David

    2006-01-01

    Simple sequence repeat (SSR) molecular genetic markers have become important tools for a broad range of applications such as genome mapping and genetic diversity studies. SSRs are readily identified within DNA sequence data and PCR primers can be designed for their amplification. These PCR primers frequently cross amplify within related species. We report a web-based tool, SSR Primer, that integrates SPUTNIK, an SSR repeat finder, with Primer3, a primer design program, within one pipeline. On submission of multiple FASTA formatted sequences, the script screens each sequence for SSRs using SPUTNIK. Results are then parsed to Primer3 for locus specific primer design. We have applied this tool for the discovery of SSRs within the complete GenBank database, and have designed PCR amplification primers for over 13 million SSRs. The SSR Taxonomy Tree server provides web-based searching and browsing of species and taxa for the visualisation and download of these SSR amplification primers. These tools are available at . PMID:16845092

  20. SSRPrimer and SSR Taxonomy Tree: Biome SSR discovery.

    PubMed

    Jewell, Erica; Robinson, Andrew; Savage, David; Erwin, Tim; Love, Christopher G; Lim, Geraldine A C; Li, Xi; Batley, Jacqueline; Spangenberg, German C; Edwards, David

    2006-07-01

    Simple sequence repeat (SSR) molecular genetic markers have become important tools for a broad range of applications such as genome mapping and genetic diversity studies. SSRs are readily identified within DNA sequence data and PCR primers can be designed for their amplification. These PCR primers frequently cross amplify within related species. We report a web-based tool, SSR Primer, that integrates SPUTNIK, an SSR repeat finder, with Primer3, a primer design program, within one pipeline. On submission of multiple FASTA formatted sequences, the script screens each sequence for SSRs using SPUTNIK. Results are then parsed to Primer3 for locus specific primer design. We have applied this tool for the discovery of SSRs within the complete GenBank database, and have designed PCR amplification primers for over 13 million SSRs. The SSR Taxonomy Tree server provides web-based searching and browsing of species and taxa for the visualisation and download of these SSR amplification primers. These tools are available at http://bioinformatics.pbcbasc.latrobe.edu.au/ssrdiscovery.html.

  1. Luminescence materials for pH and oxygen sensing in microbial cells - structures, optical properties, and biological applications.

    PubMed

    Zou, Xianshao; Pan, Tingting; Chen, Lei; Tian, Yanqing; Zhang, Weiwen

    2017-09-01

    Luminescence including fluorescence and phosphorescence sensors have been demonstrated to be important for studying cell metabolism, and diagnosing diseases and cancer. Various design principles have been employed for the development of sensors in different formats, such as organic molecules, polymers, polymeric hydrogels, and nanoparticles. The integration of the sensing with fluorescence imaging provides valuable tools for biomedical research and applications at not only bulk-cell level but also at single-cell level. In this article, we critically reviewed recent progresses on pH, oxygen, and dual pH and oxygen sensors specifically for their application in microbial cells. In addition, we focused not only on sensor materials with different chemical structures, but also on design and applications of sensors for better understanding cellular metabolism of microbial cells. Finally, we also provided an outlook for future materials design and key challenges in reaching broad applications in microbial cells.

  2. Application of whole genome re-sequencing data in the development of diagnostic DNA markers tightly linked to a disease-resistance locus for marker-assisted selection in lupin (Lupinus angustifolius).

    PubMed

    Yang, Huaan; Jian, Jianbo; Li, Xuan; Renshaw, Daniel; Clements, Jonathan; Sweetingham, Mark W; Tan, Cong; Li, Chengdao

    2015-09-02

    Molecular marker-assisted breeding provides an efficient tool to develop improved crop varieties. A major challenge for the broad application of markers in marker-assisted selection is that the marker phenotypes must match plant phenotypes in a wide range of breeding germplasm. In this study, we used the legume crop species Lupinus angustifolius (lupin) to demonstrate the utility of whole genome sequencing and re-sequencing on the development of diagnostic markers for molecular plant breeding. Nine lupin cultivars released in Australia from 1973 to 2007 were subjected to whole genome re-sequencing. The re-sequencing data together with the reference genome sequence data were used in marker development, which revealed 180,596 to 795,735 SNP markers from pairwise comparisons among the cultivars. A total of 207,887 markers were anchored on the lupin genetic linkage map. Marker mining obtained an average of 387 SNP markers and 87 InDel markers for each of the 24 genome sequence assembly scaffolds bearing markers linked to 11 genes of agronomic interest. Using the R gene PhtjR conferring resistance to phomopsis stem blight disease as a test case, we discovered 17 candidate diagnostic markers by genotyping and selecting markers on a genetic linkage map. A further 243 candidate diagnostic markers were discovered by marker mining on a scaffold bearing non-diagnostic markers linked to the PhtjR gene. Nine out from the ten tested candidate diagnostic markers were confirmed as truly diagnostic on a broad range of commercial cultivars. Markers developed using these strategies meet the requirements for broad application in molecular plant breeding. We demonstrated that low-cost genome sequencing and re-sequencing data were sufficient and very effective in the development of diagnostic markers for marker-assisted selection. The strategies used in this study may be applied to any trait or plant species. Whole genome sequencing and re-sequencing provides a powerful tool to overcome current limitations in molecular plant breeding, which will enable plant breeders to precisely pyramid favourable genes to develop super crop varieties to meet future food demands.

  3. Fluorescent nucleobases as tools for studying DNA and RNA

    NASA Astrophysics Data System (ADS)

    Xu, Wang; Chan, Ke Min; Kool, Eric T.

    2017-11-01

    Understanding the diversity of dynamic structures and functions of DNA and RNA in biology requires tools that can selectively and intimately probe these biomolecules. Synthetic fluorescent nucleobases that can be incorporated into nucleic acids alongside their natural counterparts have emerged as a powerful class of molecular reporters of location and environment. They are enabling new basic insights into DNA and RNA, and are facilitating a broad range of new technologies with chemical, biological and biomedical applications. In this Review, we will present a brief history of the development of fluorescent nucleobases and explore their utility as tools for addressing questions in biophysics, biochemistry and biology of nucleic acids. We provide chemical insights into the two main classes of these compounds: canonical and non-canonical nucleobases. A point-by-point discussion of the advantages and disadvantages of both types of fluorescent nucleobases is made, along with a perspective into the future challenges and outlook for this burgeoning field.

  4. Artificial Neural Networks Applications: from Aircraft Design Optimization to Orbiting Spacecraft On-board Environment Monitoring

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Lin, Paul P.

    2002-01-01

    This paper reviews some of the recent applications of artificial neural networks taken from various works performed by the authors over the last four years at the NASA Glenn Research Center. This paper focuses mainly on two areas. First, artificial neural networks application in design and optimization of aircraft/engine propulsion systems to shorten the overall design cycle. Out of that specific application, a generic design tool was developed, which can be used for most design optimization process. Second, artificial neural networks application in monitoring the microgravity quality onboard the International Space Station, using on-board accelerometers for data acquisition. These two different applications are reviewed in this paper to show the broad applicability of artificial intelligence in various disciplines. The intent of this paper is not to give in-depth details of these two applications, but to show the need to combine different artificial intelligence techniques or algorithms in order to design an optimized or versatile system.

  5. Genomics of foodborne pathogens for microbial food safety.

    PubMed

    Allard, Marc W; Bell, Rebecca; Ferreira, Christina M; Gonzalez-Escalona, Narjol; Hoffmann, Maria; Muruvanda, Tim; Ottesen, Andrea; Ramachandran, Padmini; Reed, Elizabeth; Sharma, Shashi; Stevens, Eric; Timme, Ruth; Zheng, Jie; Brown, Eric W

    2018-02-01

    Whole genome sequencing (WGS) has been broadly used to provide detailed characterization of foodborne pathogens. These genomes for diverse species including Salmonella, Escherichia coli, Listeria, Campylobacter and Vibrio have provided great insight into the genetic make-up of these pathogens. Numerous government agencies, industry and academia have developed new applications in food safety using WGS approaches such as outbreak detection and characterization, source tracking, determining the root cause of a contamination event, profiling of virulence and pathogenicity attributes, antimicrobial resistance monitoring, quality assurance for microbiology testing, as well as many others. The future looks bright for additional applications that come with the new technologies and tools in genomics and metagenomics. Published by Elsevier Ltd.

  6. Low Cost and Flexible UAV Deployment of Sensors

    PubMed Central

    Sørensen, Lars Yndal; Jacobsen, Lars Toft; Hansen, John Paulin

    2017-01-01

    This paper presents a platform for airborne sensor applications using low-cost, open-source components carried by an easy-to-fly unmanned aircraft vehicle (UAV). The system, available in open-source , is designed for researchers, students and makers for a broad range of exploration and data-collection needs. The main contribution is the extensible architecture for modularized airborne sensor deployment and real-time data visualisation. Our open-source Android application provides data collection, flight path definition and map tools. Total cost of the system is below 800 dollars. The flexibility of the system is illustrated by mapping the location of Bluetooth beacons (iBeacons) on a ground field and by measuring water temperature in a lake. PMID:28098819

  7. Low Cost and Flexible UAV Deployment of Sensors.

    PubMed

    Sørensen, Lars Yndal; Jacobsen, Lars Toft; Hansen, John Paulin

    2017-01-14

    This paper presents a platform for airborne sensor applications using low-cost, open-source components carried by an easy-to-fly unmanned aircraft vehicle (UAV). The system, available in open-source , is designed for researchers, students and makers for a broad range of exploration and data-collection needs. The main contribution is the extensible architecture for modularized airborne sensor deployment and real-time data visualisation. Our open-source Android application provides data collection, flight path definition and map tools. Total cost of the system is below 800 dollars. The flexibility of the system is illustrated by mapping the location of Bluetooth beacons (iBeacons) on a ground field and by measuring water temperature in a lake.

  8. NASA CYGNSS Mission Applications Workshop

    NASA Technical Reports Server (NTRS)

    Amin, Aimee V. (Compiler); Murray, John J. (Editor); Stough, Timothy M. (Editor); Molthan, Andrew (Editor)

    2015-01-01

    NASA's Cyclone Global Navigation Satellite System, (CYGNSS), mission is a constellation of eight microsatellites that will measure surface winds in and near the inner cores of hurricanes, including regions beneath the eyewall and intense inner rainbands that could not previously be measured from space. The CYGNSS-measured wind fields, when combined with precipitation fields (e.g., produced by the Global Precipitation Measurement [GPM] core satellite and its constellation of precipitation imagers), will provide coupled observations of moist atmospheric thermodynamics and ocean surface response, enabling new insights into hurricane inner core dynamics and energetics. The outcomes of this workshop, which are detailed in this report, comprise two primary elements: (1) A report of workshop proceedings, and; (2) Detailed Applications Traceability Matrices with requirements and operational considerations to serve broadly for development of value-added tools, applications, and products.

  9. The clinical impact of recent advances in LC-MS for cancer biomarker discovery and verification.

    PubMed

    Wang, Hui; Shi, Tujin; Qian, Wei-Jun; Liu, Tao; Kagan, Jacob; Srivastava, Sudhir; Smith, Richard D; Rodland, Karin D; Camp, David G

    2016-01-01

    Mass spectrometry (MS) -based proteomics has become an indispensable tool with broad applications in systems biology and biomedical research. With recent advances in liquid chromatography (LC) and MS instrumentation, LC-MS is making increasingly significant contributions to clinical applications, especially in the area of cancer biomarker discovery and verification. To overcome challenges associated with analyses of clinical samples (for example, a wide dynamic range of protein concentrations in bodily fluids and the need to perform high throughput and accurate quantification of candidate biomarker proteins), significant efforts have been devoted to improve the overall performance of LC-MS-based clinical proteomics platforms. Reviewed here are the recent advances in LC-MS and its applications in cancer biomarker discovery and quantification, along with the potentials, limitations and future perspectives.

  10. A Vision for Better Health: Mass Spectrometry Imaging for Clinical Diagnostics

    PubMed Central

    Ye, Hui; Gemperline, Erin; Li, Lingjun

    2012-01-01

    Background Mass spectrometry imaging (MSI) is a powerful tool that grants the ability to investigate a broad mass range of molecules from small molecules to large proteins by creating detailed distribution maps of selected compounds. Its usefulness in biomarker discovery towards clinical applications has obtained success by correlating the molecular expression of tissues acquired from MSI with well-established histology. Results To date, MSI has demonstrated its versatility in clinical applications, such as biomarker diagnostics of different diseases, prognostics of disease severities and metabolic response to drug treatment, etc. These studies have provided significant insight in clinical studies over the years and current technical advances are further facilitating the improvement of this field. Although the underlying concept is simple, factors such as choice of ionization method, sample preparation, instrumentation and data analysis must be taken into account for successful applications of MSI. Herein, we briefly reviewed these key elements yet focused on the clinical applications of MSI that cannot be addressed by other means. Conclusions Challenges and future perspectives in this field are also discussed to conclude that the ever-growing applications with continuous development of this powerful analytical tool will lead to a better understanding of the biology of diseases and improvements in clinical diagnostics. PMID:23078851

  11. Decision support system development at the Upper Midwest Environmental Sciences Center

    USGS Publications Warehouse

    Fox, Timothy J.; Nelson, J. C.; Rohweder, Jason J.

    2014-01-01

    A Decision Support System (DSS) can be defined in many ways. The working definition used by the U.S. Geological Survey Upper Midwest Environmental Sciences Center (UMESC) is, “A spatially based computer application or data that assists a researcher or manager in making decisions.” This is quite a broad definition—and it needs to be, because the possibilities for types of DSSs are limited only by the user group and the developer’s imagination. There is no one DSS; the types of DSSs are as diverse as the problems they help solve. This diversity requires that DSSs be built in a variety of ways, using the most appropriate methods and tools for the individual application. The skills of potential DSS users vary widely as well, further necessitating multiple approaches to DSS development. Some small, highly trained user groups may want a powerful modeling tool with extensive functionality at the expense of ease of use. Other user groups less familiar with geographic information system (GIS) and spatial data may want an easy-to-use application for a nontechnical audience. UMESC has been developing DSSs for almost 20 years. Our DSS developers offer our partners a wide variety of technical skills and development options, ranging from the most simple Web page or small application to complex modeling application development.

  12. The NASA In-Space Propulsion Technology Project, Products, and Mission Applicability

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Pencil, Eric; Liou, Larry; Dankanich, John; Munk, Michelle M.; Kremic, Tibor

    2009-01-01

    The In-Space Propulsion Technology (ISPT) Project, funded by NASA s Science Mission Directorate (SMD), is continuing to invest in propulsion technologies that will enable or enhance NASA robotic science missions. This overview provides development status, near-term mission benefits, applicability, and availability of in-space propulsion technologies in the areas of aerocapture, electric propulsion, advanced chemical thrusters, and systems analysis tools. Aerocapture investments improved: guidance, navigation, and control models of blunt-body rigid aeroshells; atmospheric models for Earth, Titan, Mars, and Venus; and models for aerothermal effects. Investments in electric propulsion technologies focused on completing NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6 to 7 kW throttle-able gridded ion system. The project is also concluding its High Voltage Hall Accelerator (HiVHAC) mid-term product specifically designed for a low-cost electric propulsion option. The primary chemical propulsion investment is on the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. The project is also delivering products to assist technology infusion and quantify mission applicability and benefits through mission analysis and tools. In-space propulsion technologies are applicable, and potentially enabling for flagship destinations currently under evaluation, as well as having broad applicability to future Discovery and New Frontiers mission solicitations.

  13. NASA's In-Space Propulsion Technology Project Overview, Near-term Products and Mission Applicability

    NASA Technical Reports Server (NTRS)

    Dankanich, John; Anderson, David J.

    2008-01-01

    The In-Space Propulsion Technology (ISPT) Project, funded by NASA's Science Mission Directorate (SMD), is continuing to invest in propulsion technologies that will enable or enhance NASA robotic science missions. This overview provides development status, near-term mission benefits, applicability, and availability of in-space propulsion technologies in the areas of aerocapture, electric propulsion, advanced chemical thrusters, and systems analysis tools. Aerocapture investments improved (1) guidance, navigation, and control models of blunt-body rigid aeroshells, 2) atmospheric models for Earth, Titan, Mars and Venus, and 3) models for aerothermal effects. Investments in electric propulsion technologies focused on completing NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system. The project is also concluding its High Voltage Hall Accelerator (HiVHAC) mid-term product specifically designed for a low-cost electric propulsion option. The primary chemical propulsion investment is on the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. The project is also delivering products to assist technology infusion and quantify mission applicability and benefits through mission analysis and tools. In-space propulsion technologies are applicable, and potentially enabling for flagship destinations currently under evaluation, as well as having broad applicability to future Discovery and New Frontiers mission solicitations.

  14. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  15. Novel synthesis and structural characterization of a high-affinity paramagnetic kinase probe for the identification of non-ATP site binders by nuclear magnetic resonance.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moy, Franklin J.; Lee, Arthur; Gavrin, Lori Krim

    2010-07-23

    To aid in the pursuit of selective kinase inhibitors, we have developed a unique ATP site binder tool for the detection of binders outside the ATP site by nuclear magnetic resonance (NMR). We report here the novel synthesis that led to this paramagnetic spin-labeled pyrazolopyrimidine probe (1), which exhibits nanomolar inhibitory activity against multiple kinases. We demonstrate the application of this probe by performing NMR binding experiments with Lck and Src kinases and utilize it to detect the binding of two compounds proximal to the ATP site. The complex structure of the probe with Lck is also presented, revealing howmore » the probe fits in the ATP site and the specific interactions it has with the protein. We believe that this spin-labeled probe is a valuable tool that holds broad applicability in a screen for non-ATP site binders.« less

  16. Google Earth and Geo Applications: A Toolset for Viewing Earth's Geospatial Information

    NASA Astrophysics Data System (ADS)

    Tuxen-Bettman, K.

    2016-12-01

    Earth scientists measure and derive fundamental data that can be of broad general interest to the public and policy makers. Yet, one of the challenges that has always faced the Earth science community is how to present their data and findings in an easy-to-use and compelling manner. Google's Geo Tools offer an efficient and dynamic way for scientists, educators, journalists and others to both access data and view or tell stories in a dynamic three-dimensional geospatial context. Google Earth in particular provides a dense canvas of satellite imagery on which can be viewed rich vector and raster datasets using the medium of Keyhole Markup Language (KML). Through KML, Google Earth can combine the analytical capabilities of Earth Engine, collaborative mapping of My Maps, and storytelling of Tour Builder and more to make Google's Geo Applications a coherent suite of tools for exploring our planet.https://earth.google.com/https://earthengine.google.com/https://mymaps.google.com/https://tourbuilder.withgoogle.com/https://www.google.com/streetview/

  17. Bridging the Gap between RF and Optical Patch Antenna Analysis via the Cavity Model.

    PubMed

    Unal, G S; Aksun, M I

    2015-11-02

    Although optical antennas with a variety of shapes and for a variety of applications have been proposed and studied, they are still in their infancy compared to their radio frequency (rf) counterparts. Optical antennas have mainly utilized the geometrical attributes of rf antennas rather than the analysis tools that have been the source of intuition for antenna engineers in rf. This study intends to narrow the gap of experience and intuition in the design of optical patch antennas by introducing an easy-to-understand and easy-to-implement analysis tool in rf, namely, the cavity model, into the optical regime. The importance of this approach is not only its simplicity in understanding and implementation but also its applicability to a broad class of patch antennas and, more importantly, its ability to provide the intuition needed to predict the outcome without going through the trial-and-error simulations with no or little intuitive guidance by the user.

  18. Simple sequence repeat marker loci discovery using SSR primer.

    PubMed

    Robinson, Andrew J; Love, Christopher G; Batley, Jacqueline; Barker, Gary; Edwards, David

    2004-06-12

    Simple sequence repeats (SSRs) have become important molecular markers for a broad range of applications, such as genome mapping and characterization, phenotype mapping, marker assisted selection of crop plants and a range of molecular ecology and diversity studies. With the increase in the availability of DNA sequence information, an automated process to identify and design PCR primers for amplification of SSR loci would be a useful tool in plant breeding programs. We report an application that integrates SPUTNIK, an SSR repeat finder, with Primer3, a PCR primer design program, into one pipeline tool, SSR Primer. On submission of multiple FASTA formatted sequences, the script screens each sequence for SSRs using SPUTNIK. The results are parsed to Primer3 for locus-specific primer design. The script makes use of a Web-based interface, enabling remote use. This program has been written in PERL and is freely available for non-commercial users by request from the authors. The Web-based version may be accessed at http://hornbill.cspp.latrobe.edu.au/

  19. Numerical simulation of controlled directional solidification under microgravity conditions

    NASA Astrophysics Data System (ADS)

    Holl, S.; Roos, D.; Wein, J.

    The computer-assisted simulation of solidification processes influenced by gravity has gained increased importance during the previous years regarding ground-based as well as microgravity research. Depending on the specific needs of the investigator, the simulation model ideally covers a broad spectrum of applications. These primarily include the optimization of furnace design in interaction with selected process parameters to meet the desired crystallization conditions. Different approaches concerning the complexity of the simulation models as well as their dedicated applications will be discussed in this paper. Special emphasis will be put on the potential of software tools to increase the scientific quality and cost-efficiency of microgravity experimentation. The results gained so far in the context of TEXUS, FSLP, D-1 and D-2 (preparatory program) experiments, highlighting their simulation-supported preparation and evaluation will be discussed. An outlook will then be given on the possibilities to enhance the efficiency of pre-industrial research in the Columbus era through the incorporation of suitable simulation methods and tools.

  20. Ground-water models as a management tool in Florida

    USGS Publications Warehouse

    Hutchinson, C.B.

    1984-01-01

    Highly sophisticated computer models provide powerful tools for analyzing historic data and for simulating future water levels, water movement, and water chemistry under stressed conditions throughout the ground-water system in Florida. Models that simulate the movement of heat and subsidence of land in response to aquifer pumping also have potential for application to hydrologic problems in the State. Florida, with 20 ground-water modeling studies reported since 1972, has applied computer modeling techniques to a variety of water-resources problems. Models in Florida generally have been used to provide insight to problems of water supply, contamination, and impact on the environment. The model applications range from site-specific studies, such as estimating contamination by wastewater injection at St. Petersburg, to a regional model of the entire State that may be used to assess broad-scale environmental impact of water-resources development. Recently, groundwater models have been used as management tools by the State regulatory authority to permit or deny development of water resources. As modeling precision, knowledge, and confidence increase, the use of ground-water models will shift more and more toward regulation of development and enforcement of environmental laws. (USGS)

  1. Wittgenstein and the limits of empathic understanding in psychopathology.

    PubMed

    Thornton, Tim

    2004-08-01

    The aim of this paper is three-fold. Firstly, to briefly set out how strategic choices made about theorising about intentionality or content have actions at a distance for accounting for delusion. Secondly, to investigate how successfully a general difficulty facing a broadly interpretative approach to delusions might be eased by the application of any of three Wittgensteinian interpretative tools. Thirdly, to draw a general moral about how the later Wittgenstein gives more reason to be pessimistic than optimistic about the prospects of a philosophical psychopathology aimed at empathic understanding of delusions.

  2. Blood Pressure Control

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Engineering Development Lab., Inc.'s E-2000 Neck Baro Reflex System was developed for cardiovascular studies of astronauts. It is regularly used on Space Shuttle Missions, and a parallel version has been developed as a research tool to facilitate studies of blood pressure reflex controls in patients with congestive heart failure, diabetes, etc. An advanced version, the PPC-1000, was developed in 1991, and the technology has been refined substantially. The PPC provides an accurate means of generating pressure for a broad array of laboratory applications. An improved version, the E2010 Barosystem, is anticipated.

  3. Enhanced image capture through fusion

    NASA Technical Reports Server (NTRS)

    Burt, Peter J.; Hanna, Keith; Kolczynski, Raymond J.

    1993-01-01

    Image fusion may be used to combine images from different sensors, such as IR and visible cameras, to obtain a single composite with extended information content. Fusion may also be used to combine multiple images from a given sensor to form a composite image in which information of interest is enhanced. We present a general method for performing image fusion and show that this method is effective for diverse fusion applications. We suggest that fusion may provide a powerful tool for enhanced image capture with broad utility in image processing and computer vision.

  4. TQM: A bibliography with abstracts. [total quality management

    NASA Technical Reports Server (NTRS)

    Gottlich, Gretchen L. (Editor)

    1992-01-01

    This document is designed to function as a special resource for NASA Langley scientists, engineers, and managers during the introduction and development of total quality management (TQM) practices at the Center. It lists approximately 300 bibliographic citations for articles and reports dealing with various aspects of TQM. Abstracts are also available for the majority of the citations. Citations are organized by broad subject areas, including case studies, customer service, senior management, leadership, communication tools, TQM basics, applications, and implementation. An introduction and indexes provide additional information on arrangement and availability of these materials.

  5. Novel scintillators and silicon photomultipliers for nuclear physics and applications

    NASA Astrophysics Data System (ADS)

    Jenkins, David

    2015-06-01

    Until comparatively recently, scintillator detectors were seen as an old-fashioned tool of nuclear physics with more attention being given to areas such as gamma-ray tracking using high-purity germanium detectors. Next-generation scintillator detectors, such as lanthanum bromide, which were developed for the demands of space science and gamma- ray telescopes, are found to have strong applicability to low energy nuclear physics. Their excellent timing resolution makes them very suitable for fast timing measurements and their much improved energy resolution compared to conventional scintillators promises to open up new avenues in nuclear physics research which were presently hard to access. Such "medium-resolution" spectroscopy has broad interest across several areas of contemporary interest such as the study of nuclear giant resonances. In addition to the connections to space science, it is striking that the demands of contemporary medical imaging have strong overlap with those of experimental nuclear physics. An example is the interest in PET-MRI combined imaging which requires putting scintillator detectors in a high magnetic field environment. This has led to strong advances in the area of silicon photomultipliers, a solid-state replacement for photomultiplier tubes, which are insensitive to magnetic fields. Broad application to nuclear physics of this technology may be foreseen.

  6. GLOBAL REFERENCE ATMOSPHERIC MODELS FOR AEROASSIST APPLICATIONS

    NASA Technical Reports Server (NTRS)

    Duvall, Aleta; Justus, C. G.; Keller, Vernon W.

    2005-01-01

    Aeroassist is a broad category of advanced transportation technology encompassing aerocapture, aerobraking, aeroentry, precision landing, hazard detection and avoidance, and aerogravity assist. The eight destinations in the Solar System with sufficient atmosphere to enable aeroassist technology are Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Saturn's moon Titan. Engineering-level atmospheric models for five of these targets - Earth, Mars, Titan, Neptune, and Venus - have been developed at NASA's Marshall Space Flight Center. These models are useful as tools in mission planning and systems analysis studies associated with aeroassist applications. The series of models is collectively named the Global Reference Atmospheric Model or GRAM series. An important capability of all the models in the GRAM series is their ability to simulate quasi-random perturbations for Monte Carlo analysis in developing guidance, navigation and control algorithms, for aerothermal design, and for other applications sensitive to atmospheric variability. Recent example applications are discussed.

  7. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.

    PubMed

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/.

  8. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis

    PubMed Central

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275

  9. Management and Research Applications of Long-range Surveillance Radar Data for Birds, Bats, and Flying Insects

    USGS Publications Warehouse

    Ruth, Janet M.; Buler, Jeffrey J.; Diehl, Robert H.; Sojda, Richard S.

    2008-01-01

    There is renewed interest in using long-range surveillance radar as a biological research tool due to substantial improvements in the network of radars within the United States. Technical improvements, the digital nature of the radar data, and the availability of computing power and geographic information systems, enable a broad range of biological applications. This publication provides a summary of long-range surveillance radar technology and applications of these data to questions about movement patterns of birds and other flying wildlife. The intended audience is potential radar-data end users, including natural-resource management and regulatory agencies, conservation organizations, and industry. This summary includes a definition of long-range surveillance radar, descriptions of its strengths and weaknesses, information on applications of the data, cost, methods of calibration, and what end users need to do, and some key references and resources.

  10. Ultrasound in athletes: emerging techniques in point-of-care practice.

    PubMed

    Yim, Eugene S; Corrado, Gianmichel

    2012-01-01

    Ultrasound offers sports medicine clinicians the potential to diagnose, treat, and manage a broad spectrum of conditions afflicting athletes. This review article highlights applications of ultrasound that hold promise as point-of-care diagnostics and therapeutic tools that can be used directly by clinicians to direct real-time management of athletes. Point-of-care ultrasound has been examined most in the context of musculoskeletal disorders in athletes, with attention given to Achilles tendinopathy, patellar tendinopathy, hip and thigh pathology, elbow tendinopathy, wrist pathology, and shoulder pain. More research has focused on therapeutic applications than diagnostic, but initial evidence has been generated in both. Preliminary evidence has been published also on abdominal ultrasound for splenic enlargement in mononucleosis, cardiopulmonary processes and hydration status, deep vein thrombosis, and bone mineral density. Further research will be required to validate these applications and to explore further applications of portable ultrasound that can be used in the care of athletes.

  11. TetrUSS Capabilities for S and C Applications

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Parikh, Paresh

    2004-01-01

    TetrUSS is a suite of loosely coupled computational fluid dynamics software that is packaged into a complete flow analysis system. The system components consist of tools for geometry setup, grid generation, flow solution, visualization, and various utilities tools. Development began in 1990 and it has evolved into a proven and stable system for Euler and Navier-Stokes analysis and design of unconventional configurations. It is 1) well developed and validated, 2) has a broad base of support, and 3) is presently is a workhorse code because of the level of confidence that has been established through wide use. The entire system can now run on linux or mac architectures. In the following slides, I will highlight more of the features of the VGRID and USM3D codes.

  12. Deciphering Phosphotyrosine-Dependent Signaling Networks in Cancer by SH2 Profiling

    PubMed Central

    Machida, Kazuya; Khenkhar, Malik

    2012-01-01

    It has been a decade since the introduction of SH2 profiling, a modular domain-based molecular diagnostics tool. This review covers the original concept of SH2 profiling, different analytical platforms, and their applications, from the detailed analysis of single proteins to broad screening in translational research. Illustrated by practical examples, we discuss the uniqueness and advantages of the approach as well as its limitations and challenges. We provide guidance for basic researchers and oncologists who may consider SH2 profiling in their respective cancer research, especially for those focusing on tyrosine phosphoproteomics. SH2 profiling can serve as an alternative phosphoproteomics tool to dissect aberrant tyrosine kinase pathways responsible for individual malignancies, with the goal of facilitating personalized diagnostics for the treatment of cancer. PMID:23226573

  13. Genome Partitioner: A web tool for multi-level partitioning of large-scale DNA constructs for synthetic biology applications.

    PubMed

    Christen, Matthias; Del Medico, Luca; Christen, Heinz; Christen, Beat

    2017-01-01

    Recent advances in lower-cost DNA synthesis techniques have enabled new innovations in the field of synthetic biology. Still, efficient design and higher-order assembly of genome-scale DNA constructs remains a labor-intensive process. Given the complexity, computer assisted design tools that fragment large DNA sequences into fabricable DNA blocks are needed to pave the way towards streamlined assembly of biological systems. Here, we present the Genome Partitioner software implemented as a web-based interface that permits multi-level partitioning of genome-scale DNA designs. Without the need for specialized computing skills, biologists can submit their DNA designs to a fully automated pipeline that generates the optimal retrosynthetic route for higher-order DNA assembly. To test the algorithm, we partitioned a 783 kb Caulobacter crescentus genome design. We validated the partitioning strategy by assembling a 20 kb test segment encompassing a difficult to synthesize DNA sequence. Successful assembly from 1 kb subblocks into the 20 kb segment highlights the effectiveness of the Genome Partitioner for reducing synthesis costs and timelines for higher-order DNA assembly. The Genome Partitioner is broadly applicable to translate DNA designs into ready to order sequences that can be assembled with standardized protocols, thus offering new opportunities to harness the diversity of microbial genomes for synthetic biology applications. The Genome Partitioner web tool can be accessed at https://christenlab.ethz.ch/GenomePartitioner.

  14. 10 CFR 33.16 - Application for other specific licenses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....16 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.16 Application for other specific licenses. An application filed pursuant to part 30 of this chapter for a specific license other than one of broad scope will be...

  15. 10 CFR 33.16 - Application for other specific licenses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....16 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.16 Application for other specific licenses. An application filed pursuant to part 30 of this chapter for a specific license other than one of broad scope will be...

  16. 10 CFR 33.16 - Application for other specific licenses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....16 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.16 Application for other specific licenses. An application filed pursuant to part 30 of this chapter for a specific license other than one of broad scope will be...

  17. 10 CFR 33.16 - Application for other specific licenses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....16 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.16 Application for other specific licenses. An application filed pursuant to part 30 of this chapter for a specific license other than one of broad scope will be...

  18. 10 CFR 33.16 - Application for other specific licenses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....16 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.16 Application for other specific licenses. An application filed pursuant to part 30 of this chapter for a specific license other than one of broad scope will be...

  19. Portable optical-resolution photoacoustic microscopy for volumetric imaging of multiscale organisms.

    PubMed

    Jin, Tian; Guo, Heng; Yao, Lei; Xie, Huikai; Jiang, Huabei; Xi, Lei

    2018-04-01

    Photoacoustic microscopy (PAM) provides a fundamentally new tool for a broad range of studies of biological structures and functions. However, the use of PAM has been largely limited to small vertebrates due to the large size/weight and the inconvenience of the equipment. Here, we describe a portable optical-resolution photoacoustic microscopy (pORPAM) system for 3-dimensional (3D) imaging of small-to-large rodents and humans with a high spatiotemporal resolution and a large field of view. We show extensive applications of pORPAM to multiscale animals including mice and rabbits. In addition, we image the 3D vascular networks of human lips, and demonstrate the feasibility of pORPAM to observe the recovery process of oral ulcer and cancer-associated capillary loops in human oral cavities. This technology is promising for broad biomedical studies from fundamental biology to clinical diseases. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. The Race To X-ray Microbeam and Nanobeam Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ice, Gene E; Budai, John D; Pang, Judy

    2011-01-01

    X-ray microbeams are an emerging characterization tool with transformational implications for broad areas of science ranging from materials structure and dynamics, geophysics and environmental science to biophysics and protein crystallography. In this review, we discuss the race toward sub-10 nm- x-ray beams with the ability to penetrate tens to hundreds of microns into most materials and with the ability to determine local (crystal) structure. Examples of science enabled by current micro/nanobeam technologies are presented and we provide a perspective on future directions. Applications highlighted are chosen to illustrate the important features of various submicron beam strategies and to highlight themore » directions of current and future research. While it is clear that x-ray microprobes will impact science broadly, the practical limit for hard x-ray beam size, the limit to trace element sensitivity, and the ultimate limitations associated with near-atomic structure determinations are the subject of ongoing research.« less

  1. Challenges and Opportunities in Interdisciplinary Materials Research Experiences for Undergraduates

    NASA Astrophysics Data System (ADS)

    Vohra, Yogesh; Nordlund, Thomas

    2009-03-01

    The University of Alabama at Birmingham (UAB) offer a broad range of interdisciplinary materials research experiences to undergraduate students with diverse backgrounds in physics, chemistry, applied mathematics, and engineering. The research projects offered cover a broad range of topics including high pressure physics, microelectronic materials, nano-materials, laser materials, bioceramics and biopolymers, cell-biomaterials interactions, planetary materials, and computer simulation of materials. The students welcome the opportunity to work with an interdisciplinary team of basic science, engineering, and biomedical faculty but the challenge is in learning the key vocabulary for interdisciplinary collaborations, experimental tools, and working in an independent capacity. The career development workshops dealing with the graduate school application process and the entrepreneurial business activities were found to be most effective. The interdisciplinary university wide poster session helped student broaden their horizons in research careers. The synergy of the REU program with other concurrently running high school summer programs on UAB campus will also be discussed.

  2. Geomagnetically induced currents: Science, engineering, and applications readiness

    NASA Astrophysics Data System (ADS)

    Pulkkinen, A.; Bernabeu, E.; Thomson, A.; Viljanen, A.; Pirjola, R.; Boteler, D.; Eichner, J.; Cilliers, P. J.; Welling, D.; Savani, N. P.; Weigel, R. S.; Love, J. J.; Balch, C.; Ngwira, C. M.; Crowley, G.; Schultz, A.; Kataoka, R.; Anderson, B.; Fugate, D.; Simpson, J. J.; MacAlester, M.

    2017-07-01

    This paper is the primary deliverable of the very first NASA Living With a Star Institute Working Group, Geomagnetically Induced Currents (GIC) Working Group. The paper provides a broad overview of the current status and future challenges pertaining to the science, engineering, and applications of the GIC problem. Science is understood here as the basic space and Earth sciences research that allows improved understanding and physics-based modeling of the physical processes behind GIC. Engineering, in turn, is understood here as the "impact" aspect of GIC. Applications are understood as the models, tools, and activities that can provide actionable information to entities such as power systems operators for mitigating the effects of GIC and government agencies for managing any potential consequences from GIC impact to critical infrastructure. Applications can be considered the ultimate goal of our GIC work. In assessing the status of the field, we quantify the readiness of various applications in the mitigation context. We use the Applications Readiness Level (ARL) concept to carry out the quantification.

  3. Geomagnetically induced currents: Science, engineering, and applications readiness

    USGS Publications Warehouse

    Pulkkinen, Antti; Bernabeu, E.; Thomson, A.; Viljanen, A.; Pirjola, R.; Boteler, D.; Eichner, J.; Cilliers, P.J.; Welling, D.; Savani, N.P.; Weigel, R.S.; Love, Jeffrey J.; Balch, Christopher; Ngwira, C.M.; Crowley, G.; Schultz, Adam; Kataoka, R.; Anderson, B.; Fugate, D.; Simpson, J.J.; MacAlester, M.

    2017-01-01

    This paper is the primary deliverable of the very first NASA Living With a Star Institute Working Group, Geomagnetically Induced Currents (GIC) Working Group. The paper provides a broad overview of the current status and future challenges pertaining to the science, engineering, and applications of the GIC problem. Science is understood here as the basic space and Earth sciences research that allows improved understanding and physics-based modeling of the physical processes behind GIC. Engineering, in turn, is understood here as the “impact” aspect of GIC. Applications are understood as the models, tools, and activities that can provide actionable information to entities such as power systems operators for mitigating the effects of GIC and government agencies for managing any potential consequences from GIC impact to critical infrastructure. Applications can be considered the ultimate goal of our GIC work. In assessing the status of the field, we quantify the readiness of various applications in the mitigation context. We use the Applications Readiness Level (ARL) concept to carry out the quantification.

  4. Vacuum system transient simulator and its application to TFTR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sredniawski, J.

    The vacuum system transient simulator (VSTS) models transient gas transport throughout complex networks of ducts, valves, traps, vacuum pumps, and other related vacuum system components. VSTS is capable of treating gas models of up to 10 species, for all flow regimes from pure molecular to continuum. Viscous interactions between species are considered as well as non-uniform temperature of a system. Although this program was specifically developed for use on the Tokamak Fusion Test Reactor (TFTR) project at Princeton, it is a generalized tool capable of handling a broad range of vacuum system problems. During the TFTR engineering design phase, VSTSmore » has been used in many applications. Two applications selected for presentation are: torus vacuum pumping system performance between 400 Ci tritium pulses and tritium backstreaming to neutral beams during pulses.« less

  5. Use of StreamStats in the Upper French Broad River Basin, North Carolina: A Pilot Water-Resources Web Application

    USGS Publications Warehouse

    Wagner, Chad R.; Tighe, Kirsten C.; Terziotti, Silvia

    2009-01-01

    StreamStats is a Web-based Geographic Information System (GIS) application that was developed by the U.S. Geological Survey (USGS) in cooperation with Environmental Systems Research Institute, Inc. (ESRI) to provide access to an assortment of analytical tools that are useful for water-resources planning and management. StreamStats allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection sites and selected ungaged sites. StreamStats also allows users to identify stream reaches upstream and downstream from user-selected sites and obtain information for locations along streams where activities occur that can affect streamflow conditions. This functionality can be accessed through a map-based interface with the user's Web browser or through individual functions requested remotely through other Web applications.

  6. No perfect tools: trade-offs of sustainability principles and user requirements in designing support tools for land-use decisions between greenfields and brownfields.

    PubMed

    Bartke, Stephan; Schwarze, Reimund

    2015-04-15

    The EU Soil Thematic Strategy calls for the application of sustainability concepts and methods as part of an integrated policy to prevent soil degradation and to increase the re-use of brownfields. Although certain general principles have been proposed for the evaluation of sustainable development, the practical application of sustainability assessment tools (SATs) is contingent on the actual requirements of tool users, e.g. planners or investors, to pick up such instruments in actual decision making. We examine the normative sustainability principles that need to be taken into account in order to make sound land-use decisions between new development on greenfield sites and the regeneration of brownfields - and relate these principles to empirically observed user requirements and the properties of available SATs. In this way we provide an overview of approaches to sustainability assessment. Three stylized approaches, represented in each case by a typical tool selected from the literature, are presented and contrasted with (1) the norm-oriented Bellagio sustainability principles and (2) the requirements of three different stakeholder groups: decision makers, scientists/experts and representatives of the general public. The paper disentangles some of the inevitable trade-offs involved in seeking to implement sustainable land-use planning, i.e. between norm orientation and holism, broad participation and effective communication. It concludes with the controversial assessment that there are no perfect tools and that to be meaningful the user requirements of decision makers must take precedence over those of other interest groups in the design of SATs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. NASA Smart Surgical Probe Project

    NASA Technical Reports Server (NTRS)

    Mah, Robert W.; Andrews, Russell J.; Jeffrey, Stefanie S.; Guerrero, Michael; Papasin, Richard; Koga, Dennis (Technical Monitor)

    2002-01-01

    Information Technologies being developed by NASA to assist astronaut-physician in responding to medical emergencies during long space flights are being employed for the improvement of women's health in the form of "smart surgical probe". This technology, initially developed for neurosurgery applications, not only has enormous potential for the diagnosis and treatment of breast cancer, but broad applicability to a wide range of medical challenges. For the breast cancer application, the smart surgical probe is being designed to "see" a suspicious lump, determine by its features if it is cancerous, and ultimately predict how the disease may progress. A revolutionary early breast cancer detection tool based on this technology has been developed by a commercial company and is being tested in human clinical trials at the University of California at Davis, School of Medicine. The smart surgical probe technology makes use of adaptive intelligent software (hybrid neural networks/fuzzy logic algorithms) with the most advanced physiologic sensors to provide real-time in vivo tissue characterization for the detection, diagnosis and treatment of tumors, including determination of tumor microenvironment and evaluation of tumor margins. The software solutions and tools from these medical applications will lead to the development of better real-time minimally-invasive smart surgical probes for emergency medical care and treatment of astronauts on long space flights.

  8. Nanodiscs in Membrane Biochemistry and Biophysics.

    PubMed

    Denisov, Ilia G; Sligar, Stephen G

    2017-03-22

    Membrane proteins play a most important part in metabolism, signaling, cell motility, transport, development, and many other biochemical and biophysical processes which constitute fundamentals of life on the molecular level. Detailed understanding of these processes is necessary for the progress of life sciences and biomedical applications. Nanodiscs provide a new and powerful tool for a broad spectrum of biochemical and biophysical studies of membrane proteins and are commonly acknowledged as an optimal membrane mimetic system that provides control over size, composition, and specific functional modifications on the nanometer scale. In this review we attempted to combine a comprehensive list of various applications of nanodisc technology with systematic analysis of the most attractive features of this system and advantages provided by nanodiscs for structural and mechanistic studies of membrane proteins.

  9. Strategies used to guide the design and implementation of a national river monitoring programme in South Africa.

    PubMed

    Roux, D J

    2001-06-01

    This article explores the strategies that were, and are being, used to facilitate the transition from scientific development to operational application of the South African River Health Programme (RHP). Theoretical models from the field of the management of technology are used to provide insight into the dynamics that influence the relationship between the creation and application of environmental programmes, and the RHP in particular. Four key components of the RHP design are analysed, namely the (a) guiding team, (b) concepts, tools and methods, (c) infra-structural innovations and (d) communication. These key components evolved over three broad life stages of the programme, which are called the design, growth and anchoring stages.

  10. Study of application of adaptive systems to the exploration of the solar system. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The field of artificial intelligence to identify practical applications to unmanned spacecraft used to explore the solar system in the decade of the 80s is examined. If an unmanned spacecraft can be made to adjust or adapt to the environment, to make decisions about what it measures and how it uses and reports the data, it can become a much more powerful tool for the science community in unlocking the secrets of the solar system. Within this definition of an adaptive spacecraft or system, there is a broad range of variability. In terms of sophistication, an adaptive system can be extremely simple or as complex as a chess-playing machine that learns from its mistakes.

  11. 2nd Congress on applied synthetic biology in Europe (Málaga, Spain, November 2013).

    PubMed

    Vetter, Beatrice V; Pantidos, Nikolaos; Edmundson, Matthew

    2014-05-25

    The second meeting organised by the EFB on the advances of applied synthetic biology in Europe was held in Málaga, Spain in November 2013. The potential for the broad application of synthetic biology was reflected in the five sessions of this meeting: synthetic biology for healthcare applications, tools and technologies for synthetic biology, production of recombinant proteins, synthetic plant biology, and biofuels and other small molecules. Outcomes from the meeting were that synthetic biology offers methods for rapid development of new strains that will result in decreased production costs, sustainable chemical production and new medical applications. Additionally, it also introduced novel ways to produce sustainable energy and biofuels, to find new alternatives for bioremediation and resource recovery, and environmentally friendly foodstuff production. All the above-mentioned advances could enable biotechnology to solve some of the major problems of Society. However, while there are still limitations in terms of lacking tools, standardisation and suitable host organisms, this meeting has laid a foundation providing cutting-edge concepts and techniques to ultimately convert the potential of synthetic biology into practice. Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  12. Modeling and performance assessment in QinetiQ of EO and IR airborne reconnaissance systems

    NASA Astrophysics Data System (ADS)

    Williams, John W.; Potter, Gary E.

    2002-11-01

    QinetiQ are the technical authority responsible for specifying the performance requirements for the procurement of airborne reconnaissance systems, on behalf of the UK MoD. They are also responsible for acceptance of delivered systems, overseeing and verifying the installed system performance as predicted and then assessed by the contractor. Measures of functional capability are central to these activities. The conduct of these activities utilises the broad technical insight and wide range of analysis tools and models available within QinetiQ. This paper focuses on the tools, methods and models that are applicable to systems based on EO and IR sensors. The tools, methods and models are described, and representative output for systems that QinetiQ has been responsible for is presented. The principle capability applicable to EO and IR airborne reconnaissance systems is the STAR (Simulation Tools for Airborne Reconnaissance) suite of models. STAR generates predictions of performance measures such as GRD (Ground Resolved Distance) and GIQE (General Image Quality) NIIRS (National Imagery Interpretation Rating Scales). It also generates images representing sensor output, using the scene generation software CAMEO-SIM and the imaging sensor model EMERALD. The simulated image 'quality' is fully correlated with the predicted non-imaging performance measures. STAR also generates image and table data that is compliant with STANAG 7023, which may be used to test ground station functionality.

  13. Foodomics and Food Safety: Where We Are.

    PubMed

    Andjelković, Uroš; Šrajer Gajdošik, Martina; Gašo-Sokač, Dajana; Martinović, Tamara; Josić, Djuro

    2017-09-01

    The power of foodomics as a discipline that is now broadly used for quality assurance of food products and adulteration identification, as well as for determining the safety of food, is presented. Concerning sample preparation and application, maintenance of highly sophisticated instruments for both high-performance and high-throughput techniques, and analysis and data interpretation, special attention has to be paid to the development of skilled analysts. The obtained data shall be integrated under a strong bioinformatics environment. Modern mass spectrometry is an extremely powerful analytical tool since it can provide direct qualitative and quantitative information about a molecule of interest from only a minute amount of sample. Quality of this information is influenced by the sample preparation procedure, the type of mass spectrometer used and the analyst's skills. Technical advances are bringing new instruments of increased sensitivity, resolution and speed to the market. Other methods presented here give additional information and can be used as complementary tools to mass spectrometry or for validation of obtained results. Genomics and transcriptomics, as well as affinity-based methods, still have a broad use in food analysis. Serious drawbacks of some of them, especially the affinity-based methods, are the cross-reactivity between similar molecules and the influence of complex food matrices. However, these techniques can be used for pre-screening in order to reduce the large number of samples. Great progress has been made in the application of bioinformatics in foodomics. These developments enabled processing of large amounts of generated data for both identification and quantification, and for corresponding modeling.

  14. Foodomics and Food Safety: Where We Are

    PubMed Central

    Andjelković, Uroš

    2017-01-01

    Summary The power of foodomics as a discipline that is now broadly used for quality assurance of food products and adulteration identification, as well as for determining the safety of food, is presented. Concerning sample preparation and application, maintenance of highly sophisticated instruments for both high-performance and high-throughput techniques, and analysis and data interpretation, special attention has to be paid to the development of skilled analysts. The obtained data shall be integrated under a strong bioinformatics environment. Modern mass spectrometry is an extremely powerful analytical tool since it can provide direct qualitative and quantitative information about a molecule of interest from only a minute amount of sample. Quality of this information is influenced by the sample preparation procedure, the type of mass spectrometer used and the analyst’s skills. Technical advances are bringing new instruments of increased sensitivity, resolution and speed to the market. Other methods presented here give additional information and can be used as complementary tools to mass spectrometry or for validation of obtained results. Genomics and transcriptomics, as well as affinity-based methods, still have a broad use in food analysis. Serious drawbacks of some of them, especially the affinity-based methods, are the cross-reactivity between similar molecules and the influence of complex food matrices. However, these techniques can be used for pre-screening in order to reduce the large number of samples. Great progress has been made in the application of bioinformatics in foodomics. These developments enabled processing of large amounts of generated data for both identification and quantification, and for corresponding modeling. PMID:29089845

  15. A user-friendly workflow for analysis of Illumina gene expression bead array data available at the arrayanalysis.org portal.

    PubMed

    Eijssen, Lars M T; Goelela, Varshna S; Kelder, Thomas; Adriaens, Michiel E; Evelo, Chris T; Radonjic, Marijana

    2015-06-30

    Illumina whole-genome expression bead arrays are a widely used platform for transcriptomics. Most of the tools available for the analysis of the resulting data are not easily applicable by less experienced users. ArrayAnalysis.org provides researchers with an easy-to-use and comprehensive interface to the functionality of R and Bioconductor packages for microarray data analysis. As a modular open source project, it allows developers to contribute modules that provide support for additional types of data or extend workflows. To enable data analysis of Illumina bead arrays for a broad user community, we have developed a module for ArrayAnalysis.org that provides a free and user-friendly web interface for quality control and pre-processing for these arrays. This module can be used together with existing modules for statistical and pathway analysis to provide a full workflow for Illumina gene expression data analysis. The module accepts data exported from Illumina's GenomeStudio, and provides the user with quality control plots and normalized data. The outputs are directly linked to the existing statistics module of ArrayAnalysis.org, but can also be downloaded for further downstream analysis in third-party tools. The Illumina bead arrays analysis module is available at http://www.arrayanalysis.org . A user guide, a tutorial demonstrating the analysis of an example dataset, and R scripts are available. The module can be used as a starting point for statistical evaluation and pathway analysis provided on the website or to generate processed input data for a broad range of applications in life sciences research.

  16. LoRTE: Detecting transposon-induced genomic variants using low coverage PacBio long read sequences.

    PubMed

    Disdero, Eric; Filée, Jonathan

    2017-01-01

    Population genomic analysis of transposable elements has greatly benefited from recent advances of sequencing technologies. However, the short size of the reads and the propensity of transposable elements to nest in highly repeated regions of genomes limits the efficiency of bioinformatic tools when Illumina or 454 technologies are used. Fortunately, long read sequencing technologies generating read length that may span the entire length of full transposons are now available. However, existing TE population genomic softwares were not designed to handle long reads and the development of new dedicated tools is needed. LoRTE is the first tool able to use PacBio long read sequences to identify transposon deletions and insertions between a reference genome and genomes of different strains or populations. Tested against simulated and genuine Drosophila melanogaster PacBio datasets, LoRTE appears to be a reliable and broadly applicable tool to study the dynamic and evolutionary impact of transposable elements using low coverage, long read sequences. LoRTE is an efficient and accurate tool to identify structural genomic variants caused by TE insertion or deletion. LoRTE is available for download at http://www.egce.cnrs-gif.fr/?p=6422.

  17. A data-based conservation planning tool for Florida panthers

    USGS Publications Warehouse

    Murrow, Jennifer L.; Thatcher, Cindy A.; Van Manen, Frank T.; Clark, Joseph D.

    2013-01-01

    Habitat loss and fragmentation are the greatest threats to the endangered Florida panther (Puma concolor coryi). We developed a data-based habitat model and user-friendly interface so that land managers can objectively evaluate Florida panther habitat. We used a geographic information system (GIS) and the Mahalanobis distance statistic (D2) to develop a model based on broad-scale landscape characteristics associated with panther home ranges. Variables in our model were Euclidean distance to natural land cover, road density, distance to major roads, human density, amount of natural land cover, amount of semi-natural land cover, amount of permanent or semi-permanent flooded area–open water, and a cost–distance variable. We then developed a Florida Panther Habitat Estimator tool, which automates and replicates the GIS processes used to apply the statistical habitat model. The estimator can be used by persons with moderate GIS skills to quantify effects of land-use changes on panther habitat at local and landscape scales. Example applications of the tool are presented.

  18. A new comprehension and communication tool: a valuable resource for internationally educated occupational therapists.

    PubMed

    Nguyen, Tram; Baptiste, Sue; Jung, Bonny; Wilkins, Seanne

    2014-06-01

    The need was identified for a way to assess internationally educated occupational therapists’ skills in understanding and communicating professional terminology used in occupational therapy practice. The project aim was to develop and validate such a resource. A scenario-based assessment was developed using a three-phase process for tool development. The development process involved completion of a literature scan of professional terminology used in occupational therapy practice; selection of terms and concepts commonly used in occupational therapy practice; and, creation of practice-based scenarios illustrating key concepts complete with rating rubrics. An advisory group provided oversight, and a sample of internationally educated occupational therapists completed pilot and validity testing. The initial findings showed the assessment to be easy to complete and sensitive to testing understanding of the defined terms. The final outcome is an assessment tool that has broad application for occupational therapists wishing to enter professional practice in a new country. © 2013 Occupational Therapy Australia.

  19. eFarm: A Tool for Better Observing Agricultural Land Systems

    PubMed Central

    Yu, Qiangyi; Shi, Yun; Tang, Huajun; Yang, Peng; Xie, Ankun; Liu, Bin; Wu, Wenbin

    2017-01-01

    Currently, observations of an agricultural land system (ALS) largely depend on remotely-sensed images, focusing on its biophysical features. While social surveys capture the socioeconomic features, the information was inadequately integrated with the biophysical features of an ALS and the applications are limited due to the issues of cost and efficiency to carry out such detailed and comparable social surveys at a large spatial coverage. In this paper, we introduce a smartphone-based app, called eFarm: a crowdsourcing and human sensing tool to collect the geotagged ALS information at the land parcel level, based on the high resolution remotely-sensed images. We illustrate its main functionalities, including map visualization, data management, and data sensing. Results of the trial test suggest the system works well. We believe the tool is able to acquire the human–land integrated information which is broadly-covered and timely-updated, thus presenting great potential for improving sensing, mapping, and modeling of ALS studies. PMID:28245554

  20. Case formulation and management using pattern-based formulation (PBF) methodology: clinical case 1.

    PubMed

    Fernando, Irosh; Cohen, Martin

    2014-02-01

    A tool for psychiatric case formulation known as pattern-based formulation (PBF) has been recently introduced. This paper presents an application of this methodology in formulating and managing complex clinical cases. The symptomatology of the clinical presentation has been parsed into individual clinical phenomena and interpreted by selecting explanatory models. The clinical presentation demonstrates how PBF has been used as a clinical tool to guide clinicians' thinking, that takes a structured approach to manage multiple issues using a broad range of management strategies. In doing so, the paper also introduces a number of patterns related to the observed clinical phenomena that can be re-used as explanatory models when formulating other clinical cases. It is expected that this paper will assist clinicians, and particularly trainees, to better understand PBF methodology and apply it to improve their formulation skills.

  1. Adapting CRISPR/Cas9 for functional genomics screens.

    PubMed

    Malina, Abba; Katigbak, Alexandra; Cencic, Regina; Maïga, Rayelle Itoua; Robert, Francis; Miura, Hisashi; Pelletier, Jerry

    2014-01-01

    The use of CRISPR/Cas9 (clustered regularly interspaced short palindromic repeats/CRISPR-associated protein) for targeted genome editing has been widely adopted and is considered a "game changing" technology. The ease and rapidity by which this approach can be used to modify endogenous loci in a wide spectrum of cell types and organisms makes it a powerful tool for customizable genetic modifications as well as for large-scale functional genomics. The development of retrovirus-based expression platforms to simultaneously deliver the Cas9 nuclease and single guide (sg) RNAs provides unique opportunities by which to ensure stable and reproducible expression of the editing tools and a broad cell targeting spectrum, while remaining compatible with in vivo genetic screens. Here, we describe methods and highlight considerations for designing and generating sgRNA libraries in all-in-one retroviral vectors for such applications.

  2. Using "big data" to capture overall health status: properties and predictive value of a claims-based health risk score.

    PubMed

    Hamad, Rita; Modrek, Sepideh; Kubo, Jessica; Goldstein, Benjamin A; Cullen, Mark R

    2015-01-01

    Investigators across many fields often struggle with how best to capture an individual's overall health status, with options including both subjective and objective measures. With the increasing availability of "big data," researchers can now take advantage of novel metrics of health status. These predictive algorithms were initially developed to forecast and manage expenditures, yet they represent an underutilized tool that could contribute significantly to health research. In this paper, we describe the properties and possible applications of one such "health risk score," the DxCG Intelligence tool. We link claims and administrative datasets on a cohort of U.S. workers during the period 1996-2011 (N = 14,161). We examine the risk score's association with incident diagnoses of five disease conditions, and we link employee data with the National Death Index to characterize its relationship with mortality. We review prior studies documenting the risk score's association with other health and non-health outcomes, including healthcare utilization, early retirement, and occupational injury. We find that the risk score is associated with outcomes across a variety of health and non-health domains. These examples demonstrate the broad applicability of this tool in multiple fields of research and illustrate its utility as a measure of overall health status for epidemiologists and other health researchers.

  3. Integrating human health and environmental health into the DPSIR framework: a tool to identify research opportunities for sustainable and healthy communities.

    PubMed

    Yee, Susan H; Bradley, Patricia; Fisher, William S; Perreault, Sally D; Quackenboss, James; Johnson, Eric D; Bousquin, Justin; Murphy, Patricia A

    2012-12-01

    The U.S. Environmental Protection Agency has recently realigned its research enterprise around the concept of sustainability. Scientists from across multiple disciplines have a role to play in contributing the information, methods, and tools needed to more fully understand the long-term impacts of decisions on the social and economic sustainability of communities. Success will depend on a shift in thinking to integrate, organize, and prioritize research within a systems context. We used the Driving forces-Pressures-State-Impact-Response (DPSIR) framework as a basis for integrating social, cultural, and economic aspects of environmental and human health into a single framework. To make the framework broadly applicable to sustainability research planning, we provide a hierarchical system of DPSIR keywords and guidelines for use as a communication tool. The applicability of the integrated framework was first tested on a public health issue (asthma disparities) for purposes of discussion. We then applied the framework at a science planning meeting to identify opportunities for sustainable and healthy communities research. We conclude that an integrated systems framework has many potential roles in science planning, including identifying key issues, visualizing interactions within the system, identifying research gaps, organizing information, developing computational models, and identifying indicators.

  4. The Web as an educational tool for/in learning/teaching bioinformatics statistics.

    PubMed

    Oliver, J; Pisano, M E; Alonso, T; Roca, P

    2005-12-01

    Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.

  5. Methods and potentials for using satellite image classification in school lessons

    NASA Astrophysics Data System (ADS)

    Voss, Kerstin; Goetzke, Roland; Hodam, Henryk

    2011-11-01

    The FIS project - FIS stands for Fernerkundung in Schulen (Remote Sensing in Schools) - aims at a better integration of the topic "satellite remote sensing" in school lessons. According to this, the overarching objective is to teach pupils basic knowledge and fields of application of remote sensing. Despite the growing significance of digital geomedia, the topic "remote sensing" is not broadly supported in schools. Often, the topic is reduced to a short reflection on satellite images and used only for additional illustration of issues relevant for the curriculum. Without addressing the issue of image data, this can hardly contribute to the improvement of the pupils' methodical competences. Because remote sensing covers more than simple, visual interpretation of satellite images, it is necessary to integrate remote sensing methods like preprocessing, classification and change detection. Dealing with these topics often fails because of confusing background information and the lack of easy-to-use software. Based on these insights, the FIS project created different simple analysis tools for remote sensing in school lessons, which enable teachers as well as pupils to be introduced to the topic in a structured way. This functionality as well as the fields of application of these analysis tools will be presented in detail with the help of three different classification tools for satellite image classification.

  6. Neuronal Morphology goes Digital: A Research Hub for Cellular and System Neuroscience

    PubMed Central

    Parekh, Ruchi; Ascoli, Giorgio A.

    2013-01-01

    Summary The importance of neuronal morphology in brain function has been recognized for over a century. The broad applicability of “digital reconstructions” of neuron morphology across neuroscience sub-disciplines has stimulated the rapid development of numerous synergistic tools for data acquisition, anatomical analysis, three-dimensional rendering, electrophysiological simulation, growth models, and data sharing. Here we discuss the processes of histological labeling, microscopic imaging, and semi-automated tracing. Moreover, we provide an annotated compilation of currently available resources in this rich research “ecosystem” as a central reference for experimental and computational neuroscience. PMID:23522039

  7. Diazo Compounds: Versatile Tools for Chemical Biology.

    PubMed

    Mix, Kalie A; Aronoff, Matthew R; Raines, Ronald T

    2016-12-16

    Diazo groups have broad and tunable reactivity. That and other attributes endow diazo compounds with the potential to be valuable reagents for chemical biologists. The presence of diazo groups in natural products underscores their metabolic stability and anticipates their utility in a biological context. The chemoselectivity of diazo groups, even in the presence of azido groups, presents many opportunities. Already, diazo compounds have served as chemical probes and elicited novel modifications of proteins and nucleic acids. Here, we review advances that have facilitated the chemical synthesis of diazo compounds, and we highlight applications of diazo compounds in the detection and modification of biomolecules.

  8. A miniaturized optoelectronic system for rapid quantitative label-free detection of harmful species in food

    NASA Astrophysics Data System (ADS)

    Raptis, Ioannis; Misiakos, Konstantinos; Makarona, Eleni; Salapatas, Alexandros; Petrou, Panagiota; Kakabakos, Sotirios; Botsialas, Athanasios; Jobst, Gerhard; Haasnoot, Willem; Fernandez-Alba, Amadeo; Lees, Michelle; Valamontes, Evangelos

    2016-03-01

    Optical biosensors have emerged in the past decade as the most promising candidates for portable, highly-sensitive bioanalytical systems that can be employed for in-situ measurements. In this work, a miniaturized optoelectronic system for rapid, quantitative, label-free detection of harmful species in food is presented. The proposed system has four distinctive features that can render to a powerful tool for the next generation of Point-of-Need applications, namely it accommodates the light sources and ten interferometric biosensors on a single silicon chip of a less-than-40mm2 footprint, each sensor can be individually functionalized for a specific target analyte, the encapsulation can be performed at the wafer-scale, and finally it exploits a new operation principle, Broad-band Mach-Zehnder Interferometry to ameliorate its analytical capabilities. Multi-analyte evaluation schemes for the simultaneous detection of harmful contaminants, such as mycotoxins, allergens and pesticides, proved that the proposed system is capable of detecting within short time these substances at concentrations below the limits imposed by regulatory authorities, rendering it to a novel tool for the near-future food safety applications.

  9. Traditional and modern plant breeding methods with examples in rice (Oryza sativa L.).

    PubMed

    Breseghello, Flavio; Coelho, Alexandre Siqueira Guedes

    2013-09-04

    Plant breeding can be broadly defined as alterations caused in plants as a result of their use by humans, ranging from unintentional changes resulting from the advent of agriculture to the application of molecular tools for precision breeding. The vast diversity of breeding methods can be simplified into three categories: (i) plant breeding based on observed variation by selection of plants based on natural variants appearing in nature or within traditional varieties; (ii) plant breeding based on controlled mating by selection of plants presenting recombination of desirable genes from different parents; and (iii) plant breeding based on monitored recombination by selection of specific genes or marker profiles, using molecular tools for tracking within-genome variation. The continuous application of traditional breeding methods in a given species could lead to the narrowing of the gene pool from which cultivars are drawn, rendering crops vulnerable to biotic and abiotic stresses and hampering future progress. Several methods have been devised for introducing exotic variation into elite germplasm without undesirable effects. Cases in rice are given to illustrate the potential and limitations of different breeding approaches.

  10. The Fishery Performance Indicators: A Management Tool for Triple Bottom Line Outcomes

    PubMed Central

    Anderson, James L.; Anderson, Christopher M.; Chu, Jingjie; Meredith, Jennifer; Asche, Frank; Sylvia, Gil; Smith, Martin D.; Anggraeni, Dessy; Arthur, Robert; Guttormsen, Atle; McCluney, Jessica K.; Ward, Tim; Akpalu, Wisdom; Eggert, Håkan; Flores, Jimely; Freeman, Matthew A.; Holland, Daniel S.; Knapp, Gunnar; Kobayashi, Mimako; Larkin, Sherry; MacLauchlin, Kari; Schnier, Kurt; Soboil, Mark; Tveteras, Sigbjorn; Uchida, Hirotsugu; Valderrama, Diego

    2015-01-01

    Pursuit of the triple bottom line of economic, community and ecological sustainability has increased the complexity of fishery management; fisheries assessments require new types of data and analysis to guide science-based policy in addition to traditional biological information and modeling. We introduce the Fishery Performance Indicators (FPIs), a broadly applicable and flexible tool for assessing performance in individual fisheries, and for establishing cross-sectional links between enabling conditions, management strategies and triple bottom line outcomes. Conceptually separating measures of performance, the FPIs use 68 individual outcome metrics—coded on a 1 to 5 scale based on expert assessment to facilitate application to data poor fisheries and sectors—that can be partitioned into sector-based or triple-bottom-line sustainability-based interpretative indicators. Variation among outcomes is explained with 54 similarly structured metrics of inputs, management approaches and enabling conditions. Using 61 initial fishery case studies drawn from industrial and developing countries around the world, we demonstrate the inferential importance of tracking economic and community outcomes, in addition to resource status. PMID:25946194

  11. Generalized schemes for high throughput manipulation of the Desulfovibrio vulgaris Hildenborough genome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chhabra, S.R.; Butland, G.; Elias, D.

    The ability to conduct advanced functional genomic studies of the thousands of sequenced bacteria has been hampered by the lack of available tools for making high- throughput chromosomal manipulations in a systematic manner that can be applied across diverse species. In this work, we highlight the use of synthetic biological tools to assemble custom suicide vectors with reusable and interchangeable DNA “parts” to facilitate chromosomal modification at designated loci. These constructs enable an array of downstream applications including gene replacement and creation of gene fusions with affinity purification or localization tags. We employed this approach to engineer chromosomal modifications inmore » a bacterium that has previously proven difficult to manipulate genetically, Desulfovibrio vulgaris Hildenborough, to generate a library of over 700 strains. Furthermore, we demonstrate how these modifications can be used for examining metabolic pathways, protein-protein interactions, and protein localization. The ubiquity of suicide constructs in gene replacement throughout biology suggests that this approach can be applied to engineer a broad range of species for a diverse array of systems biological applications and is amenable to high-throughput implementation.« less

  12. Recognizing and exploring the right questions with climate data: An example of better understanding ENSO in climate projections

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.; Buja, L.; Gutowski, W. J., Jr.; Halley-Gotway, J.; Kaatz, L.; Yates, D. N.

    2017-12-01

    Coordinated, multi-model climate change projection archives have already led to a flourishing of new climate impact applications. Collections and online tools for the computation of derived indicators have attracted many non-specialist users and decision-makers and facilitated for them the exploration of potential future weather and climate changes on their systems. Guided by a set of standardized steps and analyses, many can now use model output and determine basic model-based changes. But because each application and decision-context is different, the question remains if such a small collection of standardized tools can faithfully and comprehensively represent the critical physical context of change? We use the example of the El Niño - Southern Oscillation, the largest and most broadly recognized mode of variability in the climate system, to explore the difference in impact contexts between a quasi-blind, protocol-bound and a flexible, scientifically guided use of climate information. More use oriented diagnostics of the model-data as well as different strategies for getting data into decision environments are explored.

  13. DotMapper: an open source tool for creating interactive disease point maps.

    PubMed

    Smith, Catherine M; Hayward, Andrew C

    2016-04-12

    Molecular strain typing of tuberculosis isolates has led to increased understanding of the epidemiological characteristics of the disease and improvements in its control, diagnosis and treatment. However, molecular cluster investigations, which aim to detect previously unidentified cases, remain challenging. Interactive dot mapping is a simple approach which could aid investigations by highlighting cases likely to share epidemiological links. Current tools generally require technical expertise or lack interactivity. We designed a flexible application for producing disease dot maps using Shiny, a web application framework for the statistical software, R. The application displays locations of cases on an interactive map colour coded according to levels of categorical variables such as demographics and risk factors. Cases can be filtered by selecting combinations of these characteristics and by notification date. It can be used to rapidly identify geographic patterns amongst cases in molecular clusters of tuberculosis in space and time; generate hypotheses about disease transmission; identify outliers, and guide targeted control measures. DotMapper is a user-friendly application which enables rapid production of maps displaying locations of cases and their epidemiological characteristics without the need for specialist training in geographic information systems. Enhanced understanding of tuberculosis transmission using this application could facilitate improved detection of cases with epidemiological links and therefore lessen the public health impacts of the disease. It is a flexible system and also has broad international potential application to other investigations using geo-coded health information.

  14. Combined In Situ Illumination-NMR-UV/Vis Spectroscopy: A New Mechanistic Tool in Photochemistry.

    PubMed

    Seegerer, Andreas; Nitschke, Philipp; Gschwind, Ruth M

    2018-06-18

    Synthetic applications in photochemistry are booming. Despite great progress in the development of new reactions, mechanistic investigations are still challenging. Therefore, we present a fully automated in situ combination of NMR spectroscopy, UV/Vis spectroscopy, and illumination to allow simultaneous and time-resolved detection of paramagnetic and diamagnetic species. This optical fiber-based setup enables the first acquisition of combined UV/Vis and NMR spectra in photocatalysis, as demonstrated on a conPET process. Furthermore, the broad applicability of combined UVNMR spectroscopy for light-induced processes is demonstrated on a structural and quantitative analysis of a photoswitch, including rate modulation and stabilization of transient species by temperature variation. Owing to the flexibility regarding the NMR hardware, temperature, and light sources, we expect wide-ranging applications of this setup in various research fields. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  15. Decision Support in a Changing and Contentious World--Successfully Supporting the Development of a 50-year Comprehensive Coastal Master Plan in Louisiana

    NASA Astrophysics Data System (ADS)

    Groves, D.

    2014-12-01

    After the devastating 2005 hurricane season, Louisiana embarked on an ambitious and daunting effort to develop and implement a comprehensive Coastal Master Plan. The Master Plan sought to achieve two key goals simultaneously: reduce hurricane flood risk and halt the net conversion of its coastal landscape to open ocean. Numerous prior efforts to achieve these goals had been tried without significant success. In 2012, however, the Louisiana Coastal Protection and Restoration Authority (CPRA) produced a 50-year, $50 billion Master Plan. It had broad support from a diverse and often adversarial set of stakeholders, and it was unanimously passed by the Louisiana legislature. In contrast to other efforts, CPRA took an approach to planning called by the U.S. National Research Council as "deliberation with analysis". Specifically, CPRA used data, models, and decision support tools not to define an optimal or best strategy, but instead to support stakeholder dialogue and deliberations over alterative coastal management strategies. RAND researchers, with the support of CPRA and other collaborators, developed the planning tool at the center of this process. The CPRA planning tool synthesized large amounts of information about how the coast might evolve over time with and without different combinations of hundreds of different projects and programs. The tool helped CPRA propose alternative strategies that could achieve the State's goals while also highlighting to stakeholders the key tradeoffs among them. Importantly, this process helped bring diverse communities together to support a single vision and specific set of projects and programs to meet many of Louisiana's coastal water resources challenges. This presentation will describe the planning approach and decision support tools developed to support the Master Plan's participatory stakeholder process. The presentation will also highlight several important key takeaway messages that have broad applicability to other water resources planning efforts. Lastly, it will describe several on-going efforts in other parts of the U.S. that are employing this same approach.

  16. Genome Partitioner: A web tool for multi-level partitioning of large-scale DNA constructs for synthetic biology applications

    PubMed Central

    Del Medico, Luca; Christen, Heinz; Christen, Beat

    2017-01-01

    Recent advances in lower-cost DNA synthesis techniques have enabled new innovations in the field of synthetic biology. Still, efficient design and higher-order assembly of genome-scale DNA constructs remains a labor-intensive process. Given the complexity, computer assisted design tools that fragment large DNA sequences into fabricable DNA blocks are needed to pave the way towards streamlined assembly of biological systems. Here, we present the Genome Partitioner software implemented as a web-based interface that permits multi-level partitioning of genome-scale DNA designs. Without the need for specialized computing skills, biologists can submit their DNA designs to a fully automated pipeline that generates the optimal retrosynthetic route for higher-order DNA assembly. To test the algorithm, we partitioned a 783 kb Caulobacter crescentus genome design. We validated the partitioning strategy by assembling a 20 kb test segment encompassing a difficult to synthesize DNA sequence. Successful assembly from 1 kb subblocks into the 20 kb segment highlights the effectiveness of the Genome Partitioner for reducing synthesis costs and timelines for higher-order DNA assembly. The Genome Partitioner is broadly applicable to translate DNA designs into ready to order sequences that can be assembled with standardized protocols, thus offering new opportunities to harness the diversity of microbial genomes for synthetic biology applications. The Genome Partitioner web tool can be accessed at https://christenlab.ethz.ch/GenomePartitioner. PMID:28531174

  17. A perspective of nanotechnology in hypersensitivity reactions including drug allergy.

    PubMed

    Montañez, Maria Isabel; Ruiz-Sanchez, Antonio J; Perez-Inestrosa, Ezequiel

    2010-08-01

    We provide an overview of the application of the concepts of nanoscience and nanotechnology as a novel scientific approach to the area of nanomedicine related to the domain of the immune system. Particular emphasis will be paid to studies on drug allergy reactions. Several well defined chemical structures arranged in the dimension of the nanoscale are currently being studied for biomedical purposes. By interacting with the immune system, some of these show promising applications as vaccines, diagnostic tools and activators/effectors of the immune response. Even a brief listing of some key applications of nanostructured materials shows how broad and intense this area of nanomedicine is. As a result of the development of nanoscience and nanotechnology applied to medicine, new approaches can be envisioned for problems related to the modulation of the immune response, as well as in immunodiagnosis, and to design new tools to solve related medical challenges. Nanoparticles offer unique advantages with which to exploit new properties and for materials to play a major role in new diagnostic techniques and therapies. Fullerene-C60 and multivalent functionalized gold nanoparticles of various sizes have led to new tools and opened up new ways to study and interact with the immune system. Some of the most versatile nanostructures are dendrimers. In their interaction with the immune system they can naturally occurring macromolecules, taking advantage of the fact that dendrimers can be synthesized into nanosized structures. Their multivalence can be successfully exploited in vaccines and diagnostic tests for allergic reactions.

  18. Turbodrills and innovative PDC bits economically drilled hard formations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boudreaux, R.C.; Massey, K.

    1994-03-28

    The use of turbodrills and polycrystalline diamond compact (PDC) bits with an innovative, tracking cutting structure has improved drilling economics in medium and hard formations in the Gulf of Mexico. Field results have confirmed that turbodrilling with trackset PDC bits reduced drilling costs, compared to offset wells. The combination of turbodrills and trackset bits has been used successfully in a broad range of applications and with various drilling parameters. Formations ranging from medium shales to hard, abrasive sands have been successfully and economically drilled. The tools have been used in both water-based and oil-based muds. Additionally, the turbo-drill and tracksetmore » PDC bit combination has been stable on directional drilling applications. The locking effect of the cutting structure helps keep the bit on course.« less

  19. MDANSE: An Interactive Analysis Environment for Molecular Dynamics Simulations.

    PubMed

    Goret, G; Aoun, B; Pellegrini, E

    2017-01-23

    The MDANSE software-Molecular Dynamics Analysis of Neutron Scattering Experiments-is presented. It is an interactive application for postprocessing molecular dynamics (MD) simulations. Given the widespread use of MD simulations in material and biomolecular sciences to get a better insight for experimental techniques such as thermal neutron scattering (TNS), the development of MDANSE has focused on providing a user-friendly, interactive, graphical user interface for analyzing many trajectories in the same session and running several analyses simultaneously independently of the interface. This first version of MDANSE already proposes a broad range of analyses, and the application has been designed to facilitate the introduction of new analyses in the framework. All this makes MDANSE a valuable tool for extracting useful information from trajectories resulting from a wide range of MD codes.

  20. The clinical impact of recent advances in LC-MS for cancer biomarker discovery and verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hui; Shi, Tujin; Qian, Wei-Jun

    2015-12-04

    Mass spectrometry-based proteomics has become an indispensable tool in biomedical research with broad applications ranging from fundamental biology, systems biology, and biomarker discovery. Recent advances in LC-MS have made it become a major technology in clinical applications, especially in cancer biomarker discovery and verification. To overcome the challenges associated with the analysis of clinical samples, such as extremely wide dynamic range of protein concentrations in biofluids and the need to perform high throughput and accurate quantification, significant efforts have been devoted to improve the overall performance of LC-MS bases clinical proteomics. In this review, we summarize the recent advances inmore » LC-MS in the aspect of cancer biomarker discovery and quantification, and discuss its potentials, limitations, and future perspectives.« less

  1. Mathematics and engineering in real life through mathematical competitions

    NASA Astrophysics Data System (ADS)

    More, M.

    2018-02-01

    We bring out an experience of organizing mathematical competitions that can be used as a medium to motivate the student and teacher minds in new directions of thinking. This can contribute to fostering research, innovation and provide a hands-on experience of mathematical concepts with the real world. Mathematical competitions can be used to build curiosity and give an understanding of mathematical applications in real life. Participation in the competition has been classified under four broad categories. Student can showcase their findings in various forms of expression like model, poster, soft presentation, animation, live performance, art and poetry. The basic focus of the competition is on using open source computation tools and modern technology, to emphasize the relationship of mathematical concepts with engineering applications in real life.

  2. Towards quantum chemistry on a quantum computer.

    PubMed

    Lanyon, B P; Whitfield, J D; Gillett, G G; Goggin, M E; Almeida, M P; Kassal, I; Biamonte, J D; Mohseni, M; Powell, B J; Barbieri, M; Aspuru-Guzik, A; White, A G

    2010-02-01

    Exact first-principles calculations of molecular properties are currently intractable because their computational cost grows exponentially with both the number of atoms and basis set size. A solution is to move to a radically different model of computing by building a quantum computer, which is a device that uses quantum systems themselves to store and process data. Here we report the application of the latest photonic quantum computer technology to calculate properties of the smallest molecular system: the hydrogen molecule in a minimal basis. We calculate the complete energy spectrum to 20 bits of precision and discuss how the technique can be expanded to solve large-scale chemical problems that lie beyond the reach of modern supercomputers. These results represent an early practical step toward a powerful tool with a broad range of quantum-chemical applications.

  3. Streamlining genomes: toward the generation of simplified and stabilized microbial systems.

    PubMed

    Leprince, Audrey; van Passel, Mark W J; dos Santos, Vitor A P Martins

    2012-10-01

    At the junction between systems and synthetic biology, genome streamlining provides a solid foundation both for increased understanding of cellular circuitry, and for the tailoring of microbial chassis towards innovative biotechnological applications. Iterative genomic deletions (targeted and random) helps to generate simplified, stabilized and predictable genomes, whereas multiplexing genome engineering reveals a broad functional genetic diversity. The decrease in oligo and gene synthesis costs promises effective combinatorial tools for the generation of chassis based on streamlined and tractable genomes. Here we review recent progresses in streamlining genomes through recombineering techniques aiming to generate insights into cellular mechanisms and responses towards the design and assembly of streamlined genome chassis together with new cellular modules in diverse biotechnological applications. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Computer-aided drug discovery.

    PubMed

    Bajorath, Jürgen

    2015-01-01

    Computational approaches are an integral part of interdisciplinary drug discovery research. Understanding the science behind computational tools, their opportunities, and limitations is essential to make a true impact on drug discovery at different levels. If applied in a scientifically meaningful way, computational methods improve the ability to identify and evaluate potential drug molecules, but there remain weaknesses in the methods that preclude naïve applications. Herein, current trends in computer-aided drug discovery are reviewed, and selected computational areas are discussed. Approaches are highlighted that aid in the identification and optimization of new drug candidates. Emphasis is put on the presentation and discussion of computational concepts and methods, rather than case studies or application examples. As such, this contribution aims to provide an overview of the current methodological spectrum of computational drug discovery for a broad audience.

  5. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    PubMed

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision the broad utility of the framework for diverse problems across different length scales and imaging methods.

  6. Spectral mapping tools from the earth sciences applied to spectral microscopy data.

    PubMed

    Harris, A Thomas

    2006-08-01

    Spectral imaging, originating from the field of earth remote sensing, is a powerful tool that is being increasingly used in a wide variety of applications for material identification. Several workers have used techniques like linear spectral unmixing (LSU) to discriminate materials in images derived from spectral microscopy. However, many spectral analysis algorithms rely on assumptions that are often violated in microscopy applications. This study explores algorithms originally developed as improvements on early earth imaging techniques that can be easily translated for use with spectral microscopy. To best demonstrate the application of earth remote sensing spectral analysis tools to spectral microscopy data, earth imaging software was used to analyze data acquired with a Leica confocal microscope with mechanical spectral scanning. For this study, spectral training signatures (often referred to as endmembers) were selected with the ENVI (ITT Visual Information Solutions, Boulder, CO) "spectral hourglass" processing flow, a series of tools that use the spectrally over-determined nature of hyperspectral data to find the most spectrally pure (or spectrally unique) pixels within the data set. This set of endmember signatures was then used in the full range of mapping algorithms available in ENVI to determine locations, and in some cases subpixel abundances of endmembers. Mapping and abundance images showed a broad agreement between the spectral analysis algorithms, supported through visual assessment of output classification images and through statistical analysis of the distribution of pixels within each endmember class. The powerful spectral analysis algorithms available in COTS software, the result of decades of research in earth imaging, are easily translated to new sources of spectral data. Although the scale between earth imagery and spectral microscopy is radically different, the problem is the same: mapping material locations and abundances based on unique spectral signatures. (c) 2006 International Society for Analytical Cytology.

  7. Quantification of larval zebrafish motor function in multi-well plates using open-source MATLAB® applications

    PubMed Central

    Zhou, Yangzhong; Cattley, Richard T.; Cario, Clinton L.; Bai, Qing; Burton, Edward A.

    2014-01-01

    This article describes a method to quantify the movements of larval zebrafish in multi-well plates, using the open-source MATLAB® applications LSRtrack and LSRanalyze. The protocol comprises four stages: generation of high-quality, flatly-illuminated video recordings with exposure settings that facilitate object recognition; analysis of the resulting recordings using tools provided in LSRtrack to optimize tracking accuracy and motion detection; analysis of tracking data using LSRanalyze or custom MATLAB® scripts; implementation of validation controls. The method is reliable, automated and flexible, requires less than one hour of hands-on work for completion once optimized, and shows excellent signal:noise characteristics. The resulting data can be analyzed to determine: positional preference; displacement, velocity and acceleration; duration and frequency of movement events and rest periods. This approach is widely applicable to analyze spontaneous or stimulus-evoked zebrafish larval neurobehavioral phenotypes resulting from a broad array of genetic and environmental manipulations, in a multi-well plate format suitable for high-throughput applications. PMID:24901738

  8. Quantification of larval zebrafish motor function in multiwell plates using open-source MATLAB applications.

    PubMed

    Zhou, Yangzhong; Cattley, Richard T; Cario, Clinton L; Bai, Qing; Burton, Edward A

    2014-07-01

    This article describes a method to quantify the movements of larval zebrafish in multiwell plates, using the open-source MATLAB applications LSRtrack and LSRanalyze. The protocol comprises four stages: generation of high-quality, flatly illuminated video recordings with exposure settings that facilitate object recognition; analysis of the resulting recordings using tools provided in LSRtrack to optimize tracking accuracy and motion detection; analysis of tracking data using LSRanalyze or custom MATLAB scripts; and implementation of validation controls. The method is reliable, automated and flexible, requires <1 h of hands-on work for completion once optimized and shows excellent signal:noise characteristics. The resulting data can be analyzed to determine the following: positional preference; displacement, velocity and acceleration; and duration and frequency of movement events and rest periods. This approach is widely applicable to the analysis of spontaneous or stimulus-evoked zebrafish larval neurobehavioral phenotypes resulting from a broad array of genetic and environmental manipulations, in a multiwell plate format suitable for high-throughput applications.

  9. Optical Coherence Tomography: Basic Concepts and Applications in Neuroscience Research

    PubMed Central

    2017-01-01

    Optical coherence tomography is a micrometer-scale imaging modality that permits label-free, cross-sectional imaging of biological tissue microstructure using tissue backscattering properties. After its invention in the 1990s, OCT is now being widely used in several branches of neuroscience as well as other fields of biomedical science. This review study reports an overview of OCT's applications in several branches or subbranches of neuroscience such as neuroimaging, neurology, neurosurgery, neuropathology, and neuroembryology. This study has briefly summarized the recent applications of OCT in neuroscience research, including a comparison, and provides a discussion of the remaining challenges and opportunities in addition to future directions. The chief aim of the review study is to draw the attention of a broad neuroscience community in order to maximize the applications of OCT in other branches of neuroscience too, and the study may also serve as a benchmark for future OCT-based neuroscience research. Despite some limitations, OCT proves to be a useful imaging tool in both basic and clinical neuroscience research. PMID:29214158

  10. compomics-utilities: an open-source Java library for computational proteomics.

    PubMed

    Barsnes, Harald; Vaudel, Marc; Colaert, Niklaas; Helsens, Kenny; Sickmann, Albert; Berven, Frode S; Martens, Lennart

    2011-03-08

    The growing interest in the field of proteomics has increased the demand for software tools and applications that process and analyze the resulting data. And even though the purpose of these tools can vary significantly, they usually share a basic set of features, including the handling of protein and peptide sequences, the visualization of (and interaction with) spectra and chromatograms, and the parsing of results from various proteomics search engines. Developers typically spend considerable time and effort implementing these support structures, which detracts from working on the novel aspects of their tool. In order to simplify the development of proteomics tools, we have implemented an open-source support library for computational proteomics, called compomics-utilities. The library contains a broad set of features required for reading, parsing, and analyzing proteomics data. compomics-utilities is already used by a long list of existing software, ensuring library stability and continued support and development. As a user-friendly, well-documented and open-source library, compomics-utilities greatly simplifies the implementation of the basic features needed in most proteomics tools. Implemented in 100% Java, compomics-utilities is fully portable across platforms and architectures. Our library thus allows the developers to focus on the novel aspects of their tools, rather than on the basic functions, which can contribute substantially to faster development, and better tools for proteomics.

  11. Rational Design of Photonic Dust from Nanoporous Anodic Alumina Films: A Versatile Photonic Nanotool for Visual Sensing

    PubMed Central

    Chen, Yuting; Santos, Abel; Wang, Ye; Kumeria, Tushar; Ho, Daena; Li, Junsheng; Wang, Changhai; Losic, Dusan

    2015-01-01

    Herein, we present a systematic study on the development, optimisation and applicability of interferometrically coloured distributed Bragg reflectors based on nanoporous anodic alumina (NAA-DBRs) in the form of films and nanoporous microparticles as visual/colorimetric analytical tools. Firstly, we synthesise a complete palette of NAA-DBRs by galvanostatic pulse anodisation approach, in which the current density is altered in a periodic fashion in order to engineer the effective medium of the resulting photonic films in depth. NAA-DBR photonic films feature vivid colours that can be tuned across the UV-visible-NIR spectrum by structural engineering. Secondly, the effective medium of the resulting photonic films is assessed systematically by visual analysis and reflectometric interference spectroscopy (RIfS) in order to establish the most optimal nanoporous platforms to develop visual/colorimetric tools. Then, we demonstrate the applicability of NAA-DBR photonic films as a chemically selective sensing platform for visual detection of mercury(II) ions. Finally, we generate a new nanomaterial, so-called photonic dust, by breaking down NAA-DBRs films into nanoporous microparticles. The resulting microparticles (μP-NAA-DBRs) display vivid colours and are sensitive towards changes in their effective medium, opening new opportunities for developing advanced photonic nanotools for a broad range of applications. PMID:26245759

  12. Rational Design of Photonic Dust from Nanoporous Anodic Alumina Films: A Versatile Photonic Nanotool for Visual Sensing

    NASA Astrophysics Data System (ADS)

    Chen, Yuting; Santos, Abel; Wang, Ye; Kumeria, Tushar; Ho, Daena; Li, Junsheng; Wang, Changhai; Losic, Dusan

    2015-08-01

    Herein, we present a systematic study on the development, optimisation and applicability of interferometrically coloured distributed Bragg reflectors based on nanoporous anodic alumina (NAA-DBRs) in the form of films and nanoporous microparticles as visual/colorimetric analytical tools. Firstly, we synthesise a complete palette of NAA-DBRs by galvanostatic pulse anodisation approach, in which the current density is altered in a periodic fashion in order to engineer the effective medium of the resulting photonic films in depth. NAA-DBR photonic films feature vivid colours that can be tuned across the UV-visible-NIR spectrum by structural engineering. Secondly, the effective medium of the resulting photonic films is assessed systematically by visual analysis and reflectometric interference spectroscopy (RIfS) in order to establish the most optimal nanoporous platforms to develop visual/colorimetric tools. Then, we demonstrate the applicability of NAA-DBR photonic films as a chemically selective sensing platform for visual detection of mercury(II) ions. Finally, we generate a new nanomaterial, so-called photonic dust, by breaking down NAA-DBRs films into nanoporous microparticles. The resulting microparticles (μP-NAA-DBRs) display vivid colours and are sensitive towards changes in their effective medium, opening new opportunities for developing advanced photonic nanotools for a broad range of applications.

  13. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-02-01

    The System for Automated Geoscientific Analyses (SAGA) is an open-source Geographic Information System (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular organized software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, an easily approachable graphical user interface with many visualization options, a command line interpreter, and interfaces to scripting and low level programming languages like R and Python. The current version 2.1.4 offers more than 700 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Further, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  14. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-07-01

    The System for Automated Geoscientific Analyses (SAGA) is an open source geographic information system (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, a user friendly graphical user interface with many visualization options, a command line interpreter, and interfaces to interpreted languages like R and Python. The current version 2.1.4 offers more than 600 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Furthermore, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies, with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  15. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  16. Sensors for observing ecosystem status

    NASA Astrophysics Data System (ADS)

    Kröger, S.; Parker, E. R.; Metcalfe, J. D.; Greenwood, N.; Forster, R. M.; Sivyer, D. B.; Pearce, D. J.

    2009-11-01

    This paper aims to review the availability and application of sensors for observing marine ecosystem status. It gives a broad overview of important ecosystem variables to be investigated, such as biogeochemical cycles, primary and secondary production, species distribution, animal movements, habitats and pollutants. Some relevant legislative drivers are listed, as they provide one context in which ecosystem studies are undertaken. In addition to literature cited within the text the paper contains some useful web links to assist the reader in making an informed instrument choice, as the authors feel that the topic is so broad, it is impossible to discuss all relevant systems or to provide appropriate detail for those discussed. It is therefore an introduction to how and why ecosystem status is currently observed, what variables are quantified, from what platforms, using remote sensing or in-situ measurements, and gives examples of useful sensor based tools. Starting with those presently available, to those under development and also highlighting sensors not yet realised but desirable for future studies.

  17. Sensors for observing ecosystem status

    NASA Astrophysics Data System (ADS)

    Kröger, S.; Parker, E. R.; Metcalfe, J. D.; Greenwood, N.; Forster, R. M.; Sivyer, D. B.; Pearce, D. J.

    2009-04-01

    This paper aims to review the availability and application of sensors for observing marine ecosystem status. It gives a broad overview of important ecosystem variables to be investigated, such as biogeochemical cycles, primary and secondary production, species distribution, animal movements, habitats and pollutants. Some relevant legislative drivers are listed, as they provide one context in which ecosystem studies are undertaken. In addition to literature cited within the text the paper contains some useful web links to assist the reader in making an informed instrument choice, as the authors feel that the topic is so broad, it is impossible to discuss all relevant systems or to provide appropriate detail for those discussed. This is therefore an introduction to how and why ecosystem status is currently observed, what variables are quantified, from what platforms, using remote sensing or in-situ measurements, and gives examples of useful sensor based tools. Starting with those presently available, to those under development and also highlighting sensors not yet realised but desirable for future studies.

  18. The 20th anniversary of EMBnet: 20 years of bioinformatics for the Life Sciences community

    PubMed Central

    D'Elia, Domenica; Gisel, Andreas; Eriksson, Nils-Einar; Kossida, Sophia; Mattila, Kimmo; Klucar, Lubos; Bongcam-Rudloff, Erik

    2009-01-01

    The EMBnet Conference 2008, focusing on 'Leading Applications and Technologies in Bioinformatics', was organized by the European Molecular Biology network (EMBnet) to celebrate its 20th anniversary. Since its foundation in 1988, EMBnet has been working to promote collaborative development of bioinformatics services and tools to serve the European community of molecular biology laboratories. This conference was the first meeting organized by the network that was open to the international scientific community outside EMBnet. The conference covered a broad range of research topics in bioinformatics with a main focus on new achievements and trends in emerging technologies supporting genomics, transcriptomics and proteomics analyses such as high-throughput sequencing and data managing, text and data-mining, ontologies and Grid technologies. Papers selected for publication, in this supplement to BMC Bioinformatics, cover a broad range of the topics treated, providing also an overview of the main bioinformatics research fields that the EMBnet community is involved in. PMID:19534734

  19. Multiple-locus variable number of tandem repeat analysis (MLVA) of Irish verocytotoxigenic Escherichia coli O157 from feedlot cattle: uncovering strain dissemination routes.

    PubMed

    Murphy, Mary; Minihan, Donal; Buckley, James F; O'Mahony, Micheál; Whyte, Paul; Fanning, Séamus

    2008-01-24

    The identification of the routes of dissemination of Escherichia coli (E. coli) O157 through a cohort of cattle is a critical step to control this pathogen at farm level. The aim of this study was to identify potential routes of dissemination of E. coli O157 using Multiple-Locus Variable number of tandem repeat Analysis (MLVA). Thirty-eight environmental and sixteen cattle faecal isolates, which were detected in four adjacent pens over a four-month period were sub-typed. MLVA could separate these isolates into broadly defined clusters consisting of twelve MLVA types. Strain diversity was observed within pens, individual cattle and the environment. Application of MLVA is a broadly useful and convenient tool when applied to uncover the dissemination of E. coli O157 in the environment and in supporting improved on-farm management of this important pathogen. These data identified diverse strain types based on amplification of VNTR markers in each case.

  20. Basic and functional effects of transcranial Electrical Stimulation (tES)-An introduction.

    PubMed

    Yavari, Fatemeh; Jamil, Asif; Mosayebi Samani, Mohsen; Vidor, Liliane Pinto; Nitsche, Michael A

    2018-02-01

    Non-invasive brain stimulation (NIBS) has been gaining increased popularity in human neuroscience research during the last years. Among the emerging NIBS tools is transcranial electrical stimulation (tES), whose main modalities are transcranial direct, and alternating current stimulation (tDCS, tACS). In tES, a small current (usually less than 3mA) is delivered through the scalp. Depending on its shape, density, and duration, the applied current induces acute or long-lasting effects on excitability and activity of cerebral regions, and brain networks. tES is increasingly applied in different domains to (a) explore human brain physiology with regard to plasticity, and brain oscillations, (b) explore the impact of brain physiology on cognitive processes, and (c) treat clinical symptoms in neurological and psychiatric diseases. In this review, we give a broad overview of the main mechanisms and applications of these brain stimulation tools. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Accounting for biodiversity in the dairy industry.

    PubMed

    Sizemore, Grant C

    2015-05-15

    Biodiversity is an essential part of properly functioning ecosystems, yet the loss of biodiversity currently occurs at rates unparalleled in the modern era. One of the major causes of this phenomenon is habitat loss and modification as a result of intensified agricultural practices. This paper provides a starting point for considering biodiversity within dairy production, and, although focusing primarily on the United States, findings are applicable broadly. Biodiversity definitions and assessments (e.g., indicators, tools) are proposed and reviewed. Although no single indicator or tool currently meets all the needs of comprehensive assessment, many sustainable practices are readily adoptable as ways to conserve and promote biodiversity. These practices, as well as potential funding opportunities are identified. Given the state of uncertainty in addressing the complex nature of biodiversity assessments, the adoption of generally sustainable environmental practices may be the best currently available option for protecting biodiversity on dairy lands. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. External validity and anchoring heuristics: application of DUNDRUM-1 to secure service gatekeeping in South Wales.

    PubMed

    Lawrence, Daniel; Davies, Tracey-Lee; Bagshaw, Ruth; Hewlett, Paul; Taylor, Pamela; Watt, Andrew

    2018-02-01

    Aims and method Structured clinical judgement tools provide scope for the standardisation of forensic service gatekeeping and also allow identification of heuristics in this decision process. The DUNDRUM-1 triage tool was completed retrospectively for 121 first-time referrals to forensic services in South Wales. Fifty were admitted to medium security, 49 to low security and 22 remained in open conditions. DUNDRUM-1 total scores differed appropriately between different levels of security. However, regression revealed heuristic anchoring on the 'legal process' and 'immediacy of risk due to mental disorder' items. Clinical implications Patient placement was broadly aligned with DUNDRUM-1 recommendations. However, not all triage items informed gatekeeping decisions. It remains to be seen whether decisions anchored in this way are effective. Declaration of interest Dr Mark Freestone gave permission for AUC values from Freestone et al. (2015) to be presented here for comparison.

  3. Multi-factor energy price models and exotic derivatives pricing

    NASA Astrophysics Data System (ADS)

    Hikspoors, Samuel

    The high pace at which many of the world's energy markets have gradually been opened to competition have generated a significant amount of new financial activity. Both academicians and practitioners alike recently started to develop the tools of energy derivatives pricing/hedging as a quantitative topic of its own. The energy contract structures as well as their underlying asset properties set the energy risk management industry apart from its more standard equity and fixed income counterparts. This thesis naturally contributes to these broad market developments in participating to the advances of the mathematical tools aiming at a better theory of energy contingent claim pricing/hedging. We propose many realistic two-factor and three-factor models for spot and forward price processes that generalize some well known and standard modeling assumptions. We develop the associated pricing methodologies and propose stable calibration algorithms that motivate the application of the relevant modeling schemes.

  4. Mac OS X for Astronomy

    NASA Astrophysics Data System (ADS)

    Pierfederici, F.; Pirzkal, N.; Hook, R. N.

    Mac OS X is the new Unix based version of the Macintosh operating system. It combines a high performance DisplayPDF user interface with a standard BSD UNIX subsystem and provides users with simultaneous access to a broad range of applications which were not previously available on a single system such as Microsoft Office and Adobe Photoshop, as well as legacy X11-based scientific tools and packages like IRAF, SuperMongo, MIDAS, etc. The combination of a modern GUI layered on top of a familiar UNIX environment paves the way for new, more flexible and powerful astronomical tools to be developed while assuring compatibility with already existing, older programs. In this paper, we outline the strengths of the Mac OS X platform in a scientific environment, astronomy in particular, and point to the numerous astronomical software packages available for this platform; most notably the Scisoft collection which we have compiled.

  5. Toward Personalized Control of Human Gut Bacterial Communities.

    PubMed

    David, Lawrence A

    2018-01-01

    A key challenge in microbiology will be developing tools for manipulating human gut bacterial communities. Our ability to predict and control the dynamics of these communities is now in its infancy. To manage human gut microbiota, I am developing methods in three research domains. First, I am refining in vitro tools to experimentally study gut microbes at high throughput and in controlled settings. Second, I am adapting "big data" techniques to overcome statistical challenges confronting microbiota modeling. Third, I am testing study designs that can streamline human testing of microbiota manipulations. Assembling these methods creates new challenges, including training scientists who can work across disciplines such as engineering, ecology, and medicine. Nevertheless, I envision that overcoming these obstacles will enable my group to construct platforms that can personalize microbiota treatments, particularly ones based on diet. More broadly, I anticipate that such platforms will have applications across fields such as agriculture, biotechnology, and environmental management.

  6. Direct push driven in situ color logging tool (CLT): technique, analysis routines, and application

    NASA Astrophysics Data System (ADS)

    Werban, U.; Hausmann, J.; Dietrich, P.; Vienken, T.

    2014-12-01

    Direct push technologies have recently seen a broad development providing several tools for in situ parameterization of unconsolidated sediments. One of these techniques is the measurement of soil colors - a proxy information that reveals to soil/sediment properties. We introduce the direct push driven color logging tool (CLT) for real-time and depth-resolved investigation of soil colors within the visible spectrum. Until now, no routines exist on how to handle high-resolved (mm-scale) soil color data. To develop such a routine, we transform raw data (CIEXYZ) into soil color surrogates of selected color spaces (CIExyY, CIEL*a*b*, CIEL*c*h*, sRGB) and denoise small-scale natural variability by Haar and Daublet4 wavelet transformation, gathering interpretable color logs over depth. However, interpreting color log data as a single application remains challenging. Additional information, such as site-specific knowledge of the geological setting, is required to correlate soil color data to specific layers properties. Hence, we exemplary provide results from a joint interpretation of in situ-obtained soil color data and 'state-of-the-art' direct push based profiling tool data and discuss the benefit of additional data. The developed routine is capable of transferring the provided information obtained as colorimetric data into interpretable color surrogates. Soil color data proved to correlate with small-scale lithological/chemical changes (e.g., grain size, oxidative and reductive conditions), especially when combined with additional direct push vertical high resolution data (e.g., cone penetration testing and soil sampling). Thus, the technique allows enhanced profiling by means of providing another reproducible high-resolution parameter for analysis subsurface conditions. This opens potential new areas of application and new outputs for such data in site investigation. It is our intention to improve color measurements by means method of application and data interpretation, useful to characterize vadose layer/soil/sediment characteristics.

  7. LifeWatchGreece Portal development: architecture, implementation and challenges for a biodiversity research e-infrastructure.

    PubMed

    Gougousis, Alexandros; Bailly, Nicolas

    2016-01-01

    Biodiversity data is characterized by its cross-disciplinary character, the extremely broad range of data types and structures, and the plethora of different data sources providing resources for the same piece of information in a heterogeneous way. Since the web inception two decades ago, there are multiple initiatives to connect, aggregate, share, and publish biodiversity data, and to establish data and work flows in order to analyze them. The European program LifeWatch aims at establishing a distributed network of nodes implementing virtual research environment in Europe to facilitate the work of biodiversity researchers and managers. LifeWatchGreece is one of these nodes where a portal was developed offering access to a suite of virtual laboratories and e-services. Despite its strict definition in information technology, in practice "portal" is a fairly broad term that embraces many web architectures. In the biodiversity domain, the term "portal" is usually used to indicate either a web site that provides access to a single or an aggregation of data repositories (like: http://indiabiodiversity.org/, http://www.mountainbiodiversity.org/, http://data.freshwaterbiodiversity.eu), a web site that gathers information about various online biodiversity tools (like http://test-eubon.ebd.csic.es/, http://marine.lifewatch.eu/) or a web site that just gathers information and news about the biodiversity domain (like http://chm.moew.government.bg). LifeWatchGreece's portal takes the concept of a portal a step further. In strict IT terms, LifeWatchGreece's portal is partly a portal, partly a platform and partly an aggregator. It includes a number of biodiversity-related web tools integrated into a centrally-controlled software ecosystem. This ecosystem includes subsystems for access control, traffic monitoring, user notifications and web tool management. These subsystems are shared to all the web tools that have been integrated to the portal and thereby are part of this ecosystem. These web tools do not consist in external and completely independent web applications as it happens in most other portals. A quite obvious (to the user) indication of this is the Single-Sign-On (SSO) functionality for all tools and the common user interface wrapper that most of these tools use. Another example of a less obvious functionality is the common user profile that is shared and can be utilized by all tools (e.g user's timezone).

  8. 76 FR 9777 - Recent Postings of Broadly Applicable Alternative Test Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... Applicable Alternative Test Methods AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of availability. SUMMARY: This notice announces the broadly applicable alternative test method approval decisions... technical questions about individual alternative test method decisions, refer to the contact person...

  9. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    USGS Publications Warehouse

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  10. The Individual Basic Facts Assessment Tool

    ERIC Educational Resources Information Center

    Tait-McCutcheon, Sandi; Drake, Michael

    2015-01-01

    There is an identified and growing need for a levelled diagnostic basic facts assessment tool that provides teachers with formative information about students' mastery of a broad range of basic fact sets. The Individual Basic Facts Assessment tool has been iteratively and cumulatively developed, trialled, and refined with input from teachers and…

  11. Ergonomics action research II: a framework for integrating HF into work system design.

    PubMed

    Neumann, W P; Village, J

    2012-01-01

    This paper presents a conceptual framework that can support efforts to integrate human factors (HF) into the work system design process, where improved and cost-effective application of HF is possible. The framework advocates strategies of broad stakeholder participation, linking of performance and health goals, and process focussed change tools that can help practitioners engage in improvements to embed HF into a firm's work system design process. Recommended tools include business process mapping of the design process, implementing design criteria, using cognitive mapping to connect to managers' strategic goals, tactical use of training and adopting virtual HF (VHF) tools to support the integration effort. Consistent with organisational change research, the framework provides guidance but does not suggest a strict set of steps. This allows more adaptability for the practitioner who must navigate within a particular organisational context to secure support for embedding HF into the design process for improved operator wellbeing and system performance. There has been little scientific literature about how a practitioner might integrate HF into a company's work system design process. This paper proposes a framework for this effort by presenting a coherent conceptual framework, process tools, design tools and procedural advice that can be adapted for a target organisation.

  12. GlycoWorkbench: a tool for the computer-assisted annotation of mass spectra of glycans.

    PubMed

    Ceroni, Alessio; Maass, Kai; Geyer, Hildegard; Geyer, Rudolf; Dell, Anne; Haslam, Stuart M

    2008-04-01

    Mass spectrometry is the main analytical technique currently used to address the challenges of glycomics as it offers unrivalled levels of sensitivity and the ability to handle complex mixtures of different glycan variations. Determination of glycan structures from analysis of MS data is a major bottleneck in high-throughput glycomics projects, and robust solutions to this problem are of critical importance. However, all the approaches currently available have inherent restrictions to the type of glycans they can identify, and none of them have proved to be a definitive tool for glycomics. GlycoWorkbench is a software tool developed by the EUROCarbDB initiative to assist the manual interpretation of MS data. The main task of GlycoWorkbench is to evaluate a set of structures proposed by the user by matching the corresponding theoretical list of fragment masses against the list of peaks derived from the spectrum. The tool provides an easy to use graphical interface, a comprehensive and increasing set of structural constituents, an exhaustive collection of fragmentation types, and a broad list of annotation options. The aim of GlycoWorkbench is to offer complete support for the routine interpretation of MS data. The software is available for download from: http://www.eurocarbdb.org/applications/ms-tools.

  13. A broad-spectrum, efficient and nontransgenic approach to control plant viruses by application of salicylic acid and jasmonic acid.

    PubMed

    Shang, Jing; Xi, De-Hui; Xu, Fei; Wang, Shao-Dong; Cao, Sen; Xu, Mo-Yun; Zhao, Ping-Ping; Wang, Jian-Hui; Jia, Shu-Dan; Zhang, Zhong-Wei; Yuan, Shu; Lin, Hong-Hui

    2011-02-01

    Plant viruses cause many diseases that lead to significant economic losses. However, most of the approaches to control plant viruses, including transgenic processes or drugs are plant-species-limited or virus-species-limited, and not very effective. We introduce an application of jasmonic acid (JA) and salicylic acid (SA), a broad-spectrum, efficient and nontransgenic method, to improve plant resistance to RNA viruses. Applying 0.06 mM JA and then 0.1 mM SA 24 h later, enhanced resistance to Cucumber mosaic virus (CMV), Tobacco mosaic virus (TMV) and Turnip crinkle virus (TCV) in Arabidopsis, tobacco, tomato and hot pepper. The inhibition efficiency to virus replication usually achieved up to 80-90%. The putative molecular mechanism was investigated. Some possible factors affecting the synergism of JA and SA have been defined, including WRKY53, WRKY70, PDF1.2, MPK4, MPK2, MPK3, MPK5, MPK12, MPK14, MKK1, MKK2, and MKK6. All genes involving in the synergism of JA and SA were investigated. This approach is safe to human beings and environmentally friendly and shows potential as a strong tool for crop protection against plant viruses.

  14. Post hoc support vector machine learning for impedimetric biosensors based on weak protein-ligand interactions.

    PubMed

    Rong, Y; Padron, A V; Hagerty, K J; Nelson, N; Chi, S; Keyhani, N O; Katz, J; Datta, S P A; Gomes, C; McLamore, E S

    2018-04-30

    Impedimetric biosensors for measuring small molecules based on weak/transient interactions between bioreceptors and target analytes are a challenge for detection electronics, particularly in field studies or in the analysis of complex matrices. Protein-ligand binding sensors have enormous potential for biosensing, but achieving accuracy in complex solutions is a major challenge. There is a need for simple post hoc analytical tools that are not computationally expensive, yet provide near real time feedback on data derived from impedance spectra. Here, we show the use of a simple, open source support vector machine learning algorithm for analyzing impedimetric data in lieu of using equivalent circuit analysis. We demonstrate two different protein-based biosensors to show that the tool can be used for various applications. We conclude with a mobile phone-based demonstration focused on the measurement of acetone, an important biomarker related to the onset of diabetic ketoacidosis. In all conditions tested, the open source classifier was capable of performing as well as, or better, than the equivalent circuit analysis for characterizing weak/transient interactions between a model ligand (acetone) and a small chemosensory protein derived from the tsetse fly. In addition, the tool has a low computational requirement, facilitating use for mobile acquisition systems such as mobile phones. The protocol is deployed through Jupyter notebook (an open source computing environment available for mobile phone, tablet or computer use) and the code was written in Python. For each of the applications, we provide step-by-step instructions in English, Spanish, Mandarin and Portuguese to facilitate widespread use. All codes were based on scikit-learn, an open source software machine learning library in the Python language, and were processed in Jupyter notebook, an open-source web application for Python. The tool can easily be integrated with the mobile biosensor equipment for rapid detection, facilitating use by a broad range of impedimetric biosensor users. This post hoc analysis tool can serve as a launchpad for the convergence of nanobiosensors in planetary health monitoring applications based on mobile phone hardware.

  15. Novel Peptide Sequence (“IQ-tag”) with High Affinity for NIR Fluorochromes Allows Protein and Cell Specific Labeling for In Vivo Imaging

    PubMed Central

    McCarthy, Jason R.; Weissleder, Ralph

    2007-01-01

    Background Probes that allow site-specific protein labeling have become critical tools for visualizing biological processes. Methods Here we used phage display to identify a novel peptide sequence with nanomolar affinity for near infrared (NIR) (benz)indolium fluorochromes. The developed peptide sequence (“IQ-tag”) allows detection of NIR dyes in a wide range of assays including ELISA, flow cytometry, high throughput screens, microscopy, and optical in vivo imaging. Significance The described method is expected to have broad utility in numerous applications, namely site-specific protein imaging, target identification, cell tracking, and drug development. PMID:17653285

  16. Peptide-Based Materials for Cartilage Tissue Regeneration.

    PubMed

    Hastar, Nurcan; Arslan, Elif; Guler, Mustafa O; Tekinay, Ayse B

    2017-01-01

    Cartilaginous tissue requires structural and metabolic support after traumatic or chronic injuries because of its limited capacity for regeneration. However, current techniques for cartilage regeneration are either invasive or ineffective for long-term repair. Developing alternative approaches to regenerate cartilage tissue is needed. Therefore, versatile scaffolds formed by biomaterials are promising tools for cartilage regeneration. Bioactive scaffolds further enhance the utility in a broad range of applications including the treatment of major cartilage defects. This chapter provides an overview of cartilage tissue, tissue defects, and the methods used for regeneration, with emphasis on peptide scaffold materials that can be used to supplement or replace current medical treatment options.

  17. Recent Advances in Chemical Modification of Peptide Nucleic Acids

    PubMed Central

    Rozners, Eriks

    2012-01-01

    Peptide nucleic acid (PNA) has become an extremely powerful tool in chemistry and biology. Although PNA recognizes single-stranded nucleic acids with exceptionally high affinity and sequence selectivity, there is considerable ongoing effort to further improve properties of PNA for both fundamental science and practical applications. The present paper discusses selected recent studies that improve on cellular uptake and binding of PNA to double-stranded DNA and RNA. The focus is on chemical modifications of PNA's backbone and heterocyclic nucleobases. The paper selects representative recent studies and does not attempt to provide comprehensive coverage of the broad and vibrant field of PNA modification. PMID:22991652

  18. DNA testing in neurologic diseases.

    PubMed

    O'Brien, D P; Leeb, T

    2014-01-01

    DNA testing is available for a growing number of hereditary diseases in neurology and other specialties. In addition to guiding breeding decisions, DNA tests are important tools in the diagnosis of diseases, particularly in conditions for which clinical signs are relatively nonspecific. DNA testing also can provide valuable insight into the risk of hereditary disease when decisions about treating comorbidities are being made. Advances in technology and bioinformatics will make broad screening for potential disease-causing mutations available soon. As DNA tests come into more common use, it is critical that clinicians understand the proper application and interpretation of these test results. Copyright © 2014 by the American College of Veterinary Internal Medicine.

  19. How sulphate-reducing microorganisms cope with stress: Lessons from systems biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, J.; He, Q.; Hemme, C.L.

    2011-04-01

    Sulphate-reducing microorganisms (SRMs) are a phylogenetically diverse group of anaerobes encompassing distinct physiologies with a broad ecological distribution. As SRMs have important roles in the biogeochemical cycling of carbon, nitrogen, sulphur and various metals, an understanding of how these organisms respond to environmental stresses is of fundamental and practical importance. In this Review, we highlight recent applications of systems biology tools in studying the stress responses of SRMs, particularly Desulfovibrio spp., at the cell, population, community and ecosystem levels. The syntrophic lifestyle of SRMs is also discussed, with a focus on system-level analyses of adaptive mechanisms. Such information is importantmore » for understanding the microbiology of the global sulphur cycle and for developing biotechnological applications of SRMs for environmental remediation, energy production, biocorrosion control, wastewater treatment and mineral recovery.« less

  20. Unsupervised Tensor Mining for Big Data Practitioners.

    PubMed

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry.

  1. Molecularly imprinted solid-phase extraction in the analysis of agrochemicals.

    PubMed

    Yi, Ling-Xiao; Fang, Rou; Chen, Guan-Hua

    2013-08-01

    The molecular imprinting technique is a highly predeterminative recognition technology. Molecularly imprinted polymers (MIPs) can be applied to the cleanup and preconcentration of analytes as the selective adsorbent of solid-phase extraction (SPE). In recent years, a new type of SPE has formed, molecularly imprinted polymer solid-phase extraction (MISPE), and has been widely applied to the extraction of agrochemicals. In this review, the mechanism of the molecular imprinting technique and the methodology of MIP preparations are explained. The extraction modes of MISPE, including offline and online, are discussed, and the applications of MISPE in the analysis of agrochemicals such as herbicides, fungicides and insecticides are summarized. It is concluded that MISPE is a powerful tool to selectively isolate agrochemicals from real samples with higher extraction and cleanup efficiency than commercial SPE and that it has great potential for broad applications.

  2. Call for papers for special issue of Journal of Molecular Spectroscopy focusing on "Frequency-comb spectroscopy"

    NASA Astrophysics Data System (ADS)

    Foltynowicz, Aleksandra; Picqué, Nathalie; Ye, Jun

    2018-05-01

    Frequency combs are becoming enabling tools for many applications in science and technology, beyond the original purpose of frequency metrology of simple atoms. The precisely evenly spaced narrow lines of a laser frequency comb inspire intriguing approaches to molecular spectroscopy, designed and implemented by a growing community of scientists. Frequency-comb spectroscopy advances the frontiers of molecular physics across the entire electro-magnetic spectrum. Used as frequency rulers, frequency combs enable absolute frequency measurements and precise line shape studies of molecular transitions, for e.g. tests of fundamental physics and improved determination of fundamental constants. As light sources interrogating the molecular samples, they dramatically improve the resolution, precision, sensitivity and acquisition time of broad spectral-bandwidth spectroscopy and open up new opportunities and applications at the leading edge of molecular spectroscopy and sensing.

  3. Progress with modeling activity landscapes in drug discovery.

    PubMed

    Vogt, Martin

    2018-04-19

    Activity landscapes (ALs) are representations and models of compound data sets annotated with a target-specific activity. In contrast to quantitative structure-activity relationship (QSAR) models, ALs aim at characterizing structure-activity relationships (SARs) on a large-scale level encompassing all active compounds for specific targets. The popularity of AL modeling has grown substantially with the public availability of large activity-annotated compound data sets. AL modeling crucially depends on molecular representations and similarity metrics used to assess structural similarity. Areas covered: The concepts of AL modeling are introduced and its basis in quantitatively assessing molecular similarity is discussed. The different types of AL modeling approaches are introduced. AL designs can broadly be divided into three categories: compound-pair based, dimensionality reduction, and network approaches. Recent developments for each of these categories are discussed focusing on the application of mathematical, statistical, and machine learning tools for AL modeling. AL modeling using chemical space networks is covered in more detail. Expert opinion: AL modeling has remained a largely descriptive approach for the analysis of SARs. Beyond mere visualization, the application of analytical tools from statistics, machine learning and network theory has aided in the sophistication of AL designs and provides a step forward in transforming ALs from descriptive to predictive tools. To this end, optimizing representations that encode activity relevant features of molecules might prove to be a crucial step.

  4. The Insight ToolKit image registration framework

    PubMed Central

    Avants, Brian B.; Tustison, Nicholas J.; Stauffer, Michael; Song, Gang; Wu, Baohua; Gee, James C.

    2014-01-01

    Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit (ITK4) seeks to establish new standards in publicly available image registration methodology. ITK4 makes several advances in comparison to previous versions of ITK. ITK4 supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field) and low-dimensional (affine) transformations with metrics that are reusable across transform types and with composite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available. Furthermore, ITK4 reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations vs. translations). A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher to more easily focus on design/comparison of registration strategies. In total, the ITK4 contribution is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextualize this work with a reference registration evaluation study with application to pediatric brain labeling.1 PMID:24817849

  5. The CRISPR/Cas Genome-Editing Tool: Application in Improvement of Crops

    PubMed Central

    Khatodia, Surender; Bhatotia, Kirti; Passricha, Nishat; Khurana, S. M. P.; Tuteja, Narendra

    2016-01-01

    The Clustered Regularly Interspaced Short Palindromic Repeats associated Cas9/sgRNA system is a novel targeted genome-editing technique derived from bacterial immune system. It is an inexpensive, easy, most user friendly and rapidly adopted genome editing tool transforming to revolutionary paradigm. This technique enables precise genomic modifications in many different organisms and tissues. Cas9 protein is an RNA guided endonuclease utilized for creating targeted double-stranded breaks with only a short RNA sequence to confer recognition of the target in animals and plants. Development of genetically edited (GE) crops similar to those developed by conventional or mutation breeding using this potential technique makes it a promising and extremely versatile tool for providing sustainable productive agriculture for better feeding of rapidly growing population in a changing climate. The emerging areas of research for the genome editing in plants include interrogating gene function, rewiring the regulatory signaling networks and sgRNA library for high-throughput loss-of-function screening. In this review, we have described the broad applicability of the Cas9 nuclease mediated targeted plant genome editing for development of designer crops. The regulatory uncertainty and social acceptance of plant breeding by Cas9 genome editing have also been described. With this powerful and innovative technique the designer GE non-GM plants could further advance climate resilient and sustainable agriculture in the future and maximizing yield by combating abiotic and biotic stresses. PMID:27148329

  6. Turbomachinery noise

    NASA Astrophysics Data System (ADS)

    Groeneweg, John F.; Sofrin, Thomas G.; Rice, Edward J.; Gliebe, Phillip R.

    1991-08-01

    Summarized here are key advances in experimental techniques and theoretical applications which point the way to a broad understanding and control of turbomachinery noise. On the experimental side, the development of effective inflow control techniques makes it possible to conduct, in ground based facilities, definitive experiments in internally controlled blade row interactions. Results can now be valid indicators of flight behavior and can provide a firm base for comparison with analytical results. Inflow control coupled with detailed diagnostic tools such as blade pressure measurements can be used to uncover the more subtle mechanisms such as rotor strut interaction, which can set tone levels for some engine configurations. Initial mappings of rotor wake-vortex flow fields have provided a data base for a first generation semiempirical flow disturbance model. Laser velocimetry offers a nonintrusive method for validating and improving the model. Digital data systems and signal processing algorithms are bringing mode measurement closer to a working tool that can be frequently applied to a real machine such as a turbofan engine. On the analytical side, models of most of the links in the chain from turbomachine blade source to far field observation point have been formulated. Three dimensional lifting surface theory for blade rows, including source noncompactness and cascade effects, blade row transmission models incorporating mode and frequency scattering, and modal radiation calculations, including hybrid numerical-analytical approaches, are tools which await further application.

  7. The GenABEL Project for statistical genomics.

    PubMed

    Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.

  8. NIRS in clinical neurology - a 'promising' tool?

    PubMed

    Obrig, Hellmuth

    2014-01-15

    Near-infrared spectroscopy (NIRS) has become a relevant research tool in neuroscience. In special populations such as infants and for special tasks such as walking, NIRS has asserted itself as a low resolution functional imaging technique which profits from its ease of application, portability and the option to co-register other neurophysiological and behavioral data in a 'near natural' environment. For clinical use in neurology this translates into the option to provide a bed-side oximeter for the brain, broadly available at comparatively low costs. However, while some potential for routine brain monitoring during cardiac and vascular surgery and in neonatology has been established, NIRS is largely unknown to clinical neurologists. The article discusses some of the reasons for this lack of use in clinical neurology. Research using NIRS in three major neurologic diseases (cerebrovascular disease, epilepsy and headache) is reviewed. Additionally the potential to exploit the established position of NIRS as a functional imaging tool with regard to clinical questions such as preoperative functional assessment and neurorehabilitation is discussed. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Complex multidisciplinary systems decomposition for aerospace vehicle conceptual design and technology acquisition

    NASA Astrophysics Data System (ADS)

    Omoragbon, Amen

    Although, the Aerospace and Defense (A&D) industry is a significant contributor to the United States' economy, national prestige and national security, it experiences significant cost and schedule overruns. This problem is related to the differences between technology acquisition assessments and aerospace vehicle conceptual design. Acquisition assessments evaluate broad sets of alternatives with mostly qualitative techniques, while conceptual design tools evaluate narrow set of alternatives with multidisciplinary tools. In order for these two fields to communicate effectively, a common platform for both concerns is desired. This research is an original contribution to a three-part solution to this problem. It discusses the decomposition step of an innovation technology and sizing tool generation framework. It identifies complex multidisciplinary system definitions as a bridge between acquisition and conceptual design. It establishes complex multidisciplinary building blocks that can be used to build synthesis systems as well as technology portfolios. It also describes a Graphical User Interface Designed to aid in decomposition process. Finally, it demonstrates an application of the methodology to a relevant acquisition and conceptual design problem posed by the US Air Force.

  10. 77 FR 8865 - Recent Postings of Broadly Applicable Alternative Test Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-15

    ... Applicable Alternative Test Methods AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of availability. SUMMARY: This notice announces the broadly applicable alternative test method approval decisions... INFORMATION CONTACT: An electronic copy of each alternative test method approval document is available on the...

  11. 75 FR 7593 - Recent Postings of Broadly Applicable Alternative Test Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-22

    ... Applicable Alternative Test Methods AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of availability. SUMMARY: This notice announces the broadly applicable alternative test method approval decisions... electronic copy of each alternative test method approval document is available on EPA's Web site at http...

  12. 78 FR 11174 - Recent Postings of Broadly Applicable Alternative Test Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-15

    ... Applicable Alternative Test Methods AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of availability. SUMMARY: This notice announces the broadly applicable alternative test method approval decisions... INFORMATION CONTACT: An electronic copy of each alternative test method approval document is available on the...

  13. Isotopic niches support the resource breadth hypothesis

    USGS Publications Warehouse

    Rader, Jonathan A.; Newsome, Seth D.; Sabat, Pablo; Chesser, R. Terry; Dillon, Michael E.; Martinez del Rio, Carlos

    2017-01-01

    Because a broad spectrum of resource use allows species to persist in a wide range of habitat types, and thus permits them to occupy large geographical areas, and because broadly distributed species have access to more diverse resource bases, the resource breadth hypothesis posits that the diversity of resources used by organisms should be positively related with the extent of their geographic ranges.We investigated isotopic niche width in a small radiation of South American birds in the genus Cinclodes. We analysed feathers of 12 species of Cinclodes to test the isotopic version of the resource breadth hypothesis and to examine the correlation between isotopic niche breadth and morphology.We found a positive correlation between the widths of hydrogen and oxygen isotopic niches (which estimate breadth of elevational range) and widths of the carbon and nitrogen isotopic niches (which estimates the diversity of resources consumed, and hence of habitats used). We also found a positive correlation between broad isotopic niches and wing morphology.Our study not only supports the resource breadth hypothesis but it also highlights the usefulness of stable isotope analyses as tools in the exploration of ecological niches. It is an example of a macroecological application of stable isotopes. It also illustrates the importance of scientific collections in ecological studies.

  14. Performance characterization of a combined material identification and screening algorithm

    NASA Astrophysics Data System (ADS)

    Green, Robert L.; Hargreaves, Michael D.; Gardner, Craig M.

    2013-05-01

    Portable analytical devices based on a gamut of technologies (Infrared, Raman, X-Ray Fluorescence, Mass Spectrometry, etc.) are now widely available. These tools have seen increasing adoption for field-based assessment by diverse users including military, emergency response, and law enforcement. Frequently, end-users of portable devices are non-scientists who rely on embedded software and the associated algorithms to convert collected data into actionable information. Two classes of problems commonly encountered in field applications are identification and screening. Identification algorithms are designed to scour a library of known materials and determine whether the unknown measurement is consistent with a stored response (or combination of stored responses). Such algorithms can be used to identify a material from many thousands of possible candidates. Screening algorithms evaluate whether at least a subset of features in an unknown measurement correspond to one or more specific substances of interest and are typically configured to detect from a small list potential target analytes. Thus, screening algorithms are much less broadly applicable than identification algorithms; however, they typically provide higher detection rates which makes them attractive for specific applications such as chemical warfare agent or narcotics detection. This paper will present an overview and performance characterization of a combined identification/screening algorithm that has recently been developed. It will be shown that the combined algorithm provides enhanced detection capability more typical of screening algorithms while maintaining a broad identification capability. Additionally, we will highlight how this approach can enable users to incorporate situational awareness during a response.

  15. Using Internet Based Paraphrasing Tools: Original Work, Patchwriting or Facilitated Plagiarism?

    ERIC Educational Resources Information Center

    Rogerson, Ann M.; McCarthy, Grace

    2017-01-01

    A casual comment by a student alerted the authors to the existence and prevalence of Internet-based paraphrasing tools. A subsequent quick Google search highlighted the broad range and availability of online paraphrasing tools which offer free 'services' to paraphrase large sections of text ranging from sentences, paragraphs, whole articles, book…

  16. A New Approach to Monte Carlo Simulations in Statistical Physics

    NASA Astrophysics Data System (ADS)

    Landau, David P.

    2002-08-01

    Monte Carlo simulations [1] have become a powerful tool for the study of diverse problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, most often in the canonical ensemble, and over the past several decades enormous improvements have been made in performance. Nonetheless, difficulties arise near phase transitions-due to critical slowing down near 2nd order transitions and to metastability near 1st order transitions, and these complications limit the applicability of the method. We shall describe a new Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is known, all thermodynamic properties can be calculated. This approach can be extended to multi-dimensional parameter spaces and should be effective for systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc. Generalizations should produce a broadly applicable optimization tool. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).

  17. Earth Science Data Education through Cooking Up Recipes

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Maskey, M.; Smith, T.; Conover, H.

    2016-12-01

    One of the major challenges in Earth science research and applications is understanding and applying the proper methods, tools, and software for using scientific data. These techniques are often difficult and time consuming to identify, requiring novel users to conduct extensive research, take classes, and reach out for assistance, thus hindering scientific discovery and real-world applications. To address these challenges, the Global Hydrology Resource Center (GHRC) DAAC has developed a series of data recipes that novel users such as students, decision makers, and general Earth scientists can leverage to learn how to use Earth science datasets. Once the data recipe content had been finalized, GHRC computer and Earth scientists collaborated with a web and graphic designer to ensure the content is both attractively presented to data users, and clearly communicated to promote the education and use of Earth science data. The completed data recipes include, but are not limited to, tutorials, iPython Notebooks, resources, and tools necessary for addressing key difficulties in data use across a broad user base. These recipes enable non-traditional users to learn how to use data, but also curates and communicates common methods and approaches that may be difficult and time consuming for these users to identify.

  18. Clinical Application of Genome and Exome Sequencing as a Diagnostic Tool for Pediatric Patients: a Scoping Review of the Literature.

    PubMed

    Smith, Hadley Stevens; Swint, J Michael; Lalani, Seema R; Yamal, Jose-Miguel; de Oliveira Otto, Marcia C; Castellanos, Stephan; Taylor, Amy; Lee, Brendan H; Russell, Heidi V

    2018-05-14

    Availability of clinical genomic sequencing (CGS) has generated questions about the value of genome and exome sequencing as a diagnostic tool. Analysis of reported CGS application can inform uptake and direct further research. This scoping literature review aims to synthesize evidence on the clinical and economic impact of CGS. PubMed, Embase, and Cochrane were searched for peer-reviewed articles published between 2009 and 2017 on diagnostic CGS for infant and pediatric patients. Articles were classified according to sample size and whether economic evaluation was a primary research objective. Data on patient characteristics, clinical setting, and outcomes were extracted and narratively synthesized. Of 171 included articles, 131 were case reports, 40 were aggregate analyses, and 4 had a primary economic evaluation aim. Diagnostic yield was the only consistently reported outcome. Median diagnostic yield in aggregate analyses was 33.2% but varied by broad clinical categories and test type. Reported CGS use has rapidly increased and spans diverse clinical settings and patient phenotypes. Economic evaluations support the cost-saving potential of diagnostic CGS. Multidisciplinary implementation research, including more robust outcome measurement and economic evaluation, is needed to demonstrate clinical utility and cost-effectiveness of CGS.

  19. Networking—a statistical physics perspective

    NASA Astrophysics Data System (ADS)

    Yeung, Chi Ho; Saad, David

    2013-03-01

    Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.

  20. Generalized Schemes for High Throughput Manipulation of the Desulfovibrio vulgaris Hildenborough Genome.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chhabra, Swapnil; Butland, Gareth; Elias, Dwayne A

    The ability to conduct advanced functional genomic studies of the thousands of 38 sequenced bacteria has been hampered by the lack of available tools for making high39 throughput chromosomal manipulations in a systematic manner that can be applied across 40 diverse species. In this work, we highlight the use of synthetic biological tools to 41 assemble custom suicide vectors with reusable and interchangeable DNA parts to 42 facilitate chromosomal modification at designated loci. These constructs enable an array 43 of downstream applications including gene replacement and creation of gene fusions with 44 affinity purification or localization tags. We employed thismore » approach to engineer 45 chromosomal modifications in a bacterium that has previously proven difficult to 46 manipulate genetically, Desulfovibrio vulgaris Hildenborough, to generate a library of 47 662 strains. Furthermore, we demonstrate how these modifications can be used for 48 examining metabolic pathways, protein-protein interactions, and protein localization. The 49 ubiquity of suicide constructs in gene replacement throughout biology suggests that this 50 approach can be applied to engineer a broad range of species for a diverse array of 51 systems biological applications and is amenable to high-throughput implementation.« less

  1. [An international neuropsychological assessment tool for children, adolescents, and adults with anorexia nervosa – the German adaptation of the Ravello Profile].

    PubMed

    van Noort, Betteke Maria; Pfeiffer, Ernst; Lehmkuhl, Ulrike; Kappel, Viola

    2013-11-01

    Adults with anorexia nervosa (AN) show weaknesses in several cognitive functions before and after weight restoration. There is a great demand for standardized examinations of executive functioning in the field of child and adolescent AN. Previous studies exhibited methodological inconsistencies regarding test selection and operationalization of cognitive functions, making the interpretation of their findings difficult. In order to overcome these inconsistencies, a neuropsychological assessment tool, the "Ravello Profile," was developed, though previously not available in German. This paper presents a German adaptation of the Ravello Profile and illustrates its applicability in children and adolescents via three case descriptions. The Ravello Profile was adapted for the German-speaking area. The applicability of the Ravello Profile was evaluated in three children and adolescents with AN. The cases presented confirm the feasible implementation of this adaptation of the Ravello Profile, both in children and adolescents. Hence, it enables a methodologically consistent examination of executive functioning in German-speaking children, adolescents, and adults with AN. Using the Ravello Profile, the role of cognitive functions in the development of AN can be systematically examined over a broad age range.

  2. The notes from nature tool for unlocking biodiversity records from museum records through citizen science

    PubMed Central

    Hill, Andrew; Guralnick, Robert; Smith, Arfon; Sallans, Andrew; Rosemary Gillespie; Denslow, Michael; Gross, Joyce; Murrell, Zack; Tim Conyers; Oboyski, Peter; Ball, Joan; Thomer, Andrea; Prys-Jones, Robert; de Torre, Javier; Kociolek, Patrick; Fortson, Lucy

    2012-01-01

    Abstract Legacy data from natural history collections contain invaluable and irreplaceable information about biodiversity in the recent past, providing a baseline for detecting change and forecasting the future of biodiversity on a human-dominated planet. However, these data are often not available in formats that facilitate use and synthesis. New approaches are needed to enhance the rates of digitization and data quality improvement. Notes from Nature provides one such novel approach by asking citizen scientists to help with transcription tasks. The initial web-based prototype of Notes from Nature is soon widely available and was developed collaboratively by biodiversity scientists, natural history collections staff, and experts in citizen science project development, programming and visualization. This project brings together digital images representing different types of biodiversity records including ledgers , herbarium sheets and pinned insects from multiple projects and natural history collections. Experts in developing web-based citizen science applications then designed and built a platform for transcribing textual data and metadata from these images. The end product is a fully open source web transcription tool built using the latest web technologies. The platform keeps volunteers engaged by initially explaining the scientific importance of the work via a short orientation, and then providing transcription “missions” of well defined scope, along with dynamic feedback, interactivity and rewards. Transcribed records, along with record-level and process metadata, are provided back to the institutions.  While the tool is being developed with new users in mind, it can serve a broad range of needs from novice to trained museum specialist. Notes from Nature has the potential to speed the rate of biodiversity data being made available to a broad community of users. PMID:22859890

  3. The notes from nature tool for unlocking biodiversity records from museum records through citizen science.

    PubMed

    Hill, Andrew; Guralnick, Robert; Smith, Arfon; Sallans, Andrew; Rosemary Gillespie; Denslow, Michael; Gross, Joyce; Murrell, Zack; Tim Conyers; Oboyski, Peter; Ball, Joan; Thomer, Andrea; Prys-Jones, Robert; de Torre, Javier; Kociolek, Patrick; Fortson, Lucy

    2012-01-01

    Legacy data from natural history collections contain invaluable and irreplaceable information about biodiversity in the recent past, providing a baseline for detecting change and forecasting the future of biodiversity on a human-dominated planet. However, these data are often not available in formats that facilitate use and synthesis. New approaches are needed to enhance the rates of digitization and data quality improvement. Notes from Nature provides one such novel approach by asking citizen scientists to help with transcription tasks. The initial web-based prototype of Notes from Nature is soon widely available and was developed collaboratively by biodiversity scientists, natural history collections staff, and experts in citizen science project development, programming and visualization. This project brings together digital images representing different types of biodiversity records including ledgers , herbarium sheets and pinned insects from multiple projects and natural history collections. Experts in developing web-based citizen science applications then designed and built a platform for transcribing textual data and metadata from these images. The end product is a fully open source web transcription tool built using the latest web technologies. The platform keeps volunteers engaged by initially explaining the scientific importance of the work via a short orientation, and then providing transcription "missions" of well defined scope, along with dynamic feedback, interactivity and rewards. Transcribed records, along with record-level and process metadata, are provided back to the institutions.  While the tool is being developed with new users in mind, it can serve a broad range of needs from novice to trained museum specialist. Notes from Nature has the potential to speed the rate of biodiversity data being made available to a broad community of users.

  4. Mid-infrared hyperspectral imaging for the detection of explosive compounds

    NASA Astrophysics Data System (ADS)

    Ruxton, K.; Robertson, G.; Miller, W.; Malcolm, G. P. A.; Maker, G. T.

    2012-10-01

    Active hyperspectral imaging is a valuable tool in a wide range of applications. A developing market is the detection and identification of energetic compounds through analysis of the resulting absorption spectrum. This work presents a selection of results from a prototype mid-infrared (MWIR) hyperspectral imaging instrument that has successfully been used for compound detection at a range of standoff distances. Active hyperspectral imaging utilises a broadly tunable laser source to illuminate the scene with light over a range of wavelengths. While there are a number of illumination methods, this work illuminates the scene by raster scanning the laser beam using a pair of galvanometric mirrors. The resulting backscattered light from the scene is collected by the same mirrors and directed and focussed onto a suitable single-point detector, where the image is constructed pixel by pixel. The imaging instrument that was developed in this work is based around a MWIR optical parametric oscillator (OPO) source with broad tunability, operating at 2.6 μm to 3.7 μm. Due to material handling procedures associated with explosive compounds, experimental work was undertaken initially using simulant compounds. A second set of compounds that was tested alongside the simulant compounds is a range of confusion compounds. By having the broad wavelength tunability of the OPO, extended absorption spectra of the compounds could be obtained to aid in compound identification. The prototype imager instrument has successfully been used to record the absorption spectra for a range of compounds from the simulant and confusion sets and current work is now investigating actual explosive compounds. The authors see a very promising outlook for the MWIR hyperspectral imager. From an applications point of view this format of imaging instrument could be used for a range of standoff, improvised explosive device (IED) detection applications and potential incident scene forensic investigation.

  5. The Principles of Engineering Immune Cells to Treat Cancer

    PubMed Central

    Lim, Wendell A.; June, Carl H.

    2017-01-01

    Chimeric antigen receptor (CAR) T cells have proven that engineered immune cells can serve as a powerful new class of cancer therapeutics. Clinical experience has helped to define the major challenges that must be met to make engineered T cells a reliable, safe, and effective platform that can be deployed against a broad range of tumors. The emergence of synthetic biology approaches for cellular engineering is providing us with a broadly expanded set of tools for programming immune cells. We discuss how these tools could be used to design the next generation of smart T cell precision therapeutics. PMID:28187291

  6. 10 CFR 33.13 - Requirements for the issuance of a Type A specific license of broad scope.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... of broad scope. 33.13 Section 33.13 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.13 Requirements for the issuance of a Type A specific license of broad scope. An application for a Type A specific license of broad...

  7. 10 CFR 33.13 - Requirements for the issuance of a Type A specific license of broad scope.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of broad scope. 33.13 Section 33.13 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.13 Requirements for the issuance of a Type A specific license of broad scope. An application for a Type A specific license of broad...

  8. 10 CFR 33.14 - Requirements for the issuance of a Type B specific license of broad scope.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of broad scope. 33.14 Section 33.14 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.14 Requirements for the issuance of a Type B specific license of broad scope. An application for a Type B specific license of broad...

  9. 10 CFR 33.13 - Requirements for the issuance of a Type A specific license of broad scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of broad scope. 33.13 Section 33.13 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.13 Requirements for the issuance of a Type A specific license of broad scope. An application for a Type A specific license of broad...

  10. 10 CFR 33.14 - Requirements for the issuance of a Type B specific license of broad scope.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of broad scope. 33.14 Section 33.14 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.14 Requirements for the issuance of a Type B specific license of broad scope. An application for a Type B specific license of broad...

  11. 10 CFR 33.14 - Requirements for the issuance of a Type B specific license of broad scope.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... of broad scope. 33.14 Section 33.14 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.14 Requirements for the issuance of a Type B specific license of broad scope. An application for a Type B specific license of broad...

  12. 10 CFR 33.14 - Requirements for the issuance of a Type B specific license of broad scope.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of broad scope. 33.14 Section 33.14 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.14 Requirements for the issuance of a Type B specific license of broad scope. An application for a Type B specific license of broad...

  13. 10 CFR 33.14 - Requirements for the issuance of a Type B specific license of broad scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of broad scope. 33.14 Section 33.14 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.14 Requirements for the issuance of a Type B specific license of broad scope. An application for a Type B specific license of broad...

  14. 10 CFR 33.15 - Requirements for the issuance of a Type C specific license of broad scope.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of broad scope. 33.15 Section 33.15 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.15 Requirements for the issuance of a Type C specific license of broad scope. An application for a Type C specific license of broad...

  15. 10 CFR 33.13 - Requirements for the issuance of a Type A specific license of broad scope.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of broad scope. 33.13 Section 33.13 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.13 Requirements for the issuance of a Type A specific license of broad scope. An application for a Type A specific license of broad...

  16. 10 CFR 33.15 - Requirements for the issuance of a Type C specific license of broad scope.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... of broad scope. 33.15 Section 33.15 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.15 Requirements for the issuance of a Type C specific license of broad scope. An application for a Type C specific license of broad...

  17. 10 CFR 33.15 - Requirements for the issuance of a Type C specific license of broad scope.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of broad scope. 33.15 Section 33.15 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.15 Requirements for the issuance of a Type C specific license of broad scope. An application for a Type C specific license of broad...

  18. 10 CFR 33.13 - Requirements for the issuance of a Type A specific license of broad scope.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of broad scope. 33.13 Section 33.13 Energy NUCLEAR REGULATORY COMMISSION SPECIFIC DOMESTIC LICENSES OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.13 Requirements for the issuance of a Type A specific license of broad scope. An application for a Type A specific license of broad...

  19. Thermal Isomerization of Hydroxyazobenzenes as a Platform for Vapor Sensing

    PubMed Central

    2018-01-01

    Photoisomerization of azobenzene derivatives is a versatile tool for devising light-responsive materials for a broad range of applications in photonics, robotics, microfabrication, and biomaterials science. Some applications rely on fast isomerization kinetics, while for others, bistable azobenzenes are preferred. However, solid-state materials where the isomerization kinetics depends on the environmental conditions have been largely overlooked. Herein, an approach to utilize the environmental sensitivity of isomerization kinetics is developed. It is demonstrated that thin polymer films containing hydroxyazobenzenes offer a conceptually novel platform for sensing hydrogen-bonding vapors in the environment. The concept is based on accelerating the thermal cis–trans isomerization rate through hydrogen-bond-catalyzed changes in the thermal isomerization pathway, which allows for devising a relative humidity sensor with high sensitivity and quick response to relative humidity changes. The approach is also applicable for detecting other hydrogen-bonding vapors such as methanol and ethanol. Employing isomerization kinetics of azobenzenes for vapor sensing opens new intriguing possibilities for using azobenzene molecules in the future. PMID:29607244

  20. The Global Precipitation Measurement (GPM) Mission contributions to hydrology and societal applications

    NASA Astrophysics Data System (ADS)

    Kirschbaum, D.; Huffman, G. J.; Skofronick Jackson, G.

    2016-12-01

    Too much or too little rain can serve as a tipping point for triggering catastrophic flooding and landslides or widespread drought. Knowing when, where and how much rain is falling globally is vital to understanding how vulnerable areas may be more or less impacted by these disasters. The Global Precipitation Measurement (GPM) mission provides near real-time precipitation data worldwide that is used by a broad range of end users, from tropical cyclone forecasters to agricultural modelers to researchers evaluating the spread of diseases. The GPM constellation provides merged, multi-satellite data products at three latencies that are critical for research and societal applications around the world. This presentation will outline current capabilities in using accurate and timely information of precipitation to directly benefit society, including examples of end user applications within the tropical cyclone forecasting, disasters response, agricultural forecasting, and disease tracking communities, among others. The presentation will also introduce some of the new visualization and access tools developed by the GPM team.

  1. Dry EEG Electrodes

    PubMed Central

    Lopez-Gordo, M. A.; Sanchez-Morillo, D.; Valle, F. Pelayo

    2014-01-01

    Electroencephalography (EEG) emerged in the second decade of the 20th century as a technique for recording the neurophysiological response. Since then, there has been little variation in the physical principles that sustain the signal acquisition probes, otherwise called electrodes. Currently, new advances in technology have brought new unexpected fields of applications apart from the clinical, for which new aspects such as usability and gel-free operation are first order priorities. Thanks to new advances in materials and integrated electronic systems technologies, a new generation of dry electrodes has been developed to fulfill the need. In this manuscript, we review current approaches to develop dry EEG electrodes for clinical and other applications, including information about measurement methods and evaluation reports. We conclude that, although a broad and non-homogeneous diversity of approaches has been evaluated without a consensus in procedures and methodology, their performances are not far from those obtained with wet electrodes, which are considered the gold standard, thus enabling the former to be a useful tool in a variety of novel applications. PMID:25046013

  2. Fused Deposition Modeling 3D Printing for (Bio)analytical Device Fabrication: Procedures, Materials, and Applications

    PubMed Central

    2017-01-01

    In this work, the use of fused deposition modeling (FDM) in a (bio)analytical/lab-on-a-chip research laboratory is described. First, the specifications of this 3D printing method that are important for the fabrication of (micro)devices were characterized for a benchtop FDM 3D printer. These include resolution, surface roughness, leakage, transparency, material deformation, and the possibilities for integration of other materials. Next, the autofluorescence, solvent compatibility, and biocompatibility of 12 representative FDM materials were tested and evaluated. Finally, we demonstrate the feasibility of FDM in a number of important applications. In particular, we consider the fabrication of fluidic channels, masters for polymer replication, and tools for the production of paper microfluidic devices. This work thus provides a guideline for (i) the use of FDM technology by addressing its possibilities and current limitations, (ii) material selection for FDM, based on solvent compatibility and biocompatibility, and (iii) application of FDM technology to (bio)analytical research by demonstrating a broad range of illustrative examples. PMID:28628294

  3. Open Source GIS based integrated watershed management

    NASA Astrophysics Data System (ADS)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users

  4. Application of the Aquifer Impact Model to support decisions at a CO 2 sequestration site: Modeling and Analysis: Application of the Aquifer Impact Model to support decisions at a CO 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacon, Diana Holford; Locke II, Randall A.; Keating, Elizabeth

    The National Risk Assessment Partnership (NRAP) has developed a suite of tools to assess and manage risk at CO2 sequestration sites (1). The NRAP tool suite includes the Aquifer Impact Model (AIM), based on reduced order models developed using site-specific data from two aquifers (alluvium and carbonate). The models accept aquifer parameters as a range of variable inputs so they may have more broad applicability. Guidelines have been developed for determining the aquifer types for which the ROMs should be applicable. This paper considers the applicability of the aquifer models in AIM to predicting the impact of CO2 or Brinemore » leakage were it to occur at the Illinois Basin Decatur Project (IBDP). Based on the results of the sensitivity analysis, the hydraulic parameters and leakage source term magnitude are more sensitive than clay fraction or cation exchange capacity. Sand permeability was the only hydraulic parameter measured at the IBDP site. More information on the other hydraulic parameters, such as sand fraction and sand/clay correlation lengths, could reduce uncertainty in risk estimates. Some non-adjustable parameters, such as the initial pH and TDS and the pH no-impact threshold, are significantly different for the ROM than for the observations at the IBDP site. The reduced order model could be made more useful to a wider range of sites if the initial conditions and no-impact threshold values were adjustable parameters.« less

  5. Habitat classification modelling with incomplete data: Pushing the habitat envelope

    Treesearch

    Phoebe L. Zarnetske; Thomas C. Edwards; Gretchen G. Moisen

    2007-01-01

    Habitat classification models (HCMs) are invaluable tools for species conservation, land-use planning, reserve design, and metapopulation assessments, particularly at broad spatial scales. However, species occurrence data are often lacking and typically limited to presence points at broad scales. This lack of absence data precludes the use of many statistical...

  6. RealityConvert: a tool for preparing 3D models of biochemical structures for augmented and virtual reality.

    PubMed

    Borrel, Alexandre; Fourches, Denis

    2017-12-01

    There is a growing interest for the broad use of Augmented Reality (AR) and Virtual Reality (VR) in the fields of bioinformatics and cheminformatics to visualize complex biological and chemical structures. AR and VR technologies allow for stunning and immersive experiences, offering untapped opportunities for both research and education purposes. However, preparing 3D models ready to use for AR and VR is time-consuming and requires a technical expertise that severely limits the development of new contents of potential interest for structural biologists, medicinal chemists, molecular modellers and teachers. Herein we present the RealityConvert software tool and associated website, which allow users to easily convert molecular objects to high quality 3D models directly compatible for AR and VR applications. For chemical structures, in addition to the 3D model generation, RealityConvert also generates image trackers, useful to universally call and anchor that particular 3D model when used in AR applications. The ultimate goal of RealityConvert is to facilitate and boost the development and accessibility of AR and VR contents for bioinformatics and cheminformatics applications. http://www.realityconvert.com. dfourch@ncsu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  7. Trace saver: A tool for network service improvement and personalised analysis of user centric statistics

    NASA Astrophysics Data System (ADS)

    Bilal, Muhammad; Asfand-e-Yar, Mockford, Steve; Khan, Wasiq; Awan, Irfan

    2012-11-01

    Mobile technology is among the fastest growing technologies in today's world with low cost and highly effective benefits. Most important and entertaining areas in mobile technology development and usage are location based services, user friendly networked applications and gaming applications. However, concern towards network operator service provision and improvement has been very low. The portable applications available for a range of mobile operating systems which help improve the network operator services are desirable by the mobile operators. This paper proposes a state of the art mobile application Tracesaver, which provides a great achievement over the barriers in gathering device and network related information, for network operators to improve their network service provision. Tracesaver is available for a broad range of mobile devices with different mobile operating systems and computational capabilities. The availability of Tracesaver in market has proliferated over the last year since it was published. The survey and results show that Tracesaver is being used by millions of mobile users and provides novel ways of network service improvement with its highly user friendly interface.

  8. iTools: a framework for classification, categorization and integration of computational biology resources.

    PubMed

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu.

  9. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu. PMID:18509477

  10. Lattice Boltzmann modeling of transport phenomena in fuel cells and flow batteries

    NASA Astrophysics Data System (ADS)

    Xu, Ao; Shyy, Wei; Zhao, Tianshou

    2017-06-01

    Fuel cells and flow batteries are promising technologies to address climate change and air pollution problems. An understanding of the complex multiscale and multiphysics transport phenomena occurring in these electrochemical systems requires powerful numerical tools. Over the past decades, the lattice Boltzmann (LB) method has attracted broad interest in the computational fluid dynamics and the numerical heat transfer communities, primarily due to its kinetic nature making it appropriate for modeling complex multiphase transport phenomena. More importantly, the LB method fits well with parallel computing due to its locality feature, which is required for large-scale engineering applications. In this article, we review the LB method for gas-liquid two-phase flows, coupled fluid flow and mass transport in porous media, and particulate flows. Examples of applications are provided in fuel cells and flow batteries. Further developments of the LB method are also outlined.

  11. The Application of Adaptive Behaviour Models: A Systematic Review

    PubMed Central

    Price, Jessica A.; Morris, Zoe A.; Costello, Shane

    2018-01-01

    Adaptive behaviour has been viewed broadly as an individual’s ability to meet the standards of social responsibilities and independence; however, this definition has been a source of debate amongst researchers and clinicians. Based on the rich history and the importance of the construct of adaptive behaviour, the current study aimed to provide a comprehensive overview of the application of adaptive behaviour models to assessment tools, through a systematic review. A plethora of assessment measures for adaptive behaviour have been developed in order to adequately assess the construct; however, it appears that the only definition on which authors seem to agree is that adaptive behaviour is what adaptive behaviour scales measure. The importance of the construct for diagnosis, intervention and planning has been highlighted throughout the literature. It is recommended that researchers and clinicians critically review what measures of adaptive behaviour they are utilising and it is suggested that the definition and theory is revisited. PMID:29342927

  12. Unlocking the Potential of Phenacyl Protecting Groups: CO2-Based Formation and Photocatalytic Release of Caged Amines.

    PubMed

    Speckmeier, Elisabeth; Klimkait, Michael; Zeitler, Kirsten

    2018-04-06

    Orthogonal protection and deprotection of amines remain important tools in synthetic design as well as in chemical biology and material research applications. A robust, highly efficient, and sustainable method for the formation of phenacyl-based carbamate esters was developed using CO 2 for the in situ preparation of the intermediate carbamates. Our mild and broadly applicable protocol allows for the formation of phenacyl urethanes of anilines, primary amines, including amino acids, and secondary amines in high to excellent yields. Moreover, we demonstrate the utility by a mild and convenient photocatalytic deprotection protocol using visible light. A key feature of the [Ru(bpy) 3 ](PF 6 ) 2 -catalyzed method is the use of ascorbic acid as reductive quencher in a neutral, buffered, two-phase acetonitrile/water mixture, granting fast and highly selective deprotection for all presented examples.

  13. Synthetic biology platform technologies for antimicrobial applications.

    PubMed

    Braff, Dana; Shis, David; Collins, James J

    2016-10-01

    The growing prevalence of antibiotic resistance calls for new approaches in the development of antimicrobial therapeutics. Likewise, improved diagnostic measures are essential in guiding the application of targeted therapies and preventing the evolution of therapeutic resistance. Discovery platforms are also needed to form new treatment strategies and identify novel antimicrobial agents. By applying engineering principles to molecular biology, synthetic biologists have developed platforms that improve upon, supplement, and will perhaps supplant traditional broad-spectrum antibiotics. Efforts in engineering bacteriophages and synthetic probiotics demonstrate targeted antimicrobial approaches that can be fine-tuned using synthetic biology-derived principles. Further, the development of paper-based, cell-free expression systems holds promise in promoting the clinical translation of molecular biology tools for diagnostic purposes. In this review, we highlight emerging synthetic biology platform technologies that are geared toward the generation of new antimicrobial therapies, diagnostics, and discovery channels. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Ionic liquid thermal stabilities: decomposition mechanisms and analysis tools.

    PubMed

    Maton, Cedric; De Vos, Nils; Stevens, Christian V

    2013-07-07

    The increasing amount of papers published on ionic liquids generates an extensive quantity of data. The thermal stability data of divergent ionic liquids are collected in this paper with attention to the experimental set-up. The influence and importance of the latter parameters are broadly addressed. Both ramped temperature and isothermal thermogravimetric analysis are discussed, along with state-of-the-art methods, such as TGA-MS and pyrolysis-GC. The strengths and weaknesses of the different methodologies known to date demonstrate that analysis methods should be in line with the application. The combination of data from advanced analysis methods allows us to obtain in-depth information on the degradation processes. Aided with computational methods, the kinetics and thermodynamics of thermal degradation are revealed piece by piece. The better understanding of the behaviour of ionic liquids at high temperature allows selective and application driven design, as well as mathematical prediction for engineering purposes.

  15. Broadband high resolution X-ray spectral analyzer

    DOEpatents

    Silver, Eric H.; Legros, Mark; Madden, Norm W.; Goulding, Fred; Landis, Don

    1998-01-01

    A broad bandwidth high resolution x-ray fluorescence spectrometer has a performance that is superior in many ways to those currently available. It consists of an array of 4 large area microcalorimeters with 95% quantum efficiency at 6 keV and it produces x-ray spectra between 0.2 keV and 7 keV with an energy resolution of 7 to 10 eV. The resolution is obtained at input count rates per array element of 10 to 50 Hz in real-time, with analog pulse processing and thermal pile-up rejection. This performance cannot be matched by currently available x-ray spectrometers. The detectors are incorporated into a compact and portable cryogenic refrigerator system that is ready for use in many analytical spectroscopy applications as a tool for x-ray microanalysis or in research applications such as laboratory and astrophysical x-ray and particle spectroscopy.

  16. Broadband high resolution X-ray spectral analyzer

    DOEpatents

    Silver, E.H.; Legros, M.; Madden, N.W.; Goulding, F.; Landis, D.

    1998-07-07

    A broad bandwidth high resolution X-ray fluorescence spectrometer has a performance that is superior in many ways to those currently available. It consists of an array of 4 large area microcalorimeters with 95% quantum efficiency at 6 keV and it produces X-ray spectra between 0.2 keV and 7 keV with an energy resolution of 7 to 10 eV. The resolution is obtained at input count rates per array element of 10 to 50 Hz in real-time, with analog pulse processing and thermal pile-up rejection. This performance cannot be matched by currently available X-ray spectrometers. The detectors are incorporated into a compact and portable cryogenic refrigerator system that is ready for use in many analytical spectroscopy applications as a tool for X-ray microanalysis or in research applications such as laboratory and astrophysical X-ray and particle spectroscopy. 6 figs.

  17. Sex Chromosome Evolution in Amniotes: Applications for Bacterial Artificial Chromosome Libraries

    PubMed Central

    Janes, Daniel E.; Valenzuela, Nicole; Ezaz, Tariq; Amemiya, Chris; Edwards, Scott V.

    2011-01-01

    Variability among sex chromosome pairs in amniotes denotes a dynamic history. Since amniotes diverged from a common ancestor, their sex chromosome pairs and, more broadly, sex-determining mechanisms have changed reversibly and frequently. These changes have been studied and characterized through the use of many tools and experimental approaches but perhaps most effectively through applications for bacterial artificial chromosome (BAC) libraries. Individual BAC clones carry 100–200 kb of sequence from one individual of a target species that can be isolated by screening, mapped onto karyotypes, and sequenced. With these techniques, researchers have identified differences and similarities in sex chromosome content and organization across amniotes and have addressed hypotheses regarding the frequency and direction of past changes. Here, we review studies of sex chromosome evolution in amniotes and the ways in which the field of research has been affected by the advent of BAC libraries. PMID:20981143

  18. History, applications, and challenges of immune repertoire research.

    PubMed

    Liu, Xiao; Wu, Jinghua

    2018-02-27

    The diversity of T and B cells in terms of their receptor sequences is huge in the vertebrate's immune system and provides broad protection against the vast diversity of pathogens. Immune repertoire is defined as the sum of T cell receptors and B cell receptors (also named immunoglobulin) that makes the organism's adaptive immune system. Before the emergence of high-throughput sequencing, the studies on immune repertoire were limited by the underdeveloped methodologies, since it was impossible to capture the whole picture by the low-throughput tools. The massive paralleled sequencing technology suits perfectly the researches on immune repertoire. In this article, we review the history of immune repertoire studies, in terms of technologies and research applications. Particularly, we discuss several aspects of challenges in this field and highlight the efforts to develop potential solutions, in the era of high-throughput sequencing of the immune repertoire.

  19. A new application for food customization with additive manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Serenó, L.; Vallicrosa, G.; Delgado, J.; Ciurana, J.

    2012-04-01

    Additive Manufacturing (AM) technologies have emerged as a freeform approach capable of producing almost any complete three dimensional (3D) objects from computer-aided design (CAD) data by successively adding material layer by layer. Despite the broad range of possibilities, commercial AM technologies remain complex and expensive, making them suitable only for niche applications. The developments of the Fab@Home system as an open AM technology discovered a new range of possibilities of processing different materials such as edible products. The main objective of this work is to analyze and optimize the manufacturing capacity of this system when producing 3D edible objects. A new heated syringe deposition tool was developed and several process parameters were optimized to adapt this technology to consumers' needs. The results revealed in this study show the potential of this system to produce customized edible objects without qualified personnel knowledge, therefore saving manufacturing costs compared to traditional technologies.

  20. Adeno-associated Virus as a Mammalian DNA Vector

    PubMed Central

    SALGANIK, MAX; HIRSCH, MATTHEW L.; SAMULSKI, RICHARD JUDE

    2015-01-01

    In the nearly five decades since its accidental discovery, adeno-associated virus (AAV) has emerged as a highly versatile vector system for both research and clinical applications. A broad range of natural serotypes, as well as an increasing number of capsid variants, has combined to produce a repertoire of vectors with different tissue tropisms, immunogenic profiles and transduction efficiencies. The story of AAV is one of continued progress and surprising discoveries in a viral system that, at first glance, is deceptively simple. This apparent simplicity has enabled the advancement of AAV into the clinic, where despite some challenges it has provided hope for patients and a promising new tool for physicians. Although a great deal of work remains to be done, both in studying the basic biology of AAV and in optimizing its clinical application, AAV vectors are currently the safest and most efficient platform for gene transfer in mammalian cells. PMID:26350320

  1. High Level Analysis, Design and Validation of Distributed Mobile Systems with CoreASM

    NASA Astrophysics Data System (ADS)

    Farahbod, R.; Glässer, U.; Jackson, P. J.; Vajihollahi, M.

    System design is a creative activity calling for abstract models that facilitate reasoning about the key system attributes (desired requirements and resulting properties) so as to ensure these attributes are properly established prior to actually building a system. We explore here the practical side of using the abstract state machine (ASM) formalism in combination with the CoreASM open source tool environment for high-level design and experimental validation of complex distributed systems. Emphasizing the early phases of the design process, a guiding principle is to support freedom of experimentation by minimizing the need for encoding. CoreASM has been developed and tested building on a broad scope of applications, spanning computational criminology, maritime surveillance and situation analysis. We critically reexamine here the CoreASM project in light of three different application scenarios.

  2. Medicine 2.0: social networking, collaboration, participation, apomediation, and openness.

    PubMed

    Eysenbach, Gunther

    2008-08-25

    In a very significant development for eHealth, broad adoption of Web 2.0 technologies and approaches coincides with the more recent emergence of Personal Health Application Platforms and Personally Controlled Health Records such as Google Health, Microsoft HealthVault, and Dossia. "Medicine 2.0" applications, services and tools are defined as Web-based services for health care consumers, caregivers, patients, health professionals, and biomedical researchers, that use Web 2.0 technologies and/or semantic web and virtual reality approaches to enable and facilitate specifically 1) social networking, 2) participation, 3) apomediation, 4) openness and 5) collaboration, within and between these user groups. The Journal of Medical Internet Research (JMIR) publishes a Medicine 2.0 theme issue and sponsors a conference on "How Social Networking and Web 2.0 changes Health, Health Care, Medicine and Biomedical Research", to stimulate and encourage research in these five areas.

  3. Medicine 2.0: Social Networking, Collaboration, Participation, Apomediation, and Openness

    PubMed Central

    2008-01-01

    In a very significant development for eHealth, a broad adoption of Web 2.0 technologies and approaches coincides with the more recent emergence of Personal Health Application Platforms and Personally Controlled Health Records such as Google Health, Microsoft HealthVault, and Dossia. “Medicine 2.0” applications, services, and tools are defined as Web-based services for health care consumers, caregivers, patients, health professionals, and biomedical researchers, that use Web 2.0 technologies and/or semantic web and virtual reality approaches to enable and facilitate specifically 1) social networking, 2) participation, 3) apomediation, 4) openness, and 5) collaboration, within and between these user groups. The Journal of Medical Internet Research (JMIR) publishes a Medicine 2.0 theme issue and sponsors a conference on “How Social Networking and Web 2.0 changes Health, Health Care, Medicine, and Biomedical Research”, to stimulate and encourage research in these five areas. PMID:18725354

  4. Advances in Molecular Rotational Spectroscopy for Applied Science

    NASA Astrophysics Data System (ADS)

    Harris, Brent; Fields, Shelby S.; Pulliam, Robin; Muckle, Matt; Neill, Justin L.

    2017-06-01

    Advances in chemical sensitivity and robust, solid-state designs for microwave/millimeter-wave instrumentation compel the expansion of molecular rotational spectroscopy as research tool into applied science. It is familiar to consider molecular rotational spectroscopy for air analysis. Those techniques for molecular rotational spectroscopy are included in our presentation of a more broad application space for materials analysis using Fourier Transform Molecular Rotational Resonance (FT-MRR) spectrometers. There are potentially transformative advantages for direct gas analysis of complex mixtures, determination of unknown evolved gases with parts per trillion detection limits in solid materials, and unambiguous chiral determination. The introduction of FT-MRR as an alternative detection principle for analytical chemistry has created a ripe research space for the development of new analytical methods and sampling equipment to fully enable FT-MRR. We present the current state of purpose-built FT-MRR instrumentation and the latest application measurements that make use of new sampling methods.

  5. Normal mode analysis and applications in biological physics.

    PubMed

    Dykeman, Eric C; Sankey, Otto F

    2010-10-27

    Normal mode analysis has become a popular and often used theoretical tool in the study of functional motions in enzymes, viruses, and large protein assemblies. The use of normal modes in the study of these motions is often extremely fruitful since many of the functional motions of large proteins can be described using just a few normal modes which are intimately related to the overall structure of the protein. In this review, we present a broad overview of several popular methods used in the study of normal modes in biological physics including continuum elastic theory, the elastic network model, and a new all-atom method, recently developed, which is capable of computing a subset of the low frequency vibrational modes exactly. After a review of the various methods, we present several examples of applications of normal modes in the study of functional motions, with an emphasis on viral capsids.

  6. Artificial Lipid Membranes: Past, Present, and Future

    PubMed Central

    Siontorou, Christina G.; Nikoleli, Georgia-Paraskevi; Nikolelis, Dimitrios P.

    2017-01-01

    The multifaceted role of biological membranes prompted early the development of artificial lipid-based models with a primary view of reconstituting the natural functions in vitro so as to study and exploit chemoreception for sensor engineering. Over the years, a fair amount of knowledge on the artificial lipid membranes, as both, suspended or supported lipid films and liposomes, has been disseminated and has helped to diversify and expand initial scopes. Artificial lipid membranes can be constructed by several methods, stabilized by various means, functionalized in a variety of ways, experimented upon intensively, and broadly utilized in sensor development, drug testing, drug discovery or as molecular tools and research probes for elucidating the mechanics and the mechanisms of biological membranes. This paper reviews the state-of-the-art, discusses the diversity of applications, and presents future perspectives. The newly-introduced field of artificial cells further broadens the applicability of artificial membranes in studying the evolution of life. PMID:28933723

  7. Antimicrobial compounds from seaweeds-associated bacteria and fungi.

    PubMed

    Singh, Ravindra Pal; Kumari, Puja; Reddy, C R K

    2015-02-01

    In recent decade, seaweeds-associated microbial communities have been significantly evaluated for functional and chemical analyses. Such analyses let to conclude that seaweeds-associated microbial communities are highly diverse and rich sources of bioactive compounds of exceptional molecular structure. Extracting bioactive compounds from seaweed-associated microbial communities have been recently increased due to their broad-spectrum antimicrobial activities including antibacterial, antifungal, antiviral, anti-settlement, antiprotozoan, antiparasitic, and antitumor. These allelochemicals not only provide protection to host from other surrounding pelagic microorganisms, but also ensure their association with the host. Antimicrobial compounds from marine sources are promising and priority targets of biotechnological and pharmaceutical applications. This review describes the bioactive metabolites reported from seaweed-associated bacterial and fungal communities and illustrates their bioactivities. Biotechnological application of metagenomic approach for identifying novel bioactive metabolites is also dealt, in view of their future development as a strong tool to discover novel drug targets from seaweed-associated microbial communities.

  8. Unifying Spectral and Timing Studies of Relativistic Reflection in Active Galactic Nuclei

    NASA Astrophysics Data System (ADS)

    Reynolds, Christopher

    X-ray observations of active galactic nuclei (AGN) contain a wealth of information relevant for understanding the structure of AGN, the process of accretion, and the gravitational physics of supermassive black holes. A particularly exciting development over the past four years has been the discovery and subsequent characterization of time delays between variability of the X-ray power-law continuum and the inner disk reflection spectrum including the broad iron line. The fact that the broad iron line shows this echo, or reverberation, in XMM-Newton, Suzaku and NuSTAR data is a strong confirmation of the disk reflection paradigm and has already been used to place constraints on the extent and geometry of the X-ray corona. However, current studies of AGN X-ray variability, including broad iron line reverberation, are only scratching the surface of the available data. At the present time, essentially all studies conduct temporal analyzes in a manner that is largely divorced from detailed spectroscopy - consistency between timing results (e.g., conclusions regarding the location of the primary X-ray source) and detailed spectral fits is examined after the fact. We propose to develop and apply new analysis tools for conducting a truly unified spectraltiming analysis of the X-ray properties of AGN. Operationally, this can be thought of as spectral fitting except with additional parameters that are accessing the temporal properties of the dataset. Our first set of tools will be based on Fourier techniques (via the construction and fitting of the energy- and frequency-dependent cross-spectrum) and most readily applicable to long observations of AGN with XMM-Newton. Later, we shall develop more general schemes (of a more Bayesian nature) that can operate on irregularly sampled data or quasi-simultaneous data from multiple instruments. These shall be applied to the long joint XMM-Newton/NuSTAR and Suzaku/NuSTAR AGN campaigns as well as Swift monitoring campaigns. Another important dimension of our work is the introduction of spectral and spectral-timing models of X-ray reflection from black hole disks that include realistic disk thickness (as opposed to the razor-thin disks assumed in current analysis tools). The astrophysical implications of our work are: - The first rigorous decomposition of the time-lags into those from reverberation and those from intrinsic continuum processes. - A new method for determining the density of photoionized (warm) absorbers in AGN through a measurement of the recombination time lags. - AGN black hole mass estimates obtained purely from X-ray data, and hence complementary to (observationally expensive) optical broad line reverberation campaigns. - The best possible characterization of strong gravity signatures in the reflected disk emission. - Detection and characterization of non-trivial accretion disk structure. Each of our tools and data products will be made available to the community/public upon the publication of the first results with that tool. The proposed work is in direct support of the NASA Science Plan, and is of direct relevant and support to NASA's fleet of X-ray observatories.

  9. An Active Broad Area Cooling Model of a Cryogenic Propellant Tank with a Single Stage Reverse Turbo-Brayton Cycle Cryocooler

    NASA Technical Reports Server (NTRS)

    Guzik, Monica C.; Tomsik, Thomas M.

    2011-01-01

    As focus shifts towards long-duration space exploration missions, an increased interest in active thermal control of cryogenic propellants to achieve zero boil-off of cryogens has emerged. An active thermal control concept of considerable merit is the integration of a broad area cooling system for a cryogenic propellant tank with a combined cryocooler and circulator system that can be used to reduce or even eliminate liquid cryogen boil-off. One prospective cryocooler and circulator combination is the reverse turbo-Brayton cycle cryocooler. This system is unique in that it has the ability to both cool and circulate the coolant gas efficiently in the same loop as the broad area cooling lines, allowing for a single cooling gas loop, with the primary heat rejection occurring by way of a radiator and/or aftercooler. Currently few modeling tools exist that can size and characterize an integrated reverse turbo-Brayton cycle cryocooler in combination with a broad area cooling design. This paper addresses efforts to create such a tool to assist in gaining a broader understanding of these systems, and investigate their performance in potential space missions. The model uses conventional engineering and thermodynamic relationships to predict the preliminary design parameters, including input power requirements, pressure drops, flow rate, cycle performance, cooling lift, broad area cooler line sizing, and component operating temperatures and pressures given the cooling load operating temperature, heat rejection temperature, compressor inlet pressure, compressor rotational speed, and cryogenic tank geometry. In addition, the model allows for the preliminary design analysis of the broad area cooling tubing, to determine the effect of tube sizing on the reverse turbo-Brayton cycle system performance. At the time this paper was written, the model was verified to match existing theoretical documentation within a reasonable margin. While further experimental data is needed for full validation, this tool has already made significant steps towards giving a clearer understanding of the performance of a reverse turbo-Brayton cycle cryocooler integrated with broad area cooling technology for zero boil-off active thermal control.

  10. Geomorphic analyses from space imagery

    NASA Technical Reports Server (NTRS)

    Morisawa, M.

    1985-01-01

    One of the most obvious applications of space imagery to geomorphological analyses is in the study of drainage patterns and channel networks. LANDSAT, high altitude photography and other types of remote sensing imagery are excellent for depicting stream networks on a regional scale because of their broad coverage in a single image. They offer a valuable tool for comparing and analyzing drainage patterns and channel networks all over the world. Three aspects considered in this geomorphological study are: (1) the origin, evolution and rates of development of drainage systems; (2) the topological studies of network and channel arrangements; and (3) the adjustment of streams to tectonic events and geologic structure (i.e., the mode and rate of adjustment).

  11. Catalytic Conia-ene and related reactions.

    PubMed

    Hack, Daniel; Blümel, Marcus; Chauhan, Pankaj; Philipps, Arne R; Enders, Dieter

    2015-10-07

    Since its initial inception, the Conia-ene reaction, known as the intramolecular addition of enols to alkynes or alkenes, has experienced a tremendous development and appealing catalytic protocols have emerged. This review fathoms the underlying mechanistic principles rationalizing how substrate design, substrate activation, and the nature of the catalyst work hand in hand for the efficient synthesis of carbocycles and heterocycles at mild reaction conditions. Nowadays, Conia-ene reactions can be found as part of tandem reactions, and the road for asymmetric versions has already been paved. Based on their broad applicability, Conia-ene reactions have turned into a highly appreciated synthetic tool with impressive examples in natural product synthesis reported in recent years.

  12. Engineering Approaches to Illuminating Brain Structure and Dynamics

    PubMed Central

    Deisseroth, Karl; Schnitzer, Mark J.

    2017-01-01

    Historical milestones in neuroscience have come in diverse forms, ranging from the resolution of specific biological mysteries via creative experimentation to broad technological advances allowing neuroscientists to ask new kinds of questions. The continuous development of tools is driven with a special necessity by the complexity, fragility, and inaccessibility of intact nervous systems, such that inventive technique development and application drawing upon engineering and the applied sciences has long been essential to neuroscience. Here we highlight recent technological directions in neuroscience spurred by progress in optical, electrical, mechanical, chemical, and biological engineering. These research areas are poised for rapid growth and will likely be central to the practice of neuroscience well into the future. PMID:24183010

  13. Milestones toward Majorana-based quantum computing

    NASA Astrophysics Data System (ADS)

    Alicea, Jason

    Experiments on nanowire-based Majorana platforms now appear poised to move beyond the preliminary problem of zero-mode detection and towards loftier goals of realizing non-Abelian statistics and quantum information applications. Using an approach that synthesizes recent materials growth breakthroughs with tools long successfully deployed in quantum-dot research, I will outline a number of relatively modest milestones that progressively bridge the gap between the current state of the art and these grand longer-term challenges. The intermediate Majorana experiments surveyed in this talk should be broadly adaptable to other approaches as well. Supported by the National Science Foundation (DMR-1341822), Institute for Quantum Information and Matter, and Walter Burke Institute at Caltech.

  14. USGS perspectives on an integrated approach to watershed and coastal management

    USGS Publications Warehouse

    Larsen, Matthew C.; Hamilton, Pixie A.; Haines, John W.; Mason, Jr., Robert R.

    2010-01-01

    The writers discuss three critically important steps necessary for achieving the goal for improved integrated approaches on watershed and coastal protection and management. These steps involve modernization of monitoring networks, creation of common data and web services infrastructures, and development of modeling, assessment, and research tools. Long-term monitoring is needed for tracking the effectiveness approaches for controlling land-based sources of nutrients, contaminants, and invasive species. The integration of mapping and monitoring with conceptual and mathematical models, and multidisciplinary assessments is important in making well-informed decisions. Moreover, a better integrated data network is essential for mapping, statistical, and modeling applications, and timely dissemination of data and information products to a broad community of users.

  15. An Overview of Key Indicators and Evaluation Tools for Assessing Housing Quality: A Literature Review

    NASA Astrophysics Data System (ADS)

    Sinha, Rajan Chandra; Sarkar, Satyaki; Mandal, Nikhil Ranjan

    2017-09-01

    The issue of the housing quality has been addressed for various stakeholders at different levels. There exist varied opinion about its measurability and possible applications. Thus the study is carried out to have an insight into the concept of housing quality and its relevance in the changing demographics, technological, socio-economic and socio-cultural conditions. This study attempts to summarize the literature that addresses past research concerned with factors related to housing quality, its measurement methodology and critically examines the broad key indicators identified to have impact upon enhancing the housing quality. This work discusses the recent techniques which are extensively used for analysis of housing quality.

  16. Geospatial analysis based on GIS integrated with LADAR.

    PubMed

    Fetterman, Matt R; Freking, Robert; Fernandez-Cull, Christy; Hinkle, Christopher W; Myne, Anu; Relyea, Steven; Winslow, Jim

    2013-10-07

    In this work, we describe multi-layered analyses of a high-resolution broad-area LADAR data set in support of expeditionary activities. High-level features are extracted from the LADAR data, such as the presence and location of buildings and cars, and then these features are used to populate a GIS (geographic information system) tool. We also apply line-of-sight (LOS) analysis to develop a path-planning module. Finally, visualization is addressed and enhanced with a gesture-based control system that allows the user to navigate through the enhanced data set in a virtual immersive experience. This work has operational applications including military, security, disaster relief, and task-based robotic path planning.

  17. Computational medicinal chemistry in fragment-based drug discovery: what, how and when.

    PubMed

    Rabal, Obdulia; Urbano-Cuadrado, Manuel; Oyarzabal, Julen

    2011-01-01

    The use of fragment-based drug discovery (FBDD) has increased in the last decade due to the encouraging results obtained to date. In this scenario, computational approaches, together with experimental information, play an important role to guide and speed up the process. By default, FBDD is generally considered as a constructive approach. However, such additive behavior is not always present, therefore, simple fragment maturation will not always deliver the expected results. In this review, computational approaches utilized in FBDD are reported together with real case studies, where applicability domains are exemplified, in order to analyze them, and then, maximize their performance and reliability. Thus, a proper use of these computational tools can minimize misleading conclusions, keeping the credit on FBDD strategy, as well as achieve higher impact in the drug-discovery process. FBDD goes one step beyond a simple constructive approach. A broad set of computational tools: docking, R group quantitative structure-activity relationship, fragmentation tools, fragments management tools, patents analysis and fragment-hopping, for example, can be utilized in FBDD, providing a clear positive impact if they are utilized in the proper scenario - what, how and when. An initial assessment of additive/non-additive behavior is a critical point to define the most convenient approach for fragments elaboration.

  18. Novel calibration tools and validation concepts for microarray-based platforms used in molecular diagnostics and food safety control.

    PubMed

    Brunner, C; Hoffmann, K; Thiele, T; Schedler, U; Jehle, H; Resch-Genger, U

    2015-04-01

    Commercial platforms consisting of ready-to-use microarrays printed with target-specific DNA probes, a microarray scanner, and software for data analysis are available for different applications in medical diagnostics and food analysis, detecting, e.g., viral and bacteriological DNA sequences. The transfer of these tools from basic research to routine analysis, their broad acceptance in regulated areas, and their use in medical practice requires suitable calibration tools for regular control of instrument performance in addition to internal assay controls. Here, we present the development of a novel assay-adapted calibration slide for a commercialized DNA-based assay platform, consisting of precisely arranged fluorescent areas of various intensities obtained by incorporating different concentrations of a "green" dye and a "red" dye in a polymer matrix. These dyes present "Cy3" and "Cy5" analogues with improved photostability, chosen based upon their spectroscopic properties closely matching those of common labels for the green and red channel of microarray scanners. This simple tool allows to efficiently and regularly assess and control the performance of the microarray scanner provided with the biochip platform and to compare different scanners. It will be eventually used as fluorescence intensity scale for referencing of assays results and to enhance the overall comparability of diagnostic tests.

  19. Adaptive awareness for personal and small group decision making.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perano, Kenneth J.; Tucker, Steve; Pancerella, Carmen M.

    2003-12-01

    Many situations call for the use of sensors monitoring physiological and environmental data. In order to use the large amounts of sensor data to affect decision making, we are coupling heterogeneous sensors with small, light-weight processors, other powerful computers, wireless communications, and embedded intelligent software. The result is an adaptive awareness and warning tool, which provides both situation awareness and personal awareness to individuals and teams. Central to this tool is a sensor-independent architecture, which combines both software agents and a reusable core software framework that manages the available hardware resources and provides services to the agents. Agents can recognizemore » cues from the data, warn humans about situations, and act as decision-making aids. Within the agents, self-organizing maps (SOMs) are used to process physiological data in order to provide personal awareness. We have employed a novel clustering algorithm to train the SOM to discern individual body states and activities. This awareness tool has broad applicability to emergency teams, military squads, military medics, individual exercise and fitness monitoring, health monitoring for sick and elderly persons, and environmental monitoring in public places. This report discusses our hardware decisions, software framework, and a pilot awareness tool, which has been developed at Sandia National Laboratories.« less

  20. Selection into medical school: from tools to domains.

    PubMed

    Wilkinson, Tom M; Wilkinson, Tim J

    2016-10-03

    Most research into the validity of admissions tools focuses on the isolated correlations of individual tools with later outcomes. Instead, looking at how domains of attributes, rather than tools, predict later success is likely to be more generalizable. We aim to produce a blueprint for an admissions scheme that is broadly relevant across institutions. We broke down all measures used for admissions at one medical school into the smallest possible component scores. We grouped these into domains on the basis of a multicollinearity analysis, and conducted a regression analysis to determine the independent validity of each domain to predict outcomes of interest. We identified four broad domains: logical reasoning and problem solving, understanding people, communication skills, and biomedical science. Each was independently and significantly associated with performance in final medical school examinations. We identified two potential errors in the design of admissions schema that can undermine their validity: focusing on tools rather than outcomes, and including a wide range of measures without objectively evaluating the independent contribution of each. Both could be avoided by following a process of programmatic assessment for selection.

  1. Trends and Issues in Fuzzy Control and Neuro-Fuzzy Modeling

    NASA Technical Reports Server (NTRS)

    Chiu, Stephen

    1996-01-01

    Everyday experience in building and repairing things around the home have taught us the importance of using the right tool for the right job. Although we tend to think of a 'job' in broad terms, such as 'build a bookcase,' we understand well that the 'right job' associated with each 'right tool' is typically a narrowly bounded subtask, such as 'tighten the screws.' Unfortunately, we often lose sight of this principle when solving engineering problems; we treat a broadly defined problem, such as controlling or modeling a system, as a narrow one that has a single 'right tool' (e.g., linear analysis, fuzzy logic, neural network). We need to recognize that a typical real-world problem contains a number of different sub-problems, and that a truly optimal solution (the best combination of cost, performance and feature) is obtained by applying the right tool to the right sub-problem. Here I share some of my perspectives on what constitutes the 'right job' for fuzzy control and describe recent advances in neuro-fuzzy modeling to illustrate and to motivate the synergistic use of different tools.

  2. Molecular Rotors for Universal Quantitation of Nanoscale Hydrophobic Interfaces in Microplate Format.

    PubMed

    Bisso, Paul W; Tai, Michelle; Katepalli, Hari; Bertrand, Nicolas; Blankschtein, Daniel; Langer, Robert

    2018-01-10

    Hydrophobic self-assembly pairs diverse chemical precursors and simple formulation processes to access a vast array of functional colloids. Exploration of this design space, however, is stymied by lack of broadly general, high-throughput colloid characterization tools. Here, we show that a narrow structural subset of fluorescent, zwitterionic molecular rotors, dialkylaminostilbazolium sulfonates [DASS] with intermediate-length alkyl tails, fills this major analytical void by quantitatively sensing hydrophobic interfaces in microplate format. DASS dyes supersede existing interfacial probes by avoiding off-target fluorogenic interactions and dye aggregation while preserving hydrophobic partitioning strength. To illustrate the generality of this approach, we demonstrate (i) a microplate-based technique for measuring mass concentration of small (20-200 nm), dilute (submicrogram sensitivity) drug delivery nanoparticles; (ii) elimination of particle size, surfactant chemistry, and throughput constraints on quantifying the complex surfactant/metal oxide adsorption isotherms critical for environmental remediation and enhanced oil recovery; and (iii) more reliable self-assembly onset quantitation for chemically and structurally distinct amphiphiles. These methods could streamline the development of nanotechnologies for a broad range of applications.

  3. A Plasmonic Mass Spectrometry Approach for Detection of Small Nutrients and Toxins

    NASA Astrophysics Data System (ADS)

    Wu, Shu; Qian, Linxi; Huang, Lin; Sun, Xuming; Su, Haiyang; Gurav, Deepanjali D.; Jiang, Mawei; Cai, Wei; Qian, Kun

    2018-07-01

    Nutriology relies on advanced analytical tools to study the molecular compositions of food and provide key information on sample quality/safety. Small nutrients detection is challenging due to the high diversity and broad dynamic range of molecules in food samples, and a further issue is to track low abundance toxins. Herein, we developed a novel plasmonic matrix-assisted laser desorption/ionization mass spectrometry (MALDI MS) approach to detect small nutrients and toxins in complex biological emulsion samples. Silver nanoshells (SiO2@Ag) with optimized structures were used as matrices and achieved direct analysis of 6 nL of human breast milk without any enrichment or separation. We performed identification and quantitation of small nutrients and toxins with limit-of-detection down to 0.4 pmol (for melamine) and reaction time shortened to minutes, which is superior to the conventional biochemical method currently in use. The developed approach contributes to the near-future application of MALDI MS in a broad field and personalized design of plasmonic materials for real-case bio-analysis.[Figure not available: see fulltext.

  4. Solvent mediated hybrid 2D materials: black phosphorus - graphene heterostructured building blocks assembled for sodium ion batteries.

    PubMed

    Li, Mengya; Muralidharan, Nitin; Moyer, Kathleen; Pint, Cary L

    2018-06-07

    Here we demonstrate the broad capability to exploit interactions at different length scales in 2D materials to prepare macroscopic functional materials containing hybrid black phosphorus/graphene (BP/G) heterostructured building blocks. First, heterostructured 2D building blocks are self-assembled during co-exfoliation in the solution phase based on electrostatic attraction of different 2D materials. Second, electrophoretic deposition is used as a tool to assemble these building blocks into macroscopic films containing these self-assembled 2D heterostructures. Characterization of deposits formed using this technique elucidates the presence of stacked and sandwiched 2D heterostructures, and zeta potential measurements confirm the mechanistic interactions driving this assembly. Building on the exceptional sodium alloying capacity of BP, these materials were demonstrated as superior binder-free and additive-free anodes for sodium batteries with specific discharge capacity of 2365 mA h gP-1 and long stable cycling duration. This study demonstrates how controllable co-processing of 2D materials can enable material control for stacking and building block assembly relevant to broad future applications of 2D materials.

  5. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  6. The GenABEL Project for statistical genomics

    PubMed Central

    Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381

  7. PIPER: Performance Insight for Programmers and Exascale Runtimes: Guiding the Development of the Exascale Software Stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    The PIPER project set out to develop methodologies and software for measurement, analysis, attribution, and presentation of performance data for extreme-scale systems. Goals of the project were to support analysis of massive multi-scale parallelism, heterogeneous architectures, multi-faceted performance concerns, and to support both post-mortem performance analysis to identify program features that contribute to problematic performance and on-line performance analysis to drive adaptation. This final report summarizes the research and development activity at Rice University as part of the PIPER project. Producing a complete suite of performance tools for exascale platforms during the course of this project was impossible since bothmore » hardware and software for exascale systems is still a moving target. For that reason, the project focused broadly on the development of new techniques for measurement and analysis of performance on modern parallel architectures, enhancements to HPCToolkit’s software infrastructure to support our research goals or use on sophisticated applications, engaging developers of multithreaded runtimes to explore how support for tools should be integrated into their designs, engaging operating system developers with feature requests for enhanced monitoring support, engaging vendors with requests that they add hardware measure- ment capabilities and software interfaces needed by tools as they design new components of HPC platforms including processors, accelerators and networks, and finally collaborations with partners interested in using HPCToolkit to analyze and tune scalable parallel applications.« less

  8. A preface on advances in diagnostics for infectious and parasitic diseases: detecting parasites of medical and veterinary importance.

    PubMed

    Stothard, J Russell; Adams, Emily

    2014-12-01

    There are many reasons why detection of parasites of medical and veterinary importance is vital and where novel diagnostic and surveillance tools are required. From a medical perspective alone, these originate from a desire for better clinical management and rational use of medications. Diagnosis can be at the individual-level, at close to patient settings in testing a clinical suspicion or at the community-level, perhaps in front of a computer screen, in classification of endemic areas and devising appropriate control interventions. Thus diagnostics for parasitic diseases has a broad remit as parasites are not only tied with their definitive hosts but also in some cases with their vectors/intermediate hosts. Application of current diagnostic tools and decision algorithms in sustaining control programmes, or in elimination settings, can be problematic and even ill-fitting. For example in resource-limited settings, are current diagnostic tools sufficiently robust for operational use at scale or are they confounded by on-the-ground realities; are the diagnostic algorithms underlying public health interventions always understood and well-received within communities which are targeted for control? Within this Special Issue (SI) covering a variety of diseases and diagnostic settings some answers are forthcoming. An important theme, however, throughout the SI is to acknowledge that cross-talk and continuous feedback between development and application of diagnostic tests is crucial if they are to be used effectively and appropriately.

  9. PYCHEM: a multivariate analysis package for python.

    PubMed

    Jarvis, Roger M; Broadhurst, David; Johnson, Helen; O'Boyle, Noel M; Goodacre, Royston

    2006-10-15

    We have implemented a multivariate statistical analysis toolbox, with an optional standalone graphical user interface (GUI), using the Python scripting language. This is a free and open source project that addresses the need for a multivariate analysis toolbox in Python. Although the functionality provided does not cover the full range of multivariate tools that are available, it has a broad complement of methods that are widely used in the biological sciences. In contrast to tools like MATLAB, PyChem 2.0.0 is easily accessible and free, allows for rapid extension using a range of Python modules and is part of the growing amount of complementary and interoperable scientific software in Python based upon SciPy. One of the attractions of PyChem is that it is an open source project and so there is an opportunity, through collaboration, to increase the scope of the software and to continually evolve a user-friendly platform that has applicability across a wide range of analytical and post-genomic disciplines. http://sourceforge.net/projects/pychem

  10. Special population planner 4 : an open source release.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuiper, J.; Metz, W.; Tanzman, E.

    2008-01-01

    Emergencies like Hurricane Katrina and the recent California wildfires underscore the critical need to meet the complex challenge of planning for individuals with special needs and for institutionalized special populations. People with special needs and special populations often have difficulty responding to emergencies or taking protective actions, and emergency responders may be unaware of their existence and situations during a crisis. Special Population Planner (SPP) is an ArcGIS-based emergency planning system released as an open source product. SPP provides for easy production of maps, reports, and analyses to develop and revise emergency response plans. It includes tools to manage amore » voluntary registry of data for people with special needs, integrated links to plans and documents, tools for response planning and analysis, preformatted reports and maps, and data on locations of special populations, facility and resource characteristics, and contacts. The system can be readily adapted for new settings without programming and is broadly applicable. Full documentation and a demonstration database are included in the release.« less

  11. RootGraph: a graphic optimization tool for automated image analysis of plant roots

    PubMed Central

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N.; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J.

    2015-01-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions. PMID:26224880

  12. The role of biogeochemical hotspots, landscape heterogeneity, and hydrological connectivity for minimizing forestry effects on water quality.

    PubMed

    Laudon, Hjalmar; Kuglerová, Lenka; Sponseller, Ryan A; Futter, Martyn; Nordin, Annika; Bishop, Kevin; Lundmark, Tomas; Egnell, Gustaf; Ågren, Anneli M

    2016-02-01

    Protecting water quality in forested regions is increasingly important as pressures from land-use, long-range transport of air pollutants, and climate change intensify. Maintaining forest industry without jeopardizing sustainability of surface water quality therefore requires new tools and approaches. Here, we show how forest management can be optimized by incorporating landscape sensitivity and hydrological connectivity into a framework that promotes the protection of water quality. We discuss how this approach can be operationalized into a hydromapping tool to support forestry operations that minimize water quality impacts. We specifically focus on how hydromapping can be used to support three fundamental aspects of land management planning including how to (i) locate areas where different forestry practices can be conducted with minimal water quality impact; (ii) guide the off-road driving of forestry machines to minimize soil damage; and (iii) optimize the design of riparian buffer zones. While this work has a boreal perspective, these concepts and approaches have broad-scale applicability.

  13. Synthesis and characterization of attosecond light vortices in the extreme ultraviolet

    PubMed Central

    Géneaux, R.; Camper, A.; Auguste, T.; Gobert, O.; Caillat, J.; Taïeb, R.; Ruchon, T.

    2016-01-01

    Infrared and visible light beams carrying orbital angular momentum (OAM) are currently thoroughly studied for their extremely broad applicative prospects, among which are quantum information, micromachining and diagnostic tools. Here we extend these prospects, presenting a comprehensive study for the synthesis and full characterization of optical vortices carrying OAM in the extreme ultraviolet (XUV) domain. We confirm the upconversion rules of a femtosecond infrared helically phased beam into its high-order harmonics, showing that each harmonic order carries the total number of OAM units absorbed in the process up to very high orders (57). This allows us to synthesize and characterize helically shaped XUV trains of attosecond pulses. To demonstrate a typical use of these new XUV light beams, we show our ability to generate and control, through photoionization, attosecond electron beams carrying OAM. These breakthroughs pave the route for the study of a series of fundamental phenomena and the development of new ultrafast diagnosis tools using either photonic or electronic vortices. PMID:27573787

  14. Synthesis and characterization of attosecond light vortices in the extreme ultraviolet

    DOE PAGES

    Géneaux, R.; Camper, A.; Auguste, T.; ...

    2016-08-30

    Infrared and visible light beams carrying orbital angular momentum (OAM) are currently thoroughly studied for their extremely broad applicative prospects, among which are quantum information, micromachining and diagnostic tools. Here we extend these prospects, presenting a comprehensive study for the synthesis and full characterization of optical vortices carrying OAM in the extreme ultraviolet (XUV) domain. We confirm the upconversion rules of a femtosecond infrared helically phased beam into its high-order harmonics, showing that each harmonic order carries the total number of OAM units absorbed in the process up to very high orders (57). This allows us to synthesize and characterizemore » helically shaped XUV trains of attosecond pulses. To demonstrate a typical use of these new XUV light beams, we show our ability to generate and control, through photoionization, attosecond electron beams carrying OAM. Furthermore, these breakthroughs pave the route for the study of a series of fundamental phenomena and the development of new ultrafast diagnosis tools using either photonic or electronic vortices.« less

  15. Optogenetic control of Drosophila using a red-shifted channelrhodopsin reveals experience-dependent influences on courtship.

    PubMed

    Inagaki, Hidehiko K; Jung, Yonil; Hoopfer, Eric D; Wong, Allan M; Mishra, Neeli; Lin, John Y; Tsien, Roger Y; Anderson, David J

    2014-03-01

    Optogenetics allows the manipulation of neural activity in freely moving animals with millisecond precision, but its application in Drosophila melanogaster has been limited. Here we show that a recently described red activatable channelrhodopsin (ReaChR) permits control of complex behavior in freely moving adult flies, at wavelengths that are not thought to interfere with normal visual function. This tool affords the opportunity to control neural activity over a broad dynamic range of stimulation intensities. Using time-resolved activation, we show that the neural control of male courtship song can be separated into (i) probabilistic, persistent and (ii) deterministic, command-like components. The former, but not the latter, neurons are subject to functional modulation by social experience, which supports the idea that they constitute a locus of state-dependent influence. This separation is not evident using thermogenetic tools, a result underscoring the importance of temporally precise control of neuronal activation in the functional dissection of neural circuits in Drosophila.

  16. Spatiotemporal control of opioid signaling and behavior

    PubMed Central

    Siuda, Edward R.; Copits, Bryan A.; Schmidt, Martin J.; Baird, Madison A.; Al-Hasani, Ream; Planer, William J.; Funderburk, Samuel C.; McCall, Jordan G.; Gereau, Robert W.; Bruchas, Michael R.

    2015-01-01

    Summary Optogenetics is now a widely accepted tool for spatiotemporal manipulation of neuronal activity. However, a majority of optogenetic approaches use binary on/off control schemes. Here we extend the optogenetic toolset by developing a neuromodulatory approach using a rationale-based design to generate a Gi-coupled, optically-sensitive, mu-opioid-like receptor, we term opto-MOR. We demonstrate that opto-MOR engages canonical mu-opioid signaling through inhibition of adenylyl cyclase, activation of MAPK and G protein-gated inward rectifying potassium (GIRK) channels, and internalizes with similar kinetics as the mu-opioid receptor. To assess in vivo utility we expressed a Cre-dependent viral opto-MOR in RMTg/VTA GABAergic neurons, which led to a real-time place preference. In contrast, expression of opto-MOR in GABAergic neurons of the ventral pallidum hedonic cold spot, led to real-time place aversion. This tool has generalizable application for spatiotemporal control of opioid signaling and, furthermore, can be used broadly for mimicking endogenous neuronal inhibition pathways. PMID:25937173

  17. T2N as a new tool for robust electrophysiological modeling demonstrated for mature and adult-born dentate granule cells

    PubMed Central

    Mongiat, Lucas Alberto; Schwarzacher, Stephan Wolfgang

    2017-01-01

    Compartmental models are the theoretical tool of choice for understanding single neuron computations. However, many models are incomplete, built ad hoc and require tuning for each novel condition rendering them of limited usability. Here, we present T2N, a powerful interface to control NEURON with Matlab and TREES toolbox, which supports generating models stable over a broad range of reconstructed and synthetic morphologies. We illustrate this for a novel, highly detailed active model of dentate granule cells (GCs) replicating a wide palette of experiments from various labs. By implementing known differences in ion channel composition and morphology, our model reproduces data from mouse or rat, mature or adult-born GCs as well as pharmacological interventions and epileptic conditions. This work sets a new benchmark for detailed compartmental modeling. T2N is suitable for creating robust models useful for large-scale networks that could lead to novel predictions. We discuss possible T2N application in degeneracy studies. PMID:29165247

  18. Effects of Lambertian sources design on uniformity and measurements

    NASA Astrophysics Data System (ADS)

    Cariou, Nadine; Durell, Chris; McKee, Greg; Wilks, Dylan; Glastre, Wilfried

    2014-10-01

    Integrating sphere (IS) based uniform sources are a primary tool for ground based calibration, characterization and testing of flight radiometric equipment. The idea of a Lambertian field of energy is a very useful tool in radiometric testing, but this concept is being checked in many ways by newly lowered uncertainty goals. At an uncertainty goal of 2% one needs to assess carefully uniformity in addition to calibration uncertainties, as even sources with a 0.5% uniformity are now substantial proportions of uncertainty budgets. The paper explores integrating sphere design options for achieving 99.5% and better uniformity of exit port radiance and spectral irradiance created by an integrating sphere. Uniformity in broad spectrum and spectral bands are explored. We discuss mapping techniques and results as a function of observed uniformity as well as laboratory testing results customized to match with customer's instrumentation field of view. We will also discuss recommendations with basic commercial instrumentation, we have used to validate, inspect, and improve correlation of uniformity measurements with the intended application.

  19. A novel methodology for in-process monitoring of flow forming

    NASA Astrophysics Data System (ADS)

    Appleby, Andrew; Conway, Alastair; Ion, William

    2017-10-01

    Flow forming (FF) is an incremental cold working process with near-net-shape forming capability. Failures by fracture due to high deformation can be unexpected and sometimes catastrophic, causing tool damage. If process failures can be identified in real time, an automatic cut-out could prevent costly tool damage. Sound and vibration monitoring is well established and commercially viable in the machining sector to detect current and incipient process failures, but not for FF. A broad-frequency microphone was used to record the sound signature of the manufacturing cycle for a series of FF parts. Parts were flow formed using single and multiple passes, and flaws were introduced into some of the parts to simulate the presence of spontaneously initiated cracks. The results show that this methodology is capable of identifying both introduced defects and spontaneous failures during flow forming. Further investigation is needed to categorise and identify different modes of failure and identify further potential applications in rotary forming.

  20. Upstream oversight assessment for agrifood nanotechnology: a case studies approach.

    PubMed

    Kuzma, Jennifer; Romanchek, James; Kokotovich, Adam

    2008-08-01

    Although nanotechnology is broadly receiving attention in public and academic circles, oversight issues associated with applications for agriculture and food remain largely unexplored. Agrifood nanotechnology is at a critical stage in which informed analysis can help shape funding priorities, risk assessment, and oversight activities. This analysis is designed to help society and policymakers anticipate and prepare for challenges posed by complicated, convergent applications of agrifood nanotechnology. The goal is to identify data, risk assessment, regulatory policy, and engagement needs for overseeing these products so they can be addressed prior to market entry. Our approach, termed upstream oversight assessment (UOA), has potential as a key element of anticipatory governance. It relies on distinct case studies of proposed applications of agrifood nanotechnology to highlight areas that need study and attention. As a tool for preparation, UOA anticipates the types and features of emerging applications; their endpoints of use in society; the extent to which users, workers, ecosystems, or consumers will be exposed; the nature of the material and its safety; whether and where the technologies might fit into current regulatory system(s); the strengths and weaknesses of the system(s) in light of these novel applications; and the possible social concerns related to oversight for them.

  1. Applications of species accumulation curves in large-scale biological data analysis.

    PubMed

    Deng, Chao; Daley, Timothy; Smith, Andrew D

    2015-09-01

    The species accumulation curve, or collector's curve, of a population gives the expected number of observed species or distinct classes as a function of sampling effort. Species accumulation curves allow researchers to assess and compare diversity across populations or to evaluate the benefits of additional sampling. Traditional applications have focused on ecological populations but emerging large-scale applications, for example in DNA sequencing, are orders of magnitude larger and present new challenges. We developed a method to estimate accumulation curves for predicting the complexity of DNA sequencing libraries. This method uses rational function approximations to a classical non-parametric empirical Bayes estimator due to Good and Toulmin [Biometrika, 1956, 43, 45-63]. Here we demonstrate how the same approach can be highly effective in other large-scale applications involving biological data sets. These include estimating microbial species richness, immune repertoire size, and k -mer diversity for genome assembly applications. We show how the method can be modified to address populations containing an effectively infinite number of species where saturation cannot practically be attained. We also introduce a flexible suite of tools implemented as an R package that make these methods broadly accessible.

  2. Applications of species accumulation curves in large-scale biological data analysis

    PubMed Central

    Deng, Chao; Daley, Timothy; Smith, Andrew D

    2016-01-01

    The species accumulation curve, or collector’s curve, of a population gives the expected number of observed species or distinct classes as a function of sampling effort. Species accumulation curves allow researchers to assess and compare diversity across populations or to evaluate the benefits of additional sampling. Traditional applications have focused on ecological populations but emerging large-scale applications, for example in DNA sequencing, are orders of magnitude larger and present new challenges. We developed a method to estimate accumulation curves for predicting the complexity of DNA sequencing libraries. This method uses rational function approximations to a classical non-parametric empirical Bayes estimator due to Good and Toulmin [Biometrika, 1956, 43, 45–63]. Here we demonstrate how the same approach can be highly effective in other large-scale applications involving biological data sets. These include estimating microbial species richness, immune repertoire size, and k-mer diversity for genome assembly applications. We show how the method can be modified to address populations containing an effectively infinite number of species where saturation cannot practically be attained. We also introduce a flexible suite of tools implemented as an R package that make these methods broadly accessible. PMID:27252899

  3. Isotopic niches support the resource breadth hypothesis.

    PubMed

    Rader, Jonathan A; Newsome, Seth D; Sabat, Pablo; Chesser, R Terry; Dillon, Michael E; Martínez Del Rio, Carlos

    2017-03-01

    Because a broad spectrum of resource use allows species to persist in a wide range of habitat types, and thus permits them to occupy large geographical areas, and because broadly distributed species have access to more diverse resource bases, the resource breadth hypothesis posits that the diversity of resources used by organisms should be positively related with the extent of their geographic ranges. We investigated isotopic niche width in a small radiation of South American birds in the genus Cinclodes. We analysed feathers of 12 species of Cinclodes to test the isotopic version of the resource breadth hypothesis and to examine the correlation between isotopic niche breadth and morphology. We found a positive correlation between the widths of hydrogen and oxygen isotopic niches (which estimate breadth of elevational range) and widths of the carbon and nitrogen isotopic niches (which estimates the diversity of resources consumed, and hence of habitats used). We also found a positive correlation between broad isotopic niches and wing morphology. Our study not only supports the resource breadth hypothesis but it also highlights the usefulness of stable isotope analyses as tools in the exploration of ecological niches. It is an example of a macroecological application of stable isotopes. It also illustrates the importance of scientific collections in ecological studies. © 2016 The Authors. Journal of Animal Ecology © 2016 British Ecological Society.

  4. The Probiotic Compound VSL#3 Modulates Mucosal, Peripheral, and Systemic Immunity Following Murine Broad-Spectrum Antibiotic Treatment

    PubMed Central

    Ekmekciu, Ira; von Klitzing, Eliane; Fiebiger, Ulrike; Neumann, Christian; Bacher, Petra; Scheffold, Alexander; Bereswill, Stefan; Heimesaat, Markus M.

    2017-01-01

    There is compelling evidence linking the commensal intestinal microbiota with host health and, in turn, antibiotic induced perturbations of microbiota composition with distinct pathologies. Despite the attractiveness of probiotic therapy as a tool to beneficially alter the intestinal microbiota, its immunological effects are still incompletely understood. The aim of the present study was to assess the efficacy of the probiotic formulation VSL#3 consisting of eight distinct bacterial species (including Streptococcus thermophilus, Bifidobacterium breve, B. longum, B. infantis, Lactobacillus acidophilus, L. plantarum, L. paracasei, and L. delbrueckii subsp. Bulgaricus) in reversing immunological effects of microbiota depletion as compared to reassociation with a complex murine microbiota. To address this, conventional mice were subjected to broad-spectrum antibiotic therapy for 8 weeks and perorally reassociated with either VSL#3 bacteria or a complex murine microbiota. VSL#3 recolonization resulted in restored CD4+ and CD8+ cell numbers in the small and large intestinal lamina propria as well as in B220+ cell numbers in the former, whereas probiotic intervention was not sufficient to reverse the antibiotic induced changes of respective cell populations in the spleen. However, VSL#3 application was as efficient as complex microbiota reassociation to attenuate the frequencies of regulatory T cells, activated dendritic cells and memory/effector T cells in the small intestine, colon, mesenteric lymph nodes, and spleen. Whereas broad-spectrum antibiotic treatment resulted in decreased production of cytokines such as IFN-γ, IL-17, IL-22, and IL-10 by CD4+ cells in respective immunological compartments, VSL#3 recolonization was sufficient to completely recover the expression of the anti-inflammatory cytokine IL-10 without affecting pro-inflammatory mediators. In summary, the probiotic compound VSL#3 has an extensive impact on mucosal, peripheral, and systemic innate as well as adaptive immunity, exerting beneficial anti-inflammatory effects in intestinal as well as systemic compartments. Hence, VSL#3 might be considered a therapeutic immunomodulatory tool following antibiotic therapy. PMID:28529928

  5. Modelling and interpreting spectral energy distributions of galaxies with BEAGLE

    NASA Astrophysics Data System (ADS)

    Chevallard, Jacopo; Charlot, Stéphane

    2016-10-01

    We present a new-generation tool to model and interpret spectral energy distributions (SEDs) of galaxies, which incorporates in a consistent way the production of radiation and its transfer through the interstellar and intergalactic media. This flexible tool, named BEAGLE (for BayEsian Analysis of GaLaxy sEds), allows one to build mock galaxy catalogues as well as to interpret any combination of photometric and spectroscopic galaxy observations in terms of physical parameters. The current version of the tool includes versatile modelling of the emission from stars and photoionized gas, attenuation by dust and accounting for different instrumental effects, such as spectroscopic flux calibration and line spread function. We show a first application of the BEAGLE tool to the interpretation of broad-band SEDs of a published sample of ˜ 10^4 galaxies at redshifts 0.1 ≲ z ≲ 8. We find that the constraints derived on photometric redshifts using this multipurpose tool are comparable to those obtained using public, dedicated photometric-redshift codes and quantify this result in a rigorous statistical way. We also show how the post-processing of BEAGLE output data with the PYTHON extension PYP-BEAGLE allows the characterization of systematic deviations between models and observations, in particular through posterior predictive checks. The modular design of the BEAGLE tool allows easy extensions to incorporate, for example, the absorption by neutral galactic and circumgalactic gas, and the emission from an active galactic nucleus, dust and shock-ionized gas. Information about public releases of the BEAGLE tool will be maintained on http://www.jacopochevallard.org/beagle.

  6. Thinking Broadly: Financing Strategies for Youth Programs

    ERIC Educational Resources Information Center

    Deich, Sharon G.; Hayes, Cheryl D.

    2007-01-01

    This publication is part of a series of tools and resources on financing and sustaining youth programming. These tools and resources are intended to help policymakers, program developers, and community leaders develop innovative strategies for implementing, financing, and sustaining effective programs and policies. This strategy brief presents a…

  7. Information needs of Botswana health care workers and perceptions of wikipedia.

    PubMed

    Park, Elizabeth; Masupe, Tiny; Joseph, Joseph; Ho-Foster, Ari; Chavez, Afton; Jammalamadugu, Swetha; Marek, Andrew; Arumala, Ruth; Ketshogileng, Dineo; Littman-Quinn, Ryan; Kovarik, Carrie

    2016-11-01

    Since the UN Human Rights Council's recognition on the subject in 2011, the right to access the Internet and information is now considered one of the most basic human rights of global citizens [1,2]. Despite this, an information gap between developed and resource-limited countries remains, and there is scant research on actual information needs of workers themselves. The Republic of Botswana represents a fertile ground to address existing gaps in research, policy, and practice, due to its demonstrated gap in access to information and specialists among rural health care workers (HCWs), burgeoning mHealth capacity, and a timely offer from Orange Telecommunications to access Wikipedia for free on mobile platforms for Botswana subscribers. In this study, we sought to identify clinical information needs of HCWs of Botswana and their perception of Wikipedia as a clinical tool. Twenty-eight facilitated focus groups, consisting of 113 HCWs of various cadres based at district hospitals, clinics, and health posts around Botswana, were employed. Transcription and thematic analysis were performed for those groups. Access to the Internet is limited at most facilities. Most HCWs placed high importance upon using Botswana Ministry of Health (MoH) resources for obtaining credible clinical information. However, the clinical applicability of these materials was limited due to discrepancies amongst sources, potentially outdated information, and poor optimization for time-sensitive circumstances. As a result, HCWs faced challenges, such as loss of patient trust and compromises in patient care. Potential solutions posed by HCWs to address these issues included: multifaceted improvements in Internet infrastructure, access to up-to-date information, transfer of knowledge from MoH to HCW, and improving content and applicability of currently available information. Topics of clinical information needs were broad and encompassed: HIV, TB (Tuberculosis), OB/GYN (Obstetrics and Gynecology), and Pediatrics. HCW attitudes towards Wikipedia were variable; some trusted Wikipedia as a reliable point of care information resource whereas others thought that its use should be restricted and monitored by the MoH. There is a demonstrated need for accessible, reliable, and up-to-date information to aid clinical practice in Botswana. Attitudes towards Wikipedia as an open information resource tool are at best, split. Therefore, future studies are necessary to determine the accuracy, currency, and relevancy of Wikipedia articles on the health topics identified by health care workers as areas of information need. More broadly speaking, future efforts should be dedicated to configure a quality-controlled, readily accessible mobile platform based clinical information application tool fitting for Botswana. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. An online paradigm for exploring the self-reference effect

    PubMed Central

    Bentley, Sarah V.; Greenaway, Katharine H.; Haslam, S. Alexander

    2017-01-01

    People reliably encode information more effectively when it is related in some way to the self—a phenomenon known as the self-reference effect. This effect has been recognized in psychological research for almost 40 years, and its scope as a tool for investigating the self-concept is still expanding. The self-reference effect has been used within a broad range of psychological research, from cultural to neuroscientific, cognitive to clinical. Traditionally, the self-reference effect has been investigated in a laboratory context, which limits its applicability in non-laboratory samples. This paper introduces an online version of the self-referential encoding paradigm that yields reliable effects in an easy-to-administer procedure. Across four studies (total N = 658), this new online tool reliably replicated the traditional self-reference effect: in all studies self-referentially encoded words were recalled significantly more than semantically encoded words (d = 0.63). Moreover, the effect sizes obtained with this online tool are similar to those obtained in laboratory samples, and are robust to experimental variations in encoding time (Studies 1 and 2) and recall procedure (Studies 3 and 4), and persist independent of primacy and recency effects (all studies). PMID:28472160

  9. Advanced Welding Tool

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Accutron Tool & Instrument Co.'s welder was originally developed as a tool specifically for joining parts made of plastic or composite materials in any atmosphere to include the airless environment of space. Developers decided on induction or magnetic heating to avoid causing deformation and it also can be used with almost any type of thermoplastic material. Induction coil transfers magnetic flux through the plastic to a metal screen that is sandwiched between the sheets of plastic to be joined. When welder is energized, alternating current produces inductive heating on the screen causing the adjacent plastic surfaces to melt and flow into the mesh, creating a bond on the total surface area. Dave Brown, owner of Great Falls Canoe and Kayak Repair, Vienna, VA, uses a special repair technique based on operation of the Induction Toroid Welder to fix canoes. Whitewater canoeing poses the problem of frequent gashes that are difficult to repair. The main reason is that many canoes are made of plastics. The commercial Induction model is a self-contained, portable welding gun with a switch on the handle to regulate the temperature of the plastic melting screen. Welder has a broad range of applications in the automobile, appliance, aerospace and construction industries.

  10. Use of job-exposure matrices to estimate occupational exposure to pesticides: A review.

    PubMed

    Carles, Camille; Bouvier, Ghislaine; Lebailly, Pierre; Baldi, Isabelle

    2017-03-01

    The health effects of pesticides have been extensively studied in epidemiology, mainly in agricultural populations. However, pesticide exposure assessment remains a key methodological issue for epidemiological studies. Besides self-reported information, expert assessment or metrology, job-exposure matrices still appear to be an interesting tool. We reviewed all existing matrices assessing occupational exposure to pesticides in epidemiological studies and described the exposure parameters they included. We identified two types of matrices, (i) generic ones that are generally used in case-control studies and document broad categories of pesticides in a large range of jobs, and (ii) specific matrices, developed for use in agricultural cohorts, that generally provide exposure metrics at the active ingredient level. The various applications of these matrices in epidemiological studies have proven that they are valuable tools to assess pesticide exposure. Specific matrices are particularly promising for use in agricultural cohorts. However, results obtained with matrices have rarely been compared with those obtained with other tools. In addition, the external validity of the given estimates has not been adequately discussed. Yet, matrices would help in reducing misclassification and in quantifying cumulated exposures, to improve knowledge about the chronic health effects of pesticides.

  11. GOTree Machine (GOTM): a web-based platform for interpreting sets of interesting genes using Gene Ontology hierarchies

    PubMed Central

    Zhang, Bing; Schmoyer, Denise; Kirov, Stefan; Snoddy, Jay

    2004-01-01

    Background Microarray and other high-throughput technologies are producing large sets of interesting genes that are difficult to analyze directly. Bioinformatics tools are needed to interpret the functional information in the gene sets. Results We have created a web-based tool for data analysis and data visualization for sets of genes called GOTree Machine (GOTM). This tool was originally intended to analyze sets of co-regulated genes identified from microarray analysis but is adaptable for use with other gene sets from other high-throughput analyses. GOTree Machine generates a GOTree, a tree-like structure to navigate the Gene Ontology Directed Acyclic Graph for input gene sets. This system provides user friendly data navigation and visualization. Statistical analysis helps users to identify the most important Gene Ontology categories for the input gene sets and suggests biological areas that warrant further study. GOTree Machine is available online at . Conclusion GOTree Machine has a broad application in functional genomic, proteomic and other high-throughput methods that generate large sets of interesting genes; its primary purpose is to help users sort for interesting patterns in gene sets. PMID:14975175

  12. BioFuelDB: a database and prediction server of enzymes involved in biofuels production.

    PubMed

    Chaudhary, Nikhil; Gupta, Ankit; Gupta, Sudheer; Sharma, Vineet K

    2017-01-01

    In light of the rapid decrease in fossils fuel reserves and an increasing demand for energy, novel methods are required to explore alternative biofuel production processes to alleviate these pressures. A wide variety of molecules which can either be used as biofuels or as biofuel precursors are produced using microbial enzymes. However, the common challenges in the industrial implementation of enzyme catalysis for biofuel production are the unavailability of a comprehensive biofuel enzyme resource, low efficiency of known enzymes, and limited availability of enzymes which can function under extreme conditions in the industrial processes. We have developed a comprehensive database of known enzymes with proven or potential applications in biofuel production through text mining of PubMed abstracts and other publicly available information. A total of 131 enzymes with a role in biofuel production were identified and classified into six enzyme classes and four broad application categories namely 'Alcohol production', 'Biodiesel production', 'Fuel Cell' and 'Alternate biofuels'. A prediction tool 'Benz' was developed to identify and classify novel homologues of the known biofuel enzyme sequences from sequenced genomes and metagenomes. 'Benz' employs a hybrid approach incorporating HMMER 3.0 and RAPSearch2 programs to provide high accuracy and high speed for prediction. Using the Benz tool, 153,754 novel homologues of biofuel enzymes were identified from 23 diverse metagenomic sources. The comprehensive data of curated biofuel enzymes, their novel homologs identified from diverse metagenomes, and the hybrid prediction tool Benz are presented as a web server which can be used for the prediction of biofuel enzymes from genomic and metagenomic datasets. The database and the Benz tool is publicly available at http://metabiosys.iiserb.ac.in/biofueldb& http://metagenomics.iiserb.ac.in/biofueldb.

  13. groHMM: a computational tool for identifying unannotated and cell type-specific transcription units from global run-on sequencing data.

    PubMed

    Chae, Minho; Danko, Charles G; Kraus, W Lee

    2015-07-16

    Global run-on coupled with deep sequencing (GRO-seq) provides extensive information on the location and function of coding and non-coding transcripts, including primary microRNAs (miRNAs), long non-coding RNAs (lncRNAs), and enhancer RNAs (eRNAs), as well as yet undiscovered classes of transcripts. However, few computational tools tailored toward this new type of sequencing data are available, limiting the applicability of GRO-seq data for identifying novel transcription units. Here, we present groHMM, a computational tool in R, which defines the boundaries of transcription units de novo using a two state hidden-Markov model (HMM). A systematic comparison of the performance between groHMM and two existing peak-calling methods tuned to identify broad regions (SICER and HOMER) favorably supports our approach on existing GRO-seq data from MCF-7 breast cancer cells. To demonstrate the broader utility of our approach, we have used groHMM to annotate a diverse array of transcription units (i.e., primary transcripts) from four GRO-seq data sets derived from cells representing a variety of different human tissue types, including non-transformed cells (cardiomyocytes and lung fibroblasts) and transformed cells (LNCaP and MCF-7 cancer cells), as well as non-mammalian cells (from flies and worms). As an example of the utility of groHMM and its application to questions about the transcriptome, we show how groHMM can be used to analyze cell type-specific enhancers as defined by newly annotated enhancer transcripts. Our results show that groHMM can reveal new insights into cell type-specific transcription by identifying novel transcription units, and serve as a complete and useful tool for evaluating functional genomic elements in cells.

  14. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    PubMed

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  15. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.

  16. Novel small molecule modulators of plant growth and development identified by high-content screening with plant pollen.

    PubMed

    Chuprov-Netochin, Roman; Neskorodov, Yaroslav; Marusich, Elena; Mishutkina, Yana; Volynchuk, Polina; Leonov, Sergey; Skryabin, Konstantin; Ivashenko, Andrey; Palme, Klaus; Touraev, Alisher

    2016-09-06

    Small synthetic molecules provide valuable tools to agricultural biotechnology to circumvent the need for genetic engineering and provide unique benefits to modulate plant growth and development. We developed a method to explore molecular mechanisms of plant growth by high-throughput phenotypic screening of haploid populations of pollen cells. These cells rapidly germinate to develop pollen tubes. Compounds acting as growth inhibitors or stimulators of pollen tube growth are identified in a screen lasting not longer than 8 h high-lighting the potential broad applicability of this assay to prioritize chemicals for future mechanism focused investigations in plants. We identified 65 chemical compounds that influenced pollen development. We demonstrated the usefulness of the identified compounds as promotors or inhibitors of tobacco and Arabidopsis thaliana seed growth. When 7 days old seedlings were grown in the presence of these chemicals twenty two of these compounds caused a reduction in Arabidopsis root length in the range from 4.76 to 49.20 % when compared to controls grown in the absence of the chemicals. Two of the chemicals sharing structural homology with thiazolidines stimulated root growth and increased root length by 129.23 and 119.09 %, respectively. The pollen tube growth stimulating compound (S-02) belongs to benzazepin-type chemicals and increased Arabidopsis root length by 126.24 %. In this study we demonstrate the usefulness of plant pollen tube based assay for screening small chemical compound libraries for new biologically active compounds. The pollen tubes represent an ultra-rapid screening tool with which even large compound libraries can be analyzed in very short time intervals. The broadly applicable high-throughput protocol is suitable for automated phenotypic screening of germinating pollen resulting in combination with seed germination assays in identification of plant growth inhibitors and stimulators.

  17. Building Capacity for a Long-Term, in-Situ, National-Scale Phenology Monitoring Network: Successes, Challenges and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Browning, D. M.

    2014-12-01

    The USA National Phenology Network (USA-NPN; www.usanpn.org) is a national-scale science and monitoring initiative focused on phenology - the study of seasonal life-cycle events such as leafing, flowering, reproduction, and migration - as a tool to understand the response of biodiversity to environmental variation and change. USA-NPN provides a hierarchical, national monitoring framework that enables other organizations to leverage the capacity of the Network for their own applications - minimizing investment and duplication of effort - while promoting interoperability. Network participants can leverage: (1) Standardized monitoring protocols that have been broadly vetted, tested and published; (2) A centralized National Phenology Database (NPDb) for maintaining, archiving and replicating data, with standard metadata, terms-of-use, web-services, and documentation of QA/QC, plus tools for discovery, visualization and download of raw data and derived data products; and/or (3) A national in-situ, multi-taxa phenological monitoring system, Nature's Notebook, which enables participants to observe and record phenology of plants and animals - based on the protocols and information management system (IMS) described above - via either web or mobile applications. The protocols, NPDb and IMS, and Nature's Notebook represent a hierarchy of opportunities for involvement by a broad range of interested stakeholders, from individuals to agencies. For example, some organizations have adopted (e.g., the National Ecological Observatory Network or NEON) -- or are considering adopting (e.g., the Long-Term Agroecosystems Network or LTAR) -- the USA-NPN standardized protocols, but will develop their own database and IMS with web services to promote sharing of data with the NPDb. Other organizations (e.g., the Inventory and Monitoring Programs of the National Wildlife Refuge System and the National Park Service) have elected to use Nature's Notebook to support their phenological monitoring programs. We highlight the challenges and benefits of integrating phenology monitoring within existing and emerging national monitoring networks, and showcase opportunities that exist when standardized protocols are adopted and implemented to promote data interoperability and sharing.

  18. Linear and Nonlinear Molecular Spectroscopy with Laser Frequency Combs

    NASA Astrophysics Data System (ADS)

    Picque, Nathalie

    2013-06-01

    The regular pulse train of a mode-locked femtosecond laser can give rise to a comb spectrum of millions of laser modes with a spacing precisely equal to the pulse repetition frequency. Laser frequency combs were conceived a decade ago as tools for the precision spectroscopy of atomic hydrogen. They are now becoming enabling tools for an increasing number of applications, including molecular spectroscopy. Recent experiments of multi-heterodyne frequency comb Fourier transform spectroscopy (also called dual-comb spectroscopy) have demonstrated that the precisely spaced spectral lines of a laser frequency comb can be harnessed for new techniques of linear absorption spectroscopy. The first proof-of-principle experiments have demonstrated a very exciting potential of dual-comb spectroscopy without moving parts for ultra-rapid and ultra-sensitive recording of complex broad spectral bandwidth molecular spectra. Compared to conventional Michelson-based Fourier transform spectroscopy, recording times could be shortened from seconds to microseconds, with intriguing prospects for spectroscopy of short lived transient species. The resolution improves proportionally to the measurement time. Therefore longer recordings allow high resolution spectroscopy of molecules with extreme precision, since the absolute frequency of each laser comb line can be known with the accuracy of an atomic clock. Moreover, since laser frequency combs involve intense ultrashort laser pulses, nonlinear interactions can be harnessed. Broad spectral bandwidth ultra-rapid nonlinear molecular spectroscopy and imaging with two laser frequency combs is demonstrated with coherent Raman effects and two-photon excitation. Real-time multiplex accessing of hyperspectral images may dramatically expand the range of applications of nonlinear microscopy. B. Bernhardt et al., Nature Photonics 4, 55-57 (2010); A. Schliesser et al. Nature Photonics 6, 440-449 (2012); T. Ideguchi et al. arXiv:1201.4177 (2012) T. Ideguchi et al., Optics letters 37, 4498-4500 (2012); T. Ideguchi et al. arXiv:1302.2414 (2013)

  19. Initial Assessment of X-Ray Computer Tomography image analysis for material defect microstructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kane, Joshua James; Windes, William Enoch

    2016-06-01

    The original development work leading to this report was focused on the non destructive three-dimensional (3-D) characterization of nuclear graphite as a means to better understand the nature of the inherent pore structure. The pore structure of graphite and its evolution under various environmental factors such as irradiation, mechanical stress, and oxidation plays an important role in their observed properties and characteristics. If we are to transition from an empirical understanding of graphite behavior to a truly predictive mechanistic understanding the pore structure must be well characterized and understood. As the pore structure within nuclear graphite is highly interconnected andmore » truly 3-D in nature, 3-D characterization techniques are critical. While 3-D characterization has been an excellent tool for graphite pore characterization, it is applicable to a broad number of materials systems over many length scales. Given the wide range of applications and the highly quantitative nature of the tool, it is quite surprising to discover how few materials researchers understand and how valuable of a tool 3-D image processing and analysis can be. Ultimately, this report is intended to encourage broader use of 3 D image processing and analysis in materials science and engineering applications, more specifically nuclear-related materials applications, by providing interested readers with enough familiarity to explore its vast potential in identifying microstructure changes. To encourage this broader use, the report is divided into two main sections. Section 2 provides an overview of some of the key principals and concepts needed to extract a wide variety of quantitative metrics from a 3-D representation of a material microstructure. The discussion includes a brief overview of segmentation methods, connective components, morphological operations, distance transforms, and skeletonization. Section 3 focuses on the application of concepts from Section 2 to relevant materials at Idaho National Laboratory. In this section, image analysis examples featuring nuclear graphite will be discussed in detail. Additionally, example analyses from Transient Reactor Test Facility low-enriched uranium conversion, Advanced Gas Reactor like compacts, and tristructural isotopic particles are shown to give a broader perspective of the applicability to relevant materials of interest.« less

  20. Stand-Sit Microchip for High-Throughput, Multiplexed Analysis of Single Cancer Cells.

    PubMed

    Ramirez, Lisa; Herschkowitz, Jason I; Wang, Jun

    2016-09-01

    Cellular heterogeneity in function and response to therapeutics has been a major challenge in cancer treatment. The complex nature of tumor systems calls for the development of advanced multiplexed single-cell tools that can address the heterogeneity issue. However, to date such tools are only available in a laboratory setting and don't have the portability to meet the needs in point-of-care cancer diagnostics. Towards that application, we have developed a portable single-cell system that is comprised of a microchip and an adjustable clamp, so on-chip operation only needs pipetting and adjusting of clamping force. Up to 10 proteins can be quantitated from each cell with hundreds of single-cell assays performed in parallel from one chip operation. We validated the technology and analyzed the oncogenic signatures of cancer stem cells by quantitating both aldehyde dehydrogenase (ALDH) activities and 5 signaling proteins in single MDA-MB-231 breast cancer cells. The technology has also been used to investigate the PI3K pathway activities of brain cancer cells expressing mutant epidermal growth factor receptor (EGFR) after drug intervention targeting EGFR signaling. Our portable single-cell system will potentially have broad application in the preclinical and clinical settings for cancer diagnosis in the future.

  1. Filtration Characterization Method as Tool to Assess Membrane Bioreactor Sludge Filterability—The Delft Experience

    PubMed Central

    Lousada-Ferreira, Maria; Krzeminski, Pawel; Geilvoet, Stefan; Moreau, Adrien; Gil, Jose A.; Evenblij, Herman; van Lier, Jules B.; van der Graaf, Jaap H. J. M.

    2014-01-01

    Prevention and removal of fouling is often the most energy intensive process in Membrane Bioreactors (MBRs), responsible for 40% to 50% of the total specific energy consumed in submerged MBRs. In the past decade, methods were developed to quantify and qualify fouling, aiming to support optimization in MBR operation. Therefore, there is a need for an evaluation of the lessons learned and how to proceed. In this article, five different methods for measuring MBR activated sludge filterability and critical flux are described, commented and evaluated. Both parameters characterize the fouling potential in full-scale MBRs. The article focuses on the Delft Filtration Characterization method (DFCm) as a convenient tool to characterize sludge properties, namely on data processing, accuracy, reproducibility, reliability, and applicability, defining the boundaries of the DFCm. Significant progress was made concerning fouling measurements in particular by using straight forward approaches focusing on the applicability of the obtained results. Nevertheless, a fouling measurement method is still to be defined which is capable of being unequivocal, concerning the fouling parameters definitions; practical and simple, in terms of set-up and operation; broad and useful, in terms of obtained results. A step forward would be the standardization of the aforementioned method to assess the sludge filtration quality. PMID:24957174

  2. Computer-Assisted Community Planning and Decision Making.

    ERIC Educational Resources Information Center

    College of the Atlantic, Bar Harbor, ME.

    The College of the Atlantic (COA) developed a broad-based, interdisciplinary curriculum in ecological policy and community planning and decision-making that incorporates two primary computer-based tools: ARC/INFO Geographic Information System (GIS) and STELLA, a systems-dynamics modeling tool. Students learn how to use and apply these tools…

  3. Longitudinal assessment of the impact of the use of the UK clinical aptitude test for medical student selection.

    PubMed

    Mathers, Jonathan; Sitch, Alice; Parry, Jayne

    2016-10-01

    Medical schools are increasingly using novel tools to select applicants. The UK Clinical Aptitude Test (UKCAT) is one such tool and measures mental abilities, attitudes and professional behaviour conducive to being a doctor using constructs likely to be less affected by socio-demographic factors than traditional measures of potential. Universities are free to use UKCAT as they see fit but three broad modalities have been observed: 'borderline', 'factor' and 'threshold'. This paper aims to provide the first longitudinal analyses assessing the impact of the different uses of UKCAT on making offers to applicants with different socio-demographic characteristics. Multilevel regression was used to model the outcome of applications to UK medical schools during the period 2004-2011 (data obtained from UCAS), adjusted for sex, ethnicity, schooling, parental occupation, educational attainment, year of application and UKCAT use (borderline, factor and threshold). The three ways of using the UKCAT did not differ in their impact on making the selection process more equitable, other than a marked reversal for female advantage when applied in a 'threshold' manner. Our attempt to model the longitudinal impact of the use of the UKCAT in its threshold format found again the reversal of female advantage, but did not demonstrate similar statistically significant reductions of the advantages associated with White ethnicity, higher social class and selective schooling. Our findings demonstrate attenuation of the advantage of being female but no changes in admission rates based on White ethnicity, higher social class and selective schooling. In view of this, the utility of the UKCAT as a means to widen access to medical schools among non-White and less advantaged applicants remains unproven. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  4. Applications of complex systems theory in nursing education, research, and practice.

    PubMed

    Clancy, Thomas R; Effken, Judith A; Pesut, Daniel

    2008-01-01

    The clinical and administrative processes in today's healthcare environment are becoming increasingly complex. Multiple providers, new technology, competition, and the growing ubiquity of information all contribute to the notion of health care as a complex system. A complex system (CS) is characterized by a highly connected network of entities (e.g., physical objects, people or groups of people) from which higher order behavior emerges. Research in the transdisciplinary field of CS has focused on the use of computational modeling and simulation as a methodology for analyzing CS behavior. The creation of virtual worlds through computer simulation allows researchers to analyze multiple variables simultaneously and begin to understand behaviors that are common regardless of the discipline. The application of CS principles, mediated through computer simulation, informs nursing practice of the benefits and drawbacks of new procedures, protocols and practices before having to actually implement them. The inclusion of new computational tools and their applications in nursing education is also gaining attention. For example, education in CSs and applied computational applications has been endorsed by The Institute of Medicine, the American Organization of Nurse Executives and the American Association of Colleges of Nursing as essential training of nurse leaders. The purpose of this article is to review current research literature regarding CS science within the context of expert practice and implications for the education of nurse leadership roles. The article focuses on 3 broad areas: CS defined, literature review and exemplars from CS research and applications of CS theory in nursing leadership education. The article also highlights the key role nursing informaticists play in integrating emerging computational tools in the analysis of complex nursing systems.

  5. A methodology and decision support tool for informing state-level bioenergy policymaking: New Jersey biofuels as a case study

    NASA Astrophysics Data System (ADS)

    Brennan-Tonetta, Margaret

    This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or GGE generated by different feedstock/technology combinations at the state and county level. Modification of the model to incorporate factors that will enable the analysis of broader energy policy issues as those mentioned above, are recommended for future research efforts.

  6. Broad spectrum pesticide application alters natural enemy communities and may facilitate secondary pest outbreaks

    PubMed Central

    Macfadyen, Sarina; Nash, Michael A.

    2017-01-01

    Background Pesticide application is the dominant control method for arthropod pests in broad-acre arable systems. In Australia, organophosphate pesticides are often applied either prophylactically, or reactively, including at higher concentrations, to control crop establishment pests such as false wireworms and earth mite species. Organophosphates are reported to be disruptive to beneficial species, such as natural enemies, but this has not been widely assessed in Australian systems. Neither has the risk that secondary outbreaks may occur if the natural enemy community composition or function is altered. Methods We examine the abundance of ground-dwelling invertebrate communities in an arable field over successive seasons under rotation; barley, two years of wheat, then canola. Two organophosphates (chlorpyrifos and methidathion) were initially applied at recommended rates. After no discernible impact on target pest species, the rate for chlorpyrifos was doubled to elicit a definitive response to a level used at establishment when seedling damage is observed. Invertebrates were sampled using pitfalls and refuge traps throughout the experiments. We applied measures of community diversity, principal response curves and multiple generalised linear modelling techniques to understand the changes in pest and natural enemy communities. Results There was large variability due to seasonality and crop type. Nevertheless, both pest (e.g., mites and aphids) and natural enemy (e.g., predatory beetles) invertebrate communities were significantly affected by application of organophosphates. When the rate of chlorpyrifos was increased there was a reduction in the number of beetles that predate on slug populations. Slugs displayed opposite trends to many of the other target pests, and actually increased in numbers under the higher rates of chlorpyrifos in comparison to the other treatments. Slug numbers in the final rotation of canola resulted in significant yield loss regardless of pesticide application. Discussion Organophosphates are a cost-effective tool to control emergent pests in broad-acre arable systems in Australia. We found risks associated with prophylactic application in fields under rotation between different crop types and significant changes to the community of pests and natural enemy. Disrupting key predators reduced effective suppression of other pests, such as slugs, and may lead to secondary outbreaks when rotating with susceptible crops such as canola. Such non-target impacts are rarely documented when studies focus on single-species, rather than community assessments. This study represents a single demonstration of how pesticide application can lead to secondary outbreaks and reinforces the need for studies that include a longer temporal component to understand this process further. PMID:29302395

  7. State of art of nanotechnology applications in the meat chain: A qualitative synthesis.

    PubMed

    Belluco, Simone; Gallocchio, Federica; Losasso, Carmen; Ricci, Antonia

    2018-05-03

    Nanotechnology is a promising area in industry with a broad range of applications including in the agri-food sector. Several studies have investigated the potential benefits deriving from use of nanomaterials in the context of the whole food chain drawing scenarios of benefits but also potential for concerns. Among the agri-food sector, animal production has potential for nanomaterial application but also for safety concerns due to the possibility of nanomaterial accumulation along the farm-to-fork path. Scope and Approach: The aim of this work was to define the state of the art of nanomaterial applications in the animal production sector by assessing data belonging to recently publishes studies. To do this, a qualitative synthesis approach was applied to build a fit-for-purpose framework and to summarise relevant themes in the context of effectiveness, feasibility and health concerns. Key findings and conclusions: Nanomaterials have potential for use in a wide range of applications from feed production and farming to food packaging, including several detection tools designed for the benefit of consumer protection. The current high degree of variability in nanomaterials tested and in study designs impairs external validation of research results. Further research is required to clearly define which safe nanomaterial applications have the potential to reach the market.

  8. Optogenetics and the future of neuroscience.

    PubMed

    Boyden, Edward S

    2015-09-01

    Over the last 10 years, optogenetics has become widespread in neuroscience for the study of how specific cell types contribute to brain functions and brain disorder states. The full impact of optogenetics will emerge only when other toolsets mature, including neural connectivity and cell phenotyping tools and neural recording and imaging tools. The latter tools are rapidly improving, in part because optogenetics has helped galvanize broad interest in neurotechnology development.

  9. Communication and collaboration technologies.

    PubMed

    Cheeseman, Susan E

    2012-01-01

    This is the third in a series of columns exploring health information technology (HIT) in the neonatal intensive care unit (NICU). The first column provided background information on the implementation of information technology throughout the health care delivery system, as well as the requisite informatics competencies needed for nurses to fully engage in the digital era of health care. The second column focused on information and resources to master basic computer competencies described by the TIGER initiative (Technology Informatics Guiding Education Reform) as learning about computers, computer networks, and the transfer of data.1 This column will provide additional information related to basic computer competencies, focusing on communication and collaboration technologies. Computers and the Internet have transformed the way we communicate and collaborate. Electronic communication is the ability to exchange information through the use of computer equipment and software.2 Broadly defined, any technology that facilitates linking one or more individuals together is a collaborative tool. Collaboration using technology encompasses an extensive range of applications that enable groups of individuals to work together including e-mail, instant messaging (IM ), and several web applications collectively referred to as Web 2.0 technologies. The term Web 2.0 refers to web applications where users interact and collaborate with each other in a collective exchange of ideas generating content in a virtual community. Examples of Web 2.0 technologies include social networking sites, blogs, wikis, video sharing sites, and mashups. Many organizations are developing collaborative strategies and tools for employees to connect and interact using web-based social media technologies.3.

  10. Subradiant spontaneous undulator emission through collective suppression of shot noise

    DOE PAGES

    Ratner, D.; Hemsing, E.; Gover, A.; ...

    2015-05-01

    The phenomenon of Dicke’s subradiance, in which the collective properties of a system suppress radiation, has received broad interest in atomic physics. Recent theoretical papers in the field of relativistic electron beams have proposed schemes to achieve subradiance through suppression of shot noise current fluctuations. The resulting “quiet” beam generates less spontaneous radiation than emitted even by a shot noise beam when oscillating in an undulator. Quiet beams could have diverse accelerator applications, including lowering power requirements for seeded free-electron lasers and improving efficiency of hadron cooling. In this paper we present experimental observation of a strong reduction in undulatormore » radiation, demonstrating the feasibility of noise suppression as a practical tool in accelerator physics.« less

  11. Caititu: a tool to graphically represent peptide sequence coverage and domain distribution.

    PubMed

    Carvalho, Paulo C; Junqueira, Magno; Valente, Richard H; Domont, Gilberto B

    2008-10-07

    Here we present Caititu, an easy-to-use proteomics software to graphically represent peptide sequence coverage and domain distribution for different correlated samples (e.g. originated from 2D gel spots) relatively to the full-sequence of the known protein they are related to. Although Caititu has a broad applicability, we exemplify its usefulness in Toxinology using snake venom as a model. For example, proteolytic processing may lead to inactivation or loss of domains. Therefore, our proposed graphic representation for peptides identified by two dimensional electrophoresis followed by mass spectrometric identification of excised spots can aid in inferring what kind of processing happened to the toxins, if any. Caititu is freely available to download at: http://pcarvalho.com/things/caititu.

  12. Smooth 2D manifold extraction from 3D image stack

    PubMed Central

    Shihavuddin, Asm; Basu, Sreetama; Rexhepaj, Elton; Delestro, Felipe; Menezes, Nikita; Sigoillot, Séverine M; Del Nery, Elaine; Selimi, Fekrije; Spassky, Nathalie; Genovesio, Auguste

    2017-01-01

    Three-dimensional fluorescence microscopy followed by image processing is routinely used to study biological objects at various scales such as cells and tissue. However, maximum intensity projection, the most broadly used rendering tool, extracts a discontinuous layer of voxels, obliviously creating important artifacts and possibly misleading interpretation. Here we propose smooth manifold extraction, an algorithm that produces a continuous focused 2D extraction from a 3D volume, hence preserving local spatial relationships. We demonstrate the usefulness of our approach by applying it to various biological applications using confocal and wide-field microscopy 3D image stacks. We provide a parameter-free ImageJ/Fiji plugin that allows 2D visualization and interpretation of 3D image stacks with maximum accuracy. PMID:28561033

  13. Subradiant spontaneous undulator emission through collective suppression of shot noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ratner, D.; Hemsing, E.; Gover, A.

    The phenomenon of Dicke’s subradiance, in which the collective properties of a system suppress radiation, has received broad interest in atomic physics. Recent theoretical papers in the field of relativistic electron beams have proposed schemes to achieve subradiance through suppression of shot noise current fluctuations. The resulting “quiet” beam generates less spontaneous radiation than emitted even by a shot noise beam when oscillating in an undulator. Quiet beams could have diverse accelerator applications, including lowering power requirements for seeded free-electron lasers and improving efficiency of hadron cooling. In this paper we present experimental observation of a strong reduction in undulatormore » radiation, demonstrating the feasibility of noise suppression as a practical tool in accelerator physics.« less

  14. Engineering approaches to illuminating brain structure and dynamics.

    PubMed

    Deisseroth, Karl; Schnitzer, Mark J

    2013-10-30

    Historical milestones in neuroscience have come in diverse forms, ranging from the resolution of specific biological mysteries via creative experimentation to broad technological advances allowing neuroscientists to ask new kinds of questions. The continuous development of tools is driven with a special necessity by the complexity, fragility, and inaccessibility of intact nervous systems, such that inventive technique development and application drawing upon engineering and the applied sciences has long been essential to neuroscience. Here we highlight recent technological directions in neuroscience spurred by progress in optical, electrical, mechanical, chemical, and biological engineering. These research areas are poised for rapid growth and will likely be central to the practice of neuroscience well into the future. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Exploration for fossil and nuclear fuels from orbital altitudes

    NASA Technical Reports Server (NTRS)

    Short, N. M.; Tiedemann, H. A.

    1975-01-01

    Studies of LANDSAT and Skylab-EREP data have defined both the advantages and limitations of space platforms as a new 'tool' in mineral exploration. One LANDSAT investigation in the Anadarko Basin of Oklahoma has demonstrated a correlation between several types of anomalies recognized in the imagery and the locations of known oil and gas fields. In addition to supporting several LANDSAT follow-on investigations in petroleum exploration, NASA has approved a broad in-house study at Goddard Space Flight Center designed to verify the general applicability of the initial Anadarko Basin results. Using both conventional photogeologic methods and special computer processing, imagery taken over oil-producing areas is being subjected to detailed analysis in search of definitive recognition criteria.

  16. Applications of the pipeline environment for visual informatics and genomics computations

    PubMed Central

    2011-01-01

    Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102

  17. In Situ Environmental TEM in Imaging Gas and Liquid Phase Chemical Reactions for Materials Research.

    PubMed

    Wu, Jianbo; Shan, Hao; Chen, Wenlong; Gu, Xin; Tao, Peng; Song, Chengyi; Shang, Wen; Deng, Tao

    2016-11-01

    Gas and liquid phase chemical reactions cover a broad range of research areas in materials science and engineering, including the synthesis of nanomaterials and application of nanomaterials, for example, in the areas of sensing, energy storage and conversion, catalysis, and bio-related applications. Environmental transmission electron microscopy (ETEM) provides a unique opportunity for monitoring gas and liquid phase reactions because it enables the observation of those reactions at the ultra-high spatial resolution, which is not achievable through other techniques. Here, the fundamental science and technology developments of gas and liquid phase TEM that facilitate the mechanistic study of the gas and liquid phase chemical reactions are discussed. Combined with other characterization tools integrated in TEM, unprecedented material behaviors and reaction mechanisms are observed through the use of the in situ gas and liquid phase TEM. These observations and also the recent applications in this emerging area are described. The current challenges in the imaging process are also discussed, including the imaging speed, imaging resolution, and data management. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Two-Relaxation-Time Lattice Boltzmann Method and its Application to Advective-Diffusive-Reactive Transport

    DOE PAGES

    Yan, Zhifeng; Yang, Xiaofan; Li, Siliang; ...

    2017-09-05

    The lattice Boltzmann method (LBM) based on single-relaxation-time (SRT) or multiple-relaxation-time (MRT) collision operators is widely used in simulating flow and transport phenomena. The LBM based on two-relaxation-time (TRT) collision operators possesses strengths from the SRT and MRT LBMs, such as its simple implementation and good numerical stability, although tedious mathematical derivations and presentations of the TRT LBM hinder its application to a broad range of flow and transport phenomena. This paper describes the TRT LBM clearly and provides a pseudocode for easy implementation. Various transport phenomena were simulated using the TRT LBM to illustrate its applications in subsurface environments.more » These phenomena include advection-diffusion in uniform flow, Taylor dispersion in a pipe, solute transport in a packed column, reactive transport in uniform flow, and bacterial chemotaxis in porous media. Finally, the TRT LBM demonstrated good numerical performance in terms of accuracy and stability in predicting these transport phenomena. Therefore, the TRT LBM is a powerful tool to simulate various geophysical and biogeochemical processes in subsurface environments.« less

  19. Two-relaxation-time lattice Boltzmann method and its application to advective-diffusive-reactive transport

    NASA Astrophysics Data System (ADS)

    Yan, Zhifeng; Yang, Xiaofan; Li, Siliang; Hilpert, Markus

    2017-11-01

    The lattice Boltzmann method (LBM) based on single-relaxation-time (SRT) or multiple-relaxation-time (MRT) collision operators is widely used in simulating flow and transport phenomena. The LBM based on two-relaxation-time (TRT) collision operators possesses strengths from the SRT and MRT LBMs, such as its simple implementation and good numerical stability, although tedious mathematical derivations and presentations of the TRT LBM hinder its application to a broad range of flow and transport phenomena. This paper describes the TRT LBM clearly and provides a pseudocode for easy implementation. Various transport phenomena were simulated using the TRT LBM to illustrate its applications in subsurface environments. These phenomena include advection-diffusion in uniform flow, Taylor dispersion in a pipe, solute transport in a packed column, reactive transport in uniform flow, and bacterial chemotaxis in porous media. The TRT LBM demonstrated good numerical performance in terms of accuracy and stability in predicting these transport phenomena. Therefore, the TRT LBM is a powerful tool to simulate various geophysical and biogeochemical processes in subsurface environments.

  20. Improvements to the APBS biomolecular solvation software suite: Improvements to the APBS Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jurrus, Elizabeth; Engel, Dave; Star, Keith

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suitemore » of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.« less

  1. Improvements to the APBS biomolecular solvation software suite.

    PubMed

    Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A

    2018-01-01

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.

  2. Particle and nuclear physics instrumentation and its broad connections

    DOE PAGES

    Demarteau, Marcel; Lipton, Ron; Nicholson, Howard; ...

    2016-12-20

    Subatomic physics shares with other basic sciences the need to innovate, invent, and develop tools, techniques, and technologies to carry out its mission to explore the nature of matter, energy, space, and time. In some cases, entire detectors or technologies developed specifically for particle physics research have been adopted by other fields of research or in commercial applications. In most cases, however, the development of new devices and technologies by particle physics for its own research has added value to other fields of research or to applications beneficial to society by integrating them in the existing technologies. Thus, detector researchmore » and development has not only advanced the current state of technology for particle physics, but has often advanced research in other fields of science and has underpinned progress in numerous applications in medicine and national security. At the same time particle physics has profited immensely from developments in industry and applied them to great benefit for the use of particle physics detectors. Finally, this symbiotic relationship has seen strong mutual benefits with sometimes unexpected far reach.« less

  3. Two-Relaxation-Time Lattice Boltzmann Method and its Application to Advective-Diffusive-Reactive Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Zhifeng; Yang, Xiaofan; Li, Siliang

    The lattice Boltzmann method (LBM) based on single-relaxation-time (SRT) or multiple-relaxation-time (MRT) collision operators is widely used in simulating flow and transport phenomena. The LBM based on two-relaxation-time (TRT) collision operators possesses strengths from the SRT and MRT LBMs, such as its simple implementation and good numerical stability, although tedious mathematical derivations and presentations of the TRT LBM hinder its application to a broad range of flow and transport phenomena. This paper describes the TRT LBM clearly and provides a pseudocode for easy implementation. Various transport phenomena were simulated using the TRT LBM to illustrate its applications in subsurface environments.more » These phenomena include advection-diffusion in uniform flow, Taylor dispersion in a pipe, solute transport in a packed column, reactive transport in uniform flow, and bacterial chemotaxis in porous media. Finally, the TRT LBM demonstrated good numerical performance in terms of accuracy and stability in predicting these transport phenomena. Therefore, the TRT LBM is a powerful tool to simulate various geophysical and biogeochemical processes in subsurface environments.« less

  4. Particle and nuclear physics instrumentation and its broad connections

    NASA Astrophysics Data System (ADS)

    Demarteau, M.; Lipton, R.; Nicholson, H.; Shipsey, I.

    2016-10-01

    Subatomic physics shares with other basic sciences the need to innovate, invent, and develop tools, techniques, and technologies to carry out its mission to explore the nature of matter, energy, space, and time. In some cases, entire detectors or technologies developed specifically for particle physics research have been adopted by other fields of research or in commercial applications. In most cases, however, the development of new devices and technologies by particle physics for its own research has added value to other fields of research or to applications beneficial to society by integrating them in the existing technologies. Thus, detector research and development has not only advanced the current state of technology for particle physics, but has often advanced research in other fields of science and has underpinned progress in numerous applications in medicine and national security. At the same time particle physics has profited immensely from developments in industry and applied them to great benefit for the use of particle physics detectors. This symbiotic relationship has seen strong mutual benefits with sometimes unexpected far reach.

  5. Particle and nuclear physics instrumentation and its broad connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demarteau, Marcel; Lipton, Ron; Nicholson, Howard

    Subatomic physics shares with other basic sciences the need to innovate, invent, and develop tools, techniques, and technologies to carry out its mission to explore the nature of matter, energy, space, and time. In some cases, entire detectors or technologies developed specifically for particle physics research have been adopted by other fields of research or in commercial applications. In most cases, however, the development of new devices and technologies by particle physics for its own research has added value to other fields of research or to applications beneficial to society by integrating them in the existing technologies. Thus, detector researchmore » and development has not only advanced the current state of technology for particle physics, but has often advanced research in other fields of science and has underpinned progress in numerous applications in medicine and national security. At the same time particle physics has profited immensely from developments in industry and applied them to great benefit for the use of particle physics detectors. Finally, this symbiotic relationship has seen strong mutual benefits with sometimes unexpected far reach.« less

  6. Libraries of Synthetic TALE-Activated Promoters: Methods and Applications.

    PubMed

    Schreiber, T; Tissier, A

    2016-01-01

    The discovery of proteins with programmable DNA-binding specificities triggered a whole array of applications in synthetic biology, including genome editing, regulation of transcription, and epigenetic modifications. Among those, transcription activator-like effectors (TALEs) due to their natural function as transcription regulators, are especially well-suited for the development of orthogonal systems for the control of gene expression. We describe here the construction and testing of libraries of synthetic TALE-activated promoters which are under the control of a single TALE with a given DNA-binding specificity. These libraries consist of a fixed DNA-binding element for the TALE, a TATA box, and variable sequences of 19 bases upstream and 43 bases downstream of the DNA-binding element. These libraries were cloned using a Golden Gate cloning strategy making them usable as standard parts in a modular cloning system. The broad range of promoter activities detected and the versatility of these promoter libraries make them valuable tools for applications in the fine-tuning of expression in metabolic engineering projects or in the design and implementation of regulatory circuits. © 2016 Elsevier Inc. All rights reserved.

  7. Noninvasive Biomonitoring Approaches to Determine Dosimetry and Risk Following Acute Chemical Exposure: Analysis of Lead or Organophosphate Insecticide in Saliva

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timchalk, Chuck; Poet, Torka S.; Kousba, Ahmed A.

    2004-04-01

    There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. Themore » system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk.« less

  8. Tool time: melding watershed and site goals on private lands

    Treesearch

    Gary Bentrup; Michele Schoeneberger; Mike Dosskey; Gary Wells; Todd Kellerman

    2005-01-01

    Creating effective agroforestry systems with broad public support requires simultaneously addressing landowner and societal goals while paying respect to ecological processes that cross spatial and political boundaries. To meet this challenge, a variety of planning and design tools are needed that are straight-forward and flexible enough to accommodate the range of...

  9. Overview and insights regarding the JEQ soil and water assessment tool (SWAT) special issue

    USDA-ARS?s Scientific Manuscript database

    The Soil and Water Assessment Tool (SWAT) model has emerged as one of the most widely used water quality watershed- and river basin-scale models worldwide, and has been extensively applied for a broad range of hydrologic and/or environmental problems. Factors driving the international use of SWAT i...

  10. COASTAL INVERTEBRATES AND FISHES: HOW WILL THEY BE AFFECTED BY CHANGING ENVIRONMENTAL CONDITIONS- INCORPORATING CLIMATE SCENARIOS INTO THE COASTAL BIODIVERSITY RISK ANALYSIS TOOL (CBRAT)

    EPA Science Inventory

    The Coastal Biodiversity Risk Analysis Tool (CBRAT) is a public website that functions as an ecoinformatics platform to synthesize biogeographical distributions, abundances, life history attributes, and environmental tolerances for near-coastal invertebrates and fishes on a broad...

  11. A tutorial on aphasia test development in any language: Key substantive and psychometric considerations

    PubMed Central

    Ivanova, Maria V.; Hallowell, Brooke

    2013-01-01

    Background There are a limited number of aphasia language tests in the majority of the world's commonly spoken languages. Furthermore, few aphasia tests in languages other than English have been standardized and normed, and few have supportive psychometric data pertaining to reliability and validity. The lack of standardized assessment tools across many of the world's languages poses serious challenges to clinical practice and research in aphasia. Aims The current review addresses this lack of assessment tools by providing conceptual and statistical guidance for the development of aphasia assessment tools and establishment of their psychometric properties. Main Contribution A list of aphasia tests in the 20 most widely spoken languages is included. The pitfalls of translating an existing test into a new language versus creating a new test are outlined. Factors to consider in determining test content are discussed. Further, a description of test items corresponding to different language functions is provided, with special emphasis on implementing important controls in test design. Next, a broad review of principal psychometric properties relevant to aphasia tests is presented, with specific statistical guidance for establishing psychometric properties of standardized assessment tools. Conclusions This article may be used to help guide future work on developing, standardizing and validating aphasia language tests. The considerations discussed are also applicable to the development of standardized tests of other cognitive functions. PMID:23976813

  12. Liquid Chromatography-Mass Spectrometry-based Quantitative Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Fang; Liu, Tao; Qian, Weijun

    2011-07-22

    Liquid chromatography-mass spectrometry (LC-MS)-based quantitative proteomics has become increasingly applied for a broad range of biological applications due to growing capabilities for broad proteome coverage and good accuracy in quantification. Herein, we review the current LC-MS-based quantification methods with respect to their advantages and limitations, and highlight their potential applications.

  13. Ralstonia eutropha H16 in progress: Applications beside PHAs and establishment as production platform by advanced genetic tools.

    PubMed

    Raberg, Matthias; Volodina, Elena; Lin, Kaichien; Steinbüchel, Alexander

    2018-06-01

    Ralstonia eutropha strain H16 is a Gram-negative non-pathogenic betaproteobacterium ubiquitously found in soils and has been the subject of intensive research for more than 50 years. Due to its remarkable metabolically versatility, it utilizes a broad range of renewable heterotrophic resources. The substrate utilization range can be further extended by metabolic engineering as genetic tools are available. It has become the best studied "Knallgas" bacterium capable of chemolithoautotrophic growth with hydrogen as the electron donor and carbon dioxide as the carbon source. It also serves as a model organism to study the metabolism of poly(β-hydroxybutyrate), a polyester which is accumulated within the cells for storage of both carbon and energy. Thermoplastic and biodegradable properties of this polyhydroxyalkanoate (PHA) have attracted much biotechnical interest as a replacement for fossil resource-based plastics. The first applications of R. eutropha aimed at chemolithoautotrophic production of single cell protein (SCP) for food and feed and the synthesis of various PHAs. The complete annotated genome is available allowing systematic biology approaches together with data provided by available omics studies. Besides PHAs, novel biopolymers of 2-hydroxyalkanoates and polythioesters or cyanophycin as well as chemicals such as alcohols, alkanes, alkenes, and further interesting value added chemicals significantly recently extended the range of products synthesized by R. eutropha. High cell density cultivations can be performed without too much effort and the available repertoire of genetic tools is rapidly growing. Altogether, this qualifies R. eutropha strain H16 to become a production platform strain for a large spectrum of products.

  14. Using Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) in a range of geoscience applications

    NASA Astrophysics Data System (ADS)

    Daniels, M. D.; Kerkez, B.; Chandrasekar, V.; Graves, S. J.; Stamps, D. S.; Dye, M. J.; Keiser, K.; Martin, C. L.; Gooch, S. R.

    2016-12-01

    Cloud-Hosted Real-time Data Services for the Geosciences, or CHORDS, addresses the ever-increasing importance of real-time scientific data, particularly in mission critical scenarios, where informed decisions must be made rapidly. Part of the broader EarthCube initiative, CHORDS seeks to investigate the role of real-time data in the geosciences. Many of the phenomenon occurring within the geosciences, ranging from hurricanes and severe weather, to earthquakes, volcanoes and floods, can benefit from better handling of real-time data. The National Science Foundation funds many small teams of researchers residing at Universities whose currently inaccessible measurements could contribute to a better understanding of these phenomenon in order to ultimately improve forecasts and predictions. This lack of easy accessibility prohibits advanced algorithm and workflow development that could be initiated or enhanced by these data streams. Often the development of tools for the broad dissemination of their valuable real-time data is a large IT overhead from a pure scientific perspective, and could benefit from an easy to use, scalable, cloud-based solution to facilitate access. CHORDS proposes to make a very diverse suite of real-time data available to the broader geosciences community in order to allow innovative new science in these areas to thrive. We highlight the recently developed CHORDS portal tools and processing systems aimed at addressing some of the gaps in handling real-time data, particularly in the provisioning of data from the "long-tail" scientific community through a simple interface deployed in the cloud. Examples shown include hydrology, atmosphere and solid earth sensors. Broad use of the CHORDS framework will expand the role of real-time data within the geosciences, and enhance the potential of streaming data sources to enable adaptive experimentation and real-time hypothesis testing. CHORDS enables real-time data to be discovered and accessed using existing standards for straightforward integration into analysis, visualization and modeling tools.

  15. Viral-genetic tracing of the input-output organization of a central noradrenaline circuit.

    PubMed

    Schwarz, Lindsay A; Miyamichi, Kazunari; Gao, Xiaojing J; Beier, Kevin T; Weissbourd, Brandon; DeLoach, Katherine E; Ren, Jing; Ibanes, Sandy; Malenka, Robert C; Kremer, Eric J; Luo, Liqun

    2015-08-06

    Deciphering how neural circuits are anatomically organized with regard to input and output is instrumental in understanding how the brain processes information. For example, locus coeruleus noradrenaline (also known as norepinephrine) (LC-NE) neurons receive input from and send output to broad regions of the brain and spinal cord, and regulate diverse functions including arousal, attention, mood and sensory gating. However, it is unclear how LC-NE neurons divide up their brain-wide projection patterns and whether different LC-NE neurons receive differential input. Here we developed a set of viral-genetic tools to quantitatively analyse the input-output relationship of neural circuits, and applied these tools to dissect the LC-NE circuit in mice. Rabies-virus-based input mapping indicated that LC-NE neurons receive convergent synaptic input from many regions previously identified as sending axons to the locus coeruleus, as well as from newly identified presynaptic partners, including cerebellar Purkinje cells. The 'tracing the relationship between input and output' method (or TRIO method) enables trans-synaptic input tracing from specific subsets of neurons based on their projection and cell type. We found that LC-NE neurons projecting to diverse output regions receive mostly similar input. Projection-based viral labelling revealed that LC-NE neurons projecting to one output region also project to all brain regions we examined. Thus, the LC-NE circuit overall integrates information from, and broadcasts to, many brain regions, consistent with its primary role in regulating brain states. At the same time, we uncovered several levels of specificity in certain LC-NE sub-circuits. These tools for mapping output architecture and input-output relationship are applicable to other neuronal circuits and organisms. More broadly, our viral-genetic approaches provide an efficient intersectional means to target neuronal populations based on cell type and projection pattern.

  16. OceanVideoLab: A Tool for Exploring Underwater Video

    NASA Astrophysics Data System (ADS)

    Ferrini, V. L.; Morton, J. J.; Wiener, C.

    2016-02-01

    Video imagery acquired with underwater vehicles is an essential tool for characterizing seafloor ecosystems and seafloor geology. It is a fundamental component of ocean exploration that facilitates real-time operations, augments multidisciplinary scientific research, and holds tremendous potential for public outreach and engagement. Acquiring, documenting, managing, preserving and providing access to large volumes of video acquired with underwater vehicles presents a variety of data stewardship challenges to the oceanographic community. As a result, only a fraction of underwater video content collected with research submersibles is documented, discoverable and/or viewable online. With more than 1 billion users, YouTube offers infrastructure that can be leveraged to help address some of the challenges associated with sharing underwater video with a broad global audience. Anyone can post content to YouTube, and some oceanographic organizations, such as the Schmidt Ocean Institute, have begun live-streaming video directly from underwater vehicles. OceanVideoLab (oceanvideolab.org) was developed to help improve access to underwater video through simple annotation, browse functionality, and integration with related environmental data. Any underwater video that is publicly accessible on YouTube can be registered with OceanVideoLab by simply providing a URL. It is strongly recommended that a navigational file also be supplied to enable geo-referencing of observations. Once a video is registered, it can be viewed and annotated using a simple user interface that integrates observations with vehicle navigation data if provided. This interface includes an interactive map and a list of previous annotations that allows users to jump to times of specific observations in the video. Future enhancements to OceanVideoLab will include the deployment of a search interface, the development of an application program interface (API) that will drive the search and enable querying of content by other systems/tools, the integration of related environmental data from complementary data systems (e.g. temperature, bathymetry), and the expansion of infrastructure to enable broad crowdsourcing of annotations.

  17. A broad-host range dual-fluorescence reporter system for gene expression analysis in Gram-negative bacteria.

    PubMed

    Hennessy, Rosanna C; Christiansen, Line; Olsson, Stefan; Stougaard, Peter

    2018-01-01

    Fluorescence-based reporter systems are valuable tools for studying gene expression dynamics in living cells. Here we describe a dual-fluorescence reporter system carrying the red fluorescent marker mCherry and the blue fluorescent protein EBFP2 enabling the simultaneous analysis of two promoters in broad-host range autofluorescent Gram-negative bacteria. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. A review of wearable technology in medicine.

    PubMed

    Iqbal, Mohammed H; Aydin, Abdullatif; Brunckhorst, Oliver; Dasgupta, Prokar; Ahmed, Kamran

    2016-10-01

    With rapid advances in technology, wearable devices have evolved and been adopted for various uses, ranging from simple devices used in aiding fitness to more complex devices used in assisting surgery. Wearable technology is broadly divided into head-mounted displays and body sensors. A broad search of the current literature revealed a total of 13 different body sensors and 11 head-mounted display devices. The latter have been reported for use in surgery (n = 7), imaging (n = 3), simulation and education (n = 2) and as navigation tools (n = 1). Body sensors have been used as vital signs monitors (n = 9) and for posture-related devices for posture and fitness (n = 4). Body sensors were found to have excellent functionality in aiding patient posture and rehabilitation while head-mounted displays can provide information to surgeons to while maintaining sterility during operative procedures. There is a potential role for head-mounted wearable technology and body sensors in medicine and patient care. However, there is little scientific evidence available proving that the application of such technologies improves patient satisfaction or care. Further studies need to be conducted prior to a clear conclusion. © The Royal Society of Medicine.

  19. Tools and Services for Working with Multiple Land Remote Sensing Data Products

    NASA Astrophysics Data System (ADS)

    Krehbiel, C.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.; Maiersperger, T.

    2016-12-01

    The availability of increasingly large and diverse satellite remote sensing datasets provides both an opportunity and a challenge across broad Earth science research communities. On one hand, the extensive assortment of available data offer unprecedented opportunities to improve our understanding of Earth science and enable data use across a multitude of science disciplines. On the other hand, increasingly complex formats, data structures, and metadata can be an obstacle to data use for the broad user community that is interested in incorporating remote sensing Earth science data into their research. NASA's Land Processes Distributed Active Archive Center (LP DAAC) provides easy to use Python notebook tutorials for services such as accessing land remote sensing data from the LP DAAC Data Pool and interpreting data quality information from MODIS. We use examples to demonstrate the capabilities of the Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), such as spatially and spectrally subsetting data, decoding valuable quality information, and exploring initial analysis results within the user interface. We also show data recipes for R and Python scripts that help users process ASTER L1T and ASTER Global Emissivity Datasets.

  20. 2-Aryl-5-carboxytetrazole as a New Photoaffinity Label for Drug Target Identification.

    PubMed

    Herner, András; Marjanovic, Jasmina; Lewandowski, Tracey M; Marin, Violeta; Patterson, Melanie; Miesbauer, Laura; Ready, Damien; Williams, Jon; Vasudevan, Anil; Lin, Qing

    2016-11-09

    Photoaffinity labels are powerful tools for dissecting ligand-protein interactions, and they have a broad utility in medicinal chemistry and drug discovery. Traditional photoaffinity labels work through nonspecific C-H/X-H bond insertion reactions with the protein of interest by the highly reactive photogenerated intermediate. Herein, we report a new photoaffinity label, 2-aryl-5-carboxytetrazole (ACT), that interacts with the target protein via a unique mechanism in which the photogenerated carboxynitrile imine reacts with a proximal nucleophile near the target active site. In two distinct case studies, we demonstrate that the attachment of ACT to a ligand does not significantly alter the binding affinity and specificity of the parent drug. Compared with diazirine and benzophenone, two commonly used photoaffinity labels, in two case studies ACT showed higher photo-cross-linking yields toward their protein targets in vitro based on mass spectrometry analysis. In the in situ target identification studies, ACT successfully captured the desired targets with an efficiency comparable to the diazirine. We expect that further development of this class of photoaffinity labels will lead to a broad range of applications across target identification, and validation and elucidation of the binding site in drug discovery.

  1. 2-Aryl-5-carboxytetrazole as a New Photoaffinity Label for Drug Target Identification

    PubMed Central

    2016-01-01

    Photoaffinity labels are powerful tools for dissecting ligand–protein interactions, and they have a broad utility in medicinal chemistry and drug discovery. Traditional photoaffinity labels work through nonspecific C–H/X–H bond insertion reactions with the protein of interest by the highly reactive photogenerated intermediate. Herein, we report a new photoaffinity label, 2-aryl-5-carboxytetrazole (ACT), that interacts with the target protein via a unique mechanism in which the photogenerated carboxynitrile imine reacts with a proximal nucleophile near the target active site. In two distinct case studies, we demonstrate that the attachment of ACT to a ligand does not significantly alter the binding affinity and specificity of the parent drug. Compared with diazirine and benzophenone, two commonly used photoaffinity labels, in two case studies ACT showed higher photo-cross-linking yields toward their protein targets in vitro based on mass spectrometry analysis. In the in situ target identification studies, ACT successfully captured the desired targets with an efficiency comparable to the diazirine. We expect that further development of this class of photoaffinity labels will lead to a broad range of applications across target identification, and validation and elucidation of the binding site in drug discovery. PMID:27740749

  2. Possible Existence of Two Amorphous Phases of D-Mannitol Related by a First-Order Transition

    NASA Astrophysics Data System (ADS)

    Zhu, Men; Wang, Jun-Qiang; Perepezko, John; Yu, Lian

    We report that the common polyalcohol D-mannitol may have two amorphous phases related by a first-order transition. Slightly above Tg (284 K), the supercooled liquid (SCL) of D-mannitol transforms to a low-energy, apparently amorphous phase (Phase X). The enthalpy of Phase X is roughly halfway between those of the known amorphous and crystalline phases. The amorphous nature of Phase X is suggested by its absence of birefringence, transparency, broad X-ray diffraction, and broad Raman and NIR spectra. Phase X has greater molecular spacing, higher molecular order, fewer intra- and more inter-molecular hydrogen bonds than the normal liquid. On fast heating, Phase X transforms back to SCL near 330 K. Upon temperature cycling, it shows a glass-transition-like change of heat capacity. The presence of D-sorbitol enables a first-order liquid-liquid transition (LLT) from SCL to Phase X. This is the first report of polyamorphism at 1 atm for a pharmaceutical relevant substance. As amorphous solids are explored for many applications, polyamorphism could offer a tool to engineer the properties of materials. (Ref: M. Zhu et al., J. Chem. Phys. 2015, 142, 244504)

  3. The app will see you now: mobile health, diagnosis, and the practice of medicine in Quebec and Ontario

    PubMed Central

    Lang, Michael; Zawati, Ma’n H

    2018-01-01

    Abstract Mobile health applications are increasingly being used as tools of medicine. Outside of the clinic, some of these applications may contribute to diagnoses made absent a physician's care. We argue that this contravenes reservations of diagnosis to healthcare professionals in the law of two Canadian provinces: Quebec and Ontario. On the one hand, the law conceives of diagnosis in relatively broad terms. Drawing an association between symptoms and illness, for example, has been recognized in case law as sufficient. On the other hand, provincial law reserves diagnosis to physicians and other healthcare professionals. We argue that a number of health applications are capable of drawing associations between symptoms and disease and, in doing so, of delivering diagnoses in contravention of the law of Quebec and Ontario. This places mobile health applications in a poorly understood legal space. While prosecution is unlikely, the increasing ubiquity and technological sophistication of health applications promises to make such diagnosis widespread. We suggest that the legal status of such mobile health apps should be given serious attention. While our analysis focuses on the state of the law in Canada's largest provinces, we suggest that our argument will have implications in other jurisdictions. PMID:29707219

  4. Hypersonic Airbreathing Propulsion: An Aerodynamics, Aerothermodynamics, and Acoustics Competency White Paper

    NASA Technical Reports Server (NTRS)

    Drummond, J. Philip; Cockrell, Charles E., Jr.; Pellett, Gerald L.; Diskin, Glenn S.; Auslender, Aaron H.; Exton, Reginald J.; Guy, R. Wayne; Hoppe, John C.; Puster, Richard L.; Rogers, R. Clayton

    2002-01-01

    This White Paper examines the current state of Hypersonic Airbreathing Propulsion at the NASA Langley Research Center and the factors influencing this area of work and its personnel. Using this knowledge, the paper explores beyond the present day and suggests future directions and strategies for the field. Broad views are first taken regarding potential missions and applications of hypersonic propulsion. Then, candidate propulsion systems that may be applicable to these missions are suggested and discussed. Design tools and experimental techniques for developing these propulsion systems are then described, and approaches for applying them in the design process are considered. In each case, current strategies are reviewed and future approaches that may improve the techniques are considered. Finally, the paper concentrates on the needs to be addressed in each of these areas to take advantage of the opportunities that lay ahead for both the NASA Langley Research Center and the Aerodynamic Aerothermodynamic, and Aeroacoustics Competency. Recommendations are then provided so that the goals set forth in the paper may be achieved.

  5. gSRT-Soft: a generic software application and some methodological guidelines to investigate implicit learning through visual-motor sequential tasks.

    PubMed

    Chambaron, Stéphanie; Ginhac, Dominique; Perruchet, Pierre

    2008-05-01

    Serial reaction time tasks and, more generally, the visual-motor sequential paradigms are increasingly popular tools in a variety of research domains, from studies on implicit learning in laboratory contexts to the assessment of residual learning capabilities of patients in clinical settings. A consequence of this success, however, is the increased variability in paradigms and the difficulty inherent in respecting the methodological principles that two decades of experimental investigations have made more and more stringent. The purpose of the present article is to address those problems. We present a user-friendly application that simplifies running classical experiments, but is flexible enough to permit a broad range of nonstandard manipulations for more specific objectives. Basic methodological guidelines are also provided, as are suggestions for using the software to explore unconventional directions of research. The most recent version of gSRT-Soft may be obtained for free by contacting the authors.

  6. Application of PBPK modelling in drug discovery and development at Pfizer.

    PubMed

    Jones, Hannah M; Dickins, Maurice; Youdim, Kuresh; Gosset, James R; Attkins, Neil J; Hay, Tanya L; Gurrell, Ian K; Logan, Y Raj; Bungay, Peter J; Jones, Barry C; Gardner, Iain B

    2012-01-01

    Early prediction of human pharmacokinetics (PK) and drug-drug interactions (DDI) in drug discovery and development allows for more informed decision making. Physiologically based pharmacokinetic (PBPK) modelling can be used to answer a number of questions throughout the process of drug discovery and development and is thus becoming a very popular tool. PBPK models provide the opportunity to integrate key input parameters from different sources to not only estimate PK parameters and plasma concentration-time profiles, but also to gain mechanistic insight into compound properties. Using examples from the literature and our own company, we have shown how PBPK techniques can be utilized through the stages of drug discovery and development to increase efficiency, reduce the need for animal studies, replace clinical trials and to increase PK understanding. Given the mechanistic nature of these models, the future use of PBPK modelling in drug discovery and development is promising, however, some limitations need to be addressed to realize its application and utility more broadly.

  7. Functional Interfaces Constructed by Controlled/Living Radical Polymerization for Analytical Chemistry.

    PubMed

    Wang, Huai-Song; Song, Min; Hang, Tai-Jun

    2016-02-10

    The high-value applications of functional polymers in analytical science generally require well-defined interfaces, including precisely synthesized molecular architectures and compositions. Controlled/living radical polymerization (CRP) has been developed as a versatile and powerful tool for the preparation of polymers with narrow molecular weight distributions and predetermined molecular weights. Among the CRP system, atom transfer radical polymerization (ATRP) and reversible addition-fragmentation chain transfer (RAFT) are well-used to develop new materials for analytical science, such as surface-modified core-shell particles, monoliths, MIP micro- or nanospheres, fluorescent nanoparticles, and multifunctional materials. In this review, we summarize the emerging functional interfaces constructed by RAFT and ATRP for applications in analytical science. Various polymers with precisely controlled architectures including homopolymers, block copolymers, molecular imprinted copolymers, and grafted copolymers were synthesized by CRP methods for molecular separation, retention, or sensing. We expect that the CRP methods will become the most popular technique for preparing functional polymers that can be broadly applied in analytical chemistry.

  8. Clinician accessible tools for GUI computational models of transcranial electrical stimulation: BONSAI and SPHERES.

    PubMed

    Truong, Dennis Q; Hüber, Mathias; Xie, Xihe; Datta, Abhishek; Rahman, Asif; Parra, Lucas C; Dmochowski, Jacek P; Bikson, Marom

    2014-01-01

    Computational models of brain current flow during transcranial electrical stimulation (tES), including transcranial direct current stimulation (tDCS) and transcranial alternating current stimulation (tACS), are increasingly used to understand and optimize clinical trials. We propose that broad dissemination requires a simple graphical user interface (GUI) software that allows users to explore and design montages in real-time, based on their own clinical/experimental experience and objectives. We introduce two complimentary open-source platforms for this purpose: BONSAI and SPHERES. BONSAI is a web (cloud) based application (available at neuralengr.com/bonsai) that can be accessed through any flash-supported browser interface. SPHERES (available at neuralengr.com/spheres) is a stand-alone GUI application that allow consideration of arbitrary montages on a concentric sphere model by leveraging an analytical solution. These open-source tES modeling platforms are designed go be upgraded and enhanced. Trade-offs between open-access approaches that balance ease of access, speed, and flexibility are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Multi-color electron microscopy by element-guided identification of cells, organelles and molecules.

    PubMed

    Scotuzzi, Marijke; Kuipers, Jeroen; Wensveen, Dasha I; de Boer, Pascal; Hagen, Kees C W; Hoogenboom, Jacob P; Giepmans, Ben N G

    2017-04-07

    Cellular complexity is unraveled at nanometer resolution using electron microscopy (EM), but interpretation of macromolecular functionality is hampered by the difficulty in interpreting grey-scale images and the unidentified molecular content. We perform large-scale EM on mammalian tissue complemented with energy-dispersive X-ray analysis (EDX) to allow EM-data analysis based on elemental composition. Endogenous elements, labels (gold and cadmium-based nanoparticles) as well as stains are analyzed at ultrastructural resolution. This provides a wide palette of colors to paint the traditional grey-scale EM images for composition-based interpretation. Our proof-of-principle application of EM-EDX reveals that endocrine and exocrine vesicles exist in single cells in Islets of Langerhans. This highlights how elemental mapping reveals unbiased biomedical relevant information. Broad application of EM-EDX will further allow experimental analysis on large-scale tissue using endogenous elements, multiple stains, and multiple markers and thus brings nanometer-scale 'color-EM' as a promising tool to unravel molecular (de)regulation in biomedicine.

  10. Multi-color electron microscopy by element-guided identification of cells, organelles and molecules

    PubMed Central

    Scotuzzi, Marijke; Kuipers, Jeroen; Wensveen, Dasha I.; de Boer, Pascal; Hagen, Kees (C.) W.; Hoogenboom, Jacob P.; Giepmans, Ben N. G.

    2017-01-01

    Cellular complexity is unraveled at nanometer resolution using electron microscopy (EM), but interpretation of macromolecular functionality is hampered by the difficulty in interpreting grey-scale images and the unidentified molecular content. We perform large-scale EM on mammalian tissue complemented with energy-dispersive X-ray analysis (EDX) to allow EM-data analysis based on elemental composition. Endogenous elements, labels (gold and cadmium-based nanoparticles) as well as stains are analyzed at ultrastructural resolution. This provides a wide palette of colors to paint the traditional grey-scale EM images for composition-based interpretation. Our proof-of-principle application of EM-EDX reveals that endocrine and exocrine vesicles exist in single cells in Islets of Langerhans. This highlights how elemental mapping reveals unbiased biomedical relevant information. Broad application of EM-EDX will further allow experimental analysis on large-scale tissue using endogenous elements, multiple stains, and multiple markers and thus brings nanometer-scale ‘color-EM’ as a promising tool to unravel molecular (de)regulation in biomedicine. PMID:28387351

  11. Multi-template polymerase chain reaction.

    PubMed

    Kalle, Elena; Kubista, Mikael; Rensing, Christopher

    2014-12-01

    PCR is a formidable and potent technology that serves as an indispensable tool in a wide range of biological disciplines. However, due to the ease of use and often lack of rigorous standards many PCR applications can lead to highly variable, inaccurate, and ultimately meaningless results. Thus, rigorous method validation must precede its broad adoption to any new application. Multi-template samples possess particular features, which make their PCR analysis prone to artifacts and biases: multiple homologous templates present in copy numbers that vary within several orders of magnitude. Such conditions are a breeding ground for chimeras and heteroduplexes. Differences in template amplification efficiencies and template competition for reaction compounds undermine correct preservation of the original template ratio. In addition, the presence of inhibitors aggravates all of the above-mentioned problems. Inhibitors might also have ambivalent effects on the different templates within the same sample. Yet, no standard approaches exist for monitoring inhibitory effects in multitemplate PCR, which is crucial for establishing compatibility between samples.

  12. Direct Electrospray Printing of Gradient Refractive Index Chalcogenide Glass Films.

    PubMed

    Novak, Spencer; Lin, Pao Tai; Li, Cheng; Lumdee, Chatdanai; Hu, Juejun; Agarwal, Anuradha; Kik, Pieter G; Deng, Weiwei; Richardson, Kathleen

    2017-08-16

    A spatially varying effective refractive index gradient using chalcogenide glass layers is printed on a silicon wafer using an optimized electrospray (ES) deposition process. Using solution-derived glass precursors, IR-transparent Ge 23 Sb 7 S 70 and As 40 S 60 glass films of programmed thickness are fabricated to yield a bilayer structure, resulting in an effective gradient refractive index (GRIN) film. Optical and compositional analysis tools confirm the optical and physical nature of the gradient in the resulting high-optical-quality films, demonstrating the power of direct printing of multimaterial structures compatible with planar photonic fabrication protocols. The potential application of such tailorable materials and structures as they relate to the enhancement of sensitivity in chalcogenide glass based planar chemical sensor device design is presented. This method, applicable to a broad cross section of glass compositions, shows promise in directly depositing GRIN films with tunable refractive index profiles for bulk and planar optical components and devices.

  13. Modeling and Characterization of Damage Processes in Metallic Materials

    NASA Technical Reports Server (NTRS)

    Glaessgen, E. H.; Saether, E.; Smith, S. W.; Hochhalter, J. D.; Yamakov, V. I.; Gupta, V.

    2011-01-01

    This paper describes a broad effort that is aimed at understanding the fundamental mechanisms of crack growth and using that understanding as a basis for designing materials and enabling predictions of fracture in materials and structures that have small characteristic dimensions. This area of research, herein referred to as Damage Science, emphasizes the length scale regimes of the nanoscale and the microscale for which analysis and characterization tools are being developed to predict the formation, propagation, and interaction of fundamental damage mechanisms. Examination of nanoscale processes requires atomistic and discrete dislocation plasticity simulations, while microscale processes can be examined using strain gradient plasticity, crystal plasticity and microstructure modeling methods. Concurrent and sequential multiscale modeling methods are being developed to analytically bridge between these length scales. Experimental methods for characterization and quantification of near-crack tip damage are also being developed. This paper focuses on several new methodologies in these areas and their application to understanding damage processes in polycrystalline metals. On-going and potential applications are also discussed.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habercorn, Lasse; Merkl, Jan-Philip; Kloust, Hauke Christian

    With the polymer encapsulation of quantum dots via seeded emulsion polymerization we present a powerful tool for the preparation of fluorescent nanoparticles with an extraordinary stability in aqueous solution. The method of the seeded emulsion polymerization allows a straightforward and simple in situ functionalization of the polymer shell under preserving the optical properties of the quantum dots. These requirements are inevitable for the application of semiconductor nanoparticles as markers for biomedical applications. Polymer encapsulated quantum dots have shown only a marginal loss of quantum yields when they were exposed to copper(II)-ions. Under normal conditions the quantum dots were totally quenchedmore » in presence of copper(II)-ions. Furthermore, a broad range of in situ functionalized polymer-coated quantum dots were obtained by addition of functional monomers or surfactants like fluorescent dye molecules, antibodies or specific DNA aptamers. Furthermore the emulsion polymerization can be used to prepare multifunctional hybrid systems, combining different nanoparticles within one construct without any adverse effect of the properties of the starting materials.{sup 1,2}.« less

  15. PEG Molecular Net-Cloth Grafted on Polymeric Substrates and Its Bio-Merits

    NASA Astrophysics Data System (ADS)

    Zhao, Changwen; Lin, Zhifeng; Yin, Huabing; Ma, Yuhong; Xu, Fujian; Yang, Wantai

    2014-05-01

    Polymer brushes and hydrogels are sensitive to the environment, which can cause uncontrolled variations on their performance. Herein, for the first time, we report a non-swelling ``PEG molecular net-cloth'' on a solid surface, fabricated using a novel ``visible light induced surface controlled graft cross-linking polymerization'' (VSCGCP) technique. Via this method, we show that 1) the 3D-network structure of the net-cloth can be precisely modulated and its thickness controlled; 2) the PEG net-cloth has excellent resistance to non-specific protein adsorption and cell adhesion; 3) the mild polymerization conditions (i.e. visible light and room temperature) provided an ideal tool for in situ encapsulation of delicate biomolecules such as enzymes; 4) the successive grafting of reactive three-dimensional patterns on the PEG net-cloth enables the creation of protein microarrays with high signal to noise ratio. Importantly, this strategy is applicable to any C-H containing surface, and can be easily tailored for a broad range of applications.

  16. Current Developments in Machine Learning Techniques in Biological Data Mining.

    PubMed

    Dumancas, Gerard G; Adrianto, Indra; Bello, Ghalib; Dozmorov, Mikhail

    2017-01-01

    This supplement is intended to focus on the use of machine learning techniques to generate meaningful information on biological data. This supplement under Bioinformatics and Biology Insights aims to provide scientists and researchers working in this rapid and evolving field with online, open-access articles authored by leading international experts in this field. Advances in the field of biology have generated massive opportunities to allow the implementation of modern computational and statistical techniques. Machine learning methods in particular, a subfield of computer science, have evolved as an indispensable tool applied to a wide spectrum of bioinformatics applications. Thus, it is broadly used to investigate the underlying mechanisms leading to a specific disease, as well as the biomarker discovery process. With a growth in this specific area of science comes the need to access up-to-date, high-quality scholarly articles that will leverage the knowledge of scientists and researchers in the various applications of machine learning techniques in mining biological data.

  17. NASA Webworldwind: Multidimensional Virtual Globe for Geo Big Data Visualization

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Hogan, P.; Prestifilippo, G.; Zamboni, G.

    2016-06-01

    In this paper, we presented a web application created using the NASA WebWorldWind framework. The application is capable of visualizing n-dimensional data using a Voxel model. In this case study, we handled social media data and Call Detailed Records (CDR) of telecommunication networks. These were retrieved from the "BigData Challenge 2015" of Telecom Italia. We focused on the visualization process for a suitable way to show this geo-data in a 3D environment, incorporating more than three dimensions. This engenders an interactive way to browse the data in their real context and understand them quickly. Users will be able to handle several varieties of data, import their dataset using a particular data structure, and then mash them up in the WebWorldWind virtual globe. A broad range of public use this tool for diverse purposes is possible, without much experience in the field, thanks to the intuitive user-interface of this web app.

  18. Discrete quasi-linear viscoelastic damping analysis of connective tissues, and the biomechanics of stretching.

    PubMed

    Babaei, Behzad; Velasquez-Mao, Aaron J; Thomopoulos, Stavros; Elson, Elliot L; Abramowitch, Steven D; Genin, Guy M

    2017-05-01

    The time- and frequency-dependent properties of connective tissue define their physiological function, but are notoriously difficult to characterize. Well-established tools such as linear viscoelasticity and the Fung quasi-linear viscoelastic (QLV) model impose forms on responses that can mask true tissue behavior. Here, we applied a more general discrete quasi-linear viscoelastic (DQLV) model to identify the static and dynamic time- and frequency-dependent behavior of rabbit medial collateral ligaments. Unlike the Fung QLV approach, the DQLV approach revealed that energy dissipation is elevated at a loading period of ∼10s. The fitting algorithm was applied to the entire loading history on each specimen, enabling accurate estimation of the material's viscoelastic relaxation spectrum from data gathered from transient rather than only steady states. The application of the DQLV method to cyclically loading regimens has broad applicability for the characterization of biological tissues, and the results suggest a mechanistic basis for the stretching regimens most favored by athletic trainers. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Discrete quasi-linear viscoelastic damping analysis of connective tissues, and the biomechanics of stretching

    PubMed Central

    Babaei, Behzad; Velasquez-Mao, Aaron J.; Thomopoulos, Stavros; Elson, Elliot L.; Abramowitch, Steven D.; Genin, Guy M.

    2017-01-01

    The time- and frequency-dependent properties of connective tissue define their physiological function, but are notoriously difficult to characterize. Well-established tools such as linear viscoelasticity and the Fung quasi-linear viscoelastic (QLV) model impose forms on responses that can mask true tissue behavior. Here, we applied a more general discrete quasi-linear viscoelastic (DQLV) model to identify the static and dynamic time- and frequency-dependent behavior of rabbit medial collateral ligaments. Unlike the Fung QLV approach, the DQLV approach revealed that energy dissipation is elevated at a loading period of ~10 seconds. The fitting algorithm was applied to the entire loading history on each specimen, enabling accurate estimation of the material's viscoelastic relaxation spectrum from data gathered from transient rather than only steady states. The application of the DQLV method to cyclically loading regimens has broad applicability for the characterization of biological tissues, and the results suggest a mechanistic basis for the stretching regimens most favored by athletic trainers. PMID:28088071

  20. Nano-QSPR Modelling of Carbon-Based Nanomaterials Properties.

    PubMed

    Salahinejad, Maryam

    2015-01-01

    Evaluation of chemical and physical properties of nanomaterials is of critical importance in a broad variety of nanotechnology researches. There is an increasing interest in computational methods capable of predicting properties of new and modified nanomaterials in the absence of time-consuming and costly experimental studies. Quantitative Structure- Property Relationship (QSPR) approaches are progressive tools in modelling and prediction of many physicochemical properties of nanomaterials, which are also known as nano-QSPR. This review provides insight into the concepts, challenges and applications of QSPR modelling of carbon-based nanomaterials. First, we try to provide a general overview of QSPR implications, by focusing on the difficulties and limitations on each step of the QSPR modelling of nanomaterials. Then follows with the most significant achievements of QSPR methods in modelling of carbon-based nanomaterials properties and their recent applications to generate predictive models. This review specifically addresses the QSPR modelling of physicochemical properties of carbon-based nanomaterials including fullerenes, single-walled carbon nanotube (SWNT), multi-walled carbon nanotube (MWNT) and graphene.

  1. A study of Minnesota land and water resources using remote sensing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A pilot study of 60 lakes in Minnesota shows that LANDSAT data correlate very well with the Carlson trophic state index which is derived from measurements in the field. Nimbus satellite data reveal improvement in water quality in Lake Superior since the dumping of taconite tailings stopped in 1980. A feasibility study of using color infrared photography as a near real time tool for soil and crop management in corn and soybean areas of the state generated strong interest from farmers and agribusiness firms. The state geological survey had success in the use and applications of LANDSAT images. Subtleties of changes in vegetation, soil, and topography are such that ground water presence and depth to water table are nearly always impossible to qualify except for broad scale applications. Bedrock and structural differences as shown in lineaments offer great potential for resolution of some kinds of geologic studies. A synergistic concept is to be used to search for mineral resources in the northeastern part of the state.

  2. Versatile silicon-waveguide supercontinuum for coherent mid-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Nader, Nima; Maser, Daniel L.; Cruz, Flavio C.; Kowligy, Abijith; Timmers, Henry; Chiles, Jeff; Fredrick, Connor; Westly, Daron A.; Nam, Sae Woo; Mirin, Richard P.; Shainline, Jeffrey M.; Diddams, Scott

    2018-03-01

    Laser frequency combs, with their unique combination of precisely defined spectral lines and broad bandwidth, are a powerful tool for basic and applied spectroscopy. Here, we report offset-free, mid-infrared frequency combs and dual-comb spectroscopy through supercontinuum generation in silicon-on-sapphire waveguides. We leverage robust fabrication and geometrical dispersion engineering of nanophotonic waveguides for multi-band, coherent frequency combs spanning 70 THz in the mid-infrared (2.5 μm-6.2 μm). Precise waveguide fabrication provides significant spectral broadening with engineered spectra targeted at specific mid-infrared bands. We characterize the relative-intensity-noise of different bands and show that the measured levels do not pose any limitation for spectroscopy applications. Additionally, we use the fabricated photonic devices to demonstrate dual-comb spectroscopy of a carbonyl sulfide gas sample at 5 μm. This work forms the technological basis for applications such as point sensors for fundamental spectroscopy, atmospheric chemistry, trace and hazardous gas detection, and biological microscopy.

  3. Mediation of Artefacts, Tools and Technical Objects: An International and French Perspective

    ERIC Educational Resources Information Center

    Impedovo, Maria Antonietta; Andreucci, Colette; Ginestié, Jacques

    2017-01-01

    In this article we present a review of literature on the concept of Artefact, Tool and Technical Object in the light of sociocultural approach. Particular attention is given to present and discuss the French research tradition on the Technical Object and Technological education. The aim is to give a broad perspective to explore the mediation…

  4. Journey to Medieval China: Using Technology-Enhanced Instruction to Develop Content Knowledge and Digital Literacy Skills

    ERIC Educational Resources Information Center

    Shand, Kristen; Winstead, Lisa; Kottler, Ellen

    2012-01-01

    Recent innovations in Web-based technology tools have made planning instruction with technology in mind far more doable than in years past. To aid teachers in planning with technology, tools are organized into five broad categories: communication, collaboration, presentation, organization and critical-thinking. The purpose and potential of each…

  5. Development of a Tool to Evaluate Lecturers' Verbal Repertoire in Action

    ERIC Educational Resources Information Center

    van der Rijst, R. M.; Visser-Wijnveen, G. J.; Verloop, N.; van Driel, J. H.

    2014-01-01

    A broad communicative repertoire can help university lecturers to motivate and engage diverse student populations. The aim of this study is to develop and explore the usefulness and validity of a tool to identify patterns in lecturers' verbal repertoire. Speech act theory is presented as a framework to study lecturers' verbal…

  6. CRADA Final Report: Weld Predictor App

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Billings, Jay Jay

    Welding is an important manufacturing process used in a broad range of industries and market sectors, including automotive, aerospace, heavy manufacturing, medical, and defense. During welded fabrication, high localized heat input and subsequent rapid cooling result in the creation of residual stresses and distortion. These residual stresses can significantly affect the fatigue resistance, cracking behavior, and load-carrying capacity of welded structures during service. Further, additional fitting and tacking time is often required to fit distorted subassemblies together, resulting in non-value added cost. Using trial-and-error methods to determine which welding parameters, welding sequences, and fixture designs will most effectively reduce distortionmore » is a time-consuming and expensive process. For complex structures with many welds, this approach can take several months. For this reason, efficient and accurate methods of mitigating distortion are in-demand across all industries where welding is used. Analytical and computational methods and commercial software tools have been developed to predict welding-induced residual stresses and distortion. Welding process parameters, fixtures, and tooling can be optimized to reduce the HAZ softening and minimize weld residual stress and distortion, improving performance and reducing design, fabrication and testing costs. However, weld modeling technology tools are currently accessible only to engineers and designers with a background in finite element analysis (FEA) who work with large manufacturers, research institutes, and universities with access to high-performance computing (HPC) resources. Small and medium enterprises (SMEs) in the US do not typically have the human and computational resources needed to adopt and utilize weld modeling technology. To allow an engineer with no background in FEA and SMEs to gain access to this important design tool, EWI and the Ohio Supercomputer Center (OSC) developed the online weld application software tool “WeldPredictor” ( https://eweldpredictor.ewi.org ). About 1400 users have tested this application. This project marked the beginning of development on the next version of WeldPredictor that addresses many outstanding features of the original, including 3D models, allow more material hardening laws, model material phase transformation, and uses open source finite element solvers to quickly solve problems (as opposed to expensive commercial tools).« less

  7. A Novel Chemical Inhibitor of ABA Signaling Targets All ABA Receptors.

    PubMed

    Ye, Yajin; Zhou, Lijuan; Liu, Xue; Liu, Hao; Li, Deqiang; Cao, Minjie; Chen, Haifeng; Xu, Lin; Zhu, Jian-Kang; Zhao, Yang

    2017-04-01

    Abscisic acid (ABA), the most important stress-induced phytohormone, regulates seed dormancy, germination, plant senescence, and the abiotic stress response. ABA signaling is repressed by group A type 2C protein phosphatases (PP2Cs), and then ABA binds to its receptor of the ACTIN RESISTANCE1 (PYR1), PYR1-LIKE (PYL), and REGULATORY COMPONENTS OF ABA RECEPTORS (RCAR) family, which, in turn, inhibits PP2Cs and activates downstream ABA signaling. The agonist/antagonist of ABA receptors have the potential to reveal the ABA signaling machinery and to become lead compounds for agrochemicals; however, until now, no broad-spectrum antagonists of ABA receptors blocking all PYR/PYL-PP2C interactions have been identified. Here, using chemical genetics screenings, we identified ABA ANTAGONIST1 (AA1), the first broad-spectrum antagonist of ABA receptors in Arabidopsis ( Arabidopsis thaliana ). Physiological analyses revealed that AA1 is sufficiently active to block ABA signaling. AA1 interfered with all the PYR/PYL-HAB1 interactions, and the diminished PYR/PYL-HAB1 interactions, in turn, restored the activity of HAB1. AA1 binds to all 13 members. Molecular dockings, the non-AA1-bound PYL2 variant, and competitive binding assays demonstrated that AA1 enters into the ligand-binding pocket of PYL2. Using AA1, we tested the genetic relationships of ABA receptors with other core components of ABA signaling, demonstrating that AA1 is a powerful tool with which to sidestep this genetic redundancy of PYR/PYLs. In addition, the application of AA1 delays leaf senescence. Thus, our study developed an efficient broad-spectrum antagonist of ABA receptors and demonstrated that plant senescence can be chemically controlled through AA1, with a simple and easy-to-synthesize structure, allowing its availability and utility as a chemical probe synthesized in large quantities, indicating its potential application in agriculture. © 2017 American Society of Plant Biologists. All Rights Reserved.

  8. A Novel Chemical Inhibitor of ABA Signaling Targets All ABA Receptors1

    PubMed Central

    Ye, Yajin; Liu, Xue; Liu, Hao; Li, Deqiang; Cao, Minjie; Chen, Haifeng; Zhu, Jian-kang

    2017-01-01

    Abscisic acid (ABA), the most important stress-induced phytohormone, regulates seed dormancy, germination, plant senescence, and the abiotic stress response. ABA signaling is repressed by group A type 2C protein phosphatases (PP2Cs), and then ABA binds to its receptor of the ACTIN RESISTANCE1 (PYR1), PYR1-LIKE (PYL), and REGULATORY COMPONENTS OF ABA RECEPTORS (RCAR) family, which, in turn, inhibits PP2Cs and activates downstream ABA signaling. The agonist/antagonist of ABA receptors have the potential to reveal the ABA signaling machinery and to become lead compounds for agrochemicals; however, until now, no broad-spectrum antagonists of ABA receptors blocking all PYR/PYL-PP2C interactions have been identified. Here, using chemical genetics screenings, we identified ABA ANTAGONIST1 (AA1), the first broad-spectrum antagonist of ABA receptors in Arabidopsis (Arabidopsis thaliana). Physiological analyses revealed that AA1 is sufficiently active to block ABA signaling. AA1 interfered with all the PYR/PYL-HAB1 interactions, and the diminished PYR/PYL-HAB1 interactions, in turn, restored the activity of HAB1. AA1 binds to all 13 members. Molecular dockings, the non-AA1-bound PYL2 variant, and competitive binding assays demonstrated that AA1 enters into the ligand-binding pocket of PYL2. Using AA1, we tested the genetic relationships of ABA receptors with other core components of ABA signaling, demonstrating that AA1 is a powerful tool with which to sidestep this genetic redundancy of PYR/PYLs. In addition, the application of AA1 delays leaf senescence. Thus, our study developed an efficient broad-spectrum antagonist of ABA receptors and demonstrated that plant senescence can be chemically controlled through AA1, with a simple and easy-to-synthesize structure, allowing its availability and utility as a chemical probe synthesized in large quantities, indicating its potential application in agriculture. PMID:28193765

  9. The taxonomic name resolution service: an online tool for automated standardization of plant names

    PubMed Central

    2013-01-01

    Background The digitization of biodiversity data is leading to the widespread application of taxon names that are superfluous, ambiguous or incorrect, resulting in mismatched records and inflated species numbers. The ultimate consequences of misspelled names and bad taxonomy are erroneous scientific conclusions and faulty policy decisions. The lack of tools for correcting this ‘names problem’ has become a fundamental obstacle to integrating disparate data sources and advancing the progress of biodiversity science. Results The TNRS, or Taxonomic Name Resolution Service, is an online application for automated and user-supervised standardization of plant scientific names. The TNRS builds upon and extends existing open-source applications for name parsing and fuzzy matching. Names are standardized against multiple reference taxonomies, including the Missouri Botanical Garden's Tropicos database. Capable of processing thousands of names in a single operation, the TNRS parses and corrects misspelled names and authorities, standardizes variant spellings, and converts nomenclatural synonyms to accepted names. Family names can be included to increase match accuracy and resolve many types of homonyms. Partial matching of higher taxa combined with extraction of annotations, accession numbers and morphospecies allows the TNRS to standardize taxonomy across a broad range of active and legacy datasets. Conclusions We show how the TNRS can resolve many forms of taxonomic semantic heterogeneity, correct spelling errors and eliminate spurious names. As a result, the TNRS can aid the integration of disparate biological datasets. Although the TNRS was developed to aid in standardizing plant names, its underlying algorithms and design can be extended to all organisms and nomenclatural codes. The TNRS is accessible via a web interface at http://tnrs.iplantcollaborative.org/ and as a RESTful web service and application programming interface. Source code is available at https://github.com/iPlantCollaborativeOpenSource/TNRS/. PMID:23324024

  10. The Field-tested Learning Assessment Guide (FLAG): A Community Repository of Proven Alternative Assessment Instruments for STEM Education

    NASA Astrophysics Data System (ADS)

    Zeilik, M.; Garvin-Doxas, K.

    2003-12-01

    FLAG, the Field-tested Learning Assessment Guide (http://www.flaguide.org/) is a NSF funded website that offers broadly-applicable, self-contained modular classroom assessment techniques (CATs) and discipline-specific tools for STEM instructors creating new approaches to evaluate student learning, attitudes and performance. In particular, the FLAG contains proven techniques for alterative assessments---those needed for reformed, innovative STEM courses. Each tool has been developed, tested and refined in real classrooms at colleges and universities. The FLAG also contains an assessment primer, a section to help you select the most appropriate assessment technique(s) for your course goals, and other resources. In addition to references on instrument development and field-tested instruments on attitudes towards science, the FLAG also includes discipline-specific tools in Physics, Astronomy, Biology, and Mathematics. Building of the Geoscience collection is currently under way with the development of an instrument for detecting misconceptions of incoming freshmen on Space Science, which is being developed with the help of the Committee on Space Science and Astronomy of the American Association of Physics Teachers. Additional field-tested resources from the Geosciences are solicited from the community. Contributions should be sent to Michael Zeilik, zeilik@la.unm.edu. This work has been supported in part by NSF grant DUE 99-81155.

  11. Two's company, three (or more) is a simplex : Algebraic-topological tools for understanding higher-order structure in neural data.

    PubMed

    Giusti, Chad; Ghrist, Robert; Bassett, Danielle S

    2016-08-01

    The language of graph theory, or network science, has proven to be an exceptional tool for addressing myriad problems in neuroscience. Yet, the use of networks is predicated on a critical simplifying assumption: that the quintessential unit of interest in a brain is a dyad - two nodes (neurons or brain regions) connected by an edge. While rarely mentioned, this fundamental assumption inherently limits the types of neural structure and function that graphs can be used to model. Here, we describe a generalization of graphs that overcomes these limitations, thereby offering a broad range of new possibilities in terms of modeling and measuring neural phenomena. Specifically, we explore the use of simplicial complexes: a structure developed in the field of mathematics known as algebraic topology, of increasing applicability to real data due to a rapidly growing computational toolset. We review the underlying mathematical formalism as well as the budding literature applying simplicial complexes to neural data, from electrophysiological recordings in animal models to hemodynamic fluctuations in humans. Based on the exceptional flexibility of the tools and recent ground-breaking insights into neural function, we posit that this framework has the potential to eclipse graph theory in unraveling the fundamental mysteries of cognition.

  12. PATRIC: the Comprehensive Bacterial Bioinformatics Resource with a Focus on Human Pathogenic Species ▿ ‡ #

    PubMed Central

    Gillespie, Joseph J.; Wattam, Alice R.; Cammer, Stephen A.; Gabbard, Joseph L.; Shukla, Maulik P.; Dalay, Oral; Driscoll, Timothy; Hix, Deborah; Mane, Shrinivasrao P.; Mao, Chunhong; Nordberg, Eric K.; Scott, Mark; Schulman, Julie R.; Snyder, Eric E.; Sullivan, Daniel E.; Wang, Chunxia; Warren, Andrew; Williams, Kelly P.; Xue, Tian; Seung Yoo, Hyun; Zhang, Chengdong; Zhang, Yan; Will, Rebecca; Kenyon, Ronald W.; Sobral, Bruno W.

    2011-01-01

    Funded by the National Institute of Allergy and Infectious Diseases, the Pathosystems Resource Integration Center (PATRIC) is a genomics-centric relational database and bioinformatics resource designed to assist scientists in infectious-disease research. Specifically, PATRIC provides scientists with (i) a comprehensive bacterial genomics database, (ii) a plethora of associated data relevant to genomic analysis, and (iii) an extensive suite of computational tools and platforms for bioinformatics analysis. While the primary aim of PATRIC is to advance the knowledge underlying the biology of human pathogens, all publicly available genome-scale data for bacteria are compiled and continually updated, thereby enabling comparative analyses to reveal the basis for differences between infectious free-living and commensal species. Herein we summarize the major features available at PATRIC, dividing the resources into two major categories: (i) organisms, genomes, and comparative genomics and (ii) recurrent integration of community-derived associated data. Additionally, we present two experimental designs typical of bacterial genomics research and report on the execution of both projects using only PATRIC data and tools. These applications encompass a broad range of the data and analysis tools available, illustrating practical uses of PATRIC for the biologist. Finally, a summary of PATRIC's outreach activities, collaborative endeavors, and future research directions is provided. PMID:21896772

  13. Regulatory assessment of chemical mixtures: Requirements, current approaches and future perspectives.

    PubMed

    Kienzler, Aude; Bopp, Stephanie K; van der Linden, Sander; Berggren, Elisabet; Worth, Andrew

    2016-10-01

    This paper reviews regulatory requirements and recent case studies to illustrate how the risk assessment (RA) of chemical mixtures is conducted, considering both the effects on human health and on the environment. A broad range of chemicals, regulations and RA methodologies are covered, in order to identify mixtures of concern, gaps in the regulatory framework, data needs, and further work to be carried out. Also the current and potential future use of novel tools (Adverse Outcome Pathways, in silico tools, toxicokinetic modelling, etc.) in the RA of combined effects were reviewed. The assumptions made in the RA, predictive model specifications and the choice of toxic reference values can greatly influence the assessment outcome, and should therefore be specifically justified. Novel tools could support mixture RA mainly by providing a better understanding of the underlying mechanisms of combined effects. Nevertheless, their use is currently limited because of a lack of guidance, data, and expertise. More guidance is needed to facilitate their application. As far as the authors are aware, no prospective RA concerning chemicals related to various regulatory sectors has been performed to date, even though numerous chemicals are registered under several regulatory frameworks. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Demonstrating High-Accuracy Orbital Access Using Open-Source Tools

    NASA Technical Reports Server (NTRS)

    Gilbertson, Christian; Welch, Bryan

    2017-01-01

    Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.

  15. The role of simulation in mixed-methods research: a framework & application to patient safety.

    PubMed

    Guise, Jeanne-Marie; Hansen, Matthew; Lambert, William; O'Brien, Kerth

    2017-05-04

    Research in patient safety is an important area of health services research and is a national priority. It is challenging to investigate rare occurrences, explore potential causes, and account for the complex, dynamic context of healthcare - yet all are required in patient safety research. Simulation technologies have become widely accepted as education and clinical tools, but have yet to become a standard tool for research. We developed a framework for research that integrates accepted patient safety models with mixed-methods research approaches and describe the performance of the framework in a working example of a large National Institutes of Health (NIH)-funded R01 investigation. This worked example of a framework in action, identifies the strengths and limitations of qualitative and quantitative research approaches commonly used in health services research. Each approach builds essential layers of knowledge. We describe how the use of simulation ties these layers of knowledge together and adds new and unique dimensions of knowledge. A mixed-methods research approach that includes simulation provides a broad multi-dimensional approach to health services and patient safety research.

  16. jORCA: easily integrating bioinformatics Web Services.

    PubMed

    Martín-Requena, Victoria; Ríos, Javier; García, Maximiliano; Ramírez, Sergio; Trelles, Oswaldo

    2010-02-15

    Web services technology is becoming the option of choice to deploy bioinformatics tools that are universally available. One of the major strengths of this approach is that it supports machine-to-machine interoperability over a network. However, a weakness of this approach is that various Web Services differ in their definition and invocation protocols, as well as their communication and data formats-and this presents a barrier to service interoperability. jORCA is a desktop client aimed at facilitating seamless integration of Web Services. It does so by making a uniform representation of the different web resources, supporting scalable service discovery, and automatic composition of workflows. Usability is at the top of the jORCA agenda; thus it is a highly customizable and extensible application that accommodates a broad range of user skills featuring double-click invocation of services in conjunction with advanced execution-control, on the fly data standardization, extensibility of viewer plug-ins, drag-and-drop editing capabilities, plus a file-based browsing style and organization of favourite tools. The integration of bioinformatics Web Services is made easier to support a wider range of users. .

  17. Safety: the heart of the matter.

    PubMed

    Hayat, Sajad A; Senior, Roxy

    2005-08-01

    An integral part of evaluation of cardiac disease in modern day medicine is echocardiography. It has made great strides since the initial collaboration of Dr. Helmut Hertz and Dr. Inge Edler. In its modern day form, echocardiography maintains a legacy of a bedside utility while adopting many of the technologic advances ushered in by the digital era. As a result, it boasts a broad and growing spectrum of application including routine use in primary cardiac diagnosis and screening, therapeutic assessment, and guidance of interventional and surgical procedures. With the advent of ultrasound contrast agents it is now arguably the most complete 'one-stop' investigational tool to assess cardiac structure, function and perfusion. However, has it maintained its safety profile? The familiar and oft quoted dictum in medicine of "first do no harm" is of great importance for any diagnostic tool and patient safety should remain a primary consideration for any new investigational technique. In this issue Cosyns' et al. have examined whether some of the theoretical and in vitro experimental concerns surrounding myocardial injury during and following contrast echocardiography result in any detectable change in cardiac function.

  18. Technical Review: Microscopy and Image Processing Tools to Analyze Plant Chromatin: Practical Considerations.

    PubMed

    Baroux, Célia; Schubert, Veit

    2018-01-01

    In situ nucleus and chromatin analyses rely on microscopy imaging that benefits from versatile, efficient fluorescent probes and proteins for static or live imaging. Yet the broad choice in imaging instruments offered to the user poses orientation problems. Which imaging instrument should be used for which purpose? What are the main caveats and what are the considerations to best exploit each instrument's ability to obtain informative and high-quality images? How to infer quantitative information on chromatin or nuclear organization from microscopy images? In this review, we present an overview of common, fluorescence-based microscopy systems and discuss recently developed super-resolution microscopy systems, which are able to bridge the resolution gap between common fluorescence microscopy and electron microscopy. We briefly present their basic principles and discuss their possible applications in the field, while providing experience-based recommendations to guide the user toward best-possible imaging. In addition to raw data acquisition methods, we discuss commercial and noncommercial processing tools required for optimal image presentation and signal evaluation in two and three dimensions.

  19. A Rat Body Phantom for Radiation Analysis

    NASA Technical Reports Server (NTRS)

    Qualls, Garry D.; Clowdsley, Martha S.; Slaba, Tony C.; Walker, Steven A.

    2010-01-01

    To reduce the uncertainties associated with estimating the biological effects of ionizing radiation in tissue, researchers rely on laboratory experiments in which mono-energetic, single specie beams are applied to cell cultures, insects, and small animals. To estimate the radiation effects on astronauts in deep space or low Earth orbit, who are exposed to mixed field broad spectrum radiation, these experimental results are extrapolated and combined with other data to produce radiation quality factors, radiation weighting factors, and other risk related quantities for humans. One way to reduce the uncertainty associated with such extrapolations is to utilize analysis tools that are applicable to both laboratory and space environments. The use of physical and computational body phantoms to predict radiation exposure and its effects is well established and a wide range of human and non-human phantoms are in use today. In this paper, a computational rat phantom is presented, as well as a description of the process through which that phantom has been coupled to existing radiation analysis tools. Sample results are presented for two space radiation environments.

  20. Optimizing real time fMRI neurofeedback for therapeutic discovery and development

    PubMed Central

    Stoeckel, L.E.; Garrison, K.A.; Ghosh, S.; Wighton, P.; Hanlon, C.A.; Gilman, J.M.; Greer, S.; Turk-Browne, N.B.; deBettencourt, M.T.; Scheinost, D.; Craddock, C.; Thompson, T.; Calderon, V.; Bauer, C.C.; George, M.; Breiter, H.C.; Whitfield-Gabrieli, S.; Gabrieli, J.D.; LaConte, S.M.; Hirshberg, L.; Brewer, J.A.; Hampson, M.; Van Der Kouwe, A.; Mackey, S.; Evins, A.E.

    2014-01-01

    While reducing the burden of brain disorders remains a top priority of organizations like the World Health Organization and National Institutes of Health, the development of novel, safe and effective treatments for brain disorders has been slow. In this paper, we describe the state of the science for an emerging technology, real time functional magnetic resonance imaging (rtfMRI) neurofeedback, in clinical neurotherapeutics. We review the scientific potential of rtfMRI and outline research strategies to optimize the development and application of rtfMRI neurofeedback as a next generation therapeutic tool. We propose that rtfMRI can be used to address a broad range of clinical problems by improving our understanding of brain–behavior relationships in order to develop more specific and effective interventions for individuals with brain disorders. We focus on the use of rtfMRI neurofeedback as a clinical neurotherapeutic tool to drive plasticity in brain function, cognition, and behavior. Our overall goal is for rtfMRI to advance personalized assessment and intervention approaches to enhance resilience and reduce morbidity by correcting maladaptive patterns of brain function in those with brain disorders. PMID:25161891

  1. Peptide Array X-Linking (PAX): A New Peptide-Protein Identification Approach

    PubMed Central

    Okada, Hirokazu; Uezu, Akiyoshi; Soderblom, Erik J.; Moseley, M. Arthur; Gertler, Frank B.; Soderling, Scott H.

    2012-01-01

    Many protein interaction domains bind short peptides based on canonical sequence consensus motifs. Here we report the development of a peptide array-based proteomics tool to identify proteins directly interacting with ligand peptides from cell lysates. Array-formatted bait peptides containing an amino acid-derived cross-linker are photo-induced to crosslink with interacting proteins from lysates of interest. Indirect associations are removed by high stringency washes under denaturing conditions. Covalently trapped proteins are subsequently identified by LC-MS/MS and screened by cluster analysis and domain scanning. We apply this methodology to peptides with different proline-containing consensus sequences and show successful identifications from brain lysates of known and novel proteins containing polyproline motif-binding domains such as EH, EVH1, SH3, WW domains. These results suggest the capacity of arrayed peptide ligands to capture and subsequently identify proteins by mass spectrometry is relatively broad and robust. Additionally, the approach is rapid and applicable to cell or tissue fractions from any source, making the approach a flexible tool for initial protein-protein interaction discovery. PMID:22606326

  2. Development of a reverse genetics system to generate a recombinant Ebola virus Makona expressing a green fluorescent protein

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albariño, César G., E-mail: calbarino@cdc.gov; Wiggleton Guerrero, Lisa; Lo, Michael K.

    Previous studies have demonstrated the potential application of reverse genetics technology in studying a broad range of aspects of viral biology, including gene regulation, protein function, cell entry, and pathogenesis. Here, we describe a highly efficient reverse genetics system used to generate recombinant Ebola virus (EBOV) based on a recent isolate from a human patient infected during the 2014–2015 outbreak in Western Africa. We also rescued a recombinant EBOV expressing a fluorescent reporter protein from a cleaved VP40 protein fusion. Using this virus and an inexpensive method to quantitate the expression of the foreign gene, we demonstrate its potential usefulnessmore » as a tool for screening antiviral compounds and measuring neutralizing antibodies. - Highlights: • Recombinant Ebola virus (EBOV) derived from Makona variant was rescued. • New protocol for viral rescue allows 100% efficiency. • Modified EBOV expresses a green fluorescent protein from a VP40-fused protein. • Modified EBOV was tested as tool to screen antiviral compounds and measure neutralizing antibodies.« less

  3. Spatiotemporal control of opioid signaling and behavior.

    PubMed

    Siuda, Edward R; Copits, Bryan A; Schmidt, Martin J; Baird, Madison A; Al-Hasani, Ream; Planer, William J; Funderburk, Samuel C; McCall, Jordan G; Gereau, Robert W; Bruchas, Michael R

    2015-05-20

    Optogenetics is now a widely accepted tool for spatiotemporal manipulation of neuronal activity. However, a majority of optogenetic approaches use binary on/off control schemes. Here, we extend the optogenetic toolset by developing a neuromodulatory approach using a rationale-based design to generate a Gi-coupled, optically sensitive, mu-opioid-like receptor, which we term opto-MOR. We demonstrate that opto-MOR engages canonical mu-opioid signaling through inhibition of adenylyl cyclase, activation of MAPK and G protein-gated inward rectifying potassium (GIRK) channels and internalizes with kinetics similar to that of the mu-opioid receptor. To assess in vivo utility, we expressed a Cre-dependent viral opto-MOR in RMTg/VTA GABAergic neurons, which led to a real-time place preference. In contrast, expression of opto-MOR in GABAergic neurons of the ventral pallidum hedonic cold spot led to real-time place aversion. This tool has generalizable application for spatiotemporal control of opioid signaling and, furthermore, can be used broadly for mimicking endogenous neuronal inhibition pathways. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Can transcranial electrical stimulation improve learning difficulties in atypical brain development? A future possibility for cognitive training☆

    PubMed Central

    Krause, Beatrix; Cohen Kadosh, Roi

    2013-01-01

    Learning difficulties in atypical brain development represent serious obstacles to an individual's future achievements and can have broad societal consequences. Cognitive training can improve learning impairments only to a certain degree. Recent evidence from normal and clinical adult populations suggests that transcranial electrical stimulation (TES), a portable, painless, inexpensive, and relatively safe neuroenhancement tool, applied in conjunction with cognitive training can enhance cognitive intervention outcomes. This includes, for instance, numerical processing, language skills and response inhibition deficits commonly associated with profound learning difficulties and attention-deficit hyperactivity disorder (ADHD). The current review introduces the functional principles, current applications and promising results, and potential pitfalls of TES. Unfortunately, research in child populations is limited at present. We suggest that TES has considerable promise as a tool for increasing neuroplasticity in atypically developing children and may be an effective adjunct to cognitive training in clinical settings if it proves safe. The efficacy and both short- and long-term effects of TES on the developing brain need to be critically assessed before it can be recommended for clinical settings. PMID:23770059

  5. GeneTools--application for functional annotation and statistical hypothesis testing.

    PubMed

    Beisvag, Vidar; Jünge, Frode K R; Bergum, Hallgeir; Jølsum, Lars; Lydersen, Stian; Günther, Clara-Cecilie; Ramampiaro, Heri; Langaas, Mette; Sandvik, Arne K; Laegreid, Astrid

    2006-10-24

    Modern biology has shifted from "one gene" approaches to methods for genomic-scale analysis like microarray technology, which allow simultaneous measurement of thousands of genes. This has created a need for tools facilitating interpretation of biological data in "batch" mode. However, such tools often leave the investigator with large volumes of apparently unorganized information. To meet this interpretation challenge, gene-set, or cluster testing has become a popular analytical tool. Many gene-set testing methods and software packages are now available, most of which use a variety of statistical tests to assess the genes in a set for biological information. However, the field is still evolving, and there is a great need for "integrated" solutions. GeneTools is a web-service providing access to a database that brings together information from a broad range of resources. The annotation data are updated weekly, guaranteeing that users get data most recently available. Data submitted by the user are stored in the database, where it can easily be updated, shared between users and exported in various formats. GeneTools provides three different tools: i) NMC Annotation Tool, which offers annotations from several databases like UniGene, Entrez Gene, SwissProt and GeneOntology, in both single- and batch search mode. ii) GO Annotator Tool, where users can add new gene ontology (GO) annotations to genes of interest. These user defined GO annotations can be used in further analysis or exported for public distribution. iii) eGOn, a tool for visualization and statistical hypothesis testing of GO category representation. As the first GO tool, eGOn supports hypothesis testing for three different situations (master-target situation, mutually exclusive target-target situation and intersecting target-target situation). An important additional function is an evidence-code filter that allows users, to select the GO annotations for the analysis. GeneTools is the first "all in one" annotation tool, providing users with a rapid extraction of highly relevant gene annotation data for e.g. thousands of genes or clones at once. It allows a user to define and archive new GO annotations and it supports hypothesis testing related to GO category representations. GeneTools is freely available through www.genetools.no

  6. KINEROS2-AGWA: Model Use, Calibration, and Validation

    NASA Technical Reports Server (NTRS)

    Goodrich, D C.; Burns, I. S.; Unkrich, C. L.; Semmens, D. J.; Guertin, D. P.; Hernandez, M.; Yatheendradas, S.; Kennedy, J. R.; Levick, L. R..

    2013-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  7. Bioinformatics education dissemination with an evolutionary problem solving perspective.

    PubMed

    Jungck, John R; Donovan, Samuel S; Weisstein, Anton E; Khiripet, Noppadon; Everse, Stephen J

    2010-11-01

    Bioinformatics is central to biology education in the 21st century. With the generation of terabytes of data per day, the application of computer-based tools to stored and distributed data is fundamentally changing research and its application to problems in medicine, agriculture, conservation and forensics. In light of this 'information revolution,' undergraduate biology curricula must be redesigned to prepare the next generation of informed citizens as well as those who will pursue careers in the life sciences. The BEDROCK initiative (Bioinformatics Education Dissemination: Reaching Out, Connecting and Knitting together) has fostered an international community of bioinformatics educators. The initiative's goals are to: (i) Identify and support faculty who can take leadership roles in bioinformatics education; (ii) Highlight and distribute innovative approaches to incorporating evolutionary bioinformatics data and techniques throughout undergraduate education; (iii) Establish mechanisms for the broad dissemination of bioinformatics resource materials and teaching models; (iv) Emphasize phylogenetic thinking and problem solving; and (v) Develop and publish new software tools to help students develop and test evolutionary hypotheses. Since 2002, BEDROCK has offered more than 50 faculty workshops around the world, published many resources and supported an environment for developing and sharing bioinformatics education approaches. The BEDROCK initiative builds on the established pedagogical philosophy and academic community of the BioQUEST Curriculum Consortium to assemble the diverse intellectual and human resources required to sustain an international reform effort in undergraduate bioinformatics education.

  8. KINEROS2/AGWA: Model use, calibration and validation

    USGS Publications Warehouse

    Goodrich, D.C.; Burns, I.S.; Unkrich, C.L.; Semmens, Darius J.; Guertin, D.P.; Hernandez, M.; Yatheendradas, S.; Kennedy, Jeffrey R.; Levick, Lainie R.

    2012-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  9. Minimally invasive photopolymerization in intervertebral disc tissue cavities

    NASA Astrophysics Data System (ADS)

    Schmocker, Andreas M.; Khoushabi, Azadeh; Gantenbein-Ritter, Benjamin; Chan, Samantha; Bonél, Harald Marcel; Bourban, Pierre-Etienne; Mânson, Jan Anders; Schizas, Constantin; Pioletti, Dominique; Moser, Christophe

    2014-03-01

    Photopolymerized hydrogels are commonly used for a broad range of biomedical applications. As long as the polymer volume is accessible, gels can easily be hardened using light illumination. However, in clinics, especially for minimally invasive surgery, it becomes highly challenging to control photopolymerization. The ratios between polymerizationvolume and radiating-surface-area are several orders of magnitude higher than for ex-vivo settings. Also tissue scattering occurs and influences the reaction. We developed a Monte Carlo model for photopolymerization, which takes into account the solid/liquid phase changes, moving solid/liquid-boundaries and refraction on these boundaries as well as tissue scattering in arbitrarily designable tissue cavities. The model provides a tool to tailor both the light probe and the scattering/absorption properties of the photopolymer for applications such as medical implants or tissue replacements. Based on the simulations, we have previously shown that by adding scattering additives to the liquid monomer, the photopolymerized volume was considerably increased. In this study, we have used bovine intervertebral disc cavities, as a model for spinal degeneration, to study photopolymerization in-vitro. The cavity is created by enzyme digestion. Using a custom designed probe, hydrogels were injected and photopolymerized. Magnetic resonance imaging (MRI) and visual inspection tools were employed to investigate the successful photopolymerization outcomes. The results provide insights for the development of novel endoscopic light-scattering polymerization probes paving the way for a new generation of implantable hydrogels.

  10. Numerical study on the electromechanical behavior of dielectric elastomer with the influence of surrounding medium

    NASA Astrophysics Data System (ADS)

    Jia; Lu

    2016-01-01

    The considerable electric-induced shape change, together with the attributes of lightweight, high efficiency, and inexpensive cost, makes dielectric elastomer, a promising soft active material for the realization of actuators in broad applications. Although, a number of prototype devices have been demonstrated in the past few years, the further development of this technology necessitates adequate analytical and numerical tools. Especially, previous theoretical studies always neglect the influence of surrounding medium. Due to the large deformation and nonlinear equations of states involved in dielectric elastomer, finite element method (FEM) is anticipated; however, the few available formulations employ homemade codes, which are inconvenient to implement. The aim of this work is to present a numerical approach with the commercial FEM package COMSOL to investigate the nonlinear response of dielectric elastomer under electric stimulation. The influence of surrounding free space on the electric field is analyzed and the corresponding electric force is taken into account through an electric surface traction on the circumstances edge. By employing Maxwell stress tensor as actuation pressure, the mechanical and electric governing equations for dielectric elastomer are coupled, and then solved simultaneously with the Gent model of stain energy to derive the electric induced large deformation as well as the electromechanical instability. The finite element implementation presented here may provide a powerful computational tool to help design and optimize the engineering applications of dielectric elastomer.

  11. RFID in the healthcare supply chain: usage and application.

    PubMed

    Kumar, Sameer; Swanson, Eric; Tran, Thuy

    2009-01-01

    The purposes of this study are to first, determine the most efficient and cost effective portions of the healthcare supply chain in which radio frequency identification devices (RFID) can be implemented. Second, provide specific examples of RFID implementation and show how these business applications will add to the effectiveness of the healthcare supply chain. And third, to describe the current state of RFID technology and to give practical information for managers in the healthcare sector to make sound decisions about the possible implementation of RFID technology within their organizations. Healthcare industry literature was reviewed and examples of specific instances of RFID implementation were examined using an integrated simulation model developed with Excel, @Risk and Visio software tools. Analysis showed that the cost of implementing current RFID technology is too expensive for broad and sweeping implementation within the healthcare sector at this time. However, several example applications have been identified in which this technology can be effectively leveraged in a cost-effective way. This study shows that RFID technology has come a long way in the recent past and has potential to improve healthcare sector productivity and efficiency. Implementation by large companies such as Wal-mart has helped to make the technology become much more economical in its per unit cost as well as its supporting equipment and training costs. The originality of this study lies in the idea that few practical and pragmatic approaches have been taken within the academic field of study for the implementation of RFID into the healthcare supply chain. Much of the research has focused on specific companies or portions of the supply chain and not the entire supply chain. Also, many of the papers have discussed the future of the supply chain that is heavily dependent on advances in RFID technology. A few viable applications of how RFID technology can be implemented in the healthcare supply chain are presented and how the current state of technology limits the broad use and implementation of this technology in the healthcare industry.

  12. Force-controlled manipulation of single cells: from AFM to FluidFM.

    PubMed

    Guillaume-Gentil, Orane; Potthoff, Eva; Ossola, Dario; Franz, Clemens M; Zambelli, Tomaso; Vorholt, Julia A

    2014-07-01

    The ability to perturb individual cells and to obtain information at the single-cell level is of central importance for addressing numerous biological questions. Atomic force microscopy (AFM) offers great potential for this prospering field. Traditionally used as an imaging tool, more recent developments have extended the variety of cell-manipulation protocols. Fluidic force microscopy (FluidFM) combines AFM with microfluidics via microchanneled cantilevers with nano-sized apertures. The crucial element of the technology is the connection of the hollow cantilevers to a pressure controller, allowing their operation in liquid as force-controlled nanopipettes under optical control. Proof-of-concept studies demonstrated a broad spectrum of single-cell applications including isolation, deposition, adhesion and injection in a range of biological systems. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. [Model-based biofuels system analysis: a review].

    PubMed

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  14. Evolution of Cardiac Biomodels from Computational to Therapeutics.

    PubMed

    Rathinam, Alwin Kumar; Mokhtar, Raja Amin Raja

    2016-08-23

    Biomodeling the human anatomy in exact structure and size is an exciting field of medical science. Utilizing medical data from various medical imaging topography, the data of an anatomical structure can be extracted and converted into a three-dimensional virtual biomodel; thereafter a physical biomodel can be generated utilizing rapid prototyping machines. Here, we have reviewed the utilization of this technology and have provided some guidelines to develop biomodels of cardiac structures. Cardiac biomodels provide insights for cardiothoracic surgeons, cardiologists, and patients alike. Additionally, the technology may have future usability for tissue engineering, robotic surgery, or routine hospital usage as a diagnostic and therapeutic tool for cardiovascular diseases (CVD). Given the broad areas of application of cardiac biomodels, attention should be given to further research and development of their potential.

  15. Lactoferrin-derived Peptides Active towards Influenza: Identification of Three Potent Tetrapeptide Inhibitors.

    PubMed

    Scala, Maria Carmina; Sala, Marina; Pietrantoni, Agostina; Spensiero, Antonia; Di Micco, Simone; Agamennone, Mariangela; Bertamino, Alessia; Novellino, Ettore; Bifulco, Giuseppe; Gomez-Monterrey, Isabel M; Superti, Fabiana; Campiglia, Pietro

    2017-09-06

    Bovine lactoferrin is a biglobular multifunctional iron binding glycoprotein that plays an important role in innate immunity against infections. We have previously demonstrated that selected peptides from bovine lactoferrin C-lobe are able to prevent both Influenza virus hemagglutination and cell infection. To deeper investigate the ability of lactoferrin derived peptides to inhibit Influenza virus infection, in this study we identified new bovine lactoferrin C-lobe derived sequences and corresponding synthetic peptides were synthesized and assayed to check their ability to prevent viral hemagglutination and infection. We identified three tetrapeptides endowed with broad anti-Influenza activity and able to inhibit viral infection in a concentration range femto- to picomolar. Our data indicate that these peptides may constitute a non-toxic tool for potential applications as anti-Influenza therapeutics.

  16. Pipe dream? Envisioning a grassroots Python ecosystem of open, common software tools and data access in support of river and coastal biogeochemical research (Invited)

    NASA Astrophysics Data System (ADS)

    Mayorga, E.

    2013-12-01

    Practical, problem oriented software developed by scientists and graduate students in domains lacking a strong software development tradition is often balkanized into the scripting environments provided by dominant, typically proprietary tools. In environmental fields, these tools include ArcGIS, Matlab, SAS, Excel and others, and are often constrained to specific operating systems. While this situation is the outcome of rational choices, it limits the dissemination of useful tools and their integration into loosely coupled frameworks that can meet wider needs and be developed organically by groups addressing their own needs. Open-source dynamic languages offer the advantages of an accessible programming syntax, a wealth of pre-existing libraries, multi-platform access, linkage to community libraries developed in lower level languages such as C or FORTRAN, and access to web service infrastructure. Python in particular has seen a large and increasing uptake in scientific communities, as evidenced by the continued growth of the annual SciPy conference. Ecosystems with distinctive physical structures and organization, and mechanistic processes that are well characterized, are both factors that have often led to the grass-roots development of useful code meeting the needs of a range of communities. In aquatic applications, examples include river and watershed analysis tools (River Tools, Taudem, etc), and geochemical modules such as CO2SYS, PHREEQ and LOADEST. I will review the state of affairs and explore the potential offered by a Python tool ecosystem in supporting aquatic biogeochemistry and water quality research. This potential is multi-faceted and broadly involves accessibility to lone grad students, access to a wide community of programmers and problem solvers via online resources such as StackExchange, and opportunities to leverage broader cyberinfrastructure efforts and tools, including those from widely different domains. Collaborative development of such tools can provide the additional advantage of enhancing cohesion and communication across specific research areas, and reducing research obstacles in a range of disciplines.

  17. InP (Indium Phosphide): Into the future

    NASA Technical Reports Server (NTRS)

    Brandhorst, Henry W., Jr.

    1989-01-01

    Major industry is beginning to be devoted to indium phosphide and its potential applications. Key to these applications are high speed and radiation tolerance; however the high cost of indium phosphide may be an inhibitor to progress. The broad applicability of indium phosphide to many devices will be discussed with an emphasis on photovoltaics. Major attention is devoted to radiation tolerance and means of reducing cost of devices. Some of the approaches applicable to solar cells may also be relevant to other devices. The intent is to display the impact of visionary leadership in the field and enable the directions and broad applicability of indium phosphide.

  18. Annual Research Briefs, 2004: Center for Turbulence Research

    NASA Technical Reports Server (NTRS)

    Moin, Parviz; Mansour, Nagi N.

    2004-01-01

    This report contains the 2004 annual progress reports of the Research Fellows and students of the Center for Turbulence Research in its eighteenth year of operation. Since its inception in 1987, the objective of the CTR has been to advance the physical understanding of turbulent flows and development of physics based predictive tools for engineering analysis and turbulence control. Turbulence is ubiquitous in nature and in engineering devices. The studies at CTR have been motivated by applications where turbulence effects are significant; these include a broad range of technical areas such as planetary boundary layers, formation of planets, solar convection, magnetohydrodynamics, environmental and eco systems, aerodynamic noise, propulsion systems and high speed transportation. Numerical simulation has been the predominant research tool at CTR which has required a critical mass of researchers in numerical analysis and computer science in addition to core disciplines such as applied mathematics, chemical kinetics and fluid mechanics. Maintaining and promoting this interdisciplinary culture has been a hallmark of CTR and has been responsible for the realization of the results of its basic research in applications. The first group of reports in this volume are directed towards development, analysis and application of novel numerical methods for ow simulations. Development of methods for large eddy simulation of complex flows has been a central theme in this group. The second group is concerned with turbulent combustion, scalar transport and multi-phase ows. The nal group is devoted to geophysical turbulence where the problem of solar convection has been a new focus of considerable attention recently at CTR.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shenoy, G. K.; Rohlsberger, R.; X-Ray Science Division

    From the beginning of its discovery the Moessbauer effect has continued to be one of the most powerful tools with broad applications in diverse areas of science and technology. With the advent of synchrotron radiation sources such as the Advanced Photon Source (APS), the European Synchrotron Radiation Facility (ESRF) and the Super Photon Ring-8 (SPring-8), the tool has enlarged its scope and delivered new capabilities. The popular techniques most generally used in the field of materials physics, chemical physics, geoscience, and biology are hyperfine spectroscopy via elastic nuclear forward scattering (NFS), vibrational spectroscopy via nuclear inelastic scattering (NRIXS), and, tomore » a lesser extent, diffusional dynamics from quasielastic nuclear forward scattering (QNFS). As we look ahead, new storage rings with enhanced brilliance such as PETRA-III under construction at DESY, Hamburg, and PEP-III in its early design stage at SLAC, Stanford, will provide new and unique science opportunities. In the next two decades, x-ray free-electron lasers (XFELs), based both on self-amplified spontaneous emission (SASE-XFELs) and a seed (SXFELs), with unique time structure, coherence and a five to six orders higher average brilliance will truly revolutionize nuclear resonance applications in a major way. This overview is intended to briefly address the unique radiation characteristics of new sources on the horizon and to provide a glimpse of scientific prospects and dreams in the nuclear resonance field from the new radiation sources. We anticipate an expanded nuclear resonance research activity with applications such as spin and phonon mapping of a single nanostructure and their assemblies, interfaces, and surfaces; spin dynamics; nonequilibrium dynamics; photochemical reactions; excited-state spectroscopy; and nonlinear phenomena.« less

  20. Computational Electronics and Electromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeFord, J.F.

    The Computational Electronics and Electromagnetics thrust area is a focal point for computer modeling activities in electronics and electromagnetics in the Electronics Engineering Department of Lawrence Livermore National Laboratory (LLNL). Traditionally, they have focused their efforts in technical areas of importance to existing and developing LLNL programs, and this continues to form the basis for much of their research. A relatively new and increasingly important emphasis for the thrust area is the formation of partnerships with industry and the application of their simulation technology and expertise to the solution of problems faced by industry. The activities of the thrust areamore » fall into three broad categories: (1) the development of theoretical and computational models of electronic and electromagnetic phenomena, (2) the development of useful and robust software tools based on these models, and (3) the application of these tools to programmatic and industrial problems. In FY-92, they worked on projects in all of the areas outlined above. The object of their work on numerical electromagnetic algorithms continues to be the improvement of time-domain algorithms for electromagnetic simulation on unstructured conforming grids. The thrust area is also investigating various technologies for conforming-grid mesh generation to simplify the application of their advanced field solvers to design problems involving complicated geometries. They are developing a major code suite based on the three-dimensional (3-D), conforming-grid, time-domain code DSI3D. They continue to maintain and distribute the 3-D, finite-difference time-domain (FDTD) code TSAR, which is installed at several dozen university, government, and industry sites.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Passel, Steven, E-mail: Steven.vanpassel@uhasselt.be; University of Antwerp, Department Bioscience Engineering, Groenenborgerlaan 171, 2020 Antwerp; Meul, Marijke

    Sustainability assessment is needed to build sustainable farming systems. A broad range of sustainability concepts, methodologies and applications already exists. They differ in level, focus, orientation, measurement, scale, presentation and intended end-users. In this paper we illustrate that a smart combination of existing methods with different levels of application can make sustainability assessment more profound, and that it can broaden the insights of different end-user groups. An overview of sustainability assessment tools on different levels and for different end-users shows the complementarities and the opportunities of using different methods. In a case-study, a combination of the sustainable value approach (SVA)more » and MOTIFS is used to perform a sustainability evaluation of farming systems in Flanders. SVA is used to evaluate sustainability at sector level, and is especially useful to support policy makers, while MOTIFS is used to support and guide farmers towards sustainability at farm level. The combined use of the two methods with complementary goals can widen the insights of both farmers and policy makers, without losing the particularities of the different approaches. To stimulate and support further research and applications, we propose guidelines for multilevel and multi-user sustainability assessments. - Highlights: Black-Right-Pointing-Pointer We give an overview of sustainability assessment tools for agricultural systems. Black-Right-Pointing-Pointer SVA and MOTIFS are used to evaluate the sustainability of dairy farming in Flanders. Black-Right-Pointing-Pointer Combination of methods with different levels broadens the insights of different end-user groups. Black-Right-Pointing-Pointer We propose guidelines for multilevel and multi-user sustainability assessments.« less

  2. Sex differences and within-family associations in the broad autism phenotype.

    PubMed

    Klusek, Jessica; Losh, Molly; Martin, Gary E

    2014-02-01

    While there is a strong sex bias in the presentation of autism, it is unknown whether this bias is also present in subclinical manifestations of autism among relatives, or the broad autism phenotype. This study examined this question and investigated patterns of co-occurrence of broad autism phenotype traits within families of individuals with autism. Pragmatic language and personality features of the broad autism phenotype were studied in 42 fathers and 50 mothers of individuals with autism using direct assessment tools used in prior family studies of the broad autism phenotype. Higher rates of aloof personality style were detected among fathers, while no sex differences were detected for other broad autism phenotype traits. Within individuals, pragmatic language features were associated with the social personality styles of the broad autism phenotype in mothers but not in fathers. A number of broad autism phenotype features were correlated within spousal pairs. Finally, the associations were detected between paternal broad autism phenotype characteristics and the severity of children's autism symptoms in all three domains (social, communication, and repetitive behaviors). Mother-child correlations were detected for aspects of communication only. Together, the findings suggest that most features of the broad autism phenotype express comparably in males and females and raise some specific questions about how such features might inform studies of the genetic basis of autism.

  3. Coordinated Fault-Tolerance for High-Performance Computing Final Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panda, Dhabaleswar Kumar; Beckman, Pete

    2011-07-28

    With the Coordinated Infrastructure for Fault Tolerance Systems (CIFTS, as the original project came to be called) project, our aim has been to understand and tackle the following broad research questions, the answers to which will help the HEC community analyze and shape the direction of research in the field of fault tolerance and resiliency on future high-end leadership systems. Will availability of global fault information, obtained by fault information exchange between the different HEC software on a system, allow individual system software to better detect, diagnose, and adaptively respond to faults? If fault-awareness is raised throughout the system throughmore » fault information exchange, is it possible to get all system software working together to provide a more comprehensive end-to-end fault management on the system? What are the missing fault-tolerance features that widely used HEC system software lacks today that would inhibit such software from taking advantage of systemwide global fault information? What are the practical limitations of a systemwide approach for end-to-end fault management based on fault awareness and coordination? What mechanisms, tools, and technologies are needed to bring about fault awareness and coordination of responses on a leadership-class system? What standards, outreach, and community interaction are needed for adoption of the concept of fault awareness and coordination for fault management on future systems? Keeping our overall objectives in mind, the CIFTS team has taken a parallel fourfold approach. Our central goal was to design and implement a light-weight, scalable infrastructure with a simple, standardized interface to allow communication of fault-related information through the system and facilitate coordinated responses. This work led to the development of the Fault Tolerance Backplane (FTB) publish-subscribe API specification, together with a reference implementation and several experimental implementations on top of existing publish-subscribe tools. We enhanced the intrinsic fault tolerance capabilities representative implementations of a variety of key HPC software subsystems and integrated them with the FTB. Targeting software subsystems included: MPI communication libraries, checkpoint/restart libraries, resource managers and job schedulers, and system monitoring tools. Leveraging the aforementioned infrastructure, as well as developing and utilizing additional tools, we have examined issues associated with expanded, end-to-end fault response from both system and application viewpoints. From the standpoint of system operations, we have investigated log and root cause analysis, anomaly detection and fault prediction, and generalized notification mechanisms. Our applications work has included libraries for fault-tolerance linear algebra, application frameworks for coupled multiphysics applications, and external frameworks to support the monitoring and response for general applications. Our final goal was to engage the high-end computing community to increase awareness of tools and issues around coordinated end-to-end fault management.« less

  4. Development of a Self-Assessment Tool to Facilitate Decision-Making in Choosing a Long Term Care Administration Major

    ERIC Educational Resources Information Center

    Johs-Artisensi, Jennifer L.; Olson, Douglas M.; Nahm, Abraham Y.

    2016-01-01

    Long term care administrators need a broad base of knowledge, skills, and interests to provide leadership and be successful in managing a fiscally responsible, quality long term care organization. Researchers developed a tool to help students assess whether a long term care administration major is a compatible fit. With input from professionals in…

  5. An aquatic multiscale assessment and planning framework approach—forest plan revision case study

    Treesearch

    Kerry Overton; Ann D. Carlson; Cynthia Tait

    2010-01-01

    The Aquatic Multiscale Assessment and Planning Framework is a Web-based decision-support tool developed to assist aquatic practitioners in managing fisheries and watershed information. This tool, or framework, was designed to assist resource assessments and planning efforts from the broad scale to the fine scale, to document procedures, and to link directly to relevant...

  6. Analyzing Collaborative Learning Processes Automatically: Exploiting the Advances of Computational Linguistics in Computer-Supported Collaborative Learning

    ERIC Educational Resources Information Center

    Rose, Carolyn; Wang, Yi-Chia; Cui, Yue; Arguello, Jaime; Stegmann, Karsten; Weinberger, Armin; Fischer, Frank

    2008-01-01

    In this article we describe the emerging area of text classification research focused on the problem of collaborative learning process analysis both from a broad perspective and more specifically in terms of a publicly available tool set called TagHelper tools. Analyzing the variety of pedagogically valuable facets of learners' interactions is a…

  7. Working More Productively: Tools for Administrative Data

    PubMed Central

    Roos, Leslie L; Soodeen, Ruth-Ann; Bond, Ruth; Burchill, Charles

    2003-01-01

    Objective This paper describes a web-based resource () that contains a series of tools for working with administrative data. This work in knowledge management represents an effort to document, find, and transfer concepts and techniques, both within the local research group and to a more broadly defined user community. Concepts and associated computer programs are made as “modular” as possible to facilitate easy transfer from one project to another. Study Setting/Data Sources Tools to work with a registry, longitudinal administrative data, and special files (survey and clinical) from the Province of Manitoba, Canada in the 1990–2003 period. Data Collection Literature review and analyses of web site utilization were used to generate the findings. Principal Findings The Internet-based Concept Dictionary and SAS macros developed in Manitoba are being used in a growing number of research centers. Nearly 32,000 hits from more than 10,200 hosts in a recent month demonstrate broad interest in the Concept Dictionary. Conclusions The tools, taken together, make up a knowledge repository and research production system that aid local work and have great potential internationally. Modular software provides considerable efficiency. The merging of documentation and researcher-to-researcher dissemination keeps costs manageable. PMID:14596394

  8. Experience in the use of social media in medical and health education. Contribution of the IMIA Social Media Working Group.

    PubMed

    Paton, C; Bamidis, P D; Eysenbach, G; Hansen, M; Cabrer, M

    2011-01-01

    Social media are online tools that allow collaboration and community building. Succinctly, they can be described as applications where "users add value". This paper aims to show how five educators have used social media tools in medical and health education to attempt to add value to the education they provide. We conducted a review of the literature about the use of social media tools in medical and health education. Each of the authors reported on their use of social media in their educational projects and collaborated on a discussion of the advantages and disadvantages of this approach to delivering educational projects. We found little empirical evidence to support the use of social media tools in medical and health education. Social media are, however, a rapidly evolving range of tools, websites and online experiences and it is likely that the topic is too broad to draw definitive conclusions from any particular study. As practitioners in the use of social media, we have recognised how difficult it is to create evidence of effectiveness and have therefore presented only our anecdotal opinions based on our personal experiences of using social media in our educational projects. The authors feel confident in recommending that other educators use social media in their educational projects. Social media appear to have unique advantages over non-social educational tools. The learning experience appears to be enhanced by the ability of students to virtually build connections, make friends and find mentors. Creating a scientific analysis of why these connections enhance learning is difficult, but anecdotal and preliminary survey evidence appears to be positive and our experience reflects the hypothesis that learning is, at heart, a social activity.

  9. 10 CFR 33.15 - Requirements for the issuance of a Type C specific license of broad scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF BROAD SCOPE FOR BYPRODUCT MATERIAL Specific Licenses of Broad Scope § 33.15 Requirements for the... this chapter; and (b) The applicant submits a statement that byproduct material will be used only by... bachelor level, or equivalent training and experience, in the physical or biological sciences or in...

  10. A GeoServices Infrastructure for Near-Real-Time Access to Suomi NPP Satellite Data

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Valente, E. G.; Hao, W.; Chettri, S.

    2012-12-01

    The new Suomi National Polar-orbiting Partnership (NPP) satellite extends NASA's moderate-resolution, multispectral observations with a suite of powerful imagers and sounders to support a broad array of research and applications. However, NPP data products consist of a complex set of data and metadata files in highly specialized formats; which NPP's operational ground segment delivers to users only with several hours' delay. This severely limits their use in critical applications such as weather forecasting, emergency / disaster response, search and rescue, and other activities that require near-real-time access to satellite observations. Alternative approaches, based on distributed Direct Broadcast facilities, can reduce the delay in NPP data delivery from hours to minutes, and can make products more directly usable by practitioners in the field. To assess and fulfill this potential, we are developing a suite of software that couples Direct Broadcast data feeds with a streamlined, scalable processing chain and geospatial Web services, so as to permit many more time-sensitive applications to use NPP data. The resulting geoservices infrastructure links a variety of end-user tools and applications to NPP data from different sources, and to other rapidly-changing geospatial data. By using well-known, standard software interfaces (such as OGC Web Services or OPeNDAP), this infrastructure serves a variety of end-user analysis and visualization tools, giving them access into datasets of arbitrary size and resolution and allowing them to request and receive tailored products on demand. The standards-based approach may also streamline data sharing among independent satellite receiving facilities, thus helping them to interoperate in providing frequent, composite views of continent-scale or global regions. To enable others to build similar or derived systems, the service components we are developing (based in part on the Community Satellite Processing Package (CSPP) from the University of Wisconsin and the International Polar-Orbiter Processing Package (IPOPP) from NASA) are being released as open source software. Furthermore, they are configured to operate in a cloud computing environment, so as to allow even small organizations to process and serve NPP data without large hardware investments; and to maintain near-real-time performance cost-effectively by growing and shrinking their use of computing resources to meet large, rapid fluctuations in end-user demand, data availability, and processing needs. (This is especially important for polar-orbiting satellites like NPP, which pass within range of a receiver only a few times each day.) We will discuss the design of the infrastructure, highlight its capabilities, and sketch its potential to facilitate broad access to satellite data processing and visualization, and to enhance near-real-time applications via distributed NPP data streams.

  11. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    PubMed

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  12. Ambient Ionization Mass Spectrometry for Cancer Diagnosis and Surgical Margin Evaluation

    PubMed Central

    Ifa, Demian R.; Eberlin, Livia S.

    2017-01-01

    Background There is a clinical need for new technologies that would enable rapid disease diagnosis based on diagnostic molecular signatures. Ambient ionization mass spectrometry has revolutionized the means by which molecular information can be obtained from tissue samples in real time and with minimal sample pretreatment. New developments in ambient ionization techniques applied to clinical research suggest that ambient ionization mass spectrometry will soon become a routine medical tool for tissue diagnosis. Content This review summarizes the main developments in ambient ionization techniques applied to tissue analysis, with focus on desorption electrospray ionization mass spectrometry, probe electrospray ionization, touch spray, and rapid evaporative ionization mass spectrometry. We describe their applications to human cancer research and surgical margin evaluation, highlighting integrated approaches tested for ex vivo and in vivo human cancer tissue analysis. We also discuss the challenges for clinical implementation of these tools and offer perspectives on the future of the field. Summary A variety of studies have showcased the value of ambient ionization mass spectrometry for rapid and accurate cancer diagnosis. Small molecules have been identified as potential diagnostic biomarkers, including metabolites, fatty acids, and glycerophospholipids. Statistical analysis allows tissue discrimination with high accuracy rates (>95%) being common. This young field has challenges to overcome before it is ready to be broadly accepted as a medical tool for cancer diagnosis. Growing research in new, integrated ambient ionization mass spectrometry technologies and the ongoing improvements in the existing tools make this field very promising for future translation into the clinic. PMID:26555455

  13. Cryogenic Boil-Off Reduction System

    NASA Astrophysics Data System (ADS)

    Plachta, David W.; Guzik, Monica C.

    2014-03-01

    A computational model of the cryogenic boil-off reduction system being developed by NASA as part of the Cryogenic Propellant Storage and Transfer technology maturation project has been applied to a range of propellant storage tanks sizes for high-performing in-space cryogenic propulsion applications. This effort focuses on the scaling of multi-layer insulation (MLI), cryocoolers, broad area cooling shields, radiators, solar arrays, and tanks for liquid hydrogen propellant storage tanks ranging from 2 to 10 m in diameter. Component scaling equations were incorporated into the Cryogenic Analysis Tool, a spreadsheet-based tool used to perform system-level parametric studies. The primary addition to the evolution of this updated tool is the integration of a scaling method for reverse turbo-Brayton cycle cryocoolers, as well as the development and inclusion of Self-Supporting Multi-Layer Insulation. Mass, power, and sizing relationships are traded parametrically to establish the appropriate loiter period beyond which this boil-off reduction system application reduces mass. The projected benefit compares passive thermal control to active thermal control, where active thermal control is evaluated for reduced boil-off with a 90 K shield, zero boil-off with a single heat interception stage at the tank wall, and zero boil-off with a second interception stage at a 90 K shield. Parametric studies show a benefit over passive storage at loiter durations under one month, in addition to showing a benefit for two-stage zero boil-off in terms of reducing power and mass as compared to single stage zero boil-off. Furthermore, active cooling reduces the effect of varied multi-layer insulation performance, which, historically, has been shown to be significant.

  14. Efficient randomization of biological networks while preserving functional characterization of individual nodes.

    PubMed

    Iorio, Francesco; Bernardo-Faura, Marti; Gobbi, Andrea; Cokelaer, Thomas; Jurman, Giuseppe; Saez-Rodriguez, Julio

    2016-12-20

    Networks are popular and powerful tools to describe and model biological processes. Many computational methods have been developed to infer biological networks from literature, high-throughput experiments, and combinations of both. Additionally, a wide range of tools has been developed to map experimental data onto reference biological networks, in order to extract meaningful modules. Many of these methods assess results' significance against null distributions of randomized networks. However, these standard unconstrained randomizations do not preserve the functional characterization of the nodes in the reference networks (i.e. their degrees and connection signs), hence including potential biases in the assessment. Building on our previous work about rewiring bipartite networks, we propose a method for rewiring any type of unweighted networks. In particular we formally demonstrate that the problem of rewiring a signed and directed network preserving its functional connectivity (F-rewiring) reduces to the problem of rewiring two induced bipartite networks. Additionally, we reformulate the lower bound to the iterations' number of the switching-algorithm to make it suitable for the F-rewiring of networks of any size. Finally, we present BiRewire3, an open-source Bioconductor package enabling the F-rewiring of any type of unweighted network. We illustrate its application to a case study about the identification of modules from gene expression data mapped on protein interaction networks, and a second one focused on building logic models from more complex signed-directed reference signaling networks and phosphoproteomic data. BiRewire3 it is freely available at https://www.bioconductor.org/packages/BiRewire/ , and it should have a broad application as it allows an efficient and analytically derived statistical assessment of results from any network biology tool.

  15. Sustainable Diagnostic Tools for Site Characterization and Remediation

    NASA Astrophysics Data System (ADS)

    Driver, E. M.; Roll, I. B.; Supowit, S. D.; Halden, R. U.

    2016-12-01

    Three submersible diagnostic tools were developed to enable more precise and cost-effective means of sampling environmental waters and assessing remedial strategies. The In Situ Sampler (IS2) and In Situ Sampler for Biphasic Water Monitoring (IS2B), designed for sampling groundwater or simultaneous pore- and surface water, use affordable off-the-shelf solid phase extraction technology, applicable to a broad range of organic and inorganic contaminants. Flow-through design reduces hazardous waste generation, transportation costs, and carbon footprint by 90-98% compared to traditional methods. The IS2 is ideal for dynamic groundwater systems where discrete sampling may fail to capture temporal variations, leading to inaccurate assessment of exposure and risk. A 28-day sampling event in a Cr(VI)-impacted aquifer captured previously undetected tidally-induced fluctuations, while improving the reporting limit 8-fold. The IS2B elucidates contaminant partitioning and bioavailability, and was validated in a wetland-shallow aquifer system with the pesticide fipronil. Concentrations of total fipronil-related compounds were statistically indistinguishable from those determined by conventional techniques (p > 0.2), ranging from 9.9 ± 4.6 to 18.1 ± 4.6 ng/L in surface water and 9.1 ± 3.0 to 12.6 ± 2.1 ng/L in porewater. For groundwater remedial testing, the In Situ Microcosm Array (ISMA) was developed to integrate laboratory column treatability studies with pilot-scale field-testing, thus minimizing costs associated with sequential lab and field analyses. In situ operation maintains (geo)chemical and microbial groundwater parameters often destroyed by extraction and laboratory storage. Onboard effluent capture permits the deployment well to return to monitoring status immediately after instrument removal. All tools employ reusable internal components and may be operated by solar power. Case study results highlight the capabilities and application range of the each technology.

  16. Walking the Walk/Talking the Talk: Mission Planning with Speech-Interactive Agents

    NASA Technical Reports Server (NTRS)

    Bell, Benjamin; Short, Philip; Webb, Stewart

    2010-01-01

    The application of simulation technology to mission planning and rehearsal has enabled realistic overhead 2-D and immersive 3-D "fly-through" capabilities that can help better prepare tactical teams for conducting missions in unfamiliar locales. For aircrews, detailed terrain data can offer a preview of the relevant landmarks and hazards, and threat models can provide a comprehensive glimpse of potential hot zones and safety corridors. A further extension of the utility of such planning and rehearsal techniques would allow users to perform the radio communications planned for a mission; that is, the air-ground coordination that is critical to the success of missions such as close air support (CAS). Such practice opportunities, while valuable, are limited by the inescapable scarcity of complete mission teams to gather in space and time during planning and rehearsal cycles. Moreoever, using simulated comms with synthetic entities, despite the substantial training and cost benefits, remains an elusive objective. In this paper we report on a solution to this gap that incorporates "synthetic teammates" - intelligent software agents that can role-play entities in a mission scenario and that can communicate in spoken language with users. We employ a fielded mission planning and rehearsal tool so that our focus remains on the experimental objectives of the research rather than on developing a testbed from scratch. Use of this planning tool also helps to validate the approach in an operational system. The result is a demonstration of a mission rehearsal tool that allows aircrew users to not only fly the mission but also practice the verbal communications with air control agencies and tactical controllers on the ground. This work will be presented in a CAS mission planning example but has broad applicability across weapons systems, missions and tactical force compositions.

  17. CRISPR-Cas9D10A Nickase-Assisted Genome Editing in Lactobacillus casei

    PubMed Central

    Song, Xin; Huang, He; Xiong, Zhiqiang

    2017-01-01

    ABSTRACT Lactobacillus casei has drawn increasing attention as a health-promoting probiotic, while effective genetic manipulation tools are often not available, e.g., the single-gene knockout in L. casei still depends on the classic homologous recombination-dependent double-crossover strategy, which is quite labor-intensive and time-consuming. In the present study, a rapid and precise genome editing plasmid, pLCNICK, was established for L. casei genome engineering based on CRISPR-Cas9D10A. In addition to the P23-Cas9D10A and Pldh-sgRNA (single guide RNA) expression cassettes, pLCNICK includes the homologous arms of the target gene as repair templates. The ability and efficiency of chromosomal engineering using pLCNICK were evaluated by in-frame deletions of four independent genes and chromosomal insertion of an enhanced green fluorescent protein (eGFP) expression cassette at the LC2W_1628 locus. The efficiencies associated with in-frame deletions and chromosomal insertion is 25 to 62%. pLCNICK has been proved to be an effective, rapid, and precise tool for genome editing in L. casei, and its potential application in other lactic acid bacteria (LAB) is also discussed in this study. IMPORTANCE The lack of efficient genetic tools has limited the investigation and biotechnological application of many LAB. The CRISPR-Cas9D10A nickase-based genome editing in Lactobacillus casei, an important food industrial microorganism, was demonstrated in this study. This genetic tool allows efficient single-gene deletion and insertion to be accomplished by one-step transformation, and the cycle time is reduced to 9 days. It facilitates a rapid and precise chromosomal manipulation in L. casei and overcomes some limitations of previous methods. This editing system can serve as a basic technological platform and offers the possibility to start a comprehensive investigation on L. casei. As a broad-host-range plasmid, pLCNICK has the potential to be adapted to other Lactobacillus species for genome editing. PMID:28864652

  18. SNP discovery in common bean by restriction-associated DNA (RAD) sequencing for genetic diversity and population structure analysis.

    PubMed

    Valdisser, Paula Arielle M R; Pappas, Georgios J; de Menezes, Ivandilson P P; Müller, Bárbara S F; Pereira, Wendell J; Narciso, Marcelo G; Brondani, Claudio; Souza, Thiago L P O; Borba, Tereza C O; Vianello, Rosana P

    2016-06-01

    Researchers have made great advances into the development and application of genomic approaches for common beans, creating opportunities to driving more real and applicable strategies for sustainable management of the genetic resource towards plant breeding. This work provides useful polymorphic single-nucleotide polymorphisms (SNPs) for high-throughput common bean genotyping developed by RAD (restriction site-associated DNA) sequencing. The RAD tags were generated from DNA pooled from 12 common bean genotypes, including breeding lines of different gene pools and market classes. The aligned sequences identified 23,748 putative RAD-SNPs, of which 3357 were adequate for genotyping; 1032 RAD-SNPs with the highest ADT (assay design tool) score are presented in this article. The RAD-SNPs were structurally annotated in different coding (47.00 %) and non-coding (53.00 %) sequence components of genes. A subset of 384 RAD-SNPs with broad genome distribution was used to genotype a diverse panel of 95 common bean germplasms and revealed a successful amplification rate of 96.6 %, showing 73 % of polymorphic SNPs within the Andean group and 83 % in the Mesoamerican group. A slightly increased He (0.161, n = 21) value was estimated for the Andean gene pool, compared to the Mesoamerican group (0.156, n = 74). For the linkage disequilibrium (LD) analysis, from a group of 580 SNPs (289 RAD-SNPs and 291 BARC-SNPs) genotyped for the same set of genotypes, 70.2 % were in LD, decreasing to 0.10 %in the Andean group and 0.77 % in the Mesoamerican group. Haplotype patterns spanning 310 Mb of the genome (60 %) were characterized in samples from different origins. However, the haplotype frameworks were under-represented for the Andean (7.85 %) and Mesoamerican (5.55 %) gene pools separately. In conclusion, RAD sequencing allowed the discovery of hundreds of useful SNPs for broad genetic analysis of common bean germplasm. From now, this approach provides an excellent panel of molecular tools for whole genome analysis, allowing integrating and better exploring the common bean breeding practices.

  19. Excitation-scanning hyperspectral imaging microscope

    PubMed Central

    Favreau, Peter F.; Hernandez, Clarissa; Heaster, Tiffany; Alvarez, Diego F.; Rich, Thomas C.; Prabhat, Prashant; Leavesley, Silas J.

    2014-01-01

    Abstract. Hyperspectral imaging is a versatile tool that has recently been applied to a variety of biomedical applications, notably live-cell and whole-tissue signaling. Traditional hyperspectral imaging approaches filter the fluorescence emission over a broad wavelength range while exciting at a single band. However, these emission-scanning approaches have shown reduced sensitivity due to light attenuation from spectral filtering. Consequently, emission scanning has limited applicability for time-sensitive studies and photosensitive applications. In this work, we have developed an excitation-scanning hyperspectral imaging microscope that overcomes these limitations by providing high transmission with short acquisition times. This is achieved by filtering the fluorescence excitation rather than the emission. We tested the efficacy of the excitation-scanning microscope in a side-by-side comparison with emission scanning for detection of green fluorescent protein (GFP)-expressing endothelial cells in highly autofluorescent lung tissue. Excitation scanning provided higher signal-to-noise characteristics, as well as shorter acquisition times (300  ms/wavelength band with excitation scanning versus 3  s/wavelength band with emission scanning). Excitation scanning also provided higher delineation of nuclear and cell borders, and increased identification of GFP regions in highly autofluorescent tissue. These results demonstrate excitation scanning has utility in a wide range of time-dependent and photosensitive applications. PMID:24727909

  20. Excitation-scanning hyperspectral imaging microscope.

    PubMed

    Favreau, Peter F; Hernandez, Clarissa; Heaster, Tiffany; Alvarez, Diego F; Rich, Thomas C; Prabhat, Prashant; Leavesley, Silas J

    2014-04-01

    Hyperspectral imaging is a versatile tool that has recently been applied to a variety of biomedical applications, notably live-cell and whole-tissue signaling. Traditional hyperspectral imaging approaches filter the fluorescence emission over a broad wavelength range while exciting at a single band. However, these emission-scanning approaches have shown reduced sensitivity due to light attenuation from spectral filtering. Consequently, emission scanning has limited applicability for time-sensitive studies and photosensitive applications. In this work, we have developed an excitation-scanning hyperspectral imaging microscope that overcomes these limitations by providing high transmission with short acquisition times. This is achieved by filtering the fluorescence excitation rather than the emission. We tested the efficacy of the excitation-scanning microscope in a side-by-side comparison with emission scanning for detection of green fluorescent protein (GFP)-expressing endothelial cells in highly autofluorescent lung tissue. Excitation scanning provided higher signal-to-noise characteristics, as well as shorter acquisition times (300  ms/wavelength band with excitation scanning versus 3  s/wavelength band with emission scanning). Excitation scanning also provided higher delineation of nuclear and cell borders, and increased identification of GFP regions in highly autofluorescent tissue. These results demonstrate excitation scanning has utility in a wide range of time-dependent and photosensitive applications.

  1. Application of Raman spectroscopy technology to studying Sudan I

    NASA Astrophysics Data System (ADS)

    Li, Gang; Zhang, Guoping; Chen, Chen

    2006-06-01

    Being an industrial dye, the Sudan I may have a toxic effect after oral intake on the body, and has recently been shown to cause cancer in rats, mice and rabbits. Because China and some other countries have detected the Sudan I in samples of the hot chilli powder and the chilli products, it is necessary to study the characteristics of this dye. As one kind of molecule scattering spectroscopy, Raman spectroscopy is characterized by the frequency excursion caused by interactions of molecules and photons. The frequency excursion reflects the margin between certain two vibrational or rotational energy states, and shows the information of the molecule. Because Raman spectroscopy can provides quick, easy, reproducible, and non-destructive analysis, both qualitative and quantitative, with no sample preparation required, Raman spectroscopy has been a particularly promising technique for analyzing the characteristics and structures of molecules, especially organic ones. Now, it has a broad application in biological, chemical, environmental and industrial applications. This paper firstly introduces Sudan I dye and the Raman spectroscopy technology, and then describes its application to the Sudan I. Secondly, the fingerprint spectra of the Sudan I are respectively assigned and analyzed in detail. Finally, the conclusion that the Raman spectroscopy technology is a powerful tool to determine the Sudan I is drawn.

  2. Functional Connectivity Mapping in the Animal Model: Principles and Applications of Resting-State fMRI

    PubMed Central

    Gorges, Martin; Roselli, Francesco; Müller, Hans-Peter; Ludolph, Albert C.; Rasche, Volker; Kassubek, Jan

    2017-01-01

    “Resting-state” fMRI has substantially contributed to the understanding of human and non-human functional brain organization by the analysis of correlated patterns in spontaneous activity within dedicated brain systems. Spontaneous neural activity is indirectly measured from the blood oxygenation level-dependent signal as acquired by echo planar imaging, when subjects quietly “resting” in the scanner. Animal models including disease or knockout models allow a broad spectrum of experimental manipulations not applicable in humans. The non-invasive fMRI approach provides a promising tool for cross-species comparative investigations. This review focuses on the principles of “resting-state” functional connectivity analysis and its applications to living animals. The translational aspect from in vivo animal models toward clinical applications in humans is emphasized. We introduce the fMRI-based investigation of the non-human brain’s hemodynamics, the methodological issues in the data postprocessing, and the functional data interpretation from different abstraction levels. The longer term goal of integrating fMRI connectivity data with structural connectomes obtained with tracing and optical imaging approaches is presented and will allow the interrogation of fMRI data in terms of directional flow of information and may identify the structural underpinnings of observed functional connectivity patterns. PMID:28539914

  3. Dynamic Clamp in Cardiac and Neuronal Systems Using RTXI

    PubMed Central

    Ortega, Francis A.; Butera, Robert J.; Christini, David J.; White, John A.; Dorval, Alan D.

    2016-01-01

    The injection of computer-simulated conductances through the dynamic clamp technique has allowed researchers to probe the intercellular and intracellular dynamics of cardiac and neuronal systems with great precision. By coupling computational models to biological systems, dynamic clamp has become a proven tool in electrophysiology with many applications, such as generating hybrid networks in neurons or simulating channelopathies in cardiomyocytes. While its applications are broad, the approach is straightforward: synthesizing traditional patch clamp, computational modeling, and closed-loop feedback control to simulate a cellular conductance. Here, we present two example applications: artificial blocking of the inward rectifier potassium current in a cardiomyocyte and coupling of a biological neuron to a virtual neuron through a virtual synapse. The design and implementation of the necessary software to administer these dynamic clamp experiments can be difficult. In this chapter, we provide an overview of designing and implementing a dynamic clamp experiment using the Real-Time eXperiment Interface (RTXI), an open- source software system tailored for real-time biological experiments. We present two ways to achieve this using RTXI’s modular format, through the creation of a custom user-made module and through existing modules found in RTXI’s online library. PMID:25023319

  4. Nanolattices: An Emerging Class of Mechanical Metamaterials.

    PubMed

    Bauer, Jens; Meza, Lucas R; Schaedler, Tobias A; Schwaiger, Ruth; Zheng, Xiaoyu; Valdevit, Lorenzo

    2017-10-01

    In 1903, Alexander Graham Bell developed a design principle to generate lightweight, mechanically robust lattice structures based on triangular cells; this has since found broad application in lightweight design. Over one hundred years later, the same principle is being used in the fabrication of nanolattice materials, namely lattice structures composed of nanoscale constituents. Taking advantage of the size-dependent properties typical of nanoparticles, nanowires, and thin films, nanolattices redefine the limits of the accessible material-property space throughout different disciplines. Herein, the exceptional mechanical performance of nanolattices, including their ultrahigh strength, damage tolerance, and stiffness, are reviewed, and their potential for multifunctional applications beyond mechanics is examined. The efficient integration of architecture and size-affected properties is key to further develop nanolattices. The introduction of a hierarchical architecture is an effective tool in enhancing mechanical properties, and the eventual goal of nanolattice design may be to replicate the intricate hierarchies and functionalities observed in biological materials. Additive manufacturing and self-assembly techniques enable lattice design at the nanoscale; the scaling-up of nanolattice fabrication is currently the major challenge to their widespread use in technological applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. An Overview of Literature Topics Related to Current Concepts, Methods, Tools, and Applications for Cumulative Risk Assessment (2007-2016).

    PubMed

    Fox, Mary A; Brewer, L Elizabeth; Martin, Lawrence

    2017-04-07

    Cumulative risk assessments (CRAs) address combined risks from exposures to multiple chemical and nonchemical stressors and may focus on vulnerable communities or populations. Significant contributions have been made to the development of concepts, methods, and applications for CRA over the past decade. Work in both human health and ecological cumulative risk has advanced in two different contexts. The first context is the effects of chemical mixtures that share common modes of action, or that cause common adverse outcomes. In this context two primary models are used for predicting mixture effects, dose addition or response addition. The second context is evaluating the combined effects of chemical and nonchemical (e.g., radiation, biological, nutritional, economic, psychological, habitat alteration, land-use change, global climate change, and natural disasters) stressors. CRA can be adapted to address risk in many contexts, and this adaptability is reflected in the range in disciplinary perspectives in the published literature. This article presents the results of a literature search and discusses a range of selected work with the intention to give a broad overview of relevant topics and provide a starting point for researchers interested in CRA applications.

  6. Research on Mechanisms and Controlling Methods of Macro Defects in TC4 Alloy Fabricated by Wire Additive Manufacturing.

    PubMed

    Ji, Lei; Lu, Jiping; Tang, Shuiyuan; Wu, Qianru; Wang, Jiachen; Ma, Shuyuan; Fan, Hongli; Liu, Changmeng

    2018-06-28

    Wire feeding additive manufacturing (WFAM) has broad application prospects because of its advantages of low cost and high efficiency. However, with the mode of lateral wire feeding, including wire and laser additive manufacturing, gas tungsten arc additive manufacturing etc., it is easy to generate macro defects on the surface of the components because of the anisotropy of melted wire, which limits the promotion and application of WFAM. In this work, gas tungsten arc additive manufacturing with lateral wire feeding is proposed to investigate the mechanisms of macro defects. The results illustrate that the defect forms mainly include side spatters, collapse, poor flatness, and unmelted wire. It was found that the heat input, layer thickness, tool path, and wire curvature can have an impact on the macro defects. Side spatters are the most serious defects, mainly because the droplets cannot be transferred to the center of the molten pool in the lateral wire feeding mode. This research indicates that the macro defects can be controlled by optimizing the process parameters. Finally, block parts without macro defects were fabricated, which is meaningful for the further application of WFAM.

  7. Human Induced Pluripotent Stem Cell NEUROG2 Dual Knockin Reporter Lines Generated by the CRISPR/Cas9 System.

    PubMed

    Li, Shenglan; Xue, Haipeng; Wu, Jianbo; Rao, Mahendra S; Kim, Dong H; Deng, Wenbin; Liu, Ying

    2015-12-15

    Human induced pluripotent stem cell (hiPSC) technologies are powerful tools for modeling development and disease, drug screening, and regenerative medicine. Faithful gene targeting in hiPSCs greatly facilitates these applications. We have developed a fast and precise clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR associated protein 9 (Cas9) technology-based method and obtained fluorescent protein and antibiotic resistance dual knockin reporters in hiPSC lines for neurogenin2 (NEUROG2), an important proneural transcription factor. Gene targeting efficiency was greatly improved in CRISPR/Cas9-mediated homology directed recombination (∼ 33% correctly targeted clones) compared to conventional targeting protocol (∼ 3%) at the same locus. No off-target events were detected. In addition, taking the advantage of the versatile applications of the CRISPR/Cas9 system, we designed transactivation components to transiently induce NEUROG2 expression, which helps identify transcription factor binding sites and trans-regulation regions of human NEUROG2. The strategy of using CRISPR/Cas9 genome editing coupled with fluorescence-activated cell sorting of neural progenitor cells in a knockin lineage hiPSC reporter platform might be broadly applicable in other stem cell derivatives and subpopulations.

  8. An Overview of Literature Topics Related to Current Concepts, Methods, Tools, and Applications for Cumulative Risk Assessment (2007–2016)

    PubMed Central

    Fox, Mary A.; Brewer, L. Elizabeth; Martin, Lawrence

    2017-01-01

    Cumulative risk assessments (CRAs) address combined risks from exposures to multiple chemical and nonchemical stressors and may focus on vulnerable communities or populations. Significant contributions have been made to the development of concepts, methods, and applications for CRA over the past decade. Work in both human health and ecological cumulative risk has advanced in two different contexts. The first context is the effects of chemical mixtures that share common modes of action, or that cause common adverse outcomes. In this context two primary models are used for predicting mixture effects, dose addition or response addition. The second context is evaluating the combined effects of chemical and nonchemical (e.g., radiation, biological, nutritional, economic, psychological, habitat alteration, land-use change, global climate change, and natural disasters) stressors. CRA can be adapted to address risk in many contexts, and this adaptability is reflected in the range in disciplinary perspectives in the published literature. This article presents the results of a literature search and discusses a range of selected work with the intention to give a broad overview of relevant topics and provide a starting point for researchers interested in CRA applications. PMID:28387705

  9. Programmatic Perspectives on Using `Rapid Prototyping Capability' for Water Management Applications Using NASA Products

    NASA Astrophysics Data System (ADS)

    Toll, D.; Friedl, L.; Entin, J.; Engman, E.

    2006-12-01

    The NASA Water Management Program addresses concerns and decision making related to water availability, water forecast and water quality. The goal of the Water Management Program Element is to encourage water management organizations to use NASA Earth science data, models products, technology and other capabilities in their decision support tools (DSTs) for problem solving. The goal of the NASA Rapid Prototyping Capability (RPC) is to speed the evaluation of these NASA products and technologies to improve current and future DSTs by reducing the time to access, configure, and assess the effectiveness of NASA products and technologies. The NASA Water Management Program Element partners with Federal agencies, academia, private firms, and may include international organizations. Currently, the NASA Water Management Program oversees eight application projects. However, water management is a very broad descriptor of a much larger number of activities that are carried out to insure safe and plentiful water supply for humans, industry and agriculture, promote environmental stewardship, and mitigate disaster such as floods and droughts. The goal of this presentation is to summarize how the RPC may further enhance the effectiveness of using NASA products for water management applications.

  10. Engineering Thermostable Microbial Xylanases Toward its Industrial Applications.

    PubMed

    Kumar, Vishal; Dangi, Arun Kumar; Shukla, Pratyoosh

    2018-03-01

    Xylanases are one of the important hydrolytic enzymes which hydrolyze the β-1, 4 xylosidic linkage of the backbone of the xylan polymeric chain which consists of xylose subunits. Xylanases are mainly found in plant cell walls and are produced by several kinds of microorganisms such as fungi, bacteria, yeast, and some protozoans. The fungi are considered as most potent xylanase producers than that of yeast and bacteria. There is a broad series of industrial applications for the thermostable xylanase as an industrial enzyme. Thermostable xylanases have been used in a number of industries such as paper and pulp industry, biofuel industry, food and feed industry, textile industry, etc. The present review explores xylanase-substrate interactions using gene-editing tools toward the comprehension in improvement in industrial stability of xylanases. The various protein-engineering and metabolic-engineering methods have also been explored to improve operational stability of xylanase. Thermostable xylanases have also been used for improvement in animal feed nutritional value. Furthermore, they have been used directly in bakery and breweries, including a major use in paper and pulp industry as a biobleaching agent. This present review envisages some of such applications of thermostable xylanases for their bioengineering.

  11. Human Induced Pluripotent Stem Cell NEUROG2 Dual Knockin Reporter Lines Generated by the CRISPR/Cas9 System

    PubMed Central

    Li, Shenglan; Xue, Haipeng; Wu, Jianbo; Rao, Mahendra S.; Kim, Dong H.; Deng, Wenbin

    2015-01-01

    Human induced pluripotent stem cell (hiPSC) technologies are powerful tools for modeling development and disease, drug screening, and regenerative medicine. Faithful gene targeting in hiPSCs greatly facilitates these applications. We have developed a fast and precise clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR associated protein 9 (Cas9) technology-based method and obtained fluorescent protein and antibiotic resistance dual knockin reporters in hiPSC lines for neurogenin2 (NEUROG2), an important proneural transcription factor. Gene targeting efficiency was greatly improved in CRISPR/Cas9-mediated homology directed recombination (∼33% correctly targeted clones) compared to conventional targeting protocol (∼3%) at the same locus. No off-target events were detected. In addition, taking the advantage of the versatile applications of the CRISPR/Cas9 system, we designed transactivation components to transiently induce NEUROG2 expression, which helps identify transcription factor binding sites and trans-regulation regions of human NEUROG2. The strategy of using CRISPR/Cas9 genome editing coupled with fluorescence-activated cell sorting of neural progenitor cells in a knockin lineage hiPSC reporter platform might be broadly applicable in other stem cell derivatives and subpopulations. PMID:26414932

  12. Opportunities and Strategies to Incorporate Ecosystem Services Knowledge and Decision Support Tools into Planning and Decision Making in Hawai`i

    NASA Astrophysics Data System (ADS)

    Bremer, Leah L.; Delevaux, Jade M. S.; Leary, James J. K.; J. Cox, Linda; Oleson, Kirsten L. L.

    2015-04-01

    Incorporating ecosystem services into management decisions is a promising means to link conservation and human well-being. Nonetheless, planning and management in Hawai`i, a state with highly valued natural capital, has yet to broadly utilize an ecosystem service approach. We conducted a stakeholder assessment, based on semi-structured interviews, with terrestrial ( n = 26) and marine ( n = 27) natural resource managers across the State of Hawai`i to understand the current use of ecosystem services (ES) knowledge and decision support tools and whether, how, and under what contexts, further development would potentially be useful. We found that ES knowledge and tools customized to Hawai`i could be useful for communication and outreach, justifying management decisions, and spatial planning. Greater incorporation of this approach is clearly desired and has a strong potential to contribute to more sustainable decision making and planning in Hawai`i and other oceanic island systems. However, the unique biophysical, socio-economic, and cultural context of Hawai`i, and other island systems, will require substantial adaptation of existing ES tools. Based on our findings, we identified four key opportunities for the use of ES knowledge and tools in Hawai`i: (1) linking native forest protection to watershed health; (2) supporting sustainable agriculture; (3) facilitating ridge-to-reef management; and (4) supporting statewide terrestrial and marine spatial planning. Given the interest expressed by natural resource managers, we envision broad adoption of ES knowledge and decision support tools if knowledge and tools are tailored to the Hawaiian context and coupled with adequate outreach and training.

  13. Opportunities and strategies to incorporate ecosystem services knowledge and decision support tools into planning and decision making in Hawai'i.

    PubMed

    Bremer, Leah L; Delevaux, Jade M S; Leary, James J K; J Cox, Linda; Oleson, Kirsten L L

    2015-04-01

    Incorporating ecosystem services into management decisions is a promising means to link conservation and human well-being. Nonetheless, planning and management in Hawai'i, a state with highly valued natural capital, has yet to broadly utilize an ecosystem service approach. We conducted a stakeholder assessment, based on semi-structured interviews, with terrestrial (n = 26) and marine (n = 27) natural resource managers across the State of Hawai'i to understand the current use of ecosystem services (ES) knowledge and decision support tools and whether, how, and under what contexts, further development would potentially be useful. We found that ES knowledge and tools customized to Hawai'i could be useful for communication and outreach, justifying management decisions, and spatial planning. Greater incorporation of this approach is clearly desired and has a strong potential to contribute to more sustainable decision making and planning in Hawai'i and other oceanic island systems. However, the unique biophysical, socio-economic, and cultural context of Hawai'i, and other island systems, will require substantial adaptation of existing ES tools. Based on our findings, we identified four key opportunities for the use of ES knowledge and tools in Hawai'i: (1) linking native forest protection to watershed health; (2) supporting sustainable agriculture; (3) facilitating ridge-to-reef management; and (4) supporting statewide terrestrial and marine spatial planning. Given the interest expressed by natural resource managers, we envision broad adoption of ES knowledge and decision support tools if knowledge and tools are tailored to the Hawaiian context and coupled with adequate outreach and training.

  14. An eDNA assay for river otter detection: A tool for surveying a semi-aquatic mammal

    Treesearch

    Ticha M. Padgett-Stewart; Taylor M. Wilcox; Kellie J. Carim; Kevin S. McKelvey; Michael K. Young; Michael K. Schwartz

    2016-01-01

    Environmental DNA (eDNA) is an effective tool for the detection of elusive or low-density aquatic organisms. However, it has infrequently been applied to mammalian species. North American river otters (Lontra canadensis) are both broad ranging and semi-aquatic, making them an ideal candidate for examining the uses of eDNA for detection of mammals. We developed...

  15. Status and Mission Applicability of NASA's In-Space Propulsion Technology Project

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Munk, Michelle M.; Dankanich, John; Pencil, Eric; Liou, Larry

    2009-01-01

    The In-Space Propulsion Technology (ISPT) project develops propulsion technologies that will enable or enhance NASA robotic science missions. Since 2001, the ISPT project developed and delivered products to assist technology infusion and quantify mission applicability and benefits through mission analysis and tools. These in-space propulsion technologies are applicable, and potentially enabling for flagship destinations currently under evaluation, as well as having broad applicability to future Discovery and New Frontiers mission solicitations. This paper provides status of the technology development, near-term mission benefits, applicability, and availability of in-space propulsion technologies in the areas of advanced chemical thrusters, electric propulsion, aerocapture, and systems analysis tools. The current chemical propulsion investment is on the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. Investments in electric propulsion technologies focused on completing NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system, and the High Voltage Hall Accelerator (HiVHAC) thruster, which is a mid-term product specifically designed for a low-cost electric propulsion option. Aerocapture investments developed a family of thermal protections system materials and structures; guidance, navigation, and control models of blunt-body rigid aeroshells; atmospheric models for Earth, Titan, Mars and Venus; and models for aerothermal effects. In 2009 ISPT started the development of propulsion technologies that would enable future sample return missions. The paper describes the ISPT project's future focus on propulsion for sample return missions. The future technology development areas for ISPT is: Planetary Ascent Vehicles (PAV), with a Mars Ascent Vehicle (MAV) being the initial development focus; multi-mission technologies for Earth Entry Vehicles (MMEEV) needed for sample return missions from many different destinations; propulsion for Earth Return Vehicles (ERV), transfer stages to the destination, and Electric Propulsion for sample return and low cost missions; and Systems/Mission Analysis focused on sample return propulsion. The ISPT project is funded by NASA's Science Mission Directorate (SMD).

  16. Unidata Cyberinfrastructure in the Cloud

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Young, J. W.

    2016-12-01

    Data services, software, and user support are critical components of geosciences cyber-infrastructure to help researchers to advance science. With the maturity of and significant advances in cloud computing, it has recently emerged as an alternative new paradigm for developing and delivering a broad array of services over the Internet. Cloud computing is now mature enough in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Given the enormous potential of cloud-based services, Unidata has been moving to augment its software, services, data delivery mechanisms to align with the cloud-computing paradigm. To realize the above vision, Unidata has worked toward: * Providing access to many types of data from a cloud (e.g., via the THREDDS Data Server, RAMADDA and EDEX servers); * Deploying data-proximate tools to easily process, analyze, and visualize those data in a cloud environment cloud for consumption by any one, by any device, from anywhere, at any time; * Developing and providing a range of pre-configured and well-integrated tools and services that can be deployed by any university in their own private or public cloud settings. Specifically, Unidata has developed Docker for "containerized applications", making them easy to deploy. Docker helps to create "disposable" installs and eliminates many configuration challenges. Containerized applications include tools for data transport, access, analysis, and visualization: THREDDS Data Server, Integrated Data Viewer, GEMPAK, Local Data Manager, RAMADDA Data Server, and Python tools; * Leveraging Jupyter as a central platform and hub with its powerful set of interlinking tools to connect interactively data servers, Python scientific libraries, scripts, and workflows; * Exploring end-to-end modeling and prediction capabilities in the cloud; * Partnering with NOAA and public cloud vendors (e.g., Amazon and OCC) on the NOAA Big Data Project to harness their capabilities and resources for the benefit of the academic community.

  17. Sea-level rise modeling handbook: Resource guide for coastal land managers, engineers, and scientists

    USGS Publications Warehouse

    Doyle, Thomas W.; Chivoiu, Bogdan; Enwright, Nicholas M.

    2015-08-24

    Global sea level is rising and may accelerate with continued fossil fuel consumption from industrial and population growth. In 2012, the U.S. Geological Survey conducted more than 30 training and feedback sessions with Federal, State, and nongovernmental organization (NGO) coastal managers and planners across the northern Gulf of Mexico coast to evaluate user needs, potential benefits, current scientific understanding, and utilization of resource aids and modeling tools focused on sea-level rise. In response to the findings from the sessions, this sea-level rise modeling handbook has been designed as a guide to the science and simulation models for understanding the dynamics and impacts of sea-level rise on coastal ecosystems. The review herein of decision-support tools and predictive models was compiled from the training sessions, from online research, and from publications. The purpose of this guide is to describe and categorize the suite of data, methods, and models and their design, structure, and application for hindcasting and forecasting the potential impacts of sea-level rise in coastal ecosystems. The data and models cover a broad spectrum of disciplines involving different designs and scales of spatial and temporal complexity for predicting environmental change and ecosystem response. These data and models have not heretofore been synthesized, nor have appraisals been made of their utility or limitations. Some models are demonstration tools for non-experts, whereas others require more expert capacity to apply for any given park, refuge, or regional application. A simplified tabular context has been developed to list and contrast a host of decision-support tools and models from the ecological, geological, and hydrological perspectives. Criteria were established to distinguish the source, scale, and quality of information input and geographic datasets; physical and biological constraints and relations; datum characteristics of water and land components; utility options for setting sea-level rise and climate change scenarios; and ease or difficulty of storing, displaying, or interpreting model output. Coastal land managers, engineers, and scientists can benefit from this synthesis of tools and models that have been developed for projecting causes and consequences of sea-level change on the landscape and seascape.

  18. A novel Bayesian change-point algorithm for genome-wide analysis of diverse ChIPseq data types.

    PubMed

    Xing, Haipeng; Liao, Willey; Mo, Yifan; Zhang, Michael Q

    2012-12-10

    ChIPseq is a widely used technique for investigating protein-DNA interactions. Read density profiles are generated by using next-sequencing of protein-bound DNA and aligning the short reads to a reference genome. Enriched regions are revealed as peaks, which often differ dramatically in shape, depending on the target protein(1). For example, transcription factors often bind in a site- and sequence-specific manner and tend to produce punctate peaks, while histone modifications are more pervasive and are characterized by broad, diffuse islands of enrichment(2). Reliably identifying these regions was the focus of our work. Algorithms for analyzing ChIPseq data have employed various methodologies, from heuristics(3-5) to more rigorous statistical models, e.g. Hidden Markov Models (HMMs)(6-8). We sought a solution that minimized the necessity for difficult-to-define, ad hoc parameters that often compromise resolution and lessen the intuitive usability of the tool. With respect to HMM-based methods, we aimed to curtail parameter estimation procedures and simple, finite state classifications that are often utilized. Additionally, conventional ChIPseq data analysis involves categorization of the expected read density profiles as either punctate or diffuse followed by subsequent application of the appropriate tool. We further aimed to replace the need for these two distinct models with a single, more versatile model, which can capably address the entire spectrum of data types. To meet these objectives, we first constructed a statistical framework that naturally modeled ChIPseq data structures using a cutting edge advance in HMMs(9), which utilizes only explicit formulas-an innovation crucial to its performance advantages. More sophisticated then heuristic models, our HMM accommodates infinite hidden states through a Bayesian model. We applied it to identifying reasonable change points in read density, which further define segments of enrichment. Our analysis revealed how our Bayesian Change Point (BCP) algorithm had a reduced computational complexity-evidenced by an abridged run time and memory footprint. The BCP algorithm was successfully applied to both punctate peak and diffuse island identification with robust accuracy and limited user-defined parameters. This illustrated both its versatility and ease of use. Consequently, we believe it can be implemented readily across broad ranges of data types and end users in a manner that is easily compared and contrasted, making it a great tool for ChIPseq data analysis that can aid in collaboration and corroboration between research groups. Here, we demonstrate the application of BCP to existing transcription factor(10,11) and epigenetic data(12) to illustrate its usefulness.

  19. 29 CFR 1630.1 - Purpose, applicability, and construction.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... programs. (4) Broad coverage. The primary purpose of the ADAAA is to make it easier for people with... broad scope of protection under the ADA, the definition of “disability” in this part shall be construed...

  20. Strategies to induce broadly protective antibody responses to viral glycoproteins.

    PubMed

    Krammer, F

    2017-05-01

    Currently, several universal/broadly protective influenza virus vaccine candidates are under development. Many of these vaccines are based on strategies to induce protective antibody responses against the surface glycoproteins of antigenically and genetically diverse influenza viruses. These strategies might also be applicable to surface glycoproteins of a broad range of other important viral pathogens. Areas covered: Common strategies include sequential vaccination with divergent antigens, multivalent approaches, vaccination with glycan-modified antigens, vaccination with minimal antigens and vaccination with antigens that have centralized/optimized sequences. Here we review these strategies and the underlying concepts. Furthermore, challenges, feasibility and applicability to other viral pathogens are discussed. Expert commentary: Several broadly protective/universal influenza virus vaccine strategies will be tested in humans in the coming years. If successful in terms of safety and immunological readouts, they will move forward into efficacy trials. In the meantime, successful vaccine strategies might also be applied to other antigenically diverse viruses of concern.

Top