Sample records for powerful tool enabling

  1. Powered mobility intervention: understanding the position of tool use learning as part of implementing the ALP tool.

    PubMed

    Nilsson, Lisbeth; Durkin, Josephine

    2017-10-01

    To explore the knowledge necessary for adoption and implementation of the Assessment of Learning Powered mobility use (ALP) tool in different practice settings for both adults and children. To consult with a diverse population of professionals working with adults and children, in different countries and various settings; who were learning about or using the ALP tool, as part of exploring and implementing research findings. Classical grounded theory with a rigorous comparative analysis of data from informants together with reflections on our own rich experiences of powered mobility practice and comparisons with the literature. A core category learning tool use and a new theory of cognizing tool use, with its interdependent properties: motivation, confidence, permissiveness, attentiveness and co-construction has emerged which explains in greater depth what enables the application of the ALP tool. The scientific knowledge base on tool use learning and the new theory conveys the information necessary for practitioner's cognizing how to apply the learning approach of the ALP tool in order to enable tool use learning through powered mobility practice as a therapeutic intervention in its own right. This opens up the possibility for more children and adults to have access to learning through powered mobility practice. Implications for rehabilitation Tool use learning through powered mobility practice is a therapeutic intervention in its own right. Powered mobility practice can be used as a rehabilitation tool with individuals who may not need to become powered wheelchair users. Motivation, confidence, permissiveness, attentiveness and co-construction are key properties for enabling the application of the learning approach of the ALP tool. Labelling and the use of language, together with honing observational skills through viewing video footage, are key to developing successful learning partnerships.

  2. Power-Tool Adapter For T-Handle Screws

    NASA Technical Reports Server (NTRS)

    Deloach, Stephen R.

    1992-01-01

    Proposed adapter enables use of pneumatic drill, electric drill, electric screwdriver, or similar power tool to tighten or loosen T-handled screws. Notched tube with perpendicular rod welded to it inserted in chuck of tool. Notched end of tube slipped over screw handle.

  3. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  4. Community Crowd-Funded Solar Finance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jagerson, Gordon "Ty"

    The award supported the demonstration and development of the Village Power Platform, which enables community organizations to more readily develop, finance and operate solar installations on local community organizations. The platform enables partial or complete local ownership of the solar installation. The award specifically supported key features including financial modeling tools, community communications tools, crowdfunding mechanisms, a mobile app, and other critical features.

  5. Information Power Grid (IPG) Tutorial 2003

    NASA Technical Reports Server (NTRS)

    Meyers, George

    2003-01-01

    For NASA and the general community today Grid middleware: a) provides tools to access/use data sources (databases, instruments, ...); b) provides tools to access computing (unique and generic); c) Is an enabler of large scale collaboration. Dynamically responding to needs is a key selling point of a grid. Independent resources can be joined as appropriate to solve a problem. Provide tools to enable the building of a frameworks for application. Provide value added service to the NASA user base for utilizing resources on the grid in new and more efficient ways. Provides tools for development of Frameworks.

  6. iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations

    NASA Astrophysics Data System (ADS)

    Vanfretti, L.; Rabuzin, T.; Baudette, M.; Murad, M.

    The iTesla Power Systems Library (iPSL) is a Modelica package providing a set of power system components for phasor time-domain modeling and simulation. The Modelica language provides a systematic approach to develop models using a formal mathematical description, that uniquely specifies the physical behavior of a component or the entire system. Furthermore, the standardized specification of the Modelica language (Modelica Association [1]) enables unambiguous model exchange by allowing any Modelica-compliant tool to utilize the models for simulation and their analyses without the need of a specific model transformation tool. As the Modelica language is being developed with open specifications, any tool that implements these requirements can be utilized. This gives users the freedom of choosing an Integrated Development Environment (IDE) of their choice. Furthermore, any integration solver can be implemented within a Modelica tool to simulate Modelica models. Additionally, Modelica is an object-oriented language, enabling code factorization and model re-use to improve the readability of a library by structuring it with object-oriented hierarchy. The developed library is released under an open source license to enable a wider distribution and let the user customize it to their specific needs. This paper describes the iPSL and provides illustrative application examples.

  7. Power Plant Model Validation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less

  8. Multi-focus and multi-level techniques for visualization and analysis of networks with thematic data

    NASA Astrophysics Data System (ADS)

    Cossalter, Michele; Mengshoel, Ole J.; Selker, Ted

    2013-01-01

    Information-rich data sets bring several challenges in the areas of visualization and analysis, even when associated with node-link network visualizations. This paper presents an integration of multi-focus and multi-level techniques that enable interactive, multi-step comparisons in node-link networks. We describe NetEx, a visualization tool that enables users to simultaneously explore different parts of a network and its thematic data, such as time series or conditional probability tables. NetEx, implemented as a Cytoscape plug-in, has been applied to the analysis of electrical power networks, Bayesian networks, and the Enron e-mail repository. In this paper we briefly discuss visualization and analysis of the Enron social network, but focus on data from an electrical power network. Specifically, we demonstrate how NetEx supports the analytical task of electrical power system fault diagnosis. Results from a user study with 25 subjects suggest that NetEx enables more accurate isolation of complex faults compared to an especially designed software tool.

  9. Enzyme catalysis: Evolution made easy

    NASA Astrophysics Data System (ADS)

    Wee, Eugene J. H.; Trau, Matt

    2014-09-01

    Directed evolution is a powerful tool for the development of improved enzyme catalysts. Now, a method that enables an enzyme, its encoding DNA and a fluorescent reaction product to be encapsulated in a gel bead enables the application of directed evolution in an ultra-high-throughput format.

  10. Towards a C2 Poly-Visualization Tool: Leveraging the Power of Social-Network Analysis and GIS

    DTIC Science & Technology

    2011-06-01

    from Magsino.14 AutoMap, a product of CASOS at Carnegie Mellon University, is a text-mining tool that enables the extraction of network data from...enables community leaders to prepare for biological attacks using computational models. BioWar is a CASOS package that combines many factors into a...models, demographically accurate agent modes, wind dispersion models, and an error-diagnostic model. Construct, also developed by CASOS , is a

  11. Space Power: A Theory for Sustaining US Security Through the Information Age

    DTIC Science & Technology

    2011-05-19

    theory in the following manner. First, the great power and major power concepts are presented as tools to analyze the impact of activities in the space...exploration, space enablers, and the space protection concept to fill the gap of current space power theory. Understanding historical power theory...and concept development. Several influential space theorists have provided creative ideas, thoughtful narrative, and generated useful discussion

  12. Design Tools for Reconfigurable Hardware in Orbit (RHinO)

    NASA Technical Reports Server (NTRS)

    French, Mathew; Graham, Paul; Wirthlin, Michael; Larchev, Gregory; Bellows, Peter; Schott, Brian

    2004-01-01

    The Reconfigurable Hardware in Orbit (RHinO) project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. These tools leverage an established FPGA design environment and focus primarily on space effects mitigation and power optimization. The project is creating software to automatically test and evaluate the single-event-upsets (SEUs) sensitivities of an FPGA design and insert mitigation techniques. Extensions into the tool suite will also allow evolvable algorithm techniques to reconfigure around single-event-latchup (SEL) events. In the power domain, tools are being created for dynamic power visualiization and optimization. Thus, this technology seeks to enable the use of Reconfigurable Hardware in Orbit, via an integrated design tool-suite aiming to reduce risk, cost, and design time of multimission reconfigurable space processors using SRAM-based FPGAs.

  13. The Python Spectral Analysis Tool (PySAT) for Powerful, Flexible, and Easy Preprocessing and Machine Learning with Point Spectral Data

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T.; Morris, R. V.; Laura, J.

    2018-04-01

    The PySAT point spectra tool provides a flexible graphical interface, enabling scientists to apply a wide variety of preprocessing and machine learning methods to point spectral data, with an emphasis on multivariate regression.

  14. Modulated Chlorophyll "a" Fluorescence: A Tool for Teaching Photosynthesis

    ERIC Educational Resources Information Center

    Marques da Silva, Jorge; Bernardes da Silva, Anabela; Padua, Mario

    2007-01-01

    "In vivo" chlorophyll "a" fluorescence is a key technique in photosynthesis research. The recent release of a low cost, commercial, modulated fluorometer enables this powerful technology to be used in education. Modulated chlorophyll a fluorescence measurement "in vivo" is here proposed as a tool to demonstrate basic…

  15. Venus Mobile Explorer with RPS for Active Cooling: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Leifer, Stephanie D.; Green, Jacklyn R.; Balint, Tibor S.; Manvi, Ram

    2009-01-01

    We present our findings from a study to evaluate the feasibility of a radioisotope power system (RPS) combined with active cooling to enable a long-duration Venus surface mission. On-board power with active cooling technology featured prominently in both the National Research Council's Decadal Survey and in the 2006 NASA Solar System Exploration Roadmap as mission-enabling for the exploration of Venus. Power and cooling system options were reviewed and the most promising concepts modeled to develop an assessment tool for Venus mission planners considering a variety of future potential missions to Venus, including a Venus Mobile Explorer (either a balloon or rover concept), a long-lived Venus static lander, or a Venus Geophysical Network. The concepts modeled were based on the integration of General Purpose Heat Source (GPHS) modules with different types of Stirling cycle heat engines for power and cooling. Unlike prior investigations which reported on single point design concepts, this assessment tool allows the user to generate either a point design or parametric curves of approximate power and cooling system mass, power level, and number of GPHS modules needed for a "black box" payload housed in a spherical pressure vessel.

  16. The Power and Benefits of Concept Mapping: Measuring Use, Usefulness, Ease of Use, and Satisfaction. Research Report

    ERIC Educational Resources Information Center

    Freeman, Lee A.; Jessup, Leonard M.

    2004-01-01

    The power and benefits of concept mapping rest in four arenas: enabling shared understanding, the inclusion of affect, the balance of power, and client involvement. Concept mapping theory and research indicate concept maps (1) are appropriate tools to assist with communication, (2) are easy to use, and (3) are seen as beneficial by their users. An…

  17. Computer-Aided Engineering Tools | Water Power | NREL

    Science.gov Websites

    energy converters that will provide a full range of simulation capabilities for single devices and arrays simulation of water power technologies on high-performance computers enables the study of complex systems and experimentation. Such simulation is critical to accelerate progress in energy programs within the U.S. Department

  18. CLAST: CUDA implemented large-scale alignment search tool.

    PubMed

    Yano, Masahiro; Mori, Hiroshi; Akiyama, Yutaka; Yamada, Takuji; Kurokawa, Ken

    2014-12-11

    Metagenomics is a powerful methodology to study microbial communities, but it is highly dependent on nucleotide sequence similarity searching against sequence databases. Metagenomic analyses with next-generation sequencing technologies produce enormous numbers of reads from microbial communities, and many reads are derived from microbes whose genomes have not yet been sequenced, limiting the usefulness of existing sequence similarity search tools. Therefore, there is a clear need for a sequence similarity search tool that can rapidly detect weak similarity in large datasets. We developed a tool, which we named CLAST (CUDA implemented large-scale alignment search tool), that enables analyses of millions of reads and thousands of reference genome sequences, and runs on NVIDIA Fermi architecture graphics processing units. CLAST has four main advantages over existing alignment tools. First, CLAST was capable of identifying sequence similarities ~80.8 times faster than BLAST and 9.6 times faster than BLAT. Second, CLAST executes global alignment as the default (local alignment is also an option), enabling CLAST to assign reads to taxonomic and functional groups based on evolutionarily distant nucleotide sequences with high accuracy. Third, CLAST does not need a preprocessed sequence database like Burrows-Wheeler Transform-based tools, and this enables CLAST to incorporate large, frequently updated sequence databases. Fourth, CLAST requires <2 GB of main memory, making it possible to run CLAST on a standard desktop computer or server node. CLAST achieved very high speed (similar to the Burrows-Wheeler Transform-based Bowtie 2 for long reads) and sensitivity (equal to BLAST, BLAT, and FR-HIT) without the need for extensive database preprocessing or a specialized computing platform. Our results demonstrate that CLAST has the potential to be one of the most powerful and realistic approaches to analyze the massive amount of sequence data from next-generation sequencing technologies.

  19. Applications of the pipeline environment for visual informatics and genomics computations

    PubMed Central

    2011-01-01

    Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102

  20. Making Students' Mathematical Explanations Accessible to Teachers through the Use of Digital Recorders and iPads

    ERIC Educational Resources Information Center

    Soto, Melissa Marie; Ambrose, Rebecca

    2016-01-01

    Analyzing students' mathematical explanations can be a powerful tool to enhance teachers' practice, but collecting these explanations can be cumbersome. Here, we describe our quest to find effective tools to make explanations accessible to elementary (K-6th) teachers. First, we describe how digital audio recordings enabled teachers to focus on the…

  1. Orbit Design Based on the Global Maps of Telecom Metrics

    NASA Technical Reports Server (NTRS)

    Lee, Charles H.; Cheung, Kar-Ming; Edwards, Chad; Noreen, Gary K.; Vaisnys, Arvydas

    2004-01-01

    In this paper we describe an orbit design aide tool, called Telecom Orbit Analysis and Simulation Tool(TOAST). Although it can be used for studying and selecting orbits for any planet, we solely concentrate on its use for Mars. By specifying the six orbital elements for an orbit, a time frame of interest, a horizon mask angle, and some telecom parameters such as the transmitting power, frequency, antenna gains, antenna losses, link margin, received threshold powers for the rates, etc. this tool enables the user to view the animation of the orbit in two and three-dimensional different telecom metrics at any point on the Mars, namely the global planetary map.

  2. Department of Defense Space Science and Technology Strategy 2015

    DTIC Science & Technology

    2015-01-01

    solar cells at 34% efficiency enabling higher power spacecraft capability. These solar cells developed by the Air Force Research Laboratory (AFRL...Reduce size, weight, power , cost, and improve thermal management for SATCOM terminals  Support intelligence surveillance and reconnaissance (ISR...Improve understanding and awareness of the Earth-to-Sun environment  Improve space environment forecast capabilities and tools to predict operational

  3. Microbial ecology to manage processes in environmental biotechnology.

    PubMed

    Rittmann, Bruce E

    2006-06-01

    Microbial ecology and environmental biotechnology are inherently tied to each other. The concepts and tools of microbial ecology are the basis for managing processes in environmental biotechnology; and these processes provide interesting ecosystems to advance the concepts and tools of microbial ecology. Revolutionary advancements in molecular tools to understand the structure and function of microbial communities are bolstering the power of microbial ecology. A push from advances in modern materials along with a pull from a societal need to become more sustainable is enabling environmental biotechnology to create novel processes. How do these two fields work together? Five principles illuminate the way: (i) aim for big benefits; (ii) develop and apply more powerful tools to understand microbial communities; (iii) follow the electrons; (iv) retain slow-growing biomass; and (v) integrate, integrate, integrate.

  4. Optical power transfer and communication methods for wireless implantable sensing platforms.

    PubMed

    Mujeeb-U-Rahman, Muhammad; Adalian, Dvin; Chang, Chieh-Feng; Scherer, Axel

    2015-09-01

    Ultrasmall scale implants have recently attracted focus as valuable tools for monitoring both acute and chronic diseases. Semiconductor optical technologies are the key to miniaturizing these devices to the long-sought sub-mm scale, which will enable long-term use of these devices for medical applications. This can also enable the use of multiple implantable devices concurrently to form a true body area network of sensors. We demonstrate optical power transfer techniques and methods to effectively harness this power for implantable devices. Furthermore, we also present methods for optical data transfer from such implants. Simultaneous use of these technologies can result in miniaturized sensing platforms that can allow for large-scale use of such systems in real world applications.

  5. Optical power transfer and communication methods for wireless implantable sensing platforms

    NASA Astrophysics Data System (ADS)

    Mujeeb-U-Rahman, Muhammad; Adalian, Dvin; Chang, Chieh-Feng; Scherer, Axel

    2015-09-01

    Ultrasmall scale implants have recently attracted focus as valuable tools for monitoring both acute and chronic diseases. Semiconductor optical technologies are the key to miniaturizing these devices to the long-sought sub-mm scale, which will enable long-term use of these devices for medical applications. This can also enable the use of multiple implantable devices concurrently to form a true body area network of sensors. We demonstrate optical power transfer techniques and methods to effectively harness this power for implantable devices. Furthermore, we also present methods for optical data transfer from such implants. Simultaneous use of these technologies can result in miniaturized sensing platforms that can allow for large-scale use of such systems in real world applications.

  6. Space-to-Space Power Beaming Enabling High Performance Rapid Geocentric Orbit Transfer

    NASA Technical Reports Server (NTRS)

    Dankanich, John W.; Vassallo, Corinne; Tadge, Megan

    2015-01-01

    The use of electric propulsion is more prevalent than ever, with industry pursuing all electric orbit transfers. Electric propulsion provides high mass utilization through efficient propellant transfer. However, the transfer times become detrimental as the delta V transitions from near-impulsive to low-thrust. Increasing power and therefore thrust has diminishing returns as the increasing mass of the power system limits the potential acceleration of the spacecraft. By using space-to-space power beaming, the power system can be decoupled from the spacecraft and allow significantly higher spacecraft alpha (W/kg) and therefore enable significantly higher accelerations while maintaining high performance. This project assesses the efficacy of space-to-space power beaming to enable rapid orbit transfer while maintaining high mass utilization. Concept assessment requires integrated techniques for low-thrust orbit transfer steering laws, efficient large-scale rectenna systems, and satellite constellation configuration optimization. This project includes the development of an integrated tool with implementation of IPOPT, Q-Law, and power-beaming models. The results highlight the viability of the concept, limits and paths to infusion, and comparison to state-of-the-art capabilities. The results indicate the viability of power beaming for what may be the only approach for achieving the desired transit times with high specific impulse.

  7. Direct writing of metal nanostructures: lithographic tools for nanoplasmonics research.

    PubMed

    Leggett, Graham J

    2011-03-22

    Continued progress in the fast-growing field of nanoplasmonics will require the development of new methods for the fabrication of metal nanostructures. Optical lithography provides a continually expanding tool box. Two-photon processes, as demonstrated by Shukla et al. (doi: 10.1021/nn103015g), enable the fabrication of gold nanostructures encapsulated in dielectric material in a simple, direct process and offer the prospect of three-dimensional fabrication. At higher resolution, scanning probe techniques enable nanoparticle particle placement by localized oxidation, and near-field sintering of nanoparticulate films enables direct writing of nanowires. Direct laser "printing" of single gold nanoparticles offers a remarkable capability for the controlled fabrication of model structures for fundamental studies, particle-by-particle. Optical methods continue to provide a powerful support for research into metamaterials.

  8. Maximization of the annual energy production of wind power plants by optimization of layout and yaw-based wake control: Maximization of wind plant AEP by optimization of layout and wake control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gebraad, Pieter; Thomas, Jared J.; Ning, Andrew

    This paper presents a wind plant modeling and optimization tool that enables the maximization of wind plant annual energy production (AEP) using yaw-based wake steering control and layout changes. The tool is an extension of a wake engineering model describing the steady-state effects of yaw on wake velocity profiles and power productions of wind turbines in a wind plant. To make predictions of a wind plant's AEP, necessary extensions of the original wake model include coupling it with a detailed rotor model and a control policy for turbine blade pitch and rotor speed. This enables the prediction of power productionmore » with wake effects throughout a range of wind speeds. We use the tool to perform an example optimization study on a wind plant based on the Princess Amalia Wind Park. In this case study, combined optimization of layout and wake steering control increases AEP by 5%. The power gains from wake steering control are highest for region 1.5 inflow wind speeds, and they continue to be present to some extent for the above-rated inflow wind speeds. The results show that layout optimization and wake steering are complementary because significant AEP improvements can be achieved with wake steering in a wind plant layout that is already optimized to reduce wake losses.« less

  9. System technology for laser-assisted milling with tool integrated optics

    NASA Astrophysics Data System (ADS)

    Hermani, Jan-Patrick; Emonts, Michael; Brecher, Christian

    2013-02-01

    High strength metal alloys and ceramics offer a huge potential for increased efficiency (e. g. in engine components for aerospace or components for gas turbines). However, mass application is still hampered by cost- and time-consuming end-machining due to long processing times and high tool wear. Laser-induced heating shortly before machining can reduce the material strength and improve machinability significantly. The Fraunhofer IPT has developed and successfully realized a new approach for laser-assisted milling with spindle and tool integrated, co-rotating optics. The novel optical system inside the tool consists of one deflection prism to position the laser spot in front of the cutting insert and one focusing lens. Using a fiber laser with high beam quality the laser spot diameter can be precisely adjusted to the chip size. A high dynamic adaption of the laser power signal according to the engagement condition of the cutting tool was realized in order not to irradiate already machined work piece material. During the tool engagement the laser power is controlled in proportion to the current material removal rate, which has to be calculated continuously. The needed geometric values are generated by a CAD/CAM program and converted into a laser power signal by a real-time controller. The developed milling tool with integrated optics and the algorithm for laser power control enable a multi-axis laser-assisted machining of complex parts.

  10. Validation of the AVM Blast Computational Modeling and Simulation Tool Set

    DTIC Science & Technology

    2015-08-04

    by-construction" methodology is powerful and would not be possible without high -level design languages to support validation and verification. [1,4...to enable the making of informed design decisions.  Enable rapid exploration of the design trade-space for high -fidelity requirements tradeoffs...live-fire tests, the jump height of the target structure is recorded by using either high speed cameras or a string pot. A simple projectile motion

  11. Flash chemistry: flow chemistry that cannot be done in batch.

    PubMed

    Yoshida, Jun-ichi; Takahashi, Yusuke; Nagaki, Aiichiro

    2013-11-04

    Flash chemistry based on high-resolution reaction time control using flow microreactors enables chemical reactions that cannot be done in batch and serves as a powerful tool for laboratory synthesis of organic compounds and for production in chemical and pharmaceutical industries.

  12. A Tropical Marine Microbial Natural Products Geobibliography as an Example of Desktop Exploration of Current Research Using Web Visualisation Tools

    PubMed Central

    Mukherjee, Joydeep; Llewellyn, Lyndon E; Evans-Illidge, Elizabeth A

    2008-01-01

    Microbial marine biodiscovery is a recent scientific endeavour developing at a time when information and other technologies are also undergoing great technical strides. Global visualisation of datasets is now becoming available to the world through powerful and readily available software such as Worldwind™, ArcGIS Explorer™ and Google Earth™. Overlaying custom information upon these tools is within the hands of every scientist and more and more scientific organisations are making data available that can also be integrated into these global visualisation tools. The integrated global view that these tools enable provides a powerful desktop exploration tool. Here we demonstrate the value of this approach to marine microbial biodiscovery by developing a geobibliography that incorporates citations on tropical and near-tropical marine microbial natural products research with Google Earth™ and additional ancillary global data sets. The tools and software used are all readily available and the reader is able to use and install the material described in this article. PMID:19172194

  13. Deciphering the glycosaminoglycan code with the help of microarrays.

    PubMed

    de Paz, Jose L; Seeberger, Peter H

    2008-07-01

    Carbohydrate microarrays have become a powerful tool to elucidate the biological role of complex sugars. Microarrays are particularly useful for the study of glycosaminoglycans (GAGs), a key class of carbohydrates. The high-throughput chip format enables rapid screening of large numbers of potential GAG sequences produced via a complex biosynthesis while consuming very little sample. Here, we briefly highlight the most recent advances involving GAG microarrays built with synthetic or naturally derived oligosaccharides. These chips are powerful tools for characterizing GAG-protein interactions and determining structure-activity relationships for specific sequences. Thereby, they contribute to decoding the information contained in specific GAG sequences.

  14. Chance-Constrained AC Optimal Power Flow: Reformulations and Efficient Algorithms

    DOE PAGES

    Roald, Line Alnaes; Andersson, Goran

    2017-08-29

    Higher levels of renewable electricity generation increase uncertainty in power system operation. To ensure secure system operation, new tools that account for this uncertainty are required. Here, in this paper, we adopt a chance-constrained AC optimal power flow formulation, which guarantees that generation, power flows and voltages remain within their bounds with a pre-defined probability. We then discuss different chance-constraint reformulations and solution approaches for the problem. Additionally, we first discuss an analytical reformulation based on partial linearization, which enables us to obtain a tractable representation of the optimization problem. We then provide an efficient algorithm based on an iterativemore » solution scheme which alternates between solving a deterministic AC OPF problem and assessing the impact of uncertainty. This more flexible computational framework enables not only scalable implementations, but also alternative chance-constraint reformulations. In particular, we suggest two sample based reformulations that do not require any approximation or relaxation of the AC power flow equations.« less

  15. Contributions of Traditional Web 1.0 Tools e.g. Email and Web 2.0 Tools e.g. Weblog towards Knowledge Management

    ERIC Educational Resources Information Center

    Dehinbo, Johnson

    2010-01-01

    The use of email utilizes the power of Web 1.0 to enable users to access their email from any computer and mobile devices that is connected to the Internet making email valuable in acquiring and transferring knowledge. But the advent of Web 2.0 and social networking seems to indicate certain limitations of email. The use of social networking seems…

  16. Western blotting using chemiluminescent substrates.

    PubMed

    Alegria-Schaffer, Alice

    2014-01-01

    Western blotting is a powerful and commonly used tool to identify and quantify a specific protein in a complex mixture (Towbin et al., 1979). The technique enables indirect detection of protein samples immobilized on a nitrocellulose or polyvinylidene fluoride (PVDF) membrane. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Portraits of Learning

    ERIC Educational Resources Information Center

    Hogan, Kevin

    2008-01-01

    This article presents this year's winners of Tech & Learning's student photography contest. Selected from an overwhelming six thousand entries, these images are proof that "the kids today" are not only extremely talented, but also powerfully enabled by digital tools that can help them express and communicate those talents. The theme of the…

  18. Item Response Theory as an Efficient Tool to Describe a Heterogeneous Clinical Rating Scale in De Novo Idiopathic Parkinson's Disease Patients.

    PubMed

    Buatois, Simon; Retout, Sylvie; Frey, Nicolas; Ueckert, Sebastian

    2017-10-01

    This manuscript aims to precisely describe the natural disease progression of Parkinson's disease (PD) patients and evaluate approaches to increase the drug effect detection power. An item response theory (IRT) longitudinal model was built to describe the natural disease progression of 423 de novo PD patients followed during 48 months while taking into account the heterogeneous nature of the MDS-UPDRS. Clinical trial simulations were then used to compare drug effect detection power from IRT and sum of item scores based analysis under different analysis endpoints and drug effects. The IRT longitudinal model accurately describes the evolution of patients with and without PD medications while estimating different progression rates for the subscales. When comparing analysis methods, the IRT-based one consistently provided the highest power. IRT is a powerful tool which enables to capture the heterogeneous nature of the MDS-UPDRS.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NREL developed a modeling and experimental strategy to characterize thermal performance of materials. The technique provides critical data on thermal properties with relevance for electronics packaging applications. Thermal contact resistance and bulk thermal conductivity were characterized for new high-performance materials such as thermoplastics, boron-nitride nanosheets, copper nanowires, and atomically bonded layers. The technique is an important tool for developing designs and materials that enable power electronics packaging with small footprint, high power density, and low cost for numerous applications.

  20. ColorMoves: Optimizing Color's Potential for Exploration and Communication of Data

    NASA Astrophysics Data System (ADS)

    Samsel, F.

    2017-12-01

    Color is the most powerful perceptual channel available for exposing and communicating data. Most visualizations are rendered in one of a handful of common colormaps - the rainbow, cool-warm, heat map and viridis. These maps meet the basic criteria for encoding data - perceptual uniformity and reasonable discriminatory power. However, as the size and complexity of data grows, our need to optimize the potential of color grows. The ability to expose greater detail and differentiate between multiple variables becomes ever more important. To meet this need we have created ColorMoves, an interactive colormap construction tool that enables scientists to quickly and easily align a concentration contrast with the data ranges of interest. Perceptual research tells us that luminance is the strongest contrast and thus provides the highest degree of perceptual discrimination. However, the most commonly used colormaps contain a limited range of luminance contrast. ColorMoves enables interactive constructing colormaps enabling one to distribute the luminance where is it most needed. The interactive interface enables optimal placement of the color scales. The ability to watch the changes on ones data, in real time makes precision adjustment quick and easy. By enabling more precise placement and multiple ranges of luminance one can construct colomaps containing greater discriminatory power. By selecting from the wide range of color scale hues scientists can create colormaps intuitive to their subject. ColorMoves is comprised of four main components: a set of 40 color scales; a histogram of the data distribution; a viewing area showing the colormap on your data; and the controls section. The 40 color scales span the spectrum of hues, saturation levels and value distributions. The histogram of the data distribution enables placement of the color scales in precise locations. The viewing area show is the impact of changes on the data in real time. The controls section enables export of the constructed colormaps for use in tools such as ParaView and Matplotlib. For a clearer understanding of ColorMoves capability we recommend trying it out at SciVisColor.org.

  1. Power System Simulations For The Globalstar2 Mission Using The PowerCap Software

    NASA Astrophysics Data System (ADS)

    Defoug, S.; Pin, R.

    2011-10-01

    The Globalstar system aims to enable customers to communicate all around the world thanks to its constellation of 48 LEO satellites. Thales Alenia Space is in charge of the design and manufacturing of the second generation of the Globalstar satellites. For such a long duration mission (15 years) and with a payload power consumption varying incessantly, the optimization of the solar arrays and battery has to be consolidated by an accurate power simulation tool. After a general overview of the Globalstar power system and of the PowerCap software, this paper presents the dedicated version elaborated for the GlobalStar2 mission, the simulations results and their correlation with the tests.

  2. Performance and Behavioral Outcomes in Technology-Supported Learning: The Role of Interactive Multimedia

    ERIC Educational Resources Information Center

    Passerini, Katia

    2007-01-01

    Understanding the impact of different technological media on the achievement of instructional goals enables the delivery of a subject matter more effectively. Among the various instructional technologies that advance learning, educators and practitioners recurrently identify interactive multimedia as a very powerful tool for instruction and…

  3. Integrating Digital Video Technology in the Classroom

    ERIC Educational Resources Information Center

    Lim, Jon; Pellett, Heidi Henschel; Pellett, Tracy

    2009-01-01

    Digital video technology can be a powerful tool for teaching and learning. It enables students to develop a variety of skills including research, communication, decision-making, problem-solving, and other higher-order critical-thinking skills. In addition, digital video technology has the potential to enrich university classroom curricula, enhance…

  4. Initiating a Programmatic Assessment Report

    ERIC Educational Resources Information Center

    Berkaliev, Zaur; Devi, Shavila; Fasshauer, Gregory E.; Hickernell, Fred J.; Kartal, Ozgul; Li, Xiaofan; McCray, Patrick; Whitney, Stephanie; Zawojewski, Judith S.

    2014-01-01

    In the context of a department of applied mathematics, a program assessment was conducted to assess the departmental goal of enabling undergraduate students to recognize, appreciate, and apply the power of computational tools in solving mathematical problems that cannot be solved by hand, or would require extensive and tedious hand computation. A…

  5. Identifying Secondary-School Students' Difficulties When Reading Visual Representations Displayed in Physics Simulations

    ERIC Educational Resources Information Center

    López, Víctor; Pintó, Roser

    2017-01-01

    Computer simulations are often considered effective educational tools, since their visual and communicative power enable students to better understand physical systems and phenomena. However, previous studies have found that when students read visual representations some reading difficulties can arise, especially when these are complex or dynamic…

  6. Assessing the Accessibility of Online Learning

    ERIC Educational Resources Information Center

    Badge, Joanne L.; Dawson, Emma; Cann, Alan J.; Scott, Jon

    2008-01-01

    A wide range of tools is now available to enable teaching practitioners to create web-based educational materials from PowerPoint presentations, adding a variety of different digital media, such as audio and animation. The pilot study described in this paper compared three different systems for producing multimedia presentations from existing…

  7. Processing MPI Datatypes Outside MPI

    NASA Astrophysics Data System (ADS)

    Ross, Robert; Latham, Robert; Gropp, William; Lusk, Ewing; Thakur, Rajeev

    The MPI datatype functionality provides a powerful tool for describing structured memory and file regions in parallel applications, enabling noncontiguous data to be operated on by MPI communication and I/O routines. However, no facilities are provided by the MPI standard to allow users to efficiently manipulate MPI datatypes in their own codes.

  8. Status of Low Thrust Work at JSC

    NASA Technical Reports Server (NTRS)

    Condon, Gerald L.

    2004-01-01

    High performance low thrust (solar electric, nuclear electric, variable specific impulse magnetoplasma rocket) propulsion offers a significant benefit to NASA missions beyond low Earth orbit. As NASA (e.g., Prometheus Project) endeavors to develop these propulsion systems and associated power supplies, it becomes necessary to develop a refined trajectory design capability that will allow engineers to develop future robotic and human mission designs that take advantage of this new technology. This ongoing work addresses development of a trajectory design and optimization tool for assessing low thrust (and other types) trajectories. This work targets to advance the state of the art, enable future NASA missions, enable science drivers, and enhance education. This presentation provides a summary of the low thrust-related JSC activities under the ISP program and specifically, provides a look at a new release of a multi-gravity, multispacecraft trajectory optimization tool (Copernicus) along with analysis performed using this tool over the past year.

  9. Cooperative problem solving with personal mobile information tools in hospitals.

    PubMed

    Buchauer, A; Werner, R; Haux, R

    1998-01-01

    Health-care professionals have a broad range of needs for information and cooperation while working at different points of care (e.g., outpatient departments, wards, and functional units such as operating theaters). Patient-related data and medical knowledge have to be widely available to support high-quality patient care. Furthermore, due to the increased specialization of health-care professionals, efficient collaboration is required. Personal mobile information tools have a considerable potential to realize almost ubiquitous information and collaborative support. They enable to unite the functionality of conventional tools such as paper forms, dictating machines, and pagers into one tool. Moreover, they can extend the support already provided by clinical workstations. An approach is described for the integration of mobile information tools with heterogeneous hospital information systems. This approach includes identification of functions which should be provided on mobile tools. Major functions are the presentation of medical records and reports, electronic mailing to support interpersonal communication, and the provision of editors for structured clinical documentation. To realize those functions on mobile tools, we propose a document-based client-server architecture that enables mobile information tools to interoperate with existing computer-based application systems. Open application systems and powerful, partially wireless, hospital-wide networks are the prerequisites for the introduction of mobile information tools.

  10. Text mining resources for the life sciences.

    PubMed

    Przybyła, Piotr; Shardlow, Matthew; Aubin, Sophie; Bossy, Robert; Eckart de Castilho, Richard; Piperidis, Stelios; McNaught, John; Ananiadou, Sophia

    2016-01-01

    Text mining is a powerful technology for quickly distilling key information from vast quantities of biomedical literature. However, to harness this power the researcher must be well versed in the availability, suitability, adaptability, interoperability and comparative accuracy of current text mining resources. In this survey, we give an overview of the text mining resources that exist in the life sciences to help researchers, especially those employed in biocuration, to engage with text mining in their own work. We categorize the various resources under three sections: Content Discovery looks at where and how to find biomedical publications for text mining; Knowledge Encoding describes the formats used to represent the different levels of information associated with content that enable text mining, including those formats used to carry such information between processes; Tools and Services gives an overview of workflow management systems that can be used to rapidly configure and compare domain- and task-specific processes, via access to a wide range of pre-built tools. We also provide links to relevant repositories in each section to enable the reader to find resources relevant to their own area of interest. Throughout this work we give a special focus to resources that are interoperable-those that have the crucial ability to share information, enabling smooth integration and reusability. © The Author(s) 2016. Published by Oxford University Press.

  11. Text mining resources for the life sciences

    PubMed Central

    Shardlow, Matthew; Aubin, Sophie; Bossy, Robert; Eckart de Castilho, Richard; Piperidis, Stelios; McNaught, John; Ananiadou, Sophia

    2016-01-01

    Text mining is a powerful technology for quickly distilling key information from vast quantities of biomedical literature. However, to harness this power the researcher must be well versed in the availability, suitability, adaptability, interoperability and comparative accuracy of current text mining resources. In this survey, we give an overview of the text mining resources that exist in the life sciences to help researchers, especially those employed in biocuration, to engage with text mining in their own work. We categorize the various resources under three sections: Content Discovery looks at where and how to find biomedical publications for text mining; Knowledge Encoding describes the formats used to represent the different levels of information associated with content that enable text mining, including those formats used to carry such information between processes; Tools and Services gives an overview of workflow management systems that can be used to rapidly configure and compare domain- and task-specific processes, via access to a wide range of pre-built tools. We also provide links to relevant repositories in each section to enable the reader to find resources relevant to their own area of interest. Throughout this work we give a special focus to resources that are interoperable—those that have the crucial ability to share information, enabling smooth integration and reusability. PMID:27888231

  12. An Integrated Software Package to Enable Predictive Simulation Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang

    The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less

  13. Towards the Integration of APECS with VE-Suite to Create a Comprehensive Virtual Engineering Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCorkle, D.; Yang, C.; Jordan, T.

    2007-06-01

    Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less

  14. Spinoff 2012

    NASA Technical Reports Server (NTRS)

    2013-01-01

    Topics covered include: Water Treatment Technologies Inspire Healthy Beverages; Dietary Formulas Fortify Antioxidant Supplements; Rovers Pave the Way for Hospital Robots; Dry Electrodes Facilitate Remote Health Monitoring; Telescope Innovations Improve Speed, Accuracy of Eye Surgery; Superconductors Enable Lower Cost MRI Systems; Anti-Icing Formulas Prevent Train Delays; Shuttle Repair Tools Automate Vehicle Maintenance; Pressure-Sensitive Paints Advance Rotorcraft Design Testing; Speech Recognition Interfaces Improve Flight Safety; Polymers Advance Heat Management Materials for Vehicles; Wireless Sensors Pinpoint Rotorcraft Troubles; Ultrasonic Detectors Safely Identify Dangerous, Costly Leaks; Detectors Ensure Function, Safety of Aircraft Wiring; Emergency Systems Save Tens of Thousands of Lives; Oxygen Assessments Ensure Safer Medical Devices; Collaborative Platforms Aid Emergency Decision Making; Space-Inspired Trailers Encourage Exploration on Earth; Ultra-Thin Coatings Beautify Art; Spacesuit Materials Add Comfort to Undergarments; Gigapixel Images Connect Sports Teams with Fans; Satellite Maps Deliver More Realistic Gaming; Elemental Scanning Devices Authenticate Works of Art; Microradiometers Reveal Ocean Health, Climate Change; Sensors Enable Plants to Text Message Farmers; Efficient Cells Cut the Cost of Solar Power; Shuttle Topography Data Inform Solar Power Analysis; Photocatalytic Solutions Create Self-Cleaning Surfaces; Concentrators Enhance Solar Power Systems; Innovative Coatings Potentially Lower Facility Maintenance Costs; Simulation Packages Expand Aircraft Design Options; Web Solutions Inspire Cloud Computing Software; Behavior Prediction Tools Strengthen Nanoelectronics; Power Converters Secure Electronics in Harsh Environments; Diagnostics Tools Identify Faults Prior to Failure; Archiving Innovations Preserve Essential Historical Records; Meter Designs Reduce Operation Costs for Industry; Commercial Platforms Allow Affordable Space Research; Fiber Optics Deliver Real-Time Structural Monitoring; Camera Systems Rapidly Scan Large Structures; Terahertz Lasers Reveal Information for 3D Images; Thin Films Protect Electronics from Heat and Radiation; Interferometers Sharpen Measurements for Better Telescopes; and Vision Systems Illuminate Industrial Processes.

  15. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    PubMed Central

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations. PMID:25340050

  16. A fuzzy model for achieving lean attributes for competitive advantages development using AHP-QFD-PROMETHEE

    NASA Astrophysics Data System (ADS)

    Roghanian, E.; Alipour, Mohammad

    2014-06-01

    Lean production has become an integral part of the manufacturing landscape as its link with superior performance and its ability to provide competitive advantage is well accepted among academics and practitioners. Lean production helps producers in overcoming the challenges organizations face through using powerful tools and enablers. However, most companies are faced with restricted resources such as financial and human resources, time, etc., in using these enablers, and are not capable of implementing all these techniques. Therefore, identifying and selecting the most appropriate and efficient tool can be a significant challenge for many companies. Hence, this literature seeks to combine competitive advantages, lean attributes, and lean enablers to determine the most appropriate enablers for improvement of lean attributes. Quality function deployment in fuzzy environment and house of quality matrix are implemented. Throughout the methodology, fuzzy logic is the basis for translating linguistic judgments required for the relationships and correlation matrix to numerical values. Moreover, for final ranking of lean enablers, a multi-criteria decision-making method (PROMETHEE) is adopted. Finally, a case study in automotive industry is presented to illustrate the implementation of the proposed methodology.

  17. KETCindy--Collaboration of Cinderella and KETpic Reports on CADGME 2014 Conference Working Group

    ERIC Educational Resources Information Center

    Kaneko, Masataka; Yamashita, Satoshi; Kitahara, Kiyoshi; Maeda, Yoshifumi; Nakamura, Yasuyuki; Kortenkamp, Ulrich; Takato, Setsuo

    2015-01-01

    Dynamic Geometry Software (DGS) is a powerful tool which enables students to move geometric objects interactively. Through experimental simulations with DGS, mathematical facts and background mechanisms are accessible to students. However, especially when those facts and mechanisms are complicated, it is not so easy for some students to record and…

  18. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    ERIC Educational Resources Information Center

    Walters, Charles David

    2017-01-01

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008)…

  19. Unleashing the Power of the 21st Century Community College: Maximizing Labor Market Responsiveness

    ERIC Educational Resources Information Center

    MacAllum, Keith; Yoder, Karla; Poliakoff, Anne Rogers

    2004-01-01

    To help all community colleges unleash their potential for workforce and economic development, the U.S. Department of Education, Office of Vocational and Adult Education sponsored the Community College Labor Market Responsiveness (CCLMR) Initiative. This project sought to develop and disseminate information and tools enabling colleges to keep pace…

  20. 29 CFR 1917.43 - Powered industrial trucks.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... engagement hidden from the operator, a means shall be provided to enable the operator to determine that the... employees. Employees may be elevated by fork lift trucks only when a platform is secured to the lifting...) if tools or other objects could fall on employees below. (iii) An employee shall be at the truck's...

  1. 29 CFR 1917.43 - Powered industrial trucks.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... engagement hidden from the operator, a means shall be provided to enable the operator to determine that the... employees. Employees may be elevated by fork lift trucks only when a platform is secured to the lifting...) if tools or other objects could fall on employees below. (iii) An employee shall be at the truck's...

  2. Scripting for Collaborative Search Computer-Supported Classroom Activities

    ERIC Educational Resources Information Center

    Verdugo, Renato; Barros, Leonardo; Albornoz, Daniela; Nussbaum, Miguel; McFarlane, Angela

    2014-01-01

    Searching online is one of the most powerful resources today's students have for accessing information. Searching in groups is a daily practice across multiple contexts; however, the tools we use for searching online do not enable collaborative practices and traditional search models consider a single user navigating online in solitary. This paper…

  3. GIS in Evaluation: Utilizing the Power of Geographic Information Systems to Represent Evaluation Data

    ERIC Educational Resources Information Center

    Azzam, Tarek; Robinson, David

    2013-01-01

    This article provides an introduction to geographic information systems (GIS) and how the technology can be used to enhance evaluation practice. As a tool, GIS enables evaluators to incorporate contextual features (such as accessibility of program sites or community health needs) into evaluation designs and highlights the interactions between…

  4. Ideas for a Teaching Sequence for the Concept of Energy

    ERIC Educational Resources Information Center

    Duit, Reinders; Neumann, Knut

    2014-01-01

    The energy concept is one of the most important ideas for students to understand. Looking at phenomena through the lens of energy provides powerful tools to model, analyse and predict phenomena in the scientific disciplines. The cross-disciplinary nature of the energy concept enables students to look at phenomena from different angles, helping…

  5. IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.

    This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less

  6. Natural Products Synthesis: Enabling Tools to Penetrate Nature’s Secrets of Biogenesis and Biomechanism†

    PubMed Central

    Williams, Robert M.

    2011-01-01

    Selected examples from our laboratory of how synthetic technology platforms developed for the total synthesis of several disparate families of natural products was harnessed to penetrate biomechanistic and/or biosynthetic queries is discussed. Unexpected discoveries of biomechanistic reactivity and/or penetrating the biogenesis of naturally occurring substances were made possible through access to substances available only through chemical synthesis. Hypothesis-driven total synthesis programs are emerging as very useful conceptual templates for penetrating and exploiting the inherent reactivity of biologically active natural substances. In many instances, new enabling synthetic technologies were required to be developed. The examples demonstrate the often un-tapped richness of complex molecule synthesis to provide powerful tools to understand, manipulate and exploit Nature’s vast and creative palette of secondary metabolites. PMID:21438619

  7. HMDs as enablers of situation awareness: the OODA loop and sense-making

    NASA Astrophysics Data System (ADS)

    Melzer, James E.

    2012-06-01

    Helmet-Mounted Displays have been shown to be powerful tools that can unlock the pilot from the interior of the cockpit or the forward line of sight of the Head-Up Display. Imagery that is presented in one of three reference frames can enable the pilots to do their job more effectively while simultaneously decreasing workload. This paper will review key attributes of Situation Awareness, the Observe/Orient/Decide/Act (OODA) Loop and Sensemaking and how HMDs can aid the pilot in achieving these ideal cognitive states.

  8. The role of CORBA in enabling telemedicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forslund, D.W.

    1997-07-01

    One of the most powerful tools available for telemedicine is a multimedia medical record accessible over a wide area and simultaneously editable by multiple physicians. The ability to do this through an intuitive interface linking multiple distributed data repositories while maintaining full data integrity is a fundamental enabling technology in healthcare. The author discusses the role of distributed object technology using CORBA in providing this capability including an example of such a system (TeleMed) which can be accessed through the World Wide Web. Issues of security, scalability, data integrity, and useability are emphasized.

  9. High-energy x-ray scattering studies of battery materials

    DOE PAGES

    Glazer, Matthew P. B.; Okasinski, John S.; Almer, Jonathan D.; ...

    2016-06-08

    High-energy x-ray (HEX) scattering is a sensitive and powerful tool to nondestructively probe the atomic and mesoscale structures of battery materials under synthesis and operational conditions. The penetration power of HEXs enables the use of large, practical samples and realistic environments, allowing researchers to explore the inner workings of batteries in both laboratory and commercial formats. This article highlights the capability and versatility of HEX techniques, particularly from synchrotron sources, to elucidate materials synthesis processes and thermal instability mechanisms in situ, to understand (dis)charging mechanisms in operando under a variety of cycling conditions, and to spatially resolve electrode/electrolyte responses tomore » highlight connections between inhomogeneity and performance. Such studies have increased our understanding of the fundamental mechanisms underlying battery performance. Here, by deepening our understanding of the linkages between microstructure and overall performance, HEXs represent a powerful tool for validating existing batteries and shortening battery-development timelines.« less

  10. High-energy x-ray scattering studies of battery materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glazer, Matthew P. B.; Okasinski, John S.; Almer, Jonathan D.

    High-energy x-ray (HEX) scattering is a sensitive and powerful tool to nondestructively probe the atomic and mesoscale structures of battery materials under synthesis and operational conditions. The penetration power of HEXs enables the use of large, practical samples and realistic environments, allowing researchers to explore the inner workings of batteries in both laboratory and commercial formats. This article highlights the capability and versatility of HEX techniques, particularly from synchrotron sources, to elucidate materials synthesis processes and thermal instability mechanisms in situ, to understand (dis)charging mechanisms in operando under a variety of cycling conditions, and to spatially resolve electrode/electrolyte responses tomore » highlight connections between inhomogeneity and performance. Such studies have increased our understanding of the fundamental mechanisms underlying battery performance. Here, by deepening our understanding of the linkages between microstructure and overall performance, HEXs represent a powerful tool for validating existing batteries and shortening battery-development timelines.« less

  11. Indentation-Enabled In Situ Mechanical Characterization of Micro/Nanopillars in Electron Microscopes

    NASA Astrophysics Data System (ADS)

    Guo, Qiang; Fu, Xidan; Guo, Xiaolei; Liu, Zhiying; Shi, Yan; Zhang, Di

    2018-04-01

    Indentation-enabled micro/nanomechanical characterization of small-scale specimens provides powerful new tools for probing materials properties that were once unattainable by conventional experimental methods. Recent advancement in instrumentation further allows mechanical testing to be carried out in situ in electron microscopes, with high spatial and temporal resolution. This review discusses the recent development of nanoindentation-enabled in situ mechanical testing in electron microscopes, with an emphasis on the study of micro/nanopillars. Focus is given to novel applications beyond simple compressive and tensile testing that have been developed in the past few years, and limitations and possible future research directions in this field are proposed and discussed.

  12. Concepts for laser beam parameter monitoring during industrial mass production

    NASA Astrophysics Data System (ADS)

    Harrop, Nicholas J.; Maerten, Otto; Wolf, Stefan; Kramer, Reinhard

    2017-02-01

    In today's industrial mass production, lasers have become an established tool for a variety of processes. As with any other tool, mechanical or otherwise, the laser and its ancillary components are prone to wear and ageing. Monitoring of these ageing processes at full operating power of an industrial laser is challenging for a range of reasons. Not only the damage threshold of the measurement device itself, but also cycle time constraints in industrial processing are just two of these challenges. Power measurement, focus spot size or full beam caustic measurements are being implemented in industrial laser systems. The scope of the measurement and the amount of data collected is limited by the above mentioned cycle time, which in some cases can only be a few seconds. For successful integration of these measurement systems into automated production lines, the devices must be equipped with standardized communication interfaces, enabling a feedback loop from the measurement device to the laser processing systems. If necessary these measurements can be performed before each cycle. Power is determined with either static or dynamic calorimetry while camera and scanning systems are used for beam profile analysis. Power levels can be measured from 25W up to 20 kW, with focus spot sizes between 10μm and several millimeters. We will show, backed by relevant statistical data, that defects or contamination of the laser beam path can be detected with applied measurement systems, enabling a quality control chain to prevent process defects.

  13. Making and Missing Connections: Exploring Twitter Chats as a Learning Tool in a Preservice Teacher Education Course

    ERIC Educational Resources Information Center

    Hsieh, Betina

    2017-01-01

    Research on social media use in education indicates that network-based connections can enable powerful teacher learning opportunities. Using a connectivist theoretical framework (Siemens, 2005), this study focuses on secondary teacher candidates (TCs) who completed, archived, and reflected upon 1-hour Twitter chats (N = 39) to explore the promise…

  14. No Kidding - Exploring the Effects of Stories through the Window of Schema Theory

    ERIC Educational Resources Information Center

    Lee, Chwee Beng; Tsai, I-Chun

    2004-01-01

    People encounter stories everyday and everywhere. They are the oldest and most natural form of sense making (Jonassen, Strobel, & Gottdenker, 2004). It is a powerful tool that enables one to gain knowledge, understand phenomenon, remembers the unusual and to interact with the people around them. This paper proposes that stories can create an…

  15. Microfluidic-based mini-metagenomics enables discovery of novel microbial lineages from complex environmental samples.

    PubMed

    Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R

    2017-07-05

    Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell.

  16. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  17. Parallel dispatch: a new paradigm of electrical power system dispatch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jun Jason; Wang, Fei-Yue; Wang, Qiang

    Modern power systems are evolving into sociotechnical systems with massive complexity, whose real-time operation and dispatch go beyond human capability. Thus, the need for developing and applying new intelligent power system dispatch tools are of great practical significance. In this paper, we introduce the overall business model of power system dispatch, the top level design approach of an intelligent dispatch system, and the parallel intelligent technology with its dispatch applications. We expect that a new dispatch paradigm, namely the parallel dispatch, can be established by incorporating various intelligent technologies, especially the parallel intelligent technology, to enable secure operation of complexmore » power grids, extend system operators U+02BC capabilities, suggest optimal dispatch strategies, and to provide decision-making recommendations according to power system operational goals.« less

  18. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etmektzoglou, A; Mishra, P; Svatos, M

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomesmore » available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly translate research ideas into machine readable scripts without programming knowledge. As an open source initiative, it also enables researcher collaboration on future developments. I am a full time employee at Varian Medical Systems, Palo Alto, California.« less

  19. Synthesizing parallel imaging applications using the CAP (computer-aided parallelization) tool

    NASA Astrophysics Data System (ADS)

    Gennart, Benoit A.; Mazzariol, Marc; Messerli, Vincent; Hersch, Roger D.

    1997-12-01

    Imaging applications such as filtering, image transforms and compression/decompression require vast amounts of computing power when applied to large data sets. These applications would potentially benefit from the use of parallel processing. However, dedicated parallel computers are expensive and their processing power per node lags behind that of the most recent commodity components. Furthermore, developing parallel applications remains a difficult task: writing and debugging the application is difficult (deadlocks), programs may not be portable from one parallel architecture to the other, and performance often comes short of expectations. In order to facilitate the development of parallel applications, we propose the CAP computer-aided parallelization tool which enables application programmers to specify at a high-level of abstraction the flow of data between pipelined-parallel operations. In addition, the CAP tool supports the programmer in developing parallel imaging and storage operations. CAP enables combining efficiently parallel storage access routines and image processing sequential operations. This paper shows how processing and I/O intensive imaging applications must be implemented to take advantage of parallelism and pipelining between data access and processing. This paper's contribution is (1) to show how such implementations can be compactly specified in CAP, and (2) to demonstrate that CAP specified applications achieve the performance of custom parallel code. The paper analyzes theoretically the performance of CAP specified applications and demonstrates the accuracy of the theoretical analysis through experimental measurements.

  20. Social Networking Adapted for Distributed Scientific Collaboration

    NASA Technical Reports Server (NTRS)

    Karimabadi, Homa

    2012-01-01

    Share is a social networking site with novel, specially designed feature sets to enable simultaneous remote collaboration and sharing of large data sets among scientists. The site will include not only the standard features found on popular consumer-oriented social networking sites such as Facebook and Myspace, but also a number of powerful tools to extend its functionality to a science collaboration site. A Virtual Observatory is a promising technology for making data accessible from various missions and instruments through a Web browser. Sci-Share augments services provided by Virtual Observatories by enabling distributed collaboration and sharing of downloaded and/or processed data among scientists. This will, in turn, increase science returns from NASA missions. Sci-Share also enables better utilization of NASA s high-performance computing resources by providing an easy and central mechanism to access and share large files on users space or those saved on mass storage. The most common means of remote scientific collaboration today remains the trio of e-mail for electronic communication, FTP for file sharing, and personalized Web sites for dissemination of papers and research results. Each of these tools has well-known limitations. Sci-Share transforms the social networking paradigm into a scientific collaboration environment by offering powerful tools for cooperative discourse and digital content sharing. Sci-Share differentiates itself by serving as an online repository for users digital content with the following unique features: a) Sharing of any file type, any size, from anywhere; b) Creation of projects and groups for controlled sharing; c) Module for sharing files on HPC (High Performance Computing) sites; d) Universal accessibility of staged files as embedded links on other sites (e.g. Facebook) and tools (e.g. e-mail); e) Drag-and-drop transfer of large files, replacing awkward e-mail attachments (and file size limitations); f) Enterprise-level data and messaging encryption; and g) Easy-to-use intuitive workflow.

  1. Statistical power analysis of cardiovascular safety pharmacology studies in conscious rats.

    PubMed

    Bhatt, Siddhartha; Li, Dingzhou; Flynn, Declan; Wisialowski, Todd; Hemkens, Michelle; Steidl-Nichols, Jill

    2016-01-01

    Cardiovascular (CV) toxicity and related attrition are a major challenge for novel therapeutic entities and identifying CV liability early is critical for effective derisking. CV safety pharmacology studies in rats are a valuable tool for early investigation of CV risk. Thorough understanding of data analysis techniques and statistical power of these studies is currently lacking and is imperative for enabling sound decision-making. Data from 24 crossover and 12 parallel design CV telemetry rat studies were used for statistical power calculations. Average values of telemetry parameters (heart rate, blood pressure, body temperature, and activity) were logged every 60s (from 1h predose to 24h post-dose) and reduced to 15min mean values. These data were subsequently binned into super intervals for statistical analysis. A repeated measure analysis of variance was used for statistical analysis of crossover studies and a repeated measure analysis of covariance was used for parallel studies. Statistical power analysis was performed to generate power curves and establish relationships between detectable CV (blood pressure and heart rate) changes and statistical power. Additionally, data from a crossover CV study with phentolamine at 4, 20 and 100mg/kg are reported as a representative example of data analysis methods. Phentolamine produced a CV profile characteristic of alpha adrenergic receptor antagonism, evidenced by a dose-dependent decrease in blood pressure and reflex tachycardia. Detectable blood pressure changes at 80% statistical power for crossover studies (n=8) were 4-5mmHg. For parallel studies (n=8), detectable changes at 80% power were 6-7mmHg. Detectable heart rate changes for both study designs were 20-22bpm. Based on our results, the conscious rat CV model is a sensitive tool to detect and mitigate CV risk in early safety studies. Furthermore, these results will enable informed selection of appropriate models and study design for early stage CV studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation

    PubMed Central

    Biggs, Matthew B.; Papin, Jason A.

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool. PMID:24147108

  3. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    PubMed

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  4. The Internet and managed care: a new wave of innovation.

    PubMed

    Goldsmith, J

    2000-01-01

    Managed care firms have been under siege in the political system and the marketplace for the past few years. The rise of the Internet has brought into being powerful new electronic tools for automating administrative and financial processes in health insurance. These tools may enable new firms or employers to create custom-designed networks connecting their workers and providers, bypassing health plans altogether. Alternatively, health plans may use these tools to create a new consumer-focused business model. While some disintermediation of managed care plans may occur, the barriers to adoption of Internet tools by established plans are quite low. Network computing may provide important leverage for health plans not only to retain their franchises but also to improve their profitability and customer service.

  5. A community practitioner abroad: listening to women in Dailekh, Nepal.

    PubMed

    Nixon, Catherine

    2015-07-01

    Nepal is one of the poorest countries in the world, and has a strongly patriarchal culture. This study reports on methods used to explore women's opportunities in decision-making roles in Dailekh, Nepal. Action-based research was used to support women to identify barriers and to enable them to find solutions which could increase meaningful, practical and genuine representation. Participants were women in nominal positions of leadership in the community and subsequently also men in leadership roles. Focus groups and interviews enabled data to be collected and analysed using participatory and 'rich picture' tools. A five-stage framework approach was used to analyse data. A major theme of 'power' emerged comprised of supporting themes; 'place in society 'formal power,' informal power and 'voice'. These outcomes formed the basis for identifying viable action plans generated by the participants of both genders to promote meaningful involvement of women in community decision making. Women were clear that involving men and women in the actions was key to increasing success.

  6. Use of Transition Modeling to Enable the Computation of Losses for Variable-Speed Power Turbine

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2012-01-01

    To investigate the penalties associated with using a variable speed power turbine (VSPT) in a rotorcraft capable of vertical takeoff and landing, various analysis tools are required. Such analysis tools must be able to model the flow accurately within the operating envelope of VSPT. For power turbines low Reynolds numbers and a wide range of the incidence angles, positive and negative, due to the variation in the shaft speed at relatively fixed corrected flows, characterize this envelope. The flow in the turbine passage is expected to be transitional and separated at high incidence. The turbulence model of Walters and Leylek was implemented in the NASA Glenn-HT code to enable a more accurate analysis of such flows. Two-dimensional heat transfer predictions of flat plate flow and two-dimensional and three-dimensional heat transfer predictions on a turbine blade were performed and reported herein. Heat transfer computations were performed because it is a good marker for transition. The final goal is to be able to compute the aerodynamic losses. Armed with the new transition model, total pressure losses for three-dimensional flow of an Energy Efficient Engine (E3) tip section cascade for a range of incidence angles were computed in anticipation of the experimental data. The results obtained form a loss bucket for the chosen blade.

  7. Co-fuse: a new class discovery analysis tool to identify and prioritize recurrent fusion genes from RNA-sequencing data.

    PubMed

    Paisitkriangkrai, Sakrapee; Quek, Kelly; Nievergall, Eva; Jabbour, Anissa; Zannettino, Andrew; Kok, Chung Hoow

    2018-06-07

    Recurrent oncogenic fusion genes play a critical role in the development of various cancers and diseases and provide, in some cases, excellent therapeutic targets. To date, analysis tools that can identify and compare recurrent fusion genes across multiple samples have not been available to researchers. To address this deficiency, we developed Co-occurrence Fusion (Co-fuse), a new and easy to use software tool that enables biologists to merge RNA-seq information, allowing them to identify recurrent fusion genes, without the need for exhaustive data processing. Notably, Co-fuse is based on pattern mining and statistical analysis which enables the identification of hidden patterns of recurrent fusion genes. In this report, we show that Co-fuse can be used to identify 2 distinct groups within a set of 49 leukemic cell lines based on their recurrent fusion genes: a multiple myeloma (MM) samples-enriched cluster and an acute myeloid leukemia (AML) samples-enriched cluster. Our experimental results further demonstrate that Co-fuse can identify known driver fusion genes (e.g., IGH-MYC, IGH-WHSC1) in MM, when compared to AML samples, indicating the potential of Co-fuse to aid the discovery of yet unknown driver fusion genes through cohort comparisons. Additionally, using a 272 primary glioma sample RNA-seq dataset, Co-fuse was able to validate recurrent fusion genes, further demonstrating the power of this analysis tool to identify recurrent fusion genes. Taken together, Co-fuse is a powerful new analysis tool that can be readily applied to large RNA-seq datasets, and may lead to the discovery of new disease subgroups and potentially new driver genes, for which, targeted therapies could be developed. The Co-fuse R source code is publicly available at https://github.com/sakrapee/co-fuse .

  8. Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems

    NASA Technical Reports Server (NTRS)

    Holda, Julie

    2004-01-01

    The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.

  9. The importance of sound methodology in environmental DNA sampling

    Treesearch

    T. M. Wilcox; K. J. Carim; M. K. Young; K. S. McKelvey; T. W. Franklin; M. K. Schwartz

    2018-01-01

    Environmental DNA (eDNA) sampling - which enables inferences of species’ presence from genetic material in the environment - is a powerful tool for sampling rare fishes. Numerous studies have demonstrated that eDNA sampling generally provides greater probabilities of detection than traditional techniques (e.g., Thomsen et al. 2012; McKelvey et al. 2016; Valentini et al...

  10. Nano/microfluidics for diagnosis of infectious diseases in developing countries

    PubMed Central

    Lee, Won Gu; Kim, Yun-Gon; Chung, Bong Geun; Demirci, Utkan; Khademhosseini, Ali

    2010-01-01

    Nano/microfluidic technologies are emerging as powerful enabling tools for diagnosis and monitoring of infectious diseases in both developed and developing countries. Miniaturized nano/microfluidic platforms that precisely manipulate small fluid volumes can be used to enable medical diagnosis in a more rapid and accurate manner. In particular, these nano/microfluidic diagnostic technologies are potentially applicable to global health applications, because they are disposable, inexpensive, portable, and easy-to-use for detection of infectious diseases. In this paper, we review recent developments in nano/microfluidic technologies for clinical point-of-care applications at resource-limited settings in developing countries. PMID:19954755

  11. Research on Resilience of Power Systems Under Natural Disasters—A Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yezhou; Chen, Chen; Wang, Jianhui

    2016-03-01

    Natural disasters can cause large blackouts. Research into natural disaster impacts on electric power systems is emerging to understand the causes of the blackouts, explore ways to prepare and harden the grid, and increase the resilience of the power grid under such events. At the same time, new technologies such as smart grid, micro grid, and wide area monitoring applications could increase situational awareness as well as enable faster restoration of the system. This paper aims to consolidate and review the progress of the research field towards methods and tools of forecasting natural disaster related power system disturbances, hardening andmore » pre-storm operations, and restoration models. Challenges and future research opportunities are also presented in the paper.« less

  12. Smart signal processing for an evolving electric grid

    NASA Astrophysics Data System (ADS)

    Silva, Leandro Rodrigues Manso; Duque, Calos Augusto; Ribeiro, Paulo F.

    2015-12-01

    Electric grids are interconnected complex systems consisting of generation, transmission, distribution, and active loads, recently called prosumers as they produce and consume electric energy. Additionally, these encompass a vast array of equipment such as machines, power transformers, capacitor banks, power electronic devices, motors, etc. that are continuously evolving in their demand characteristics. Given these conditions, signal processing is becoming an essential assessment tool to enable the engineer and researcher to understand, plan, design, and operate the complex and smart electronic grid of the future. This paper focuses on recent developments associated with signal processing applied to power system analysis in terms of characterization and diagnostics. The following techniques are reviewed and their characteristics and applications discussed: active power system monitoring, sparse representation of power system signal, real-time resampling, and time-frequency (i.e., wavelets) applied to power fluctuations.

  13. Sub-Second Parallel State Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Rice, Mark J.; Glaesemann, Kurt R.

    This report describes the performance of Pacific Northwest National Laboratory (PNNL) sub-second parallel state estimation (PSE) tool using the utility data from the Bonneville Power Administrative (BPA) and discusses the benefits of the fast computational speed for power system applications. The test data were provided by BPA. They are two-days’ worth of hourly snapshots that include power system data and measurement sets in a commercial tool format. These data are extracted out from the commercial tool box and fed into the PSE tool. With the help of advanced solvers, the PSE tool is able to solve each BPA hourly statemore » estimation problem within one second, which is more than 10 times faster than today’s commercial tool. This improved computational performance can help increase the reliability value of state estimation in many aspects: (1) the shorter the time required for execution of state estimation, the more time remains for operators to take appropriate actions, and/or to apply automatic or manual corrective control actions. This increases the chances of arresting or mitigating the impact of cascading failures; (2) the SE can be executed multiple times within time allowance. Therefore, the robustness of SE can be enhanced by repeating the execution of the SE with adaptive adjustments, including removing bad data and/or adjusting different initial conditions to compute a better estimate within the same time as a traditional state estimator’s single estimate. There are other benefits with the sub-second SE, such as that the PSE results can potentially be used in local and/or wide-area automatic corrective control actions that are currently dependent on raw measurements to minimize the impact of bad measurements, and provides opportunities to enhance the power grid reliability and efficiency. PSE also can enable other advanced tools that rely on SE outputs and could be used to further improve operators’ actions and automated controls to mitigate effects of severe events on the grid. The power grid continues to grow and the number of measurements is increasing at an accelerated rate due to the variety of smart grid devices being introduced. A parallel state estimation implementation will have better performance than traditional, sequential state estimation by utilizing the power of high performance computing (HPC). This increased performance positions parallel state estimators as valuable tools for operating the increasingly more complex power grid.« less

  14. Xray: N-dimensional, labeled arrays for analyzing physical datasets in Python

    NASA Astrophysics Data System (ADS)

    Hoyer, S.

    2015-12-01

    Efficient analysis of geophysical datasets requires tools that both preserve and utilize metadata, and that transparently scale to process large datas. Xray is such a tool, in the form of an open source Python library for analyzing the labeled, multi-dimensional array (tensor) datasets that are ubiquitous in the Earth sciences. Xray's approach pairs Python data structures based on the data model of the netCDF file format with the proven design and user interface of pandas, the popular Python data analysis library for labeled tabular data. On top of the NumPy array, xray adds labeled dimensions (e.g., "time") and coordinate values (e.g., "2015-04-10"), which it uses to enable a host of operations powered by these labels: selection, aggregation, alignment, broadcasting, split-apply-combine, interoperability with pandas and serialization to netCDF/HDF5. Many of these operations are enabled by xray's tight integration with pandas. Finally, to allow for easy parallelism and to enable its labeled data operations to scale to datasets that does not fit into memory, xray integrates with the parallel processing library dask.

  15. Real-time development of data acquisition and analysis software for hands-on physiology education in neuroscience: G-PRIME.

    PubMed

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.

  16. Ultrafast disk technology enables next generation micromachining laser sources

    NASA Astrophysics Data System (ADS)

    Heckl, Oliver H.; Weiler, Sascha; Luzius, Severin; Zawischa, Ivo; Sutter, Dirk

    2013-02-01

    Ultrashort pulsed lasers based on thin disk technology have entered the 100 W regime and deliver several tens of MW peak power without chirped pulse amplification. Highest uptime and insensitivity to back reflections make them ideal tools for efficient and cost effective industrial micromachining. Frequency converted versions allow the processing of a large variety of materials. On one hand, thin disk oscillators deliver more than 30 MW peak power directly out of the resonator in laboratory setups. These peak power levels are made possible by recent progress in the scaling of the pulse energy in excess of 40 μJ. At the corresponding high peak intensity, thin disk technology profits from the limited amount of material and hence the manageable nonlinearity within the resonator. Using new broadband host materials like for example the sesquioxides will eventually reduce the pulse duration during high power operation and further increase the peak power. On the other hand industry grade amplifier systems deliver even higher peak power levels. At closed-loop controlled 100W, the TruMicro Series 5000 currently offers the highest average ultrafast power in an industry proven product, and enables efficient micromachining of almost any material, in particular of glasses, ceramics or sapphire. Conventional laser cutting of these materials often requires UV laser sources with pulse durations of several nanoseconds and an average power in the 10 W range. Material processing based on high peak power laser sources makes use of multi-photon absorption processes. This highly nonlinear absorption enables micromachining driven by the fundamental (1030 nm) or frequency doubled (515 nm) wavelength of Yb:YAG. Operation in the IR or green spectral range reduces the complexity and running costs of industrial systems initially based on UV light sources. Where UV wavelength is required, the TruMicro 5360 with a specified UV crystal life-time of more than 10 thousand hours of continues operation at 15W is an excellent choice. Currently this is the world's most powerful industrial sub-10 ps UV laser.

  17. Power-law statistics of neurophysiological processes analyzed using short signals

    NASA Astrophysics Data System (ADS)

    Pavlova, Olga N.; Runnova, Anastasiya E.; Pavlov, Alexey N.

    2018-04-01

    We discuss the problem of quantifying power-law statistics of complex processes from short signals. Based on the analysis of electroencephalograms (EEG) we compare three interrelated approaches which enable characterization of the power spectral density (PSD) and show that an application of the detrended fluctuation analysis (DFA) or the wavelet-transform modulus maxima (WTMM) method represents a useful way of indirect characterization of the PSD features from short data sets. We conclude that despite DFA- and WTMM-based measures can be obtained from the estimated PSD, these tools outperform the standard spectral analysis when characterization of the analyzed regime should be provided based on a very limited amount of data.

  18. "We Don't Twitter, We Facebook": An Alternative Pedagogical Space that Enables Critical Practices in Relation to Writing

    ERIC Educational Resources Information Center

    Reid, Jean

    2011-01-01

    This article explores what happens to interpersonal and power dynamics when tutors use closed-group Facebook pages as a social networking tool in their tutorial groups with first and second year Bachelor of Education (BEd) students at the Wits School of Education (WSoE). It argues that this literacy practice creates an alternative pedagogical…

  19. Turbomachinery

    NASA Technical Reports Server (NTRS)

    Simoneau, Robert J.; Strazisar, Anthony J.; Sockol, Peter M.; Reid, Lonnie; Adamczyk, John J.

    1987-01-01

    The discipline research in turbomachinery, which is directed toward building the tools needed to understand such a complex flow phenomenon, is based on the fact that flow in turbomachinery is fundamentally unsteady or time dependent. Success in building a reliable inventory of analytic and experimental tools will depend on how the time and time-averages are treated, as well as on who the space and space-averages are treated. The raw tools at disposal (both experimentally and computational) are truly powerful and their numbers are growing at a staggering pace. As a result of this power, a case can be made that a situation exists where information is outstripping understanding. The challenge is to develop a set of computational and experimental tools which genuinely increase understanding of the fluid flow and heat transfer in a turbomachine. Viewgraphs outline a philosophy based on working on a stairstep hierarchy of mathematical and experimental complexity to build a system of tools, which enable one to aggressively design the turbomachinery of the next century. Examples of the types of computational and experimental tools under current development at Lewis, with progress to date, are examined. The examples include work in both the time-resolved and time-averaged domains. Finally, an attempt is made to identify the proper place for Lewis in this continuum of research.

  20. The Python Spectral Analysis Tool (PySAT): A Powerful, Flexible, Preprocessing and Machine Learning Library and Interface

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T. G.; Morris, R. V.; Laura, J.; Gaddis, L. R.

    2017-12-01

    Machine learning is a powerful but underutilized approach that can enable planetary scientists to derive meaningful results from the rapidly-growing quantity of available spectral data. For example, regression methods such as Partial Least Squares (PLS) and Least Absolute Shrinkage and Selection Operator (LASSO), can be used to determine chemical concentrations from ChemCam and SuperCam Laser-Induced Breakdown Spectroscopy (LIBS) data [1]. Many scientists are interested in testing different spectral data processing and machine learning methods, but few have the time or expertise to write their own software to do so. We are therefore developing a free open-source library of software called the Python Spectral Analysis Tool (PySAT) along with a flexible, user-friendly graphical interface to enable scientists to process and analyze point spectral data without requiring significant programming or machine-learning expertise. A related but separately-funded effort is working to develop a graphical interface for orbital data [2]. The PySAT point-spectra tool includes common preprocessing steps (e.g. interpolation, normalization, masking, continuum removal, dimensionality reduction), plotting capabilities, and capabilities to prepare data for machine learning such as creating stratified folds for cross validation, defining training and test sets, and applying calibration transfer so that data collected on different instruments or under different conditions can be used together. The tool leverages the scikit-learn library [3] to enable users to train and compare the results from a variety of multivariate regression methods. It also includes the ability to combine multiple "sub-models" into an overall model, a method that has been shown to improve results and is currently used for ChemCam data [4]. Although development of the PySAT point-spectra tool has focused primarily on the analysis of LIBS spectra, the relevant steps and methods are applicable to any spectral data. The tool is available at https://github.com/USGS-Astrogeology/PySAT_Point_Spectra_GUI. [1] Clegg, S.M., et al. (2017) Spectrochim Acta B. 129, 64-85. [2] Gaddis, L. et al. (2017) 3rd Planetary Data Workshop, #1986. [3] http://scikit-learn.org/ [4] Anderson, R.B., et al. (2017) Spectrochim. Acta B. 129, 49-57.

  1. Frequency Domain Modeling of SAW Devices

    NASA Technical Reports Server (NTRS)

    Wilson, W. C.; Atkinson, G. M.

    2007-01-01

    New SAW sensors for integrated vehicle health monitoring of aerospace vehicles are being investigated. SAW technology is low cost, rugged, lightweight, and extremely low power. However, the lack of design tools for MEMS devices in general, and for Surface Acoustic Wave (SAW) devices specifically, has led to the development of tools that will enable integrated design, modeling, simulation, analysis and automatic layout generation of SAW devices. A frequency domain model has been created. The model is mainly first order, but it includes second order effects from triple transit echoes. This paper presents the model and results from the model for a SAW delay line device.

  2. 100 years of Drosophila research and its impact on vertebrate neuroscience: a history lesson for the future.

    PubMed

    Bellen, Hugo J; Tong, Chao; Tsuda, Hiroshi

    2010-07-01

    Discoveries in fruit flies have greatly contributed to our understanding of neuroscience. The use of an unparalleled wealth of tools, many of which originated between 1910–1960, has enabled milestone discoveries in nervous system development and function. Such findings have triggered and guided many research efforts in vertebrate neuroscience. After 100 years, fruit flies continue to be the choice model system for many neuroscientists. The combinational use of powerful research tools will ensure that this model organism will continue to lead to key discoveries that will impact vertebrate neuroscience.

  3. [Atomic force microscopy: a tool to analyze the viral cycle].

    PubMed

    Bernaud, Julien; Castelnovo, Martin; Muriaux, Delphine; Faivre-Moskalenko, Cendrine

    2015-05-01

    Each step of the HIV-1 life cycle frequently involves a change in the morphology and/or mechanical properties of the viral particle or core. The atomic force microscope (AFM) constitutes a powerful tool for characterizing these physical changes at the scale of a single virus. Indeed, AFM enables the visualization of viral capsids in a controlled physiological environment and to probe their mechanical properties by nano-indentation. Finally, AFM force spectroscopy allows to characterize the affinities between viral envelope proteins and cell receptors at the single molecule level. © 2015 médecine/sciences – Inserm.

  4. 100 years of Drosophila research and its impact on vertebrate neuroscience: a history lesson for the future

    PubMed Central

    Bellen, Hugo J; Tong, Chao; Tsuda, Hiroshi

    2014-01-01

    Discoveries in fruit flies have greatly contributed to our understanding of neuroscience. The use of an unparalleled wealth of tools, many of which originated between 1910–1960, has enabled milestone discoveries in nervous system development and function. Such findings have triggered and guided many research efforts in vertebrate neuroscience. After 100 years, fruit flies continue to be the choice model system for many neuroscientists. The combinational use of powerful research tools will ensure that this model organism will continue to lead to key discoveries that will impact vertebrate neuroscience. PMID:20383202

  5. Self-contained microfluidic systems: a review.

    PubMed

    Boyd-Moss, Mitchell; Baratchi, Sara; Di Venere, Martina; Khoshmanesh, Khashayar

    2016-08-16

    Microfluidic systems enable rapid diagnosis, screening and monitoring of diseases and health conditions using small amounts of biological samples and reagents. Despite these remarkable features, conventional microfluidic systems rely on bulky expensive external equipment, which hinders their utility as powerful analysis tools outside of research laboratories. 'Self-contained' microfluidic systems, which contain all necessary components to facilitate a complete assay, have been developed to address this limitation. In this review, we provide an in-depth overview of self-contained microfluidic systems. We categorise these systems based on their operating mechanisms into three major groups: passive, hand-powered and active. Several examples are provided to discuss the structure, capabilities and shortcomings of each group. In particular, we discuss the self-contained microfluidic systems enabled by active mechanisms, due to their unique capability for running multi-step and highly controllable diagnostic assays. Integration of self-contained microfluidic systems with the image acquisition and processing capabilities of smartphones, especially those equipped with accessory optical components, enables highly sensitive and quantitative assays, which are discussed. Finally, the future trends and possible solutions to expand the versatility of self-contained, stand-alone microfluidic platforms are outlined.

  6. Microfluidic-based mini-metagenomics enables discovery of novel microbial lineages from complex environmental samples

    PubMed Central

    Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R

    2017-01-01

    Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell. DOI: http://dx.doi.org/10.7554/eLife.26580.001 PMID:28678007

  7. Microarray Я US: a user-friendly graphical interface to Bioconductor tools that enables accurate microarray data analysis and expedites comprehensive functional analysis of microarray results.

    PubMed

    Dai, Yilin; Guo, Ling; Li, Meng; Chen, Yi-Bu

    2012-06-08

    Microarray data analysis presents a significant challenge to researchers who are unable to use the powerful Bioconductor and its numerous tools due to their lack of knowledge of R language. Among the few existing software programs that offer a graphic user interface to Bioconductor packages, none have implemented a comprehensive strategy to address the accuracy and reliability issue of microarray data analysis due to the well known probe design problems associated with many widely used microarray chips. There is also a lack of tools that would expedite the functional analysis of microarray results. We present Microarray Я US, an R-based graphical user interface that implements over a dozen popular Bioconductor packages to offer researchers a streamlined workflow for routine differential microarray expression data analysis without the need to learn R language. In order to enable a more accurate analysis and interpretation of microarray data, we incorporated the latest custom probe re-definition and re-annotation for Affymetrix and Illumina chips. A versatile microarray results output utility tool was also implemented for easy and fast generation of input files for over 20 of the most widely used functional analysis software programs. Coupled with a well-designed user interface, Microarray Я US leverages cutting edge Bioconductor packages for researchers with no knowledge in R language. It also enables a more reliable and accurate microarray data analysis and expedites downstream functional analysis of microarray results.

  8. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    NASA Technical Reports Server (NTRS)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.

  9. Quantum machine learning.

    PubMed

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-13

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  10. Quantum machine learning

    NASA Astrophysics Data System (ADS)

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-01

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  11. Design and Implementation of Real-Time Off-Grid Detection Tool Based on FNET/GridEye

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Jiahui; Zhang, Ye; Liu, Yilu

    2014-01-01

    Real-time situational awareness tools are of critical importance to power system operators, especially during emergencies. The availability of electric power has become a linchpin of most post disaster response efforts as it is the primary dependency for public and private sector services, as well as individuals. Knowledge of the scope and extent of facilities impacted, as well as the duration of their dependence on backup power, enables emergency response officials to plan for contingencies and provide better overall response. Based on real-time data acquired by Frequency Disturbance Recorders (FDRs) deployed in the North American power grid, a real-time detection methodmore » is proposed. This method monitors critical electrical loads and detects the transition of these loads from an on-grid state, where the loads are fed by the power grid to an off-grid state, where the loads are fed by an Uninterrupted Power Supply (UPS) or a backup generation system. The details of the proposed detection algorithm are presented, and some case studies and off-grid detection scenarios are also provided to verify the effectiveness and robustness. Meanwhile, the algorithm has already been implemented based on the Grid Solutions Framework (GSF) and has effectively detected several off-grid situations.« less

  12. Enabling access to new WHO essential medicines: the case for nicotine replacement therapies

    PubMed Central

    2010-01-01

    Nicotine replacement therapies (NRT) are powerful tools for the successful treatment of nicotine addiction and tobacco use. The medicines are clinically effective, supported by the Framework Convention on Tobacco Control, and are now World Health Organization-approved essential medicines. Enabling global access to NRT remains a challenge given ongoing confusion and misperceptions about their efficacy, cost-effectiveness, and availability with respect to other tobacco control and public health opportunities. In this commentary, we review existing evidence and guidelines to make the case for global access to NRT highlighting the smoker's right to access treatment to sensibly address nicotine addiction. PMID:21092092

  13. A bright future for bioluminescent imaging in viral research

    PubMed Central

    Coleman, Stewart M; McGregor, Alistair

    2015-01-01

    Summary Bioluminescence imaging (BLI) has emerged as a powerful tool in the study of animal models of viral disease. BLI enables real-time in vivo study of viral infection, host immune response and the efficacy of intervention strategies. Substrate dependent light emitting luciferase enzyme when incorporated into a virus as a reporter gene enables detection of bioluminescence from infected cells using sensitive charge-coupled device (CCD) camera systems. Advantages of BLI include low background, real-time tracking of infection in the same animal and reduction in the requirement for larger animal numbers. Transgenic luciferase-tagged mice enable the use of pre-existing nontagged viruses in BLI studies. Continued development in luciferase reporter genes, substrates, transgenic animals and imaging systems will greatly enhance future BLI strategies in viral research. PMID:26413138

  14. Final Technical Report for Contract No. DE-EE0006332, "Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormier, Dallas; Edra, Sherwin; Espinoza, Michael

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations,more » identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.« less

  15. An Overview of the Role of Systems Analysis in NASA's Hypersonics Project

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.; Martin John G.; Bowles, Jeffrey V> ; Mehta, Unmeel B.; Snyder, CHristopher A.

    2006-01-01

    NASA's Aeronautics Research Mission Directorate recently restructured its Vehicle Systems Program, refocusing it towards understanding the fundamental physics that govern flight in all speed regimes. Now called the Fundamental Aeronautics Program, it is comprised of four new projects, Subsonic Fixed Wing, Subsonic Rotary Wing, Supersonics, and Hypersonics. The Aeronautics Research Mission Directorate has charged the Hypersonics Project with having a basic understanding of all systems that travel at hypersonic speeds within the Earth's and other planets atmospheres. This includes both powered and unpowered systems, such as re-entry vehicles and vehicles powered by rocket or airbreathing propulsion that cruise in and accelerate through the atmosphere. The primary objective of the Hypersonics Project is to develop physics-based predictive tools that enable the design, analysis and optimization of such systems. The Hypersonics Project charges the systems analysis discipline team with providing it the decision-making information it needs to properly guide research and technology development. Credible, rapid, and robust multi-disciplinary system analysis processes and design tools are required in order to generate this information. To this end, the principal challenges for the systems analysis team are the introduction of high fidelity physics into the analysis process and integration into a design environment, quantification of design uncertainty through the use of probabilistic methods, reduction in design cycle time, and the development and implementation of robust processes and tools enabling a wide design space and associated technology assessment capability. This paper will discuss the roles and responsibilities of the systems analysis discipline team within the Hypersonics Project as well as the tools, methods, processes, and approach that the team will undertake in order to perform its project designated functions.

  16. Mobile computing device as tools for college student education: a case on flashcards application

    NASA Astrophysics Data System (ADS)

    Kang, Congying

    2012-04-01

    Traditionally, college students always use flash cards as a tool to remember massive knowledge, such as nomenclature, structures, and reactions in chemistry. Educational and information technology have enabled flashcards viewed on computers, like Slides and PowerPoint, works as tunnels of drilling and feedback for the learners. The current generation of students is more capable of information technology and mobile computing devices. For example, they use their Mobile phones much more intensively everyday day. Trends of using Mobile phone as an educational tool is analyzed and a educational technology initiative is proposed, which use Mobile phone flash cards applications to help students learn biology and chemistry. Experiments show that users responded positively to these mobile flash cards.

  17. Towards a National Space Weather Predictive Capability

    NASA Astrophysics Data System (ADS)

    Fox, N. J.; Lindstrom, K. L.; Ryschkewitsch, M. G.; Anderson, B. J.; Gjerloev, J. W.; Merkin, V. G.; Kelly, M. A.; Miller, E. S.; Sitnov, M. I.; Ukhorskiy, A. Y.; Erlandson, R. E.; Barnes, R. J.; Paxton, L. J.; Sotirelis, T.; Stephens, G.; Comberiate, J.

    2014-12-01

    National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review datasets, tools and models that have resulted from research by scientists at JHU/APL, and examine how they could be applied to support space weather applications in coordination with other community assets and capabilities.

  18. Purity of Vector Vortex Beams through a Birefringent Amplifier

    NASA Astrophysics Data System (ADS)

    Sroor, Hend; Lisa, Nyameko; Naidoo, Darryl; Litvin, Igor; Forbes, Andrew

    2018-04-01

    Creating high-quality vector vortex (VV) beams is possible with a myriad of techniques at low power, and while a few studies have produced such beams at high power, none have considered the impact of amplification on the vector purity. Here we employ tools to study the amplification of VV beams and, in particular, the purity of such modes. We outline a versatile toolbox for such investigations and demonstrate its use in the general case of VV beams through a birefringent gain medium. Intriguingly, we show that it is possible to enhance the purity of such beams during amplification, paving the way for high-brightness VV beams, a requirement for their use in high-power applications such as optical communication and laser-enabled manufacturing.

  19. Lunar South Pole Illumination: Review, Reassessment, and Power System Implications

    NASA Technical Reports Server (NTRS)

    Fincannon, James

    2007-01-01

    This paper reviews past analyses and research related to lunar south pole illumination and presents results of independent illumination analyses using an analytical tool and a radar digital elevation model. The analysis tool enables assessment at most locations near the lunar poles for any time and any year. Average illumination fraction, energy storage duration, solar/horizon terrain elevation profiles and illumination fraction profiles are presented for various highly illuminated sites which have been identified for manned or unmanned operations. The format of the data can be used by power system designers to develop mass optimized solar and energy storage systems. Data are presented for the worse case lunar day (a critical power planning bottleneck) as well as three lunar days during lunar south pole winter. The main site under consideration by present lunar mission planners (on the Crater Shackleton rim) is shown to have, for the worse case lunar day, a 0.71 average illumination fraction and 73 to 117 hours required for energy storage (depending on power system type). Linking other sites and including towers at either site are shown to not completely eliminate the need for energy storage.

  20. Characterization of Lunar Polar Illumination from a Power System Perspective

    NASA Technical Reports Server (NTRS)

    Fincannon, James

    2008-01-01

    This paper presents the results of illumination analyses for the lunar south and north pole regions obtained using an independently developed analytical tool and two types of digital elevation models (DEM). One DEM was based on radar height data from Earth observations of the lunar surface and the other was a combination of the radar data with a separate dataset generated using Clementine spacecraft stereo imagery. The analysis tool enables the assessment of illumination at most locations in the lunar polar regions for any time and any year. Maps are presented for both lunar poles for the worst case winter period (the critical power system design and planning bottleneck) and for the more favorable best case summer period. Average illumination maps are presented to help understand general topographic trends over the regions. Energy storage duration maps are presented to assist in power system design. Average illumination fraction, energy storage duration, solar/horizon terrain elevation profiles and illumination fraction profiles are presented for favorable lunar north and south pole sites which have the potential for manned or unmanned spacecraft operations. The format of the data is oriented for use by power system designers to develop mass optimized solar and energy storage systems.

  1. DNS and Embedded DNS as Tools for Investigating Unsteady Heat Transfer Phenomena in Turbines

    NASA Technical Reports Server (NTRS)

    vonTerzi, Dominic; Bauer, H.-J.

    2010-01-01

    DNS is a powerful tool with high potential for investigating unsteady heat transfer and fluid flow phenomena, in particular for cases involving transition to turbulence and/or large coherent structures. - DNS of idealized configurations related to turbomachinery components is already possible. - For more realistic configurations and the inclusion of more effects, reduction of computational cost is key issue (e.g., hybrid methods). - Approach pursued here: Embedded DNS ( segregated coupling of DNS with LES and/or RANS). - Embedded DNS is an enabling technology for many studies. - Pre-transitional heat transfer and trailing-edge cutback film-cooling are good candidates for (embedded) DNS studies.

  2. IMG-ABC: A Knowledge Base To Fuel Discovery of Biosynthetic Gene Clusters and Novel Secondary Metabolites.

    PubMed

    Hadjithomas, Michalis; Chen, I-Min Amy; Chu, Ken; Ratner, Anna; Palaniappan, Krishna; Szeto, Ernest; Huang, Jinghua; Reddy, T B K; Cimermančič, Peter; Fischbach, Michael A; Ivanova, Natalia N; Markowitz, Victor M; Kyrpides, Nikos C; Pati, Amrita

    2015-07-14

    In the discovery of secondary metabolites, analysis of sequence data is a promising exploration path that remains largely underutilized due to the lack of computational platforms that enable such a systematic approach on a large scale. In this work, we present IMG-ABC (https://img.jgi.doe.gov/abc), an atlas of biosynthetic gene clusters within the Integrated Microbial Genomes (IMG) system, which is aimed at harnessing the power of "big" genomic data for discovering small molecules. IMG-ABC relies on IMG's comprehensive integrated structural and functional genomic data for the analysis of biosynthetic gene clusters (BCs) and associated secondary metabolites (SMs). SMs and BCs serve as the two main classes of objects in IMG-ABC, each with a rich collection of attributes. A unique feature of IMG-ABC is the incorporation of both experimentally validated and computationally predicted BCs in genomes as well as metagenomes, thus identifying BCs in uncultured populations and rare taxa. We demonstrate the strength of IMG-ABC's focused integrated analysis tools in enabling the exploration of microbial secondary metabolism on a global scale, through the discovery of phenazine-producing clusters for the first time in Alphaproteobacteria. IMG-ABC strives to fill the long-existent void of resources for computational exploration of the secondary metabolism universe; its underlying scalable framework enables traversal of uncovered phylogenetic and chemical structure space, serving as a doorway to a new era in the discovery of novel molecules. IMG-ABC is the largest publicly available database of predicted and experimental biosynthetic gene clusters and the secondary metabolites they produce. The system also includes powerful search and analysis tools that are integrated with IMG's extensive genomic/metagenomic data and analysis tool kits. As new research on biosynthetic gene clusters and secondary metabolites is published and more genomes are sequenced, IMG-ABC will continue to expand, with the goal of becoming an essential component of any bioinformatic exploration of the secondary metabolism world. Copyright © 2015 Hadjithomas et al.

  3. Genome-Enabled Molecular Tools for Reductive Dehalogenation

    DTIC Science & Technology

    2011-11-01

    Genome-Enabled Molecular Tools for Reductive Dehalogenation - A Shift in Paradigm for Bioremediation - Alfred M. Spormann Departments of Chemical...Genome-Enabled Molecular Tools for Reductive Dehalogenation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...Applications Technical Session No. 3D C-77 GENOME-ENABLED MOLECULAR TOOLS FOR REDUCTIVE DEHALOGENATION PROFESSOR ALFRED SPORMANN Stanford

  4. The IBD interactome: an integrated view of aetiology, pathogenesis and therapy.

    PubMed

    de Souza, Heitor S P; Fiocchi, Claudio; Iliopoulos, Dimitrios

    2017-12-01

    Crohn's disease and ulcerative colitis are prototypical complex diseases characterized by chronic and heterogeneous manifestations, induced by interacting environmental, genomic, microbial and immunological factors. These interactions result in an overwhelming complexity that cannot be tackled by studying the totality of each pathological component (an '-ome') in isolation without consideration of the interaction among all relevant -omes that yield an overall 'network effect'. The outcome of this effect is the 'IBD interactome', defined as a disease network in which dysregulation of individual -omes causes intestinal inflammation mediated by dysfunctional molecular modules. To define the IBD interactome, new concepts and tools are needed to implement a systems approach; an unbiased data-driven integration strategy that reveals key players of the system, pinpoints the central drivers of inflammation and enables development of targeted therapies. Powerful bioinformatics tools able to query and integrate multiple -omes are available, enabling the integration of genomic, epigenomic, transcriptomic, proteomic, metabolomic and microbiome information to build a comprehensive molecular map of IBD. This approach will enable identification of IBD molecular subtypes, correlations with clinical phenotypes and elucidation of the central hubs of the IBD interactome that will aid discovery of compounds that can specifically target the hubs that control the disease.

  5. Development of a smartphone-based pulse oximeter with adaptive SNR/power balancing.

    PubMed

    Phelps, Tom; Haowei Jiang; Hall, Drew A

    2017-07-01

    Millions worldwide suffer from diseases that exhibit early warnings signs that can be detected by standard clinical-grade diagnostic tools. Unfortunately, such tools are often prohibitively expensive to the developing world leading to inadequate healthcare and high mortality rates. To address this problem, a smartphone-based pulse oximeter is presented that interfaces with the phone through the audio jack, enabling point-of-care measurements of heart rate (HR) and oxygen saturation (SpO 2 ). The device is designed to utilize existing phone resources (e.g., the processor, battery, and memory) resulting in a more portable and inexpensive diagnostic tool than standalone equivalents. By adaptively tuning the LED driving signal, the device is less dependent on phone-specific audio jack properties than prior audio jack-based work making it universally compatible with all smartphones. We demonstrate that the pulse oximeter can adaptively optimize the signal-to-noise ratio (SNR) within the power constraints of a mobile phone (<; 10mW) while maintaining high accuracy (HR error <; 3.4% and SpO 2 error <; 3.7%) against a clinical grade instrument.

  6. A Multi-layer, Data-driven Advanced Reasoning Tool for Intelligent Data Mining and Analysis for Smart Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Du, Pengwei; Greitzer, Frank L.

    2012-12-31

    This paper presents the multi-layer, data-driven advanced reasoning tool (M-DART), a proof-of-principle decision support tool for improved power system operation. M-DART will cross-correlate and examine different data sources to assess anomalies, infer root causes, and anneal data into actionable information. By performing higher-level reasoning “triage” of diverse data sources, M-DART focuses on early detection of emerging power system events and identifies highest priority actions for the human decision maker. M-DART represents a significant advancement over today’s grid monitoring technologies that apply offline analyses to derive model-based guidelines for online real-time operations and use isolated data processing mechanisms focusing on individualmore » data domains. The development of the M-DART will bridge these gaps by reasoning about results obtained from multiple data sources that are enabled by the smart grid infrastructure. This hybrid approach integrates a knowledge base that is trained offline but tuned online to capture model-based relationships while revealing complex causal relationships among data from different domains.« less

  7. Geographic Visualization of Power-Grid Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R.

    2015-06-18

    The visualization enables the simulation analyst to see changes in the frequency through time and space. With this technology, the analyst has a bird's eye view of the frequency at loads and generators as the simulated power system responds to the loss of a generator, spikes in load, and other contingencies. The significance of a contingency to the operation of an electrical power system depends critically on how the resulting tansients evolve in time and space. Consequently, these dynamic events can only be understood when seen in their proper geographic context. this understanding is indispensable to engineers working on themore » next generation of distributed sensing and control systems for the smart grid. By making possible a natural and intuitive presentation of dynamic behavior, our new visualization technology is a situational-awareness tool for power-system engineers.« less

  8. The Power of CRISPR-Cas9-Induced Genome Editing to Speed Up Plant Breeding

    PubMed Central

    Wang, Wenqin; Le, Hien T. T.

    2016-01-01

    Genome editing with engineered nucleases enabling site-directed sequence modifications bears a great potential for advanced plant breeding and crop protection. Remarkably, the RNA-guided endonuclease technology (RGEN) based on the clustered regularly interspaced short palindromic repeats (CRISPR) and CRISPR-associated protein 9 (Cas9) is an extremely powerful and easy tool that revolutionizes both basic research and plant breeding. Here, we review the major technical advances and recent applications of the CRISPR-Cas9 system for manipulation of model and crop plant genomes. We also discuss the future prospects of this technology in molecular plant breeding. PMID:28097123

  9. GEAS Spectroscopy Tools for Authentic Research Investigations in the Classroom

    NASA Astrophysics Data System (ADS)

    Rector, Travis A.; Vogt, Nicole P.

    2018-06-01

    Spectroscopy is one of the most powerful tools that astronomers use to study the universe. However relatively few resources are available that enable undergraduates to explore astronomical spectra interactively. We present web-based applications which guide students through the analysis of real spectra of stars, galaxies, and quasars. The tools are written in HTML5 and function in all modern web browsers on computers and tablets. No software needs to be installed nor do any datasets need to be downloaded, enabling students to use the tools in or outside of class (e.g., for online classes).Approachable GUIs allow students to analyze spectra in the same manner as professional astronomers. The stellar spectroscopy tool can fit a continuum with a blackbody and identify spectral features, as well as fit line profiles and determine equivalent widths. The galaxy and AGN tools can also measure redshifts and calcium break strengths. The tools provide access to an archive of hundreds of spectra obtained with the optical telescopes at Kitt Peak National Observatory. It is also possible to load your own spectra or to query the Sloan Digital Sky Survey (SDSS) database.We have also developed curricula to investigate these topics: spectral classification, variable stars, redshift, and AGN classification. We will present the functionality of the tools and describe the associated curriculum. The tools are part of the General Education Astronomy Source (GEAS) project based at New Mexico State University, with support from the National Science Foundation (NSF, AST-0349155) and the National Aeronautics and Space Administration (NASA, NNX09AV36G). Curriculum development was supported by the NSF (DUE-0618849 and DUE-0920293).

  10. Generation of a novel next-generation sequencing-based method for the isolation of new human papillomavirus types.

    PubMed

    Brancaccio, Rosario N; Robitaille, Alexis; Dutta, Sankhadeep; Cuenin, Cyrille; Santare, Daiga; Skenders, Girts; Leja, Marcis; Fischer, Nicole; Giuliano, Anna R; Rollison, Dana E; Grundhoff, Adam; Tommasino, Massimo; Gheit, Tarik

    2018-05-07

    With the advent of new molecular tools, the discovery of new papillomaviruses (PVs) has accelerated during the past decade, enabling the expansion of knowledge about the viral populations that inhabit the human body. Human PVs (HPVs) are etiologically linked to benign or malignant lesions of the skin and mucosa. The detection of HPV types can vary widely, depending mainly on the methodology and the quality of the biological sample. Next-generation sequencing is one of the most powerful tools, enabling the discovery of novel viruses in a wide range of biological material. Here, we report a novel protocol for the detection of known and unknown HPV types in human skin and oral gargle samples using improved PCR protocols combined with next-generation sequencing. We identified 105 putative new PV types in addition to 296 known types, thus providing important information about the viral distribution in the oral cavity and skin. Copyright © 2018. Published by Elsevier Inc.

  11. A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software

    NASA Astrophysics Data System (ADS)

    Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.

    2017-10-01

    Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  12. Development of a recombinase polymerase amplification assay for the detection of pathogenic Leptospira.

    PubMed

    Ahmed, Ahmed; van der Linden, Hans; Hartskeerl, Rudy A

    2014-05-08

    Detection of leptospires based on DNA amplification techniques is essential for the early diagnosis of leptospirosis when anti-Leptospira antibodies are below the detection limit of most serological tests. In middle and low income countries where leptospirosis is endemic, routine implementation of real-time PCR is financially and technically challenging due to the requirement of expensive thermocycler equipment. In this study we report the development and evaluation of a novel isothermal recombinase polymerase amplification assay (RPA) for detection of pathogenic Leptospira based on TwistAmp chemistry. RPA enabled the detection of less than two genome copies per reaction. Retrospective evaluation revealed a high diagnostic accuracy (sensitivity and specificity of 94.7% and 97.7%, respectively) compared to culturing as the reference standard. RPA presents a powerful tool for the early diagnosis of leptospirosis in humans and in animals. Furthermore, it enables the detection of the causative agent in reservoirs and environment, and as such is a valuable adjunct to current tools for surveillance and early outbreak warning.

  13. Development of a Recombinase Polymerase Amplification Assay for the Detection of Pathogenic Leptospira

    PubMed Central

    Ahmed, Ahmed; van der Linden, Hans; Hartskeerl, Rudy A.

    2014-01-01

    Detection of leptospires based on DNA amplification techniques is essential for the early diagnosis of leptospirosis when anti-Leptospira antibodies are below the detection limit of most serological tests. In middle and low income countries where leptospirosis is endemic, routine implementation of real-time PCR is financially and technically challenging due to the requirement of expensive thermocycler equipment. In this study we report the development and evaluation of a novel isothermal recombinase polymerase amplification assay (RPA) for detection of pathogenic Leptospira based on TwistAmp chemistry. RPA enabled the detection of less than two genome copies per reaction. Retrospective evaluation revealed a high diagnostic accuracy (sensitivity and specificity of 94.7% and 97.7%, respectively) compared to culturing as the reference standard. RPA presents a powerful tool for the early diagnosis of leptospirosis in humans and in animals. Furthermore, it enables the detection of the causative agent in reservoirs and environment, and as such is a valuable adjunct to current tools for surveillance and early outbreak warning. PMID:24814943

  14. Approaching Suspicious Substances Safely

    NASA Technical Reports Server (NTRS)

    2004-01-01

    A mineral identification tool that was developed for NASA's Mars Rover Technology Development program is now serving as a powerful tool for U.S. law enforcement agencies and military personnel to identify suspicious liquid and solid substances. The tool can measure unknown substances through glass and plastic packaging materials with the RamanProbe(TradeMark) focused fiber-optic probe. The probe length can be extended up to 200 meters to enable users to analyze potentially dangerous substances at a safe distance. In many cases, the spectrometer and personnel are kept in a safe zone while the probe is positioned next to the sample being analyzed. Being able to identify chemicals in remote locations also saves users time and labor, since otherwise the samples would need to be collected, transported, and prepared prior to measurement in the laboratory.

  15. Fast Dynamic Simulation-Based Small Signal Stability Assessment and Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acharya, Naresh; Baone, Chaitanya; Veda, Santosh

    2014-12-31

    Power grid planning and operation decisions are made based on simulation of the dynamic behavior of the system. Enabling substantial energy savings while increasing the reliability of the aging North American power grid through improved utilization of existing transmission assets hinges on the adoption of wide-area measurement systems (WAMS) for power system stabilization. However, adoption of WAMS alone will not suffice if the power system is to reach its full entitlement in stability and reliability. It is necessary to enhance predictability with "faster than real-time" dynamic simulations that will enable the dynamic stability margins, proactive real-time control, and improve gridmore » resiliency to fast time-scale phenomena such as cascading network failures. Present-day dynamic simulations are performed only during offline planning studies, considering only worst case conditions such as summer peak, winter peak days, etc. With widespread deployment of renewable generation, controllable loads, energy storage devices and plug-in hybrid electric vehicles expected in the near future and greater integration of cyber infrastructure (communications, computation and control), monitoring and controlling the dynamic performance of the grid in real-time would become increasingly important. The state-of-the-art dynamic simulation tools have limited computational speed and are not suitable for real-time applications, given the large set of contingency conditions to be evaluated. These tools are optimized for best performance of single-processor computers, but the simulation is still several times slower than real-time due to its computational complexity. With recent significant advances in numerical methods and computational hardware, the expectations have been rising towards more efficient and faster techniques to be implemented in power system simulators. This is a natural expectation, given that the core solution algorithms of most commercial simulators were developed decades ago, when High Performance Computing (HPC) resources were not commonly available.« less

  16. ITEP: an integrated toolkit for exploration of microbial pan-genomes.

    PubMed

    Benedict, Matthew N; Henriksen, James R; Metcalf, William W; Whitaker, Rachel J; Price, Nathan D

    2014-01-03

    Comparative genomics is a powerful approach for studying variation in physiological traits as well as the evolution and ecology of microorganisms. Recent technological advances have enabled sequencing large numbers of related genomes in a single project, requiring computational tools for their integrated analysis. In particular, accurate annotations and identification of gene presence and absence are critical for understanding and modeling the cellular physiology of newly sequenced genomes. Although many tools are available to compare the gene contents of related genomes, new tools are necessary to enable close examination and curation of protein families from large numbers of closely related organisms, to integrate curation with the analysis of gain and loss, and to generate metabolic networks linking the annotations to observed phenotypes. We have developed ITEP, an Integrated Toolkit for Exploration of microbial Pan-genomes, to curate protein families, compute similarities to externally-defined domains, analyze gene gain and loss, and generate draft metabolic networks from one or more curated reference network reconstructions in groups of related microbial species among which the combination of core and variable genes constitute the their "pan-genomes". The ITEP toolkit consists of: (1) a series of modular command-line scripts for identification, comparison, curation, and analysis of protein families and their distribution across many genomes; (2) a set of Python libraries for programmatic access to the same data; and (3) pre-packaged scripts to perform common analysis workflows on a collection of genomes. ITEP's capabilities include de novo protein family prediction, ortholog detection, analysis of functional domains, identification of core and variable genes and gene regions, sequence alignments and tree generation, annotation curation, and the integration of cross-genome analysis and metabolic networks for study of metabolic network evolution. ITEP is a powerful, flexible toolkit for generation and curation of protein families. ITEP's modular design allows for straightforward extension as analysis methods and tools evolve. By integrating comparative genomics with the development of draft metabolic networks, ITEP harnesses the power of comparative genomics to build confidence in links between genotype and phenotype and helps disambiguate gene annotations when they are evaluated in both evolutionary and metabolic network contexts.

  17. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    PubMed

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  18. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  19. Stratified Charge Rotary Engine Critical Technology Enablement, Volume 1

    NASA Technical Reports Server (NTRS)

    Irion, C. E.; Mount, R. E.

    1992-01-01

    This report summarizes results of a critical technology enablement effort with the stratified charge rotary engine (SCRE) focusing on a power section of 0.67 liters (40 cu. in.) per rotor in single and two rotor versions. The work is a continuation of prior NASA Contracts NAS3-23056 and NAS3-24628. Technical objectives are multi-fuel capability, including civil and military jet fuel and DF-2, fuel efficiency of 0.355 Lbs/BHP-Hr. at best cruise condition above 50 percent power, altitude capability of up to 10Km (33,000 ft.) cruise, 2000 hour TBO and reduced coolant heat rejection. Critical technologies for SCRE's that have the potential for competitive performance and cost in a representative light-aircraft environment were examined. Objectives were: the development and utilization of advanced analytical tools, i.e. higher speed and enhanced three dimensional combustion modeling; identification of critical technologies; development of improved instrumentation, and to isolate and quantitatively identify the contribution to performance and efficiency of critical components or subsystems.

  20. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  1. Photochemically and Thermally Driven Full-Color Reflection in a Self-Organized Helical Superstructure Enabled by a Halogen-Bonded Chiral Molecular Switch.

    PubMed

    Wang, Hao; Bisoyi, Hari Krishna; Wang, Ling; Urbas, Augustine M; Bunning, Timothy J; Li, Quan

    2018-02-05

    Supramolecular approaches toward the fabrication of functional materials and systems have been an enabling endeavor. Recently, halogen bonding has been harnessed as a promising supramolecular tool. Herein we report the synthesis and characterization of a novel halogen-bonded light-driven axially chiral molecular switch. The photoactive halogen-bonded chiral switch is able to induce a self-organized, tunable helical superstructure, that is, cholesteric liquid crystal (CLC), when doped into an achiral liquid crystal (LC) host. The halogen-bonded switch as a chiral dopant has a high helical twisting power (HTP) and shows a large change of its HTP upon photoisomerization. This light-driven dynamic modulation enables reversible selective reflection color tuning across the entire visible spectrum. The chiral switch also displays a temperature-dependent HTP change that enables thermally driven red, green, and blue (RGB) reflection colors in the self-organized helical superstructure. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A wireless remote high-power laser device for optogenetic experiments

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Gong, Q.; Li, Y. Y.; Li, A. Z.; Zhang, Y. G.; Cao, C. F.; Xu, H. X.; Cui, J.; Gao, J. J.

    2015-04-01

    Optogenetics affords the ability to stimulate genetically targeted neurons in a relatively innocuous manner. Reliable and targetable tools have enabled versatile new classes of investigation in the study of neural systems. However, current hardware systems are generally limited to acute measurements or require external tethering of the system to the light source. Here we provide a low-cost, high-power, remotely controlled blue laser diode (LD) stimulator for the application of optogenetics in neuroscience, focusing on wearable and intelligent devices, which can be carried by monkeys, rats and any other animals under study. Compared with the conventional light emitting diode (LED) device, this LD stimulator has higher efficiency, output power, and stability. Our system is fully wirelessly controlled and suitable for experiments with a large number of animals.

  3. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  4. State Policies to Transform Struggling Schools: How Various State Policies Can Be Used to Enable School & District Turnaround. Meeting the Turnaround Challenge: Strategies, Resources & Tools to Transform a Framework into Practice

    ERIC Educational Resources Information Center

    Mass Insight Education (NJ1), 2009

    2009-01-01

    State governments wield significant authority in the management of public schools. As a nexus for federal funding, state funding, and regulatory authority, states have both the legal and financial power to help drive school change. The "No Child Left Behind Act" has required each state to create a system of standards-based assessment and…

  5. PLEXOS Input Data Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PLEXOS Input Data Generator (PIDG) is a tool that enables PLEXOS users to better version their data, automate data processing, collaborate in developing inputs, and transfer data between different production cost modeling and other power systems analysis software. PIDG can process data that is in a generalized format from multiple input sources, including CSV files, PostgreSQL databases, and PSS/E .raw files and write it to an Excel file that can be imported into PLEXOS with only limited manual intervention.

  6. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Barbak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a next generation CED; nin addition to a point design, the team develops a model of the local trade space. The process is a balance between the power of model-developing tools and the creativity of human experts, enabling the development of a variety of trade models for any space mission.

  7. Synchrotron Radiation Sheds Fresh Light on Plant Research: The Use of Powerful Techniques to Probe Structure and Composition of Plants.

    PubMed

    Vijayan, Permual; Willick, Ian R; Lahlali, Rachid; Karunakaran, Chithra; Tanino, Karen K

    2015-07-01

    While synchrotron radiation is a powerful tool in material and biomedical sciences, it is still underutilized in plant research. This mini review attempts to introduce the potential of synchrotron-based spectroscopic and imaging methods and their applications to plant sciences. Synchrotron-based Fourier transform infrared spectroscopy, X-ray absorption and fluorescence techniques, and two- and three-dimensional imaging techniques are examined. We also discuss the limitations of synchrotron-based research in plant sciences, specifically the types of plant samples that can be used. Despite limitations, the unique features of synchrotron radiation such as high brightness, polarization and pulse properties offer great advantages over conventional spectroscopic and imaging tools and enable the correlation of the structure and chemical composition of plants with biochemical function. Modern detector technologies and experimental methodologies are thus enabling plant scientists to investigate aspects of plant sciences such as ultrafast kinetics of biochemical reactions, mineral uptake, transport and accumulation, and dynamics of cell wall structure and composition during environmental stress in unprecedented ways using synchrotron beamlines. The potential for the automation of some of these synchrotron technologies and their application to plant phenotyping is also discussed. © The Author 2015. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  8. The auxin-inducible degradation (AID) system enables versatile conditional protein depletion in C. elegans

    PubMed Central

    Zhang, Liangyu; Ward, Jordan D.; Cheng, Ze; Dernburg, Abby F.

    2015-01-01

    Experimental manipulation of protein abundance in living cells or organisms is an essential strategy for investigation of biological regulatory mechanisms. Whereas powerful techniques for protein expression have been developed in Caenorhabditis elegans, existing tools for conditional disruption of protein function are far more limited. To address this, we have adapted the auxin-inducible degradation (AID) system discovered in plants to enable conditional protein depletion in C. elegans. We report that expression of a modified Arabidopsis TIR1 F-box protein mediates robust auxin-dependent depletion of degron-tagged targets. We document the effectiveness of this system for depletion of nuclear and cytoplasmic proteins in diverse somatic and germline tissues throughout development. Target proteins were depleted in as little as 20-30 min, and their expression could be re-established upon auxin removal. We have engineered strains expressing TIR1 under the control of various promoter and 3′ UTR sequences to drive tissue-specific or temporally regulated expression. The degron tag can be efficiently introduced by CRISPR/Cas9-based genome editing. We have harnessed this system to explore the roles of dynamically expressed nuclear hormone receptors in molting, and to analyze meiosis-specific roles for proteins required for germ line proliferation. Together, our results demonstrate that the AID system provides a powerful new tool for spatiotemporal regulation and analysis of protein function in a metazoan model organism. PMID:26552885

  9. Novel Power Electronics Three-Dimensional Heat Exchanger: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennion, K.; Cousineau, J.; Lustbader, J.

    2014-08-01

    Electric drive systems for vehicle propulsion enable technologies critical to meeting challenges for energy, environmental, and economic security. Enabling cost-effective electric drive systems requires reductions in inverter power semiconductor area. As critical components of the electric drive system are made smaller, heat removal becomes an increasing challenge. In this paper, we demonstrate an integrated approach to the design of thermal management systems for power semiconductors that matches the passive thermal resistance of the packaging with the active convective cooling performance of the heat exchanger. The heat exchanger concept builds on existing semiconductor thermal management improvements described in literature and patents,more » which include improved bonded interface materials, direct cooling of the semiconductor packages, and double-sided cooling. The key difference in the described concept is the achievement of high heat transfer performance with less aggressive cooling techniques by optimizing the passive and active heat transfer paths. An extruded aluminum design was selected because of its lower tooling cost, higher performance, and scalability in comparison to cast aluminum. Results demonstrated a heat flux improvement of a factor of two, and a package heat density improvement over 30%, which achieved the thermal performance targets.« less

  10. Parametric investigations of plasma characteristics in a remote inductively coupled plasma system

    NASA Astrophysics Data System (ADS)

    Shukla, Prasoon; Roy, Abhra; Jain, Kunal; Bhoj, Ananth

    2016-09-01

    Designing a remote plasma system involves source chamber sizing, selection of coils and/or electrodes to power the plasma, designing the downstream tubes, selection of materials used in the source and downstream regions, locations of inlets and outlets and finally optimizing the process parameter space of pressure, gas flow rates and power delivery. Simulations can aid in spatial and temporal plasma characterization in what are often inaccessible locations for experimental probes in the source chamber. In this paper, we report on simulations of a remote inductively coupled Argon plasma system using the modeling platform CFD-ACE +. The coupled multiphysics model description successfully address flow, chemistry, electromagnetics, heat transfer and plasma transport in the remote plasma system. The SimManager tool enables easy setup of parametric simulations to investigate the effect of varying the pressure, power, frequency, flow rates and downstream tube lengths. It can also enable the automatic solution of the varied parameters to optimize a user-defined objective function, which may be the integral ion and radical fluxes at the wafer. The fast run time coupled with the parametric and optimization capabilities can add significant insight and value in design and optimization.

  11. A Data-Driven Approach to Interactive Visualization of Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Jun

    Driven by emerging industry standards, electric utilities and grid coordination organizations are eager to seek advanced tools to assist grid operators to perform mission-critical tasks and enable them to make quick and accurate decisions. The emerging field of visual analytics holds tremendous promise for improving the business practices in today’s electric power industry. The conducted investigation, however, has revealed that the existing commercial power grid visualization tools heavily rely on human designers, hindering user’s ability to discover. Additionally, for a large grid, it is very labor-intensive and costly to build and maintain the pre-designed visual displays. This project proposes amore » data-driven approach to overcome the common challenges. The proposed approach relies on developing powerful data manipulation algorithms to create visualizations based on the characteristics of empirically or mathematically derived data. The resulting visual presentations emphasize what the data is rather than how the data should be presented, thus fostering comprehension and discovery. Furthermore, the data-driven approach formulates visualizations on-the-fly. It does not require a visualization design stage, completely eliminating or significantly reducing the cost for building and maintaining visual displays. The research and development (R&D) conducted in this project is mainly divided into two phases. The first phase (Phase I & II) focuses on developing data driven techniques for visualization of power grid and its operation. Various data-driven visualization techniques were investigated, including pattern recognition for auto-generation of one-line diagrams, fuzzy model based rich data visualization for situational awareness, etc. The R&D conducted during the second phase (Phase IIB) focuses on enhancing the prototyped data driven visualization tool based on the gathered requirements and use cases. The goal is to evolve the prototyped tool developed during the first phase into a commercial grade product. We will use one of the identified application areas as an example to demonstrate how research results achieved in this project are successfully utilized to address an emerging industry need. In summary, the data-driven visualization approach developed in this project has proven to be promising for building the next-generation power grid visualization tools. Application of this approach has resulted in a state-of-the-art commercial tool currently being leveraged by more than 60 utility organizations in North America and Europe .« less

  12. Beyond information access: Support for complex cognitive activities in public health informatics tools.

    PubMed

    Sedig, Kamran; Parsons, Paul; Dittmer, Mark; Ola, Oluwakemi

    2012-01-01

    Public health professionals work with a variety of information sources to carry out their everyday activities. In recent years, interactive computational tools have become deeply embedded in such activities. Unlike the early days of computational tool use, the potential of tools nowadays is not limited to simply providing access to information; rather, they can act as powerful mediators of human-information discourse, enabling rich interaction with public health information. If public health informatics tools are designed and used properly, they can facilitate, enhance, and support the performance of complex cognitive activities that are essential to public health informatics, such as problem solving, forecasting, sense-making, and planning. However, the effective design and evaluation of public health informatics tools requires an understanding of the cognitive and perceptual issues pertaining to how humans work and think with information to perform such activities. This paper draws on research that has examined some of the relevant issues, including interaction design, complex cognition, and visual representations, to offer some human-centered design and evaluation considerations for public health informatics tools.

  13. Rendering the Intractable More Tractable: Tools from Caenorhabditis elegans Ripe for Import into Parasitic Nematodes

    PubMed Central

    Ward, Jordan D.

    2015-01-01

    Recent and rapid advances in genetic and molecular tools have brought spectacular tractability to Caenorhabditis elegans, a model that was initially prized because of its simple design and ease of imaging. C. elegans has long been a powerful model in biomedical research, and tools such as RNAi and the CRISPR/Cas9 system allow facile knockdown of genes and genome editing, respectively. These developments have created an additional opportunity to tackle one of the most debilitating burdens on global health and food security: parasitic nematodes. I review how development of nonparasitic nematodes as genetic models informs efforts to import tools into parasitic nematodes. Current tools in three commonly studied parasites (Strongyloides spp., Brugia malayi, and Ascaris suum) are described, as are tools from C. elegans that are ripe for adaptation and the benefits and barriers to doing so. These tools will enable dissection of a huge array of questions that have been all but completely impenetrable to date, allowing investigation into host–parasite and parasite–vector interactions, and the genetic basis of parasitism. PMID:26644478

  14. Dynamically variable negative stiffness structures.

    PubMed

    Churchill, Christopher B; Shahan, David W; Smith, Sloan P; Keefe, Andrew C; McKnight, Geoffrey P

    2016-02-01

    Variable stiffness structures that enable a wide range of efficient load-bearing and dexterous activity are ubiquitous in mammalian musculoskeletal systems but are rare in engineered systems because of their complexity, power, and cost. We present a new negative stiffness-based load-bearing structure with dynamically tunable stiffness. Negative stiffness, traditionally used to achieve novel response from passive structures, is a powerful tool to achieve dynamic stiffness changes when configured with an active component. Using relatively simple hardware and low-power, low-frequency actuation, we show an assembly capable of fast (<10 ms) and useful (>100×) dynamic stiffness control. This approach mitigates limitations of conventional tunable stiffness structures that exhibit either small (<30%) stiffness change, high friction, poor load/torque transmission at low stiffness, or high power active control at the frequencies of interest. We experimentally demonstrate actively tunable vibration isolation and stiffness tuning independent of supported loads, enhancing applications such as humanoid robotic limbs and lightweight adaptive vibration isolators.

  15. The power and benefits of concept mapping: measuring use, usefulness, ease of use, and satisfaction

    NASA Astrophysics Data System (ADS)

    Freeman, Lee A.; Jessup, Leonard M.

    2004-02-01

    The power and benefits of concept mapping rest in four arenas: enabling shared understanding, the inclusion of affect, the balance of power, and client involvement. Concept mapping theory and research indicate concept maps (1) are appropriate tools to assist with communication, (2) are easy to use, and (3) are seen as beneficial by their users. An experiment was conducted to test these assertions and analyze the power and benefits of concept mapping using a typical business consulting scenario involving 16 groups of two individuals. The results were analyzed via empirical hypothesis testing and protocol analyses, and indicate an overall support of the theory and prior research and additional support of new measures of usefulness, ease of use, and satisfaction by both parties. A more thorough understanding of concept mapping is gained and available to future practitioners and researchers.

  16. THE ROLE AND PROMOTION OF NURSING.

    PubMed

    Benceković, Željka; Benko, Ivica; Režek, Biserka; Grgas-Bile, Cecilija

    2016-06-01

    Nurses have great influence on health care system. As they face many different states of affairs nowadays, it is necessary to find a way to promote and enable their profession. The significant interaction between nurses and patients and other medical professions makes them a powerful marketing tool of a medical institution. Therefore, the promotion of their image is necessary because of authorizing their professional status. There are numerous ways of creating and improving their image. Some of them include implementation of marketing principles and the Internet as well. This paper presents the web page promotion of nursing, i.e. the Nursing and Related Professions of the Sestre milosrdnice University Hospital Center homepage, as a tool of nursing promotion.

  17. Microsystems, Space Qualified Electronics and Mobile Sensor Platforms for Harsh Environment Applications and Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.; Okojie, Robert S.; Krasowski, Michael J.; Beheim, Glenn M.; Fralick, Gustave C.; Wrbanek, John D.; Greenberg, Paul S.; Xu, Jennifer

    2007-01-01

    NASA Glenn Research Center is presently developing and applying a range of sensor and electronic technologies that can enable future planetary missions. These include space qualified instruments and electronics, high temperature sensors for Venus missions, mobile sensor platforms, and Microsystems for detection of a range of chemical species and particulates. A discussion of each technology area and its level of maturity is given. It is concluded that there is a strong need for low power devices which can be mobile and provide substantial characterization of the planetary environment where and when needed. While a given mission will require tailoring of the technology for the application, basic tools which can enable new planetary missions are being developed.

  18. Laboratory Astrophysics: Enabling Scientific Discovery and Understanding

    NASA Technical Reports Server (NTRS)

    Kirby, K.

    2006-01-01

    NASA's Science Strategic Roadmap for Universe Exploration lays out a series of science objectives on a grand scale and discusses the various missions, over a wide range of wavelengths, which will enable discovery. Astronomical spectroscopy is arguably the most powerful tool we have for exploring the Universe. Experimental and theoretical studies in Laboratory Astrophysics convert "hard-won data into scientific understanding". However, the development of instruments with increasingly high spectroscopic resolution demands atomic and molecular data of unprecedented accuracy and completeness. How to meet these needs, in a time of severe budgetary constraints, poses a significant challenge both to NASA, the astronomical observers and model-builders, and the laboratory astrophysics community. I will discuss these issues, together with some recent examples of productive astronomy/lab astro collaborations.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.; McCorkle, D.; Yang, C.

    Process modeling and simulation tools are widely used for the design and operation of advanced power generation systems. These tools enable engineers to solve the critical process systems engineering problems that arise throughout the lifecycle of a power plant, such as designing a new process, troubleshooting a process unit or optimizing operations of the full process. To analyze the impact of complex thermal and fluid flow phenomena on overall power plant performance, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has developed the Advanced Process Engineering Co-Simulator (APECS). The APECS system is an integrated software suite that combinesmore » process simulation (e.g., Aspen Plus) and high-fidelity equipment simulations such as those based on computational fluid dynamics (CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper we discuss the initial phases of the integration of the APECS system with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite uses the ActiveX (OLE Automation) controls in the Aspen Plus process simulator wrapped by the CASI library developed by Reaction Engineering International to run process/CFD co-simulations and query for results. This integration represents a necessary step in the development of virtual power plant co-simulations that will ultimately reduce the time, cost, and technical risk of developing advanced power generation systems.« less

  20. Rapid Particle Patterning in Surface Deposited Micro-Droplets of Low Ionic Content via Low-Voltage Electrochemistry and Electrokinetics

    PubMed Central

    Sidelman, Noam; Cohen, Moshik; Kolbe, Anke; Zalevsky, Zeev; Herrman, Andreas; Richter, Shachar

    2015-01-01

    Electrokinetic phenomena are a powerful tool used in various scientific and technological applications for the manipulation of aqueous solutions and the chemical entities within them. However, the use of DC-induced electrokinetics in miniaturized devices is highly limited. This is mainly due to unavoidable electrochemical reactions at the electrodes, which hinder successful manipulation. Here we present experimental evidence that on-chip DC manipulation of particles between closely positioned electrodes inside micro-droplets can be successfully achieved, and at low voltages. We show that such manipulation, which is considered practically impossible, can be used to rapidly concentrate and pattern particles in 2D shapes in inter-electrode locations. We show that this is made possible in low ion content dispersions, which enable low-voltage electrokinetics and an anomalous bubble-free water electrolysis. This phenomenon can serve as a powerful tool in both microflow devices and digital microfluidics for rapid pre-concentration and particle patterning. PMID:26293477

  1. Replacement of SSE with NASA's POWER Project GIS-enabled Web Data Portal

    Atmospheric Science Data Center

    2018-04-30

    Replacement of SSE with NASA's POWER Project GIS-enabled Web Data Portal Friday, March ... 2018 Replacement of SSE (Release 6) with NASA's Prediction of Worldwide Energy Resource (POWER) Project GIS-enabled Web ... Worldwide Energy Resource (POWER) Project funded largely by NASA Earth Applied Sciences program.   The new POWER web portal ...

  2. A real-time intercepting beam-profile monitor for a medical cyclotron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendriks, C.; Uittenbosch, T.; Cameron, D.

    2013-11-15

    There is a lack of real-time continuous beam-diagnostic tools for medical cyclotrons due to high power deposition during proton irradiation. To overcome this limitation, we have developed a profile monitor that is capable of providing continuous feedback about beam shape and current in real time while it is inserted in the beam path. This enables users to optimize the beam profile and observe fluctuations in the beam over time with periodic insertion of the monitor.

  3. Matching of renewable source of energy generation graphs and electrical load in local energy system

    NASA Astrophysics Data System (ADS)

    Lezhniuk, Petro; Komar, Vyacheslav; Sobchuk, Dmytro; Kravchuk, Sergiy; Kacejko, Piotr; Zavidsky, Vladislav

    2017-08-01

    The paper contains the method of matching generation graph of photovoltaic electric stations and consumers. Characteristic feature of this method is the application of morphometric analysis for assessment of non-uniformity of the integrated graph of energy supply, optimal coefficients of current distribution, that enables by mean of refining the powers, transferring in accordance with the graph , to provide the decrease of electric energy losses in the grid and transport task, as the optimization tool.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zarkesh, Ryan A.; Foster, Michael E.; Ichimura, Andrew S.

    The ability to tune the steric envelope through redox events post-synthetically or in tandem with other chemical processes is a powerful tool that could assist in enabling new catalytic methodologies and understanding potential pitfalls in ligand design. The α-diimine ligand, dmp-BIAN, exhibits the peculiar and previously unreported feature of varying steric profiles depending on oxidation state when paired with a main group element. A study of the factors that give rise to this behaviour as well as its impact on the incorporation of other ligands is performed.

  5. Batch Proving and Proof Scripting in PVS

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.

    2007-01-01

    The batch execution modes of PVS are powerful, but highly technical, features of the system that are mostly accessible to expert users. This paper presents a PVS tool, called ProofLite, that extends the theorem prover interface with a batch proving utility and a proof scripting notation. ProofLite enables a semi-literate proving style where specification and proof scripts reside in the same file. The goal of ProofLite is to provide batch proving and proof scripting capabilities to regular, non-expert, users of PVS.

  6. Addition of CF3 across unsaturated moieties: a powerful functionalization tool

    PubMed Central

    2014-01-01

    In the last few years, the efficient introduction of trifluoromethyl groups in organic molecules has become a major research focus. This review highlights the recent developments enabling the incorporation of CF3 groups across unsaturated moieties, preferentially alkenes, and the mechanistic scenarios governing these transformations. We have specially focused on methods involving the simultaneous formation of C–CF3 and C–C or C–heteroatom bonds by formal addition reactions across π-systems, as such difunctionalization processes hold valuable synthetic potential. PMID:24789472

  7. Energy efficiency analysis and optimization for mobile platforms

    NASA Astrophysics Data System (ADS)

    Metri, Grace Camille

    The introduction of mobile devices changed the landscape of computing. Gradually, these devices are replacing traditional personal computer (PCs) to become the devices of choice for entertainment, connectivity, and productivity. There are currently at least 45.5 million people in the United States who own a mobile device, and that number is expected to increase to 1.5 billion by 2015. Users of mobile devices expect and mandate that their mobile devices have maximized performance while consuming minimal possible power. However, due to the battery size constraints, the amount of energy stored in these devices is limited and is only growing by 5% annually. As a result, we focused in this dissertation on energy efficiency analysis and optimization for mobile platforms. We specifically developed SoftPowerMon, a tool that can power profile Android platforms in order to expose the power consumption behavior of the CPU. We also performed an extensive set of case studies in order to determine energy inefficiencies of mobile applications. Through our case studies, we were able to propose optimization techniques in order to increase the energy efficiency of mobile devices and proposed guidelines for energy-efficient application development. In addition, we developed BatteryExtender, an adaptive user-guided tool for power management of mobile devices. The tool enables users to extend battery life on demand for a specific duration until a particular task is completed. Moreover, we examined the power consumption of System-on-Chips (SoCs) and observed the impact on the energy efficiency in the event of offloading tasks from the CPU to the specialized custom engines. Based on our case studies, we were able to demonstrate that current software-based power profiling techniques for SoCs can have an error rate close to 12%, which needs to be addressed in order to be able to optimize the energy consumption of the SoC. Finally, we summarize our contributions and outline possible direction for future research in this field.

  8. Large-scale deep learning for robotically gathered imagery for science

    NASA Astrophysics Data System (ADS)

    Skinner, K.; Johnson-Roberson, M.; Li, J.; Iscar, E.

    2016-12-01

    With the explosion of computing power, the intelligence and capability of mobile robotics has dramatically increased over the last two decades. Today, we can deploy autonomous robots to achieve observations in a variety of environments ripe for scientific exploration. These platforms are capable of gathering a volume of data previously unimaginable. Additionally, optical cameras, driven by mobile phones and consumer photography, have rapidly improved in size, power consumption, and quality making their deployment cheaper and easier. Finally, in parallel we have seen the rise of large-scale machine learning approaches, particularly deep neural networks (DNNs), increasing the quality of the semantic understanding that can be automatically extracted from optical imagery. In concert this enables new science using a combination of machine learning and robotics. This work will discuss the application of new low-cost high-performance computing approaches and the associated software frameworks to enable scientists to rapidly extract useful science data from millions of robotically gathered images. The automated analysis of imagery on this scale opens up new avenues of inquiry unavailable using more traditional manual or semi-automated approaches. We will use a large archive of millions of benthic images gathered with an autonomous underwater vehicle to demonstrate how these tools enable new scientific questions to be posed.

  9. Micro-intestinal robot with wireless power transmission: design, analysis and experiment.

    PubMed

    Shi, Yu; Yan, Guozheng; Chen, Wenwen; Zhu, Bingquan

    2015-11-01

    Video capsule endoscopy is a useful tool for noninvasive intestinal detection, but it is not capable of active movement; wireless power is an effective solution to this problem. The research in this paper consists of two parts: the mechanical structure which enables the robot to move smoothly inside the intestinal tract, and the wireless power supply which ensures efficiency. First, an intestinal robot with leg architectures was developed based on the Archimedes spiral, which mimics the movement of an inchworm. The spiral legs were capable of unfolding to an angle of approximately 155°, which guaranteed stability of clamping, consistency of surface pressure, and avoided the risk of puncturing the intestinal tract. Secondly, the necessary power to operate the robot was far beyond the capacity of button batteries, so a wireless power transmission (WPT) platform was developed. The design of the platform focused on power transfer efficiency and frequency stability. In addition, the safety of human tissue in the alternating electromagnetic field was also taken into consideration. Finally, the assembled robot was tested and verified with the use of the WPT platform. In the isolated intestine, the robot system successfully traveled along the intestine with an average speed of 23 mm per minute. The obtained videos displayed a resolution of 320 × 240 and a transmission rate of 30 frames per second. The WPT platform supplied up to 500 mW of energy to the robot, and achieved a power transfer efficiency of 12%. It has been experimentally verified that the intestinal robot is safe and effective as an endoscopy tool, for which wireless power is feasible. Proposals for further improving the robot and wireless power supply are provided later in this paper. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Organ-On-A-Chip Platforms: A Convergence of Advanced Materials, Cells, and Microscale Technologies.

    PubMed

    Ahadian, Samad; Civitarese, Robert; Bannerman, Dawn; Mohammadi, Mohammad Hossein; Lu, Rick; Wang, Erika; Davenport-Huyer, Locke; Lai, Ben; Zhang, Boyang; Zhao, Yimu; Mandla, Serena; Korolj, Anastasia; Radisic, Milica

    2018-01-01

    Significant advances in biomaterials, stem cell biology, and microscale technologies have enabled the fabrication of biologically relevant tissues and organs. Such tissues and organs, referred to as organ-on-a-chip (OOC) platforms, have emerged as a powerful tool in tissue analysis and disease modeling for biological and pharmacological applications. A variety of biomaterials are used in tissue fabrication providing multiple biological, structural, and mechanical cues in the regulation of cell behavior and tissue morphogenesis. Cells derived from humans enable the fabrication of personalized OOC platforms. Microscale technologies are specifically helpful in providing physiological microenvironments for tissues and organs. In this review, biomaterials, cells, and microscale technologies are described as essential components to construct OOC platforms. The latest developments in OOC platforms (e.g., liver, skeletal muscle, cardiac, cancer, lung, skin, bone, and brain) are then discussed as functional tools in simulating human physiology and metabolism. Future perspectives and major challenges in the development of OOC platforms toward accelerating clinical studies of drug discovery are finally highlighted. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Supercomputers ready for use as discovery machines for neuroscience.

    PubMed

    Helias, Moritz; Kunkel, Susanne; Masumoto, Gen; Igarashi, Jun; Eppler, Jochen Martin; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus

    2012-01-01

    NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 10(8) neurons and 10(12) synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience.

  12. Supercomputers Ready for Use as Discovery Machines for Neuroscience

    PubMed Central

    Helias, Moritz; Kunkel, Susanne; Masumoto, Gen; Igarashi, Jun; Eppler, Jochen Martin; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus

    2012-01-01

    NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 108 neurons and 1012 synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience. PMID:23129998

  13. IMG-ABC. A knowledge base to fuel discovery of biosynthetic gene clusters and novel secondary metabolites

    DOE PAGES

    Hadjithomas, Michalis; Chen, I-Min Amy; Chu, Ken; ...

    2015-07-14

    In the discovery of secondary metabolites, analysis of sequence data is a promising exploration path that remains largely underutilized due to the lack of computational platforms that enable such a systematic approach on a large scale. In this work, we present IMG-ABC (https://img.jgi.doe.gov/abc), an atlas of biosynthetic gene clusters within the Integrated Microbial Genomes (IMG) system, which is aimed at harnessing the power of “big” genomic data for discovering small molecules. IMG-ABC relies on IMG’s comprehensive integrated structural and functional genomic data for the analysis of biosynthetic gene clusters (BCs) and associated secondary metabolites (SMs). SMs and BCs serve asmore » the two main classes of objects in IMG-ABC, each with a rich collection of attributes. A unique feature of IMG-ABC is the incorporation of both experimentally validated and computationally predicted BCs in genomes as well as metagenomes, thus identifying BCs in uncultured populations and rare taxa. We demonstrate the strength of IMG-ABC’s focused integrated analysis tools in enabling the exploration of microbial secondary metabolism on a global scale, through the discovery of phenazine-producing clusters for the first time in lphaproteobacteria. IMG-ABC strives to fill the long-existent void of resources for computational exploration of the secondary metabolism universe; its underlying scalable framework enables traversal of uncovered phylogenetic and chemical structure space, serving as a doorway to a new era in the discovery of novel molecules. IMG-ABC is the largest publicly available database of predicted and experimental biosynthetic gene clusters and the secondary metabolites they produce. The system also includes powerful search and analysis tools that are integrated with IMG’s extensive genomic/metagenomic data and analysis tool kits. As new research on biosynthetic gene clusters and secondary metabolites is published and more genomes are sequenced, IMG-ABC will continue to expand, with the goal of becoming an essential component of any bioinformatic exploration of the secondary metabolism world.« less

  14. CRISPR/Cas9 Immune System as a Tool for Genome Engineering.

    PubMed

    Hryhorowicz, Magdalena; Lipiński, Daniel; Zeyland, Joanna; Słomski, Ryszard

    2017-06-01

    CRISPR/Cas (clustered regularly interspaced short palindromic repeats/CRISPR-associated) adaptive immune systems constitute a bacterial defence against invading nucleic acids derived from bacteriophages or plasmids. This prokaryotic system was adapted in molecular biology and became one of the most powerful and versatile platforms for genome engineering. CRISPR/Cas9 is a simple and rapid tool which enables the efficient modification of endogenous genes in various species and cell types. Moreover, a modified version of the CRISPR/Cas9 system with transcriptional repressors or activators allows robust transcription repression or activation of target genes. The simplicity of CRISPR/Cas9 has resulted in the widespread use of this technology in many fields, including basic research, biotechnology and biomedicine.

  15. Strategies to explore functional genomics data sets in NCBI's GEO database.

    PubMed

    Wilhite, Stephen E; Barrett, Tanya

    2012-01-01

    The Gene Expression Omnibus (GEO) database is a major repository that stores high-throughput functional genomics data sets that are generated using both microarray-based and sequence-based technologies. Data sets are submitted to GEO primarily by researchers who are publishing their results in journals that require original data to be made freely available for review and analysis. In addition to serving as a public archive for these data, GEO has a suite of tools that allow users to identify, analyze, and visualize data relevant to their specific interests. These tools include sample comparison applications, gene expression profile charts, data set clusters, genome browser tracks, and a powerful search engine that enables users to construct complex queries.

  16. Chip in a lab: Microfluidics for next generation life science research

    PubMed Central

    Streets, Aaron M.; Huang, Yanyi

    2013-01-01

    Microfluidic circuits are characterized by fluidic channels and chambers with a linear dimension on the order of tens to hundreds of micrometers. Components of this size enable lab-on-a-chip technology that has much promise, for example, in the development of point-of-care diagnostics. Micro-scale fluidic circuits also yield practical, physical, and technological advantages for studying biological systems, enhancing the ability of researchers to make more precise quantitative measurements. Microfluidic technology has thus become a powerful tool in the life science research laboratory over the past decade. Here we focus on chip-in-a-lab applications of microfluidics and survey some examples of how small fluidic components have provided researchers with new tools for life science research. PMID:23460772

  17. Strategies to Explore Functional Genomics Data Sets in NCBI’s GEO Database

    PubMed Central

    Wilhite, Stephen E.; Barrett, Tanya

    2012-01-01

    The Gene Expression Omnibus (GEO) database is a major repository that stores high-throughput functional genomics data sets that are generated using both microarray-based and sequence-based technologies. Data sets are submitted to GEO primarily by researchers who are publishing their results in journals that require original data to be made freely available for review and analysis. In addition to serving as a public archive for these data, GEO has a suite of tools that allow users to identify, analyze and visualize data relevant to their specific interests. These tools include sample comparison applications, gene expression profile charts, data set clusters, genome browser tracks, and a powerful search engine that enables users to construct complex queries. PMID:22130872

  18. Power management system

    DOEpatents

    Algrain, Marcelo C.; Johnson, Kris W.; Akasam, Sivaprasad; Hoff, Brian D.

    2007-10-02

    A method of managing power resources for an electrical system of a vehicle may include identifying enabled power sources from among a plurality of power sources in electrical communication with the electrical system and calculating a threshold power value for the enabled power sources. A total power load placed on the electrical system by one or more power consumers may be measured. If the total power load exceeds the threshold power value, then a determination may be made as to whether one or more additional power sources is available from among the plurality of power sources. At least one of the one or more additional power sources may be enabled, if available.

  19. Chattanooga Electric Power Board Case Study Distribution Automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glass, Jim; Melin, Alexander M.; Starke, Michael R.

    In 2009, the U.S. Department of Energy under the American Recovery and Reinvestment Act (ARRA) awarded a grant to the Chattanooga, Tennessee, Electric Power Board (EPB) as part of the Smart Grid Investment Grant Program. The grant had the objective “to accelerate the transformation of the nation’s electric grid by deploying smart grid technologies.” This funding award enabled EPB to expedite the original smart grid implementation schedule from an estimated 10-12 years to 2.5 years. With this funding, EPB invested heavily in distribution automation technologies including installing over 1,200 automated circuit switches and sensors on 171 circuits. For utilities consideringmore » a commitment to distribution automation, there are underlying questions such as the following: “What is the value?” and “What are the costs?” This case study attempts to answer these questions. The primary benefit of distribution automation is increased reliability or reduced power outage duration and frequency. Power outages directly impact customer economics by interfering with business functions. In the past, this economic driver has been difficult to effectively evaluate. However, as this case study demonstrates, tools and analysis techniques are now available. In this case study, the impact on customer costs associated with power outages before and after the implementation of distribution automation are compared. Two example evaluations are performed to demonstrate the benefits: 1) a savings baseline for customers under normal operations1 and 2) customer savings for a single severe weather event. Cost calculations for customer power outages are performed using the US Department of Energy (DOE) Interruption Cost Estimate (ICE) calculator2. This tool uses standard metrics associated with outages and the customers to calculate cost impact. The analysis shows that EPB customers have seen significant reliability improvements from the implementation of distribution automation. Under normal operations, the investment in distribution automation has enabled a 43.5% reduction in annual outage minutes since 2012. This has led to an estimated total savings of $26.8 million per year. Examining a single severe weather event3, the distribution automation was able to restore power to 40,579 (nearly 56%) customers within 1–2 seconds and reduce outage minutes by 29.0%. This saved customers an estimated $23.2 million over the course of the storm.« less

  20. AirLab: a cloud-based platform to manage and share antibody-based single-cell research.

    PubMed

    Catena, Raúl; Özcan, Alaz; Jacobs, Andrea; Chevrier, Stephane; Bodenmiller, Bernd

    2016-06-29

    Single-cell analysis technologies are essential tools in research and clinical diagnostics. These methods include flow cytometry, mass cytometry, and other microfluidics-based technologies. Most laboratories that employ these methods maintain large repositories of antibodies. These ever-growing collections of antibodies, their multiple conjugates, and the large amounts of data generated in assays using specific antibodies and conditions makes a dedicated software solution necessary. We have developed AirLab, a cloud-based tool with web and mobile interfaces, for the organization of these data. AirLab streamlines the processes of antibody purchase, organization, and storage, antibody panel creation, results logging, and antibody validation data sharing and distribution. Furthermore, AirLab enables inventory of other laboratory stocks, such as primers or clinical samples, through user-controlled customization. Thus, AirLab is a mobile-powered and flexible tool that harnesses the capabilities of mobile tools and cloud-based technology to facilitate inventory and sharing of antibody and sample collections and associated validation data.

  1. Fluorescent nucleobases as tools for studying DNA and RNA

    NASA Astrophysics Data System (ADS)

    Xu, Wang; Chan, Ke Min; Kool, Eric T.

    2017-11-01

    Understanding the diversity of dynamic structures and functions of DNA and RNA in biology requires tools that can selectively and intimately probe these biomolecules. Synthetic fluorescent nucleobases that can be incorporated into nucleic acids alongside their natural counterparts have emerged as a powerful class of molecular reporters of location and environment. They are enabling new basic insights into DNA and RNA, and are facilitating a broad range of new technologies with chemical, biological and biomedical applications. In this Review, we will present a brief history of the development of fluorescent nucleobases and explore their utility as tools for addressing questions in biophysics, biochemistry and biology of nucleic acids. We provide chemical insights into the two main classes of these compounds: canonical and non-canonical nucleobases. A point-by-point discussion of the advantages and disadvantages of both types of fluorescent nucleobases is made, along with a perspective into the future challenges and outlook for this burgeoning field.

  2. Towards a National Space Weather Predictive Capability

    NASA Astrophysics Data System (ADS)

    Fox, N. J.; Ryschkewitsch, M. G.; Merkin, V. G.; Stephens, G. K.; Gjerloev, J. W.; Barnes, R. J.; Anderson, B. J.; Paxton, L. J.; Ukhorskiy, A. Y.; Kelly, M. A.; Berger, T. E.; Bonadonna, L. C. M. F.; Hesse, M.; Sharma, S.

    2015-12-01

    National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review the space weather system developed for the Van Allen Probes mission, together with other datasets, tools and models that have resulted from research by scientists at JHU/APL. We will look at how these, and results from future missions such as Solar Probe Plus, could be applied to support space weather applications in coordination with other community assets and capabilities.

  3. A Fully Automated High-Throughput Flow Cytometry Screening System Enabling Phenotypic Drug Discovery.

    PubMed

    Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott

    2018-05-01

    The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.

  4. Cell biochemistry studied by single-molecule imaging.

    PubMed

    Mashanov, G I; Nenasheva, T A; Peckham, M; Molloy, J E

    2006-11-01

    Over the last decade, there have been remarkable developments in live-cell imaging. We can now readily observe individual protein molecules within living cells and this should contribute to a systems level understanding of biological pathways. Direct observation of single fluorophores enables several types of molecular information to be gathered. Temporal and spatial trajectories enable diffusion constants and binding kinetics to be deduced, while analyses of fluorescence lifetime, intensity, polarization or spectra give chemical and conformational information about molecules in their cellular context. By recording the spatial trajectories of pairs of interacting molecules, formation of larger molecular complexes can be studied. In the future, multicolour and multiparameter imaging of single molecules in live cells will be a powerful analytical tool for systems biology. Here, we discuss measurements of single-molecule mobility and residency at the plasma membrane of live cells. Analysis of diffusional paths at the plasma membrane gives information about its physical properties and measurement of temporal trajectories enables rates of binding and dissociation to be derived. Meanwhile, close scrutiny of individual fluorophore trajectories enables ideas about molecular dimerization and oligomerization related to function to be tested directly.

  5. Chipster: user-friendly analysis software for microarray and other high-throughput data.

    PubMed

    Kallio, M Aleksi; Tuimala, Jarno T; Hupponen, Taavi; Klemelä, Petri; Gentile, Massimiliano; Scheinin, Ilari; Koski, Mikko; Käki, Janne; Korpelainen, Eija I

    2011-10-14

    The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available.

  6. Chipster: user-friendly analysis software for microarray and other high-throughput data

    PubMed Central

    2011-01-01

    Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available. PMID:21999641

  7. A new method for finding and characterizing galaxy groups via low-frequency radio surveys

    NASA Astrophysics Data System (ADS)

    Croston, J. H.; Ineson, J.; Hardcastle, M. J.; Mingo, B.

    2017-09-01

    We describe a new method for identifying and characterizing the thermodynamic state of large samples of evolved galaxy groups at high redshifts using high-resolution, low-frequency radio surveys, such as those that will be carried out with LOFAR and the Square Kilometre Array. We identify a sub-population of morphologically regular powerful [Fanaroff-Riley type II (FR II)] radio galaxies and demonstrate that, for this sub-population, the internal pressure of the radio lobes is a reliable tracer of the external intragroup/intracluster medium (ICM) pressure, and that the assumption of a universal pressure profile for relaxed groups enables the total mass and X-ray luminosity to be estimated. Using a sample of well-studied FR II radio galaxies, we demonstrate that our method enables the estimation of group/cluster X-ray luminosities over three orders of magnitude in luminosity to within a factor of ˜2 from low-frequency radio properties alone. Our method could provide a powerful new tool for building samples of thousands of evolved galaxy groups at z > 1 and characterizing their ICM.

  8. Soft X-ray and cathodoluminescence measurement, optimisation and analysis at liquid nitrogen temperatures

    NASA Astrophysics Data System (ADS)

    MacRae, C. M.; Wilson, N. C.; Torpy, A.; Delle Piane, C.

    2018-01-01

    Advances in field emission gun electron microprobes have led to significant gains in the beam power density and when analysis at high resolution is required then low voltages are often selected. The resulting beam power can lead to damage and this can be minimised by cooling the sample down to cryogenic temperatures allowing sub-micrometre imaging using a variety of spectrometers. Recent advances in soft X-ray emission spectrometers (SXES) offer a spectral tool to measure both chemistry and bonding and when combined with spectral cathodoluminescence the complementary techniques enable new knowledge to be gained from both mineral and materials. Magnesium and aluminium metals have been examined at both room and liquid nitrogen temperatures by SXES and the L-emission Fermi-edge has been observed to sharpen at the lower temperatures directly confirming thermal broadening of the X-ray spectra. Gains in emission intensity and resolution have been observed in cathodoluminescence for liquid nitrogen cooled quartz grains compared to ambient temperature quartz. This has enabled subtle growth features at quartz to quartz-cement boundaries to be imaged for the first time.

  9. Molecular inversion probe assay.

    PubMed

    Absalan, Farnaz; Ronaghi, Mostafa

    2007-01-01

    We have described molecular inversion probe technologies for large-scale genetic analyses. This technique provides a comprehensive and powerful tool for the analysis of genetic variation and enables affordable, large-scale studies that will help uncover the genetic basis of complex disease and explain the individual variation in response to therapeutics. Major applications of the molecular inversion probes (MIP) technologies include targeted genotyping from focused regions to whole-genome studies, and allele quantification of genomic rearrangements. The MIP technology (used in the HapMap project) provides an efficient, scalable, and affordable way to score polymorphisms in case/control populations for genetic studies. The MIP technology provides the highest commercially available multiplexing levels and assay conversion rates for targeted genotyping. This enables more informative, genome-wide studies with either the functional (direct detection) approach or the indirect detection approach.

  10. RATANA MEEKHAM, AN ELECTRICAL INTEGRATION TECHNICIAN FOR QUALIS CORP. OF HUNTSVILLE, ALABAMA, HELPS TEST AVIONICS -- COMPLEX VEHICLE SYSTEMS ENABLING NAVIGATION, COMMUNICATIONS AND OTHER FUNCTIONS CRITICAL TO HUMAN SPACEFLIGHT

    NASA Image and Video Library

    2015-01-08

    RATANA MEEKHAM, AN ELECTRICAL INTEGRATION TECHNICIAN FOR QUALIS CORP. OF HUNTSVILLE, ALABAMA, HELPS TEST AVIONICS -- COMPLEX VEHICLE SYSTEMS ENABLING NAVIGATION, COMMUNICATIONS AND OTHER FUNCTIONS CRITICAL TO HUMAN SPACEFLIGHT -- FOR THE SPACE LAUNCH SYSTEM PROGRAM AT NASA’S MARSHALL SPACE FLIGHT CENTER IN HUNTSVILLE, ALABAMA. HER WORK SUPPORTS THE NASA ENGINEERING & SCIENCE SERVICES AND SKILLS AUGMENTATION CONTRACT LED BY JACOBS ENGINEERING OF HUNTSVILLE. MEEKHAM WORKS FULL-TIME AT MARSHALL WHILE FINISHING HER ASSOCIATE'S DEGREE IN MACHINE TOOL TECHNOLOGY AT CALHOUN COMMUNITY COLLEGE IN DECATUR, ALABAMA. THE SPACE LAUNCH SYSTEM, NASA’S NEXT HEAVY-LIFT LAUNCH VEHICLE, IS THE WORLD’S MOST POWERFUL ROCKET, SET TO FLY ITS FIRST UNCREWED LUNAR ORBITAL MISSION IN 2018. ITS FIRST.

  11. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations.

    PubMed

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-15

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  12. Dynamic Gate Product and Artifact Generation from System Models

    NASA Technical Reports Server (NTRS)

    Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris

    2011-01-01

    Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.

  13. Bringing your tools to CyVerse Discovery Environment using Docker

    PubMed Central

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse’s Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse’s production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use. PMID:27803802

  14. Bringing your tools to CyVerse Discovery Environment using Docker.

    PubMed

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use.

  15. An update on carbon nanotube-enabled X-ray sources for biomedical imaging.

    PubMed

    Puett, Connor; Inscoe, Christina; Hartman, Allison; Calliste, Jabari; Franceschi, Dora K; Lu, Jianping; Zhou, Otto; Lee, Yueh Z

    2018-01-01

    A new imaging technology has emerged that uses carbon nanotubes (CNT) as the electron emitter (cathode) for the X-ray tube. Since the performance of the CNT cathode is controlled by simple voltage manipulation, CNT-enabled X-ray sources are ideal for the repetitive imaging steps needed to capture three-dimensional information. As such, they have allowed the development of a gated micro-computed tomography (CT) scanner for small animal research as well as stationary tomosynthesis, an experimental technology for large field-of-view human imaging. The small animal CT can acquire images at specific points in the respiratory and cardiac cycles. Longitudinal imaging therefore becomes possible and has been applied to many research questions, ranging from tumor response to the noninvasive assessment of cardiac output. Digital tomosynthesis (DT) is a low-dose and low-cost human imaging tool that captures some depth information. Known as three-dimensional mammography, DT is now used clinically for breast imaging. However, the resolution of currently-approved DT is limited by the need to swing the X-ray source through space to collect a series of projection views. An array of fixed and distributed CNT-enabled sources provides the solution and has been used to construct stationary DT devices for breast, lung, and dental imaging. To date, over 100 patients have been imaged on Institutional Review Board-approved study protocols. Early experience is promising, showing an excellent conspicuity of soft-tissue features, while also highlighting technical and post-acquisition processing limitations that are guiding continued research and development. Additionally, CNT-enabled sources are being tested in miniature X-ray tubes that are capable of generating adequate photon energies and tube currents for clinical imaging. Although there are many potential applications for these small field-of-view devices, initial experience has been with an X-ray source that can be inserted into the mouth for dental imaging. Conceived less than 20 years ago, CNT-enabled X-ray sources are now being manufactured on a commercial scale and are powering both research tools and experimental human imaging devices. WIREs Nanomed Nanobiotechnol 2018, 10:e1475. doi: 10.1002/wnan.1475 This article is categorized under: Diagnostic Tools > Diagnostic Nanodevices Diagnostic Tools > In Vivo Nanodiagnostics and Imaging. © 2017 Wiley Periodicals, Inc.

  16. Dynamically variable negative stiffness structures

    PubMed Central

    Churchill, Christopher B.; Shahan, David W.; Smith, Sloan P.; Keefe, Andrew C.; McKnight, Geoffrey P.

    2016-01-01

    Variable stiffness structures that enable a wide range of efficient load-bearing and dexterous activity are ubiquitous in mammalian musculoskeletal systems but are rare in engineered systems because of their complexity, power, and cost. We present a new negative stiffness–based load-bearing structure with dynamically tunable stiffness. Negative stiffness, traditionally used to achieve novel response from passive structures, is a powerful tool to achieve dynamic stiffness changes when configured with an active component. Using relatively simple hardware and low-power, low-frequency actuation, we show an assembly capable of fast (<10 ms) and useful (>100×) dynamic stiffness control. This approach mitigates limitations of conventional tunable stiffness structures that exhibit either small (<30%) stiffness change, high friction, poor load/torque transmission at low stiffness, or high power active control at the frequencies of interest. We experimentally demonstrate actively tunable vibration isolation and stiffness tuning independent of supported loads, enhancing applications such as humanoid robotic limbs and lightweight adaptive vibration isolators. PMID:26989771

  17. Advanced light source technologies that enable high-volume manufacturing of DUV lithography extensions

    NASA Astrophysics Data System (ADS)

    Cacouris, Theodore; Rao, Rajasekhar; Rokitski, Rostislav; Jiang, Rui; Melchior, John; Burfeindt, Bernd; O'Brien, Kevin

    2012-03-01

    Deep UV (DUV) lithography is being applied to pattern increasingly finer geometries, leading to solutions like double- and multiple-patterning. Such process complexities lead to higher costs due to the increasing number of steps required to produce the desired results. One of the consequences is that the lithography equipment needs to provide higher operating efficiencies to minimize the cost increases, especially for producers of memory devices that experience a rapid decline in sales prices of these products over time. In addition to having introduced higher power 193nm light sources to enable higher throughput, we previously described technologies that also enable: higher tool availability via advanced discharge chamber gas management algorithms; improved process monitoring via enhanced on-board beam metrology; and increased depth of focus (DOF) via light source bandwidth modulation. In this paper we will report on the field performance of these technologies with data that supports the desired improvements in on-wafer performance and operational efficiencies.

  18. An inverse method for determining the spatially resolved properties of viscoelastic–viscoplastic three-dimensional printed materials

    PubMed Central

    Chen, X.; Ashcroft, I. A.; Wildman, R. D.; Tuck, C. J.

    2015-01-01

    A method using experimental nanoindentation and inverse finite-element analysis (FEA) has been developed that enables the spatial variation of material constitutive properties to be accurately determined. The method was used to measure property variation in a three-dimensional printed (3DP) polymeric material. The accuracy of the method is dependent on the applicability of the constitutive model used in the inverse FEA, hence four potential material models: viscoelastic, viscoelastic–viscoplastic, nonlinear viscoelastic and nonlinear viscoelastic–viscoplastic were evaluated, with the latter enabling the best fit to experimental data. Significant changes in material properties were seen in the depth direction of the 3DP sample, which could be linked to the degree of cross-linking within the material, a feature inherent in a UV-cured layer-by-layer construction method. It is proposed that the method is a powerful tool in the analysis of manufacturing processes with potential spatial property variation that will also enable the accurate prediction of final manufactured part performance. PMID:26730216

  19. An inverse method for determining the spatially resolved properties of viscoelastic-viscoplastic three-dimensional printed materials.

    PubMed

    Chen, X; Ashcroft, I A; Wildman, R D; Tuck, C J

    2015-11-08

    A method using experimental nanoindentation and inverse finite-element analysis (FEA) has been developed that enables the spatial variation of material constitutive properties to be accurately determined. The method was used to measure property variation in a three-dimensional printed (3DP) polymeric material. The accuracy of the method is dependent on the applicability of the constitutive model used in the inverse FEA, hence four potential material models: viscoelastic, viscoelastic-viscoplastic, nonlinear viscoelastic and nonlinear viscoelastic-viscoplastic were evaluated, with the latter enabling the best fit to experimental data. Significant changes in material properties were seen in the depth direction of the 3DP sample, which could be linked to the degree of cross-linking within the material, a feature inherent in a UV-cured layer-by-layer construction method. It is proposed that the method is a powerful tool in the analysis of manufacturing processes with potential spatial property variation that will also enable the accurate prediction of final manufactured part performance.

  20. Nanoimprint of a 3D structure on an optical fiber for light wavefront manipulation.

    PubMed

    Calafiore, Giuseppe; Koshelev, Alexander; Allen, Frances I; Dhuey, Scott; Sassolini, Simone; Wong, Edward; Lum, Paul; Munechika, Keiko; Cabrini, Stefano

    2016-09-16

    Integration of complex photonic structures onto optical fiber facets enables powerful platforms with unprecedented optical functionalities. Conventional nanofabrication technologies, however, do not permit viable integration of complex photonic devices onto optical fibers owing to their low throughput and high cost. In this paper we report the fabrication of a three-dimensional structure achieved by direct nanoimprint lithography on the facet of an optical fiber. Nanoimprint processes and tools were specifically developed to enable a high lithographic accuracy and coaxial alignment of the optical device with respect to the fiber core. To demonstrate the capability of this new approach, a 3D beam splitter has been designed, imprinted and optically characterized. Scanning electron microscopy and optical measurements confirmed the good lithographic capabilities of the proposed approach as well as the desired optical performance of the imprinted structure. The inexpensive solution presented here should enable advancements in areas such as integrated optics and sensing, achieving enhanced portability and versatility of fiber optic components.

  1. Multiplatform Mission Planning and Operations Simulation Environment for Adaptive Remote Sensors

    NASA Astrophysics Data System (ADS)

    Smith, G.; Ball, C.; O'Brien, A.; Johnson, J. T.

    2017-12-01

    We report on the design and development of mission simulator libraries to support the emerging field of adaptive remote sensors. We will outline the current state of the art in adaptive sensing, provide analysis of how the current approach to performing observing system simulation experiments (OSSEs) must be changed to enable adaptive sensors for remote sensing, and present an architecture to enable their inclusion in future OSSEs.The growing potential of sensors capable of real-time adaptation of their operational parameters calls for a new class of mission planning and simulation tools. Existing simulation tools used in OSSEs assume a fixed set of sensor parameters in terms of observation geometry, frequencies used, resolution, or observation time, which allows simplifications to be made in the simulation and allows sensor observation errors to be characterized a priori. Adaptive sensors may vary these parameters depending on the details of the scene observed, so that sensor performance is not simple to model without conducting OSSE simulations that include sensor adaptation in response to varying observational environment. Adaptive sensors are of significance to resource-constrained, small satellite platforms because they enable the management of power and data volumes while providing methods for multiple sensors to collaborate.The new class of OSSEs required to utilize adaptive sensors located on multiple platforms must answer the question: If the physical act of sensing has a cost, how does the system determine if the science value of a measurement is worth the cost and how should that cost be shared among the collaborating sensors?Here we propose to answer this question using an architecture structured around three modules: ADAPT, MANAGE and COLLABORATE. The ADAPT module is a set of routines to facilitate modeling of adaptive sensors, the MANAGE module will implement a set of routines to facilitate simulations of sensor resource management when power and data volume are constrained, and the COLLABORATE module will support simulations of coordination among multiple platforms with adaptive sensors. When used together these modules will for a simulation OSSEs that can enable both the design of adaptive algorithms to support remote sensing and the prediction of the sensor performance.

  2. Multiple-Color Optical Activation, Silencing, and Desynchronization of Neural Activity, with Single-Spike Temporal Resolution

    PubMed Central

    Han, Xue; Boyden, Edward S.

    2007-01-01

    The quest to determine how precise neural activity patterns mediate computation, behavior, and pathology would be greatly aided by a set of tools for reliably activating and inactivating genetically targeted neurons, in a temporally precise and rapidly reversible fashion. Having earlier adapted a light-activated cation channel, channelrhodopsin-2 (ChR2), for allowing neurons to be stimulated by blue light, we searched for a complementary tool that would enable optical neuronal inhibition, driven by light of a second color. Here we report that targeting the codon-optimized form of the light-driven chloride pump halorhodopsin from the archaebacterium Natronomas pharaonis (hereafter abbreviated Halo) to genetically-specified neurons enables them to be silenced reliably, and reversibly, by millisecond-timescale pulses of yellow light. We show that trains of yellow and blue light pulses can drive high-fidelity sequences of hyperpolarizations and depolarizations in neurons simultaneously expressing yellow light-driven Halo and blue light-driven ChR2, allowing for the first time manipulations of neural synchrony without perturbation of other parameters such as spiking rates. The Halo/ChR2 system thus constitutes a powerful toolbox for multichannel photoinhibition and photostimulation of virally or transgenically targeted neural circuits without need for exogenous chemicals, enabling systematic analysis and engineering of the brain, and quantitative bioengineering of excitable cells. PMID:17375185

  3. Enabling Searches on Wavelengths in a Hyperspectral Indices Database

    NASA Astrophysics Data System (ADS)

    Piñuela, F.; Cerra, D.; Müller, R.

    2017-10-01

    Spectral indices derived from hyperspectral reflectance measurements are powerful tools to estimate physical parameters in a non-destructive and precise way for several fields of applications, among others vegetation health analysis, coastal and deep water constituents, geology, and atmosphere composition. In the last years, several micro-hyperspectral sensors have appeared, with both full-frame and push-broom acquisition technologies, while in the near future several hyperspectral spaceborne missions are planned to be launched. This is fostering the use of hyperspectral data in basic and applied research causing a large number of spectral indices to be defined and used in various applications. Ad hoc search engines are therefore needed to retrieve the most appropriate indices for a given application. In traditional systems, query input parameters are limited to alphanumeric strings, while characteristics such as spectral range/ bandwidth are not used in any existing search engine. Such information would be relevant, as it enables an inverse type of search: given the spectral capabilities of a given sensor or a specific spectral band, find all indices which can be derived from it. This paper describes a tool which enables a search as described above, by using the central wavelength or spectral range used by a given index as a search parameter. This offers the ability to manage numeric wavelength ranges in order to select indices which work at best in a given set of wavelengths or wavelength ranges.

  4. Defining Tolerance: Impacts of Delay and Disruption when Managing Challenged Networks

    NASA Technical Reports Server (NTRS)

    Birrane, Edward J. III; Burleigh, Scott C.; Cerf, Vint

    2011-01-01

    Challenged networks exhibit irregularities in their communication performance stemming from node mobility, power constraints, and impacts from the operating environment. These irregularities manifest as high signal propagation delay and frequent link disruption. Understanding those limits of link disruption and propagation delay beyond which core networking features fail is an ongoing area of research. Various wireless networking communities propose tools and techniques that address these phenomena. Emerging standardization activities within the Internet Research Task Force (IRTF) and the Consultative Committee for Space Data Systems (CCSDS) look to build upon both this experience and scalability analysis. Successful research in this area is predicated upon identifying enablers for common communication functions (notably node discovery, duplex communication, state caching, and link negotiation) and how increased disruptions and delays affect their feasibility within the network. Networks that make fewer assumptions relating to these enablers provide more universal service. Specifically, reliance on node discovery and link negotiation results in network-specific operational concepts rather than scalable technical solutions. Fundamental to this debate are the definitions, assumptions, operational concepts, and anticipated scaling of these networks. This paper presents the commonalities and differences between delay and disruption tolerance, including support protocols and critical enablers. We present where and how these tolerances differ. We propose a set of use cases that must be accommodated by any standardized delay-tolerant network and discuss the implication of these on existing tool development.

  5. IDL Object Oriented Software for Hinode/XRT Image Analysis

    NASA Astrophysics Data System (ADS)

    Higgins, P. A.; Gallagher, P. T.

    2008-09-01

    We have developed a set of object oriented IDL routines that enable users to search, download and analyse images from the X-Ray Telescope (XRT) on-board Hinode. In this paper, we give specific examples of how the object can be used and how multi-instrument data analysis can be performed. The XRT object is a highly versatile and powerful IDL object, which will prove to be a useful tool for solar researchers. This software utilizes the generic Framework object available within the GEN branch of SolarSoft.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forslund, D.W.; Cook, J.L.

    One of the most powerful tools available for telemedicine is a multimedia medical record accessible over a wide area and simultaneously editable by multiple physicians. The ability to do this through an intuitive interface linking multiple distributed data repositories while maintaining full data integrity is a fundamental enabling technology in healthcare. The authors discuss the role of distributed object technology using Java and CORBA in providing this capability including an example of such a system (TeleMed) which can be accessed through the World Wide Web. Issues of security, scalability, data integrity, and usability are emphasized.

  7. Bio-sensing with butterfly wings: naturally occurring nano-structures for SERS-based malaria parasite detection.

    PubMed

    Garrett, Natalie L; Sekine, Ryo; Dixon, Matthew W A; Tilley, Leann; Bambery, Keith R; Wood, Bayden R

    2015-09-07

    Surface enhanced Raman scattering (SERS) is a powerful tool with great potential to provide improved bio-sensing capabilities. The current 'gold-standard' method for diagnosis of malaria involves visual inspection of blood smears using light microscopy, which is time consuming and can prevent early diagnosis of the disease. We present a novel surface-enhanced Raman spectroscopy substrate based on gold-coated butterfly wings, which enabled detection of malarial hemozoin pigment within lysed blood samples containing 0.005% and 0.0005% infected red blood cells.

  8. An Experimental Investigation of Dextrous Robots Using EVA Tools and Interfaces

    NASA Technical Reports Server (NTRS)

    Ambrose, Robert; Culbert, Christopher; Rehnmark, Frederik

    2001-01-01

    This investigation of robot capabilities with extravehicular activity (EVA) equipment looks at how improvements in dexterity are enabling robots to perform tasks once thought to be beyond machines. The approach is qualitative, using the Robonaut system at the Johnson Space Center (JSC), performing task trials that offer a quick look at this system's high degree of dexterity and the demands of EVA. Specific EVA tools attempted include tether hooks, power torque tools, and rock scoops, as well as conventional tools like scissors, wire strippers, forceps, and wrenches. More complex EVA equipment was also studied, with more complete tasks that mix tools, EVA hand rails, tethers, tools boxes, PIP pins, and EVA electrical connectors. These task trials have been ongoing over an 18 month period, as the Robonaut system evolved to its current 43 degree of freedom (DOF) configuration, soon to expand to over 50. In each case, the number of teleoperators is reported, with rough numbers of attempts and their experience level, with a subjective difficulty rating assigned to each piece of EVA equipment and function. JSC' s Robonaut system was successful with all attempted EVA hardware, suggesting new options for human and robot teams working together in space.

  9. Parallel Implementation of MAFFT on CUDA-Enabled Graphics Hardware.

    PubMed

    Zhu, Xiangyuan; Li, Kenli; Salah, Ahmad; Shi, Lin; Li, Keqin

    2015-01-01

    Multiple sequence alignment (MSA) constitutes an extremely powerful tool for many biological applications including phylogenetic tree estimation, secondary structure prediction, and critical residue identification. However, aligning large biological sequences with popular tools such as MAFFT requires long runtimes on sequential architectures. Due to the ever increasing sizes of sequence databases, there is increasing demand to accelerate this task. In this paper, we demonstrate how graphic processing units (GPUs), powered by the compute unified device architecture (CUDA), can be used as an efficient computational platform to accelerate the MAFFT algorithm. To fully exploit the GPU's capabilities for accelerating MAFFT, we have optimized the sequence data organization to eliminate the bandwidth bottleneck of memory access, designed a memory allocation and reuse strategy to make full use of limited memory of GPUs, proposed a new modified-run-length encoding (MRLE) scheme to reduce memory consumption, and used high-performance shared memory to speed up I/O operations. Our implementation tested in three NVIDIA GPUs achieves speedup up to 11.28 on a Tesla K20m GPU compared to the sequential MAFFT 7.015.

  10. A topological multilayer model of the human body.

    PubMed

    Barbeito, Antonio; Painho, Marco; Cabral, Pedro; O'Neill, João

    2015-11-04

    Geographical information systems deal with spatial databases in which topological models are described with alphanumeric information. Its graphical interfaces implement the multilayer concept and provide powerful interaction tools. In this study, we apply these concepts to the human body creating a representation that would allow an interactive, precise, and detailed anatomical study. A vector surface component of the human body is built using a three-dimensional (3-D) reconstruction methodology. This multilayer concept is implemented by associating raster components with the corresponding vector surfaces, which include neighbourhood topology enabling spatial analysis. A root mean square error of 0.18 mm validated the three-dimensional reconstruction technique of internal anatomical structures. The expansion of the identification and the development of a neighbourhood analysis function are the new tools provided in this model.

  11. Nanowire systems: technology and design

    PubMed Central

    Gaillardon, Pierre-Emmanuel; Amarù, Luca Gaetano; Bobba, Shashikanth; De Marchi, Michele; Sacchetto, Davide; De Micheli, Giovanni

    2014-01-01

    Nanosystems are large-scale integrated systems exploiting nanoelectronic devices. In this study, we consider double independent gate, vertically stacked nanowire field effect transistors (FETs) with gate-all-around structures and typical diameter of 20 nm. These devices, which we have successfully fabricated and evaluated, control the ambipolar behaviour of the nanostructure by selectively enabling one type of carriers. These transistors work as switches with electrically programmable polarity and thus realize an exclusive or operation. The intrinsic higher expressive power of these FETs, when compared with standard complementary metal oxide semiconductor technology, enables us to realize more efficient logic gates, which we organize as tiles to realize nanowire systems by regular arrays. This article surveys both the technology for double independent gate FETs as well as physical and logic design tools to realize digital systems with this fabrication technology. PMID:24567471

  12. CASL L2 milestone report : VUQ.Y1.03, %22Enable statistical sensitivity and UQ demonstrations for VERA.%22

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yixing; Adams, Brian M.; Witkowski, Walter R.

    2011-04-01

    The CASL Level 2 Milestone VUQ.Y1.03, 'Enable statistical sensitivity and UQ demonstrations for VERA,' was successfully completed in March 2011. The VUQ focus area led this effort, in close partnership with AMA, and with support from VRI. DAKOTA was coupled to VIPRE-W thermal-hydraulics simulations representing reactors of interest to address crud-related challenge problems in order to understand the sensitivity and uncertainty in simulation outputs with respect to uncertain operating and model form parameters. This report summarizes work coupling the software tools, characterizing uncertainties, selecting sensitivity and uncertainty quantification algorithms, and analyzing the results of iterative studies. These demonstration studies focusedmore » on sensitivity and uncertainty of mass evaporation rate calculated by VIPRE-W, a key predictor for crud-induced power shift (CIPS).« less

  13. An episomal vector-based CRISPR/Cas9 system for highly efficient gene knockout in human pluripotent stem cells.

    PubMed

    Xie, Yifang; Wang, Daqi; Lan, Feng; Wei, Gang; Ni, Ting; Chai, Renjie; Liu, Dong; Hu, Shijun; Li, Mingqing; Li, Dajin; Wang, Hongyan; Wang, Yongming

    2017-05-24

    Human pluripotent stem cells (hPSCs) represent a unique opportunity for understanding the molecular mechanisms underlying complex traits and diseases. CRISPR/Cas9 is a powerful tool to introduce genetic mutations into the hPSCs for loss-of-function studies. Here, we developed an episomal vector-based CRISPR/Cas9 system, which we called epiCRISPR, for highly efficient gene knockout in hPSCs. The epiCRISPR system enables generation of up to 100% Insertion/Deletion (indel) rates. In addition, the epiCRISPR system enables efficient double-gene knockout and genomic deletion. To minimize off-target cleavage, we combined the episomal vector technology with double-nicking strategy and recent developed high fidelity Cas9. Thus the epiCRISPR system offers a highly efficient platform for genetic analysis in hPSCs.

  14. Modeling of power transmission and stress grading for corona protection

    NASA Astrophysics Data System (ADS)

    Zohdi, T. I.; Abali, B. E.

    2017-11-01

    Electrical high voltage (HV) machines are prone to corona discharges leading to power losses as well as damage of the insulating layer. Many different techniques are applied as corona protection and computational methods aid to select the best design. In this paper we develop a reduced-order model in 1D estimating electric field and temperature distribution of a conductor wrapped with different layers, as usual for HV-machines. Many assumptions and simplifications are undertaken for this 1D model, therefore, we compare its results to a direct numerical simulation in 3D quantitatively. Both models are transient and nonlinear, giving a possibility to quickly estimate in 1D or fully compute in 3D by a computational cost. Such tools enable understanding, evaluation, and optimization of corona shielding systems for multilayered coils.

  15. Dissecting genetic and environmental mutation signatures with model organisms.

    PubMed

    Segovia, Romulo; Tam, Annie S; Stirling, Peter C

    2015-08-01

    Deep sequencing has impacted on cancer research by enabling routine sequencing of genomes and exomes to identify genetic changes associated with carcinogenesis. Researchers can now use the frequency, type, and context of all mutations in tumor genomes to extract mutation signatures that reflect the driving mutational processes. Identifying mutation signatures, however, may not immediately suggest a mechanism. Consequently, several recent studies have employed deep sequencing of model organisms exposed to discrete genetic or environmental perturbations. These studies exploit the simpler genomes and availability of powerful genetic tools in model organisms to analyze mutation signatures under controlled conditions, forging mechanistic links between mutational processes and signatures. We discuss the power of this approach and suggest that many such studies may be on the horizon. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Analytical thermal model for end-pumped solid-state lasers

    NASA Astrophysics Data System (ADS)

    Cini, L.; Mackenzie, J. I.

    2017-12-01

    Fundamentally power-limited by thermal effects, the design challenge for end-pumped "bulk" solid-state lasers depends upon knowledge of the temperature gradients within the gain medium. We have developed analytical expressions that can be used to model the temperature distribution and thermal-lens power in end-pumped solid-state lasers. Enabled by the inclusion of a temperature-dependent thermal conductivity, applicable from cryogenic to elevated temperatures, typical pumping distributions are explored and the results compared with accepted models. Key insights are gained through these analytical expressions, such as the dependence of the peak temperature rise in function of the boundary thermal conductance to the heat sink. Our generalized expressions provide simple and time-efficient tools for parametric optimization of the heat distribution in the gain medium based upon the material and pumping constraints.

  17. ATHLETE: Lunar Cargo Unloading from a High Deck

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian H.

    2010-01-01

    As part of the NASA Exploration Technology Development Program, the Jet Propulsion Laboratory is developing a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. Each vehicle is based on six wheels at the ends of six multi-degree-of freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through or at least out of extreme terrain, the wheels and wheel actuators can be sized for nominal terrain. There are substantial mass savings in the wheel and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are at least comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be lighter than a conventional all-terrain mobility chassis. A side benefit of this approach is that each limb has sufficient degrees-of freedom to be used as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb. A power-take-off from the wheel actuates the tools, so that they can take advantage of the 1+ horsepower motor in each wheel to enable drilling, gripping or other power-tool functions.

  18. ATHLETE: a Cargo and Habitat Transporter for the Moon

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian H.

    2009-01-01

    As part of the NASA Exploration Technology Development Program, the Jet Propulsion Laboratory is developing a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. The vehicle concept is based on six wheels at the ends of six multi-degree-of-freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through (or at least out of) extreme terrain, the wheels and wheel actuators can be sized only for nominal terrain. There are substantial mass savings in the wheels and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be about 25 percent lighter than a conventional mobility chassis for planetary exploration. A side benefit of this approach is that each limb has sufficient degrees-of-freedom for use as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb. A rotating power-take-off from the wheel actuates the tools, so that they can take advantage of the 1-plus-horsepower motor in each wheel to enable drilling, gripping or other power-tool functions.

  19. [Addictions: Motivated or forced care].

    PubMed

    Cottencin, Olivier; Bence, Camille

    2016-12-01

    Patients presenting with addictions are often obliged to consult. This constraint can be explicit (partner, children, parents, doctor, police, justice) or can be implicit (for their children, for their families, or for their health). Thus, beyond the fact that the caregiver faces the paradox of caring for subjects who do not ask treatment, he faces as well a double bind considered to be supporter of the social order or helper of patients. The transtheoretical model of change is complex showing us that it was neither fixed in time, nor perpetual for a given individual. This model includes ambivalence, resistance and even relapse, but it still considers constraint as a brake than an effective tool. Therapist must have adequate communication tools to enable everyone (forced or not) understand that involvement in care will enable him/her to regain his free will, even though it took to go through coercion. We propose in this article to detail the first steps with the patient presenting with addiction looking for constraint (implicit or explicit), how to work with constraint, avoid making resistances ourselves and make of constraint a powerful motivator for change. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  20. Pluripotent stem cells reveal the developmental biology of human megakaryocytes and provide a source of platelets for clinical application.

    PubMed

    Takayama, Naoya; Eto, Koji

    2012-10-01

    Human pluripotent stem cells [PSCs; including human embryonic stem cells (ESCs) and induced pluripotent stem cells (iPSCs)] can infinitely proliferate in vitro and are easily accessible for gene manipulation. Megakaryocytes (MKs) and platelets can be created from human ESCs and iPSCs in vitro and represent a potential source of blood cells for transfusion and a promising tool for studying the human thrombopoiesis. Moreover, disease-specific iPSCs are a powerful tool for elucidating the pathogenesis of hematological diseases and for drug screening. In that context, we and other groups have developed in vitro MK and platelet differentiation systems from human pluripotent stem cells (PSCs). Combining this co-culture system with a drug-inducible gene expression system enabled us to clarify the novel role played by c-MYC during human thrombopoiesis. In the next decade, technical advances (e.g., high-throughput genomic sequencing) will likely enable the identification of numerous gene mutations associated with abnormal thrombopoiesis. Combined with such technology, an in vitro system for differentiating human PSCs into MKs and platelets could provide a novel platform for studying human gene function associated with thrombopoiesis.

  1. Horizontal technology helps spark Louisiana`s Austin chalk trend

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koen, A.D.

    1996-04-29

    A handful of companies paced by some of the most active operators in the US are pressing the limits of horizontal technology to ramp up Cretaceous Austin chalk exploration and development (E and D) across Louisiana. Companies find applications in Louisiana for lessons learned drilling horizontal wells to produce chalk intervals in Texas in Giddings, Pearsall, and Brookeland fields. Continuing advances in horizontal well technology are helping operators deal with deeper, hotter reservoirs in more complex geological settings that typify the chalk in Louisiana. Better horizontal drilling, completion, formation evaluation, and stimulation techniques have enabled operators to produce oil andmore » gas from formations previously thought to be uneconomical. Most of the improved capabilities stem from better horizontal tools. Horizontal drilling breakthroughs include dual powered mud motors and retrievable whipstocks, key links in the ability to drill wells with more than one horizontal lateral. Better geosteering tools have enabled operators to maintain horizontal wellbores in desired intervals by signaling bit positions downhole while drilling. This paper reviews the technology and provides a historical perspective on the various drilling programs which have been completed in this trend. It also makes predictions on future drilling successes.« less

  2. Unlocking the potential of publicly available microarray data using inSilicoDb and inSilicoMerging R/Bioconductor packages.

    PubMed

    Taminau, Jonatan; Meganck, Stijn; Lazar, Cosmin; Steenhoff, David; Coletta, Alain; Molter, Colin; Duque, Robin; de Schaetzen, Virginie; Weiss Solís, David Y; Bersini, Hugues; Nowé, Ann

    2012-12-24

    With an abundant amount of microarray gene expression data sets available through public repositories, new possibilities lie in combining multiple existing data sets. In this new context, analysis itself is no longer the problem, but retrieving and consistently integrating all this data before delivering it to the wide variety of existing analysis tools becomes the new bottleneck. We present the newly released inSilicoMerging R/Bioconductor package which, together with the earlier released inSilicoDb R/Bioconductor package, allows consistent retrieval, integration and analysis of publicly available microarray gene expression data sets. Inside the inSilicoMerging package a set of five visual and six quantitative validation measures are available as well. By providing (i) access to uniformly curated and preprocessed data, (ii) a collection of techniques to remove the batch effects between data sets from different sources, and (iii) several validation tools enabling the inspection of the integration process, these packages enable researchers to fully explore the potential of combining gene expression data for downstream analysis. The power of using both packages is demonstrated by programmatically retrieving and integrating gene expression studies from the InSilico DB repository [https://insilicodb.org/app/].

  3. Comprehensive characterizations of nanoparticle biodistribution following systemic injection in mice

    NASA Astrophysics Data System (ADS)

    Liao, Wei-Yin; Li, Hui-Jing; Chang, Ming-Yao; Tang, Alan C. L.; Hoffman, Allan S.; Hsieh, Patrick C. H.

    2013-10-01

    Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics.Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr03954d

  4. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    PubMed

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  5. Four-to-one power combiner for 20 GHz phased array antenna using RADC MMIC phase shifters

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The design and microwave simulation of two-to-one microstrip power combiners is described. The power combiners were designed for use in a four element phase array receive antenna subarray at 20 GHz. Four test circuits are described which were designed to enable testing of the power combiner and the four element phased array antenna. Test Circuit 1 enables measurement of the two-to-one power combiner. Test Circuit 2 enables measurement of the four-to-one power combiner. Test Circuit 3 enables measurement of a four element antenna array without phase shifting MMIC's in order to characterize the power combiner with the antenna patch-to-microstrip coaxial feedthroughs. Test circuit 4 is the four element phased array antenna including the RADC MMIC phase shifters and appropriate interconnects to provide bias voltages and control phase bits.

  6. Evaluation of Cross-Protocol Stability of a Fully Automated Brain Multi-Atlas Parcellation Tool.

    PubMed

    Liang, Zifei; He, Xiaohai; Ceritoglu, Can; Tang, Xiaoying; Li, Yue; Kutten, Kwame S; Oishi, Kenichi; Miller, Michael I; Mori, Susumu; Faria, Andreia V

    2015-01-01

    Brain parcellation tools based on multiple-atlas algorithms have recently emerged as a promising method with which to accurately define brain structures. When dealing with data from various sources, it is crucial that these tools are robust for many different imaging protocols. In this study, we tested the robustness of a multiple-atlas, likelihood fusion algorithm using Alzheimer's Disease Neuroimaging Initiative (ADNI) data with six different protocols, comprising three manufacturers and two magnetic field strengths. The entire brain was parceled into five different levels of granularity. In each level, which defines a set of brain structures, ranging from eight to 286 regions, we evaluated the variability of brain volumes related to the protocol, age, and diagnosis (healthy or Alzheimer's disease). Our results indicated that, with proper pre-processing steps, the impact of different protocols is minor compared to biological effects, such as age and pathology. A precise knowledge of the sources of data variation enables sufficient statistical power and ensures the reliability of an anatomical analysis when using this automated brain parcellation tool on datasets from various imaging protocols, such as clinical databases.

  7. Ask-the-expert: Active Learning Based Knowledge Discovery Using the Expert

    NASA Technical Reports Server (NTRS)

    Das, Kamalika; Avrekh, Ilya; Matthews, Bryan; Sharma, Manali; Oza, Nikunj

    2017-01-01

    Often the manual review of large data sets, either for purposes of labeling unlabeled instances or for classifying meaningful results from uninteresting (but statistically significant) ones is extremely resource intensive, especially in terms of subject matter expert (SME) time. Use of active learning has been shown to diminish this review time significantly. However, since active learning is an iterative process of learning a classifier based on a small number of SME-provided labels at each iteration, the lack of an enabling tool can hinder the process of adoption of these technologies in real-life, in spite of their labor-saving potential. In this demo we present ASK-the-Expert, an interactive tool that allows SMEs to review instances from a data set and provide labels within a single framework. ASK-the-Expert is powered by an active learning algorithm for training a classifier in the backend. We demonstrate this system in the context of an aviation safety application, but the tool can be adopted to work as a simple review and labeling tool as well, without the use of active learning.

  8. Ask-the-Expert: Active Learning Based Knowledge Discovery Using the Expert

    NASA Technical Reports Server (NTRS)

    Das, Kamalika

    2017-01-01

    Often the manual review of large data sets, either for purposes of labeling unlabeled instances or for classifying meaningful results from uninteresting (but statistically significant) ones is extremely resource intensive, especially in terms of subject matter expert (SME) time. Use of active learning has been shown to diminish this review time significantly. However, since active learning is an iterative process of learning a classifier based on a small number of SME-provided labels at each iteration, the lack of an enabling tool can hinder the process of adoption of these technologies in real-life, in spite of their labor-saving potential. In this demo we present ASK-the-Expert, an interactive tool that allows SMEs to review instances from a data set and provide labels within a single framework. ASK-the-Expert is powered by an active learning algorithm for training a classifier in the back end. We demonstrate this system in the context of an aviation safety application, but the tool can be adopted to work as a simple review and labeling tool as well, without the use of active learning.

  9. High-Sensitivity Nuclear Magnetic Resonance at Giga-Pascal Pressures: A New Tool for Probing Electronic and Chemical Properties of Condensed Matter under Extreme Conditions

    PubMed Central

    Meier, Thomas; Haase, Jürgen

    2014-01-01

    Nuclear Magnetic Resonance (NMR) is one of the most important techniques for the study of condensed matter systems, their chemical structure, and their electronic properties. The application of high pressure enables one to synthesize new materials, but the response of known materials to high pressure is a very useful tool for studying their electronic structure and developing theories. For example, high-pressure synthesis might be at the origin of life; and understanding the behavior of small molecules under extreme pressure will tell us more about fundamental processes in our universe. It is no wonder that there has always been great interest in having NMR available at high pressures. Unfortunately, the desired pressures are often well into the Giga-Pascal (GPa) range and require special anvil cell devices where only very small, secluded volumes are available. This has restricted the use of NMR almost entirely in the past, and only recently, a new approach to high-sensitivity GPa NMR, which has a resonating micro-coil inside the sample chamber, was put forward. This approach enables us to achieve high sensitivity with experiments that bring the power of NMR to Giga-Pascal pressure condensed matter research. First applications, the detection of a topological electronic transition in ordinary aluminum metal and the closing of the pseudo-gap in high-temperature superconductivity, show the power of such an approach. Meanwhile, the range of achievable pressures was increased tremendously with a new generation of anvil cells (up to 10.1 GPa), that fit standard-bore NMR magnets. This approach might become a new, important tool for the investigation of many condensed matter systems, in chemistry, geochemistry, and in physics, since we can now watch structural changes with the eyes of a very versatile probe. PMID:25350694

  10. High-sensitivity nuclear magnetic resonance at Giga-Pascal pressures: a new tool for probing electronic and chemical properties of condensed matter under extreme conditions.

    PubMed

    Meier, Thomas; Haase, Jürgen

    2014-10-10

    Nuclear Magnetic Resonance (NMR) is one of the most important techniques for the study of condensed matter systems, their chemical structure, and their electronic properties. The application of high pressure enables one to synthesize new materials, but the response of known materials to high pressure is a very useful tool for studying their electronic structure and developing theories. For example, high-pressure synthesis might be at the origin of life; and understanding the behavior of small molecules under extreme pressure will tell us more about fundamental processes in our universe. It is no wonder that there has always been great interest in having NMR available at high pressures. Unfortunately, the desired pressures are often well into the Giga-Pascal (GPa) range and require special anvil cell devices where only very small, secluded volumes are available. This has restricted the use of NMR almost entirely in the past, and only recently, a new approach to high-sensitivity GPa NMR, which has a resonating micro-coil inside the sample chamber, was put forward. This approach enables us to achieve high sensitivity with experiments that bring the power of NMR to Giga-Pascal pressure condensed matter research. First applications, the detection of a topological electronic transition in ordinary aluminum metal and the closing of the pseudo-gap in high-temperature superconductivity, show the power of such an approach. Meanwhile, the range of achievable pressures was increased tremendously with a new generation of anvil cells (up to 10.1 GPa), that fit standard-bore NMR magnets. This approach might become a new, important tool for the investigation of many condensed matter systems, in chemistry, geochemistry, and in physics, since we can now watch structural changes with the eyes of a very versatile probe.

  11. GROVER: An autonomous vehicle for ice sheet research

    NASA Astrophysics Data System (ADS)

    Trisca, G. O.; Robertson, M. E.; Marshall, H.; Koenig, L.; Comberiate, M. A.

    2013-12-01

    The Goddard Remotely Operated Vehicle for Exploration and Research or Greenland Rover (GROVER) is a science enabling autonomous robot specifically designed to carry a low-power, large bandwidth radar for snow accumulation mapping over the Greenland Ice Sheet. This new and evolving technology enables reduced cost and increased safety for polar research. GROVER was field tested at Summit, Greenland in May 2013. The robot traveled over 30 km and was controlled both by line of sight wireless and completely autonomously with commands and telemetry via the Iridium Satellite Network, from Summit as well as remotely from Boise, Idaho. Here we describe GROVER's unique abilities and design. The software stack features a modular design that can be adapted for any application that requires autonomous behavior, reliable communications using different technologies and low level control of peripherals. The modules are built to communicate using the publisher-subscriber design pattern to maximize data-reuse and allow for graceful failures at the software level, along with the ability to be loaded or unloaded on-the-fly, enabling the software to adopt different behaviors based on power constraints or specific processing needs. These modules can also be loaded or unloaded remotely for servicing and telemetry can be configured to contain any kind of information being generated by the sensors or scientific instruments. The hardware design protects the electronic components and the control system can change functional parameters based on sensor input. Power failure modes built into the hardware prevent the vehicle from running out of energy permanently by monitoring voltage levels and triggering software reboots when the levels match pre-established conditions. This guarantees that the control software will be operational as soon as there is enough charge to sustain it, giving the vehicle increased longevity in case of a temporary power loss. GROVER demonstrates that autonomous rovers can be a revolutionary tool for data collection, and that both the technology and the software are available and ready to be implemented to create scientific data collection platforms.

  12. Blast2GO goes grid: developing a grid-enabled prototype for functional genomics analysis.

    PubMed

    Aparicio, G; Götz, S; Conesa, A; Segrelles, D; Blanquer, I; García, J M; Hernandez, V; Robles, M; Talon, M

    2006-01-01

    The vast amount in complexity of data generated in Genomic Research implies that new dedicated and powerful computational tools need to be developed to meet their analysis requirements. Blast2GO (B2G) is a bioinformatics tool for Gene Ontology-based DNA or protein sequence annotation and function-based data mining. The application has been developed with the aim of affering an easy-to-use tool for functional genomics research. Typical B2G users are middle size genomics labs carrying out sequencing, ETS and microarray projects, handling datasets up to several thousand sequences. In the current version of B2G. The power and analytical potential of both annotation and function data-mining is somehow restricted to the computational power behind each particular installation. In order to be able to offer the possibility of an enhanced computational capacity within this bioinformatics application, a Grid component is being developed. A prototype has been conceived for the particular problem of speeding up the Blast searches to obtain fast results for large datasets. Many efforts have been done in the literature concerning the speeding up of Blast searches, but few of them deal with the use of large heterogeneous production Grid Infrastructures. These are the infrastructures that could reach the largest number of resources and the best load balancing for data access. The Grid Service under development will analyse requests based on the number of sequences, splitting them accordingly to the available resources. Lower-level computation will be performed through MPIBLAST. The software architecture is based on the WSRF standard.

  13. NASA Missions Enabled by Space Nuclear Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Schmidt, George R.

    2009-01-01

    This viewgraph presentation reviews NASA Space Missions that are enabled by Space Nuclear Systems. The topics include: 1) Space Nuclear System Applications; 2) Trade Space for Electric Power Systems; 3) Power Generation Specific Energy Trade Space; 4) Radioisotope Power Generation; 5) Radioisotope Missions; 6) Fission Power Generation; 7) Solar Powered Lunar Outpost; 8) Fission Powered Lunar Outpost; 9) Fission Electric Power Generation; and 10) Fission Nuclear Thermal Propulsion.

  14. MsViz: A Graphical Software Tool for In-Depth Manual Validation and Quantitation of Post-translational Modifications.

    PubMed

    Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo

    2017-08-04

    Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.

  15. DS Sentry: an acquisition ASIC for smart, micro-power sensing applications

    NASA Astrophysics Data System (ADS)

    Liobe, John; Fiscella, Mark; Moule, Eric; Balon, Mark; Bocko, Mark; Ignjatovic, Zeljko

    2011-06-01

    Unattended ground monitoring that combines seismic and acoustic information can be a highly valuable tool in intelligence gathering; however there are several prerequisites for this approach to be viable. The first is high sensitivity as well as the ability to discriminate real threats from noise and other spurious signals. By combining ground sensing with acoustic and image monitoring this requirement may be achieved. Moreover, the DS Sentry®provides innate spurious signal rejection by the "active-filtering" technique employed as well as embedding some basic statistical analysis. Another primary requirement is spatial and temporal coverage. The ideal is uninterrupted, long-term monitoring of an area. Therefore, sensors should be densely deployed and consume very little power. Furthermore, sensors must be inexpensive and easily deployed to allow dense placements in critical areas. The ADVIS DS Sentry®, which is a fully-custom integrated circuit that enables smart, micro-power monitoring of dynamic signals, is the foundation of the proposed system. The core premise behind this technology is the use of an ultra-low power front-end for active monitoring of dynamic signals in conjunction with a highresolution, Σ Δ-based analog-to-digital converter, which utilizes a novel noise rejection technique and is only employed when a potential threat has been detected. The DS Sentry® can be integrated with seismic accelerometers and microphones and user-programmed to continuously monitor for signals with specific signatures such as impacts, footsteps, excavation noise, vehicle-induced ground vibrations, or speech, while consuming only microwatts of power. This will enable up to several years of continuous monitoring on a single small battery while concurrently mitigating false threats.

  16. Distributed decision-making in electric power system transmission maintenance scheduling using multi-agent systems (MAS)

    NASA Astrophysics Data System (ADS)

    Zhang, Zhong

    In this work, motivated by the need to coordinate transmission maintenance scheduling among a multiplicity of self-interested entities in restructured power industry, a distributed decision support framework based on multiagent negotiation systems (MANS) is developed. An innovative risk-based transmission maintenance optimization procedure is introduced. Several models for linking condition monitoring information to the equipment's instantaneous failure probability are presented, which enable quantitative evaluation of the effectiveness of maintenance activities in terms of system cumulative risk reduction. Methodologies of statistical processing, equipment deterioration evaluation and time-dependent failure probability calculation are also described. A novel framework capable of facilitating distributed decision-making through multiagent negotiation is developed. A multiagent negotiation model is developed and illustrated that accounts for uncertainty and enables social rationality. Some issues of multiagent negotiation convergence and scalability are discussed. The relationships between agent-based negotiation and auction systems are also identified. A four-step MAS design methodology for constructing multiagent systems for power system applications is presented. A generic multiagent negotiation system, capable of inter-agent communication and distributed decision support through inter-agent negotiations, is implemented. A multiagent system framework for facilitating the automated integration of condition monitoring information and maintenance scheduling for power transformers is developed. Simulations of multiagent negotiation-based maintenance scheduling among several independent utilities are provided. It is shown to be a viable alternative solution paradigm to the traditional centralized optimization approach in today's deregulated environment. This multiagent system framework not only facilitates the decision-making among competing power system entities, but also provides a tool to use in studying competitive industry relative to monopolistic industry.

  17. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  18. 'Personalized medicine': what's in a name?

    PubMed

    Pokorska-Bocci, Anna; Stewart, Alison; Sagoo, Gurdeep S; Hall, Alison; Kroese, Mark; Burton, Hilary

    2014-03-01

    Over the last decade genomics and other molecular biosciences have enabled new capabilities that, according to many, have the potential to revolutionize medicine and healthcare. These developments have been associated with a range of terminologies, including 'precision', 'personalized', 'individualized' and 'stratified' medicine. In this article, based on a literature review, we examine how the terms have arisen and their various meanings and definitions. We discuss the impact of the new technologies on disease classification, prevention and management. We suggest that although genomics and molecular biosciences will undoubtedly greatly enhance the power of medicine, they will not lead to a conceptually new paradigm of medical care. What is new is the portfolio of modern tools that medicine and healthcare can use for better targeted approaches to health and disease management, and the sociopolitical contexts within which these tools are applied.

  19. In vivo RNAi: Today and Tomorrow

    PubMed Central

    Perrimon, Norbert; Ni, Jian-Quan; Perkins, Lizabeth

    2010-01-01

    SUMMARY RNA interference (RNAi) provides a powerful reverse genetics approach to analyze gene functions both in tissue culture and in vivo. Because of its widespread applicability and effectiveness it has become an essential part of the tool box kits of model organisms such as Caenorhabditis elegans, Drosophila, and the mouse. In addition, the use of RNAi in animals in which genetic tools are either poorly developed or nonexistent enables a myriad of fundamental questions to be asked. Here, we review the methods and applications of in vivo RNAi to characterize gene functions in model organisms and discuss their impact to the study of developmental as well as evolutionary questions. Further, we discuss the applications of RNAi technologies to crop improvement, pest control and RNAi therapeutics, thus providing an appreciation of the potential for phenomenal applications of RNAi to agriculture and medicine. PMID:20534712

  20. Field-based Information Technology in Geology Education: GeoPads

    NASA Astrophysics Data System (ADS)

    Knoop, P. A.; van der Pluijm, B.

    2004-12-01

    During the past two summers, we have successfully incorporated a field-based information technology component into our senior-level, field geology course (GS-440) at the University of Michigan's Camp Davis Geology Field Station, near Jackson, WY. Using GeoPads -- rugged TabletPCs equipped with electronic notebook software, GIS, GPS, and wireless networking -- we have significantly enhanced our field mapping exercises and field trips. While fully retaining the traditional approaches and advantages of field instruction, GeoPads offer important benefits in the development of students' spatial reasoning skills. GeoPads enable students to record observations and directly create geologic maps in the field, using a combination of an electronic field notebook (Microsoft OneNote) tightly integrated with pen-enabled GIS software (ArcGIS-ArcMap). Specifically, this arrangement permits students to analyze and manipulate their data in multiple contexts and representations -- while still in the field -- using both traditional 2-D map views, as well as richer 3-D contexts. Such enhancements provide students with powerful exploratory tools that aid the development of spatial reasoning skills, allowing more intuitive interactions with 2-D representations of our 3-D world. Additionally, field-based GIS mapping enables better error-detection, through immediate interaction with current observations in the context of both supporting data (e.g., topographic maps, aerial photos, magnetic surveys) and students' ongoing observations. The overall field-based IT approach also provides students with experience using tools that are increasingly relevant to their future academic or professional careers.

  1. Electronics Environmental Benefits Calculator

    EPA Pesticide Factsheets

    The Electronics Environmental Benefits Calculator (EEBC) was developed to assist organizations in estimating the environmental benefits of greening their purchase, use and disposal of electronics.The EEBC estimates the environmental and economic benefits of: Purchasing Electronic Product Environmental Assessment Tool (EPEAT)-registered products; Enabling power management features on computers and monitors above default percentages; Extending the life of equipment beyond baseline values; Reusing computers, monitors and cell phones; and Recycling computers, monitors, cell phones and loads of mixed electronic products.The EEBC may be downloaded as a Microsoft Excel spreadsheet.See https://www.federalelectronicschallenge.net/resources/bencalc.htm for more details.

  2. Simulating Operation of a Large Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Frederick, Dean K.; DeCastro, Jonathan

    2008-01-01

    The Commercial Modular Aero- Propulsion System Simulation (C-MAPSS) is a computer program for simulating transient operation of a commercial turbofan engine that can generate as much as 90,000 lb (.0.4 MN) of thrust. It includes a power-management system that enables simulation of open- or closed-loop engine operation over a wide range of thrust levels throughout the full range of flight conditions. C-MAPSS provides the user with a set of tools for performing open- and closed-loop transient simulations and comparison of linear and non-linear models throughout its operating envelope, in an easy-to-use graphical environment.

  3. Voices from the heart: the use of digital story telling in education.

    PubMed

    Matthews, Jackie

    2014-01-01

    Digital storytelling has emerged as a powerful teaching and learning tool, which presents personal narratives, images and music to create a unique and sometimes emotional snapshot into another person's experience. By offering a platform for sharing and understanding such narratives, professionals may gain insight into a perceived experience and construct their role accordingly. Used effectively, they can engage the listener and offer opportunity to reflect and consider the impact of their professional role on the storyteller. This article looks at how digital storytelling can enhance professional practice and enable vulnerable voices to be heard.

  4. Internal scanning method as unique imaging method of optical vortex scanning microscope

    NASA Astrophysics Data System (ADS)

    Popiołek-Masajada, Agnieszka; Masajada, Jan; Szatkowski, Mateusz

    2018-06-01

    The internal scanning method is specific for the optical vortex microscope. It allows to move the vortex point inside the focused vortex beam with nanometer resolution while the whole beam stays in place. Thus the sample illuminated by the focused vortex beam can be scanned just by the vortex point. We show that this method enables high resolution imaging. The paper presents the preliminary experimental results obtained with the first basic image recovery procedure. A prospect of developing more powerful tools for topography recovery with the optical vortex scanning microscope is discussed shortly.

  5. Intravital imaging of a spheroid-based orthotopic model of melanoma in the mouse ear skin

    PubMed Central

    Chan, Keefe T.; Jones, Stephen W.; Brighton, Hailey E.; Bo, Tao; Cochran, Shelly D.; Sharpless, Norman E.; Bear, James E.

    2017-01-01

    Multiphoton microscopy is a powerful tool that enables the visualization of fluorescently tagged tumor cells and their stromal interactions within tissues in vivo. We have developed an orthotopic model of implanting multicellular melanoma tumor spheroids into the dermis of the mouse ear skin without the requirement for invasive surgery. Here, we demonstrate the utility of this approach to observe the primary tumor, single cell actin dynamics, and tumor-associated vasculature. These methods can be broadly applied to investigate an array of biological questions regarding tumor cell behavior in vivo. PMID:28748125

  6. Plastic Surgery Applications Using Three-Dimensional Planning and Computer-Assisted Design and Manufacturing.

    PubMed

    Pfaff, Miles J; Steinbacher, Derek M

    2016-03-01

    Three-dimensional analysis and planning is a powerful tool in plastic and reconstructive surgery, enabling improved diagnosis, patient education and communication, and intraoperative transfer to achieve the best possible results. Three-dimensional planning can increase efficiency and accuracy, and entails five core components: (1) analysis, (2) planning, (3) virtual surgery, (4) three-dimensional printing, and (5) comparison of planned to actual results. The purpose of this article is to provide an overview of three-dimensional virtual planning and to provide a framework for applying these systems to clinical practice. Therapeutic, V.

  7. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Babak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design (CED) teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a 'next-generation CED; in addition to a point design, the Team develops a model of the local trade space. The process is a balance between the power of a model developing tools and the creativity of humal experts, enabling the development of a variety of trade models for any space mission. This paper reviews the modeling method and its practical implementation in the ED environment. Example results illustrate the benefit of this approach.

  8. DNA origami nanopores: developments, challenges and perspectives

    NASA Astrophysics Data System (ADS)

    Hernández-Ainsa, Silvia; Keyser, Ulrich F.

    2014-11-01

    DNA nanotechnology has enabled the construction of DNA origami nanopores; synthetic nanopores that present improved capabilities for the area of single molecule detection. Their extraordinary versatility makes them a new and powerful tool in nanobiotechnology for a wide range of important applications beyond molecular sensing. In this review, we briefly present the recent developments in this emerging field of research. We discuss the current challenges and possible solutions that would enhance the sensing capabilities of DNA origami nanopores. Finally, we anticipate novel avenues for future research and highlight a range of exciting ideas and applications that could be explored in the near future.

  9. Potential use of combining the diffusion equation with the free Shrödinger equation to improve the Optical Coherence Tomography image analysis

    NASA Astrophysics Data System (ADS)

    Cabrera Fernandez, Delia; Salinas, Harry M.; Somfai, Gabor; Puliafito, Carmen A.

    2006-03-01

    Optical coherence tomography (OCT) is a rapidly emerging medical imaging technology. In ophthalmology, OCT is a powerful tool because it enables visualization of the cross sectional structure of the retina and anterior eye with higher resolutions than any other non-invasive imaging modality. Furthermore, OCT image information can be quantitatively analyzed, enabling objective assessment of features such as macular edema and diabetes retinopathy. We present specific improvements in the quantitative analysis of the OCT system, by combining the diffusion equation with the free Shrödinger equation. In such formulation, important features of the image can be extracted by extending the analysis from the real axis to the complex domain. Experimental results indicate that our proposed novel approach has good performance in speckle noise removal, enhancement and segmentation of the various cellular layers of the retina using the OCT system.

  10. Quantum computing on encrypted data

    NASA Astrophysics Data System (ADS)

    Fisher, K. A. G.; Broadbent, A.; Shalm, L. K.; Yan, Z.; Lavoie, J.; Prevedel, R.; Jennewein, T.; Resch, K. J.

    2014-01-01

    The ability to perform computations on encrypted data is a powerful tool for protecting privacy. Recently, protocols to achieve this on classical computing systems have been found. Here, we present an efficient solution to the quantum analogue of this problem that enables arbitrary quantum computations to be carried out on encrypted quantum data. We prove that an untrusted server can implement a universal set of quantum gates on encrypted quantum bits (qubits) without learning any information about the inputs, while the client, knowing the decryption key, can easily decrypt the results of the computation. We experimentally demonstrate, using single photons and linear optics, the encryption and decryption scheme on a set of gates sufficient for arbitrary quantum computations. As our protocol requires few extra resources compared with other schemes it can be easily incorporated into the design of future quantum servers. These results will play a key role in enabling the development of secure distributed quantum systems.

  11. A new X-ray fluorescence spectroscopy for extraterrestrial materials using a muon beam

    PubMed Central

    Terada, K.; Ninomiya, K.; Osawa, T.; Tachibana, S.; Miyake, Y.; Kubo, M. K.; Kawamura, N.; Higemoto, W.; Tsuchiyama, A.; Ebihara, M.; Uesugi, M.

    2014-01-01

    The recent development of the intense pulsed muon source at J-PARC MUSE, Japan Proton Accelerator Research Complex/MUon Science Establishment (106 s−1 for a momentum of 60 MeV/c), enabled us to pioneer a new frontier in analytical sciences. Here, we report a non-destructive elemental analysis using µ− capture. Controlling muon momentum from 32.5 to 57.5 MeV/c, we successfully demonstrate a depth-profile analysis of light elements (B, C, N, and O) from several mm-thick layered materials and non-destructive bulk analyses of meteorites containing organic materials. Muon beam analysis, enabling a bulk analysis of light to heavy elements without severe radioactivation, is a unique analytical method complementary to other non-destructive analyses. Furthermore, this technology can be used as a powerful tool to identify the content and distribution of organic components in future asteroidal return samples. PMID:24861282

  12. poRe: an R package for the visualization and analysis of nanopore sequencing data.

    PubMed

    Watson, Mick; Thomson, Marian; Risse, Judith; Talbot, Richard; Santoyo-Lopez, Javier; Gharbi, Karim; Blaxter, Mark

    2015-01-01

    The Oxford Nanopore MinION device represents a unique sequencing technology. As a mobile sequencing device powered by the USB port of a laptop, the MinION has huge potential applications. To enable these applications, the bioinformatics community will need to design and build a suite of tools specifically for MinION data. Here we present poRe, a package for R that enables users to manipulate, organize, summarize and visualize MinION nanopore sequencing data. As a package for R, poRe has been tested on Windows, Linux and MacOSX. Crucially, the Windows version allows users to analyse MinION data on the Windows laptop attached to the device. poRe is released as a package for R at http://sourceforge.net/projects/rpore/. A tutorial and further information are available at https://sourceforge.net/p/rpore/wiki/Home/. © The Author 2014. Published by Oxford University Press.

  13. Quantum computing on encrypted data.

    PubMed

    Fisher, K A G; Broadbent, A; Shalm, L K; Yan, Z; Lavoie, J; Prevedel, R; Jennewein, T; Resch, K J

    2014-01-01

    The ability to perform computations on encrypted data is a powerful tool for protecting privacy. Recently, protocols to achieve this on classical computing systems have been found. Here, we present an efficient solution to the quantum analogue of this problem that enables arbitrary quantum computations to be carried out on encrypted quantum data. We prove that an untrusted server can implement a universal set of quantum gates on encrypted quantum bits (qubits) without learning any information about the inputs, while the client, knowing the decryption key, can easily decrypt the results of the computation. We experimentally demonstrate, using single photons and linear optics, the encryption and decryption scheme on a set of gates sufficient for arbitrary quantum computations. As our protocol requires few extra resources compared with other schemes it can be easily incorporated into the design of future quantum servers. These results will play a key role in enabling the development of secure distributed quantum systems.

  14. A new X-ray fluorescence spectroscopy for extraterrestrial materials using a muon beam.

    PubMed

    Terada, K; Ninomiya, K; Osawa, T; Tachibana, S; Miyake, Y; Kubo, M K; Kawamura, N; Higemoto, W; Tsuchiyama, A; Ebihara, M; Uesugi, M

    2014-05-27

    The recent development of the intense pulsed muon source at J-PARC MUSE, Japan Proton Accelerator Research Complex/MUon Science Establishment (10(6) s(-1) for a momentum of 60 MeV/c), enabled us to pioneer a new frontier in analytical sciences. Here, we report a non-destructive elemental analysis using µ(-) capture. Controlling muon momentum from 32.5 to 57.5 MeV/c, we successfully demonstrate a depth-profile analysis of light elements (B, C, N, and O) from several mm-thick layered materials and non-destructive bulk analyses of meteorites containing organic materials. Muon beam analysis, enabling a bulk analysis of light to heavy elements without severe radioactivation, is a unique analytical method complementary to other non-destructive analyses. Furthermore, this technology can be used as a powerful tool to identify the content and distribution of organic components in future asteroidal return samples.

  15. Custom controls

    NASA Astrophysics Data System (ADS)

    Butell, Bart

    1996-02-01

    Microsoft's Visual Basic (VB) and Borland's Delphi provide an extremely robust programming environment for delivering multimedia solutions for interactive kiosks, games and titles. Their object oriented use of standard and custom controls enable a user to build extremely powerful applications. A multipurpose, database enabled programming environment that can provide an event driven interface functions as a multimedia kernel. This kernel can provide a variety of authoring solutions (e.g. a timeline based model similar to Macromedia Director or a node authoring model similar to Icon Author). At the heart of the kernel is a set of low level multimedia components providing object oriented interfaces for graphics, audio, video and imaging. Data preparation tools (e.g., layout, palette and Sprite Editors) could be built to manage the media database. The flexible interface for VB allows the construction of an infinite number of user models. The proliferation of these models within a popular, easy to use environment will allow the vast developer segment of 'producer' types to bring their ideas to the market. This is the key to building exciting, content rich multimedia solutions. Microsoft's VB and Borland's Delphi environments combined with multimedia components enable these possibilities.

  16. The mediation of environmental assessment's influence: What role for power?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cashmore, Matthew, E-mail: cashmore@plan.aau.dk; Axelsson, Anna

    2013-02-15

    Considerable empirical research has been conducted on why policy tools such as environmental assessment (EA) often appear to have 'little effect' (after Weiss) on policy decisions. This article revisits this debate but looks at a mediating factor that has received limited attention to-date in the context of EA - political power. Using a tripartite analytical framework, a comparative analysis of the influence and significance of power in mediating environmental policy integration is undertaken. Power is analysed, albeit partially, through an exploration of institutions that underpin social order. Empirically, the research examines the case of a new approach to policy-level EAmore » (essentially a form of Strategic Environmental Assessment) developed by the World Bank and its trial application to urban environmental governance and planning in Dhaka mega-city, Bangladesh. The research results demonstrate that power was intimately involved in mediating the influence of the policy EA approach, in both positive (enabling) and negative (constraining) ways. It is suggested that the policy EA approach was ultimately a manifestation of a corporate strategy to maintain the powerful position of the World Bank as a leading authority on international development which focuses on knowledge generation. Furthermore, as constitutive of an institution and reflecting the worldviews of its proponents, the development of a new approach to EA also represents a significant power play. This leads us to, firstly, emphasise the concepts of strategy and intentionality in theorising how and why EA tools are employed, succeed and fail; and secondly, reflect on the reasons why power has received such limited attention to-date in EA scholarship. - Highlights: Black-Right-Pointing-Pointer Conducts empirical research on the neglected issue of power. Black-Right-Pointing-Pointer Employs an interpretation of power in which it is viewed as a productive phenomenon. Black-Right-Pointing-Pointer Analyses the influence of power in the trial application of a new approach to policy environmental assessment. Black-Right-Pointing-Pointer Demonstrates the importance of power dynamics in understanding the successes and failures of environmental assessment.« less

  17. Dynamics of the line-start reluctance motor with rotor made of SMC material

    NASA Astrophysics Data System (ADS)

    Smółka, Krzysztof; Gmyrek, Zbigniew

    2017-12-01

    Design and control of electric motors in such a way as to ensure the expected motor dynamics, are the problems studied for many years. Many researchers tried to solve this problem, for example by the design optimization or by the use of special control algorithms in electronic systems. In the case of low-power and fractional power motors, the manufacture cost of the final product is many times less than cost of electronic system powering them. The authors of this paper attempt to improve the dynamic of 120 W line-start synchronous reluctance motor, energized by 50 Hz mains (without any electronic systems). The authors seek a road enabling improvement of dynamics of the analyzed motor, by changing the shape and material of the rotor, in such a way to minimize the modification cost of the tools necessary for the motor production. After the initial selection, the analysis of four rotors having different tooth shapes, was conducted.

  18. Scientific Resource EXplorer

    NASA Astrophysics Data System (ADS)

    Xing, Z.; Wormuth, A.; Smith, A.; Arca, J.; Lu, Y.; Sayfi, E.

    2014-12-01

    Inquisitive minds in our society are never satisfied with curatedimages released by a typical public affairs office. They always want tolook deeper and play directly on original data. However, most scientificdata products are notoriously hard to use. They are immensely large,highly distributed and diverse in format. In this presentation,we will demonstrate Resource EXplorer (REX), a novel webtop applicationthat allows anyone to conveniently explore and visualize rich scientificdata repositories, using only a standard web browser. This tool leverageson the power of Webification Science (w10n-sci), a powerful enabling technologythat simplifies the use of scientific data on the web platform.W10n-sci is now being deployed at an increasing number of NASA data centers,some of which are the largest digital treasure troves in our nation.With REX, these wonderful scientific resources are open for teachers andstudents to learn and play.

  19. Electrical safety device

    DOEpatents

    White, David B.

    1991-01-01

    An electrical safety device for use in power tools that is designed to automatically discontinue operation of the power tool upon physical contact of the tool with a concealed conductive material. A step down transformer is used to supply the operating power for a disconnect relay and a reset relay. When physical contact is made between the power tool and the conductive material, an electrical circuit through the disconnect relay is completed and the operation of the power tool is automatically interrupted. Once the contact between the tool and conductive material is broken, the power tool can be quickly and easily reactivated by a reset push button activating the reset relay. A remote reset is provided for convenience and efficiency of operation.

  20. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  1. B-MIC: An Ultrafast Three-Level Parallel Sequence Aligner Using MIC.

    PubMed

    Cui, Yingbo; Liao, Xiangke; Zhu, Xiaoqian; Wang, Bingqiang; Peng, Shaoliang

    2016-03-01

    Sequence alignment is the central process for sequence analysis, where mapping raw sequencing data to reference genome. The large amount of data generated by NGS is far beyond the process capabilities of existing alignment tools. Consequently, sequence alignment becomes the bottleneck of sequence analysis. Intensive computing power is required to address this challenge. Intel recently announced the MIC coprocessor, which can provide massive computing power. The Tianhe-2 is the world's fastest supercomputer now equipped with three MIC coprocessors each compute node. A key feature of sequence alignment is that different reads are independent. Considering this property, we proposed a MIC-oriented three-level parallelization strategy to speed up BWA, a widely used sequence alignment tool, and developed our ultrafast parallel sequence aligner: B-MIC. B-MIC contains three levels of parallelization: firstly, parallelization of data IO and reads alignment by a three-stage parallel pipeline; secondly, parallelization enabled by MIC coprocessor technology; thirdly, inter-node parallelization implemented by MPI. In this paper, we demonstrate that B-MIC outperforms BWA by a combination of those techniques using Inspur NF5280M server and the Tianhe-2 supercomputer. To the best of our knowledge, B-MIC is the first sequence alignment tool to run on Intel MIC and it can achieve more than fivefold speedup over the original BWA while maintaining the alignment precision.

  2. Analysis of Facial Injuries Caused by Power Tools.

    PubMed

    Kim, Jiye; Choi, Jin-Hee; Hyun Kim, Oh; Won Kim, Sug

    2016-06-01

    The number of injuries caused by power tools is steadily increasing as more domestic woodwork is undertaken and more power tools are used recreationally. The injuries caused by the different power tools as a consequence of accidents are an issue, because they can lead to substantial costs for patients and the national insurance system. The increase in hand surgery as a consequence of the use of power tools and its economic impact, and the characteristics of the hand injuries caused by power saws have been described. In recent years, the authors have noticed that, in addition to hand injuries, facial injuries caused by power tools commonly present to the emergency room. This study aimed to review the data in relation to facial injuries caused by power saws that were gathered from patients who visited the trauma center at our hospital over the last 4 years, and to analyze the incidence and epidemiology of the facial injuries caused by power saws. The authors found that facial injuries caused by power tools have risen continually. Facial injuries caused by power tools are accidental, and they cause permanent facial disfigurements and functional disabilities. Accidents are almost inevitable in particular workplaces; however, most facial injuries could be avoided by providing sufficient operator training and by tool operators wearing suitable protective devices. The evaluation of the epidemiology and patterns of facial injuries caused by power tools in this study should provide the information required to reduce the number of accidental injuries.

  3. Enabling Research Tools for Sustained Climate Assessment

    NASA Technical Reports Server (NTRS)

    Leidner, Allison K.; Bosilovich, Michael G.; Jasinski, Michael F.; Nemani, Ramakrishna R.; Waliser, Duane Edward; Lee, Tsengdar J.

    2016-01-01

    The U.S. Global Change Research Program Sustained Assessment process benefits from long-term investments in Earth science research that enable the scientific community to conduct assessment-relevant science. To this end, NASA initiated several research programs over the past five years to support the Earth observation community in developing indicators, datasets, research products, and tools to support ongoing and future National Climate Assessments. These activities complement NASA's ongoing Earth science research programs. One aspect of the assessment portfolio funds four "enabling tools" projects at NASA research centers. Each tool leverages existing capacity within the center, but has developed tailored applications and products for National Climate Assessments. The four projects build on the capabilities of a global atmospheric reanalysis (MERRA-2), a continental U.S. land surface reanalysis (NCA-LDAS), the NASA Earth Exchange (NEX), and a Regional Climate Model Evaluation System (RCMES). Here, we provide a brief overview of each enabling tool, highlighting the ways in which it has advanced assessment science to date. We also discuss how the assessment community can access and utilize these tools for National Climate Assessments and other sustained assessment activities.

  4. Design Optimization of a Variable-Speed Power Turbine

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.; Jones, Scott M.; Gray, Justin S.

    2014-01-01

    NASA's Rotary Wing Project is investigating technologies that will enable the development of revolutionary civil tilt rotor aircraft. Previous studies have shown that for large tilt rotor aircraft to be viable, the rotor speeds need to be slowed significantly during the cruise portion of the flight. This requirement to slow the rotors during cruise presents an interesting challenge to the propulsion system designer as efficient engine performance must be achieved at two drastically different operating conditions. One potential solution to this challenge is to use a transmission with multiple gear ratios and shift to the appropriate ratio during flight. This solution will require a large transmission that is likely to be maintenance intensive and will require a complex shifting procedure to maintain power to the rotors at all times. An alternative solution is to use a fixed gear ratio transmission and require the power turbine to operate efficiently over the entire speed range. This concept is referred to as a variable-speed power-turbine (VSPT) and is the focus of the current study. This paper explores the design of a variable speed power turbine for civil tilt rotor applications using design optimization techniques applied to NASA's new meanline tool, the Object-Oriented Turbomachinery Analysis Code (OTAC).

  5. Spinoff 2009

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Topics covered include: Image-Capture Devices Extend Medicine's Reach; Medical Devices Assess, Treat Balance Disorders; NASA Bioreactors Advance Disease Treatments; Robotics Algorithms Provide Nutritional Guidelines; "Anti-Gravity" Treadmills Speed Rehabilitation; Crew Management Processes Revitalize Patient Care; Hubble Systems Optimize Hospital Schedules; Web-based Programs Assess Cognitive Fitness; Electrolyte Concentrates Treat Dehydration; Tools Lighten Designs, Maintain Structural Integrity; Insulating Foams Save Money, Increase Safety; Polyimide Resins Resist Extreme Temperatures; Sensors Locate Radio Interference; Surface Operations Systems Improve Airport Efficiency; Nontoxic Resins Advance Aerospace Manufacturing; Sensors Provide Early Warning of Biological Threats; Robot Saves Soldier's Lives Overseas (MarcBot); Apollo-Era Life Raft Saves Hundreds of Sailors; Circuits Enhance Scientific Instruments and Safety Devices; Tough Textiles Protect Payloads and Public Safety Officers; Forecasting Tools Point to Fishing Hotspots; Air Purifiers Eliminate Pathogens, Preserve Food; Fabrics Protect Sensitive Skin from UV Rays; Phase Change Fabrics Control Temperature; Tiny Devices Project Sharp, Colorful Images; Star-Mapping Tools Enable Tracking of Endangered Animals; Nanofiber Filters Eliminate Contaminants; Modeling Innovations Advance Wind Energy Industry; Thermal Insulation Strips Conserve Energy; Satellite Respondent Buoys Identify Ocean Debris; Mobile Instruments Measure Atmospheric Pollutants; Cloud Imagers Offer New Details on Earth's Health; Antennas Lower Cost of Satellite Access; Feature Detection Systems Enhance Satellite Imagery; Chlorophyll Meters Aid Plant Nutrient Management; Telemetry Boards Interpret Rocket, Airplane Engine Data; Programs Automate Complex Operations Monitoring; Software Tools Streamline Project Management; Modeling Languages Refine Vehicle Design; Radio Relays Improve Wireless Products; Advanced Sensors Boost Optical Communication, Imaging; Tensile Fabrics Enhance Architecture Around the World; Robust Light Filters Support Powerful Imaging Devices; Thermoelectric Devices Cool, Power Electronics; Innovative Tools Advance Revolutionary Weld Technique; Methods Reduce Cost, Enhance Quality of Nanotubes; Gauging Systems Monitor Cryogenic Liquids; Voltage Sensors Monitor Harmful Static; and Compact Instruments Measure Heat Potential.

  6. A new mask exposure and analysis facility

    NASA Astrophysics Data System (ADS)

    te Sligte, Edwin; Koster, Norbert; Deutz, Alex; Staring, Wilbert

    2014-10-01

    The introduction of ever higher source powers in EUV systems causes increased risks for contamination and degradation of EUV masks and pellicles. Appropriate testing can help to inventory and mitigate these risks. To this end, we propose EBL2: a laboratory EUV exposure system capable of operating at high EUV powers and intensities, and capable of exposing and analyzing EUV masks. The proposed system architecture is similar to the EBL system which has been operated jointly by TNO and Carl Zeiss SMT since 2005. EBL2 contains an EUV Beam Line, in which samples can be exposed to EUV irradiation in a controlled environment. Attached to this Beam Line is an XPS system, which can be reached from the Beam Line via an in-vacuum transfer system. This enables surface analysis of exposed masks without breaking vacuum. Automated handling with dual pods is foreseen so that exposed EUV masks will still be usable in EUV lithography tools to assess the imaging impact of the exposure. Compared to the existing system, large improvements in EUV power, intensity, reliability, and flexibility are proposed. Also, in-situ measurements by e.g. ellipsometry is foreseen for real time monitoring of the sample condition. The system shall be equipped with additional ports for EUVR or other analysis tools. This unique facility will be open for external customers and other research groups.

  7. High Penetration Solar PV Deployment Sunshine State Solar Grid Initiative (SUNGRIN)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meeker, Rick; Steurer, Mischa; Faruque, MD Omar

    The report provides results from the Sunshine State Solar Grid Initiative (SUNGRIN) high penetration solar PV deployment project led by Florida State University’s (FSU) Center for Advanced Power Systems (CAPS). FSU CAPS and industry and university partners have completed a five-year effort aimed at enabling effective integration of high penetration levels of grid-connected solar PV generation. SUNGRIN has made significant contributions in the development of simulation-assisted techniques, tools, insight and understanding associated with solar PV effects on electric power system (EPS) operation and the evaluation of mitigation options for maintaining reliable operation. An important element of the project was themore » partnership and participation of six major Florida utilities and the Florida Reliability Coordinating Council (FRCC). Utilities provided details and data associated with actual distribution circuits having high-penetration PV to use as case studies. The project also conducted foundational work supporting future investigations of effects at the transmission / bulk power system level. In the final phase of the project, four open-use models with built-in case studies were developed and released, along with synthetic solar PV data sets, and tools and techniques for model reduction and in-depth parametric studies of solar PV impact on distribution circuits. Along with models and data, at least 70 supporting MATLAB functions have been developed and made available, with complete documentation.« less

  8. BioWord: A sequence manipulation suite for Microsoft Word

    PubMed Central

    2012-01-01

    Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326

  9. BioWord: a sequence manipulation suite for Microsoft Word.

    PubMed

    Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan

    2012-06-07

    The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.

  10. Shared Memory Parallelism for 3D Cartesian Discrete Ordinates Solver

    NASA Astrophysics Data System (ADS)

    Moustafa, Salli; Dutka-Malen, Ivan; Plagne, Laurent; Ponçot, Angélique; Ramet, Pierre

    2014-06-01

    This paper describes the design and the performance of DOMINO, a 3D Cartesian SN solver that implements two nested levels of parallelism (multicore+SIMD) on shared memory computation nodes. DOMINO is written in C++, a multi-paradigm programming language that enables the use of powerful and generic parallel programming tools such as Intel TBB and Eigen. These two libraries allow us to combine multi-thread parallelism with vector operations in an efficient and yet portable way. As a result, DOMINO can exploit the full power of modern multi-core processors and is able to tackle very large simulations, that usually require large HPC clusters, using a single computing node. For example, DOMINO solves a 3D full core PWR eigenvalue problem involving 26 energy groups, 288 angular directions (S16), 46 × 106 spatial cells and 1 × 1012 DoFs within 11 hours on a single 32-core SMP node. This represents a sustained performance of 235 GFlops and 40:74% of the SMP node peak performance for the DOMINO sweep implementation. The very high Flops/Watt ratio of DOMINO makes it a very interesting building block for a future many-nodes nuclear simulation tool.

  11. Unleashing the Power of Distributed CPU/GPU Architectures: Massive Astronomical Data Analysis and Visualization Case Study

    NASA Astrophysics Data System (ADS)

    Hassan, A. H.; Fluke, C. J.; Barnes, D. G.

    2012-09-01

    Upcoming and future astronomy research facilities will systematically generate terabyte-sized data sets moving astronomy into the Petascale data era. While such facilities will provide astronomers with unprecedented levels of accuracy and coverage, the increases in dataset size and dimensionality will pose serious computational challenges for many current astronomy data analysis and visualization tools. With such data sizes, even simple data analysis tasks (e.g. calculating a histogram or computing data minimum/maximum) may not be achievable without access to a supercomputing facility. To effectively handle such dataset sizes, which exceed today's single machine memory and processing limits, we present a framework that exploits the distributed power of GPUs and many-core CPUs, with a goal of providing data analysis and visualizing tasks as a service for astronomers. By mixing shared and distributed memory architectures, our framework effectively utilizes the underlying hardware infrastructure handling both batched and real-time data analysis and visualization tasks. Offering such functionality as a service in a “software as a service” manner will reduce the total cost of ownership, provide an easy to use tool to the wider astronomical community, and enable a more optimized utilization of the underlying hardware infrastructure.

  12. 3D printing technology as innovative tool for math and geometry teaching applications

    NASA Astrophysics Data System (ADS)

    Huleihil, M.

    2017-01-01

    The industrial revolution and automation of production processes have changed the face of the world. Three dimensional (3D) printing has the potential to revolutionize manufacturing and further change methods of production toward allowing in increasing number of people to produce products at home. According to a recent OECD (see Backer [1]) publication, “…tapping into the next industrial revolution requires actions on many levels and in many different areas. In particular, unlocking the potential of emerging and enabling technologies requires policy development along a number of fronts, from commercialization to regulation and the supply of skills through education.” In this paper we discuss the role of schools and their responsibility to act as quickly as possible to design a plan of action that will prepare the future citizens to deal with this new reality. This requires planning of action in different directions and on different planes, such as labs, teachers, and curricula. 3D printing requires higher levels of thinking, innovation and creativity. It has the power to develop human imagination and give students the opportunity to visualize numbers, two- dimensional shapes, and three-dimensional objects. The combination of thinking, design, and production has immense power to increase motivation and satisfaction, with a highly probable increase in a student’s math and geometry achievements. The CAD system includes a measure tool which enables and alternative way for calculating properties of the objects under consideration and allows development of reflection and critical thinking. The research method was based on comparison between a reference group and a test group; it was found that intervention significantly improved the reflection abilities of 6th grade students in mathematics.

  13. Method for reworkable packaging of high speed, low electrical parasitic power electronics modules through gate drive integration

    DOEpatents

    Passmore, Brandon; Cole, Zach; Whitaker, Bret; Barkley, Adam; McNutt, Ty; Lostetter, Alexander

    2016-08-02

    A multichip power module directly connecting the busboard to a printed-circuit board that is attached to the power substrate enabling extremely low loop inductance for extreme environments such as high temperature operation. Wire bond interconnections are taught from the power die directly to the busboard further enabling enable low parasitic interconnections. Integration of on-board high frequency bus capacitors provide extremely low loop inductance. An extreme environment gate driver board allows close physical proximity of gate driver and power stage to reduce overall volume and reduce impedance in the control circuit. Parallel spring-loaded pin gate driver PCB connections allows a reliable and reworkable power module to gate driver interconnections.

  14. Application of the Enabler to nuclear electric propulsion

    NASA Astrophysics Data System (ADS)

    Pierce, Bill L.

    This paper describes a power system concept that provides the electric power for a baseline electric propulsion system for a piloted mission to Mars. A 10-MWe space power system is formed by coupling an Enabler reactor with a simple non-recuperated closed Brayton cycle. The Enabler reactor is a gas-cooled reactor based on proven reactor technology developed under the NERVA/Rover programs. The selected power cycle, which uses a helium-xenon mixture at 1920 K at the turbine inlet, is diagramed and described. The specific mass of the power system over the power range from 5 to 70 MWe is given. The impact of operating life on the specific mass of a 10-MWe system is also shown.

  15. Microfluidics as a functional tool for cell mechanics.

    PubMed

    Vanapalli, Siva A; Duits, Michel H G; Mugele, Frieder

    2009-01-05

    Living cells are a fascinating demonstration of nature's most intricate and well-coordinated micromechanical objects. They crawl, spread, contract, and relax-thus performing a multitude of complex mechanical functions. Alternatively, they also respond to physical and chemical cues that lead to remodeling of the cytoskeleton. To understand this intricate coupling between mechanical properties, mechanical function and force-induced biochemical signaling requires tools that are capable of both controlling and manipulating the cell microenvironment and measuring the resulting mechanical response. In this review, the power of microfluidics as a functional tool for research in cell mechanics is highlighted. In particular, current literature is discussed to show that microfluidics powered by soft lithographic techniques offers the following capabilities that are of significance for understanding the mechanical behavior of cells: (i) Microfluidics enables the creation of in vitro models of physiological environments in which cell mechanics can be probed. (ii) Microfluidics is an excellent means to deliver physical cues that affect cell mechanics, such as cell shape, fluid flow, substrate topography, and stiffness. (iii) Microfluidics can also expose cells to chemical cues, such as growth factors and drugs, which alter their mechanical behavior. Moreover, these chemical cues can be delivered either at the whole cell or subcellular level. (iv) Microfluidic devices offer the possibility of measuring the intrinsic mechanical properties of cells in a high throughput fashion. (v) Finally, microfluidic methods provide exquisite control over drop size, generation, and manipulation. As a result, droplets are being increasingly used to control the physicochemical environment of cells and as biomimetic analogs of living cells. These powerful attributes of microfluidics should further stimulate novel means of investigating the link between physicochemical cues and the biomechanical response of cells. Insights from such studies will have implications in areas such as drug delivery, medicine, tissue engineering, and biomedical diagnostics.

  16. Advanced Monitoring to Improve Combustion Turbine/Combined Cycle Reliability, Availability & Maintainability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leonard Angello

    2005-09-30

    Power generators are concerned with the maintenance costs associated with the advanced turbines that they are purchasing. Since these machines do not have fully established Operation and Maintenance (O&M) track records, power generators face financial risk due to uncertain future maintenance costs. This risk is of particular concern, as the electricity industry transitions to a competitive business environment in which unexpected O&M costs cannot be passed through to consumers. These concerns have accelerated the need for intelligent software-based diagnostic systems that can monitor the health of a combustion turbine in real time and provide valuable information on the machine's performancemore » to its owner/operators. EPRI, Impact Technologies, Boyce Engineering, and Progress Energy have teamed to develop a suite of intelligent software tools integrated with a diagnostic monitoring platform that, in real time, interpret data to assess the 'total health' of combustion turbines. The 'Combustion Turbine Health Management System' (CTHMS) will consist of a series of 'Dynamic Link Library' (DLL) programs residing on a diagnostic monitoring platform that accepts turbine health data from existing monitoring instrumentation. CTHMS interprets sensor and instrument outputs, correlates them to a machine's condition, provide interpretative analyses, project servicing intervals, and estimate remaining component life. In addition, the CTHMS enables real-time anomaly detection and diagnostics of performance and mechanical faults, enabling power producers to more accurately predict critical component remaining useful life and turbine degradation.« less

  17. Data-driven multi-scale multi-physics models to derive process-structure-property relationships for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-05-01

    Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.

  18. Generative Representations for Computer-Automated Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2006-01-01

    With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.

  19. Bioanalytical applications of SERS (surface-enhanced Raman spectroscopy).

    PubMed

    Hudson, Stephen D; Chumanov, George

    2009-06-01

    Surface-enhanced Raman scattering (SERS) is a powerful technique for analyzing biological samples as it can rapidly and nondestructively provide chemical and, in some cases, structural information about molecules in aqueous environments. In the Raman scattering process, both visible and near-infrared (NIR) wavelengths of light can be used to induce polarization of Raman-active molecules, leading to inelastic light scattering that yields specific molecular vibrational information. The development of surface enhancement has enabled Raman scattering to be an effective tool for qualitative as well as quantitative measurements with high sensitivity and specificity. Recent advances have led to many novel applications of SERS for biological analyses, resulting in new insights for biochemistry and molecular biology, the detection of biological warfare agents, and medical diagnostics for cancer, diabetes, and other diseases. This trend article highlights many of these recent investigations and provides a brief outlook in order to assess possible future directions of SERS as a bioanalytical tool.

  20. CellCognition: time-resolved phenotype annotation in high-throughput live cell imaging.

    PubMed

    Held, Michael; Schmitz, Michael H A; Fischer, Bernd; Walter, Thomas; Neumann, Beate; Olma, Michael H; Peter, Matthias; Ellenberg, Jan; Gerlich, Daniel W

    2010-09-01

    Fluorescence time-lapse imaging has become a powerful tool to investigate complex dynamic processes such as cell division or intracellular trafficking. Automated microscopes generate time-resolved imaging data at high throughput, yet tools for quantification of large-scale movie data are largely missing. Here we present CellCognition, a computational framework to annotate complex cellular dynamics. We developed a machine-learning method that combines state-of-the-art classification with hidden Markov modeling for annotation of the progression through morphologically distinct biological states. Incorporation of time information into the annotation scheme was essential to suppress classification noise at state transitions and confusion between different functional states with similar morphology. We demonstrate generic applicability in different assays and perturbation conditions, including a candidate-based RNA interference screen for regulators of mitotic exit in human cells. CellCognition is published as open source software, enabling live-cell imaging-based screening with assays that directly score cellular dynamics.

  1. Methods for quantifying simple gravity sensing in Drosophila melanogaster.

    PubMed

    Inagaki, Hidehiko K; Kamikouchi, Azusa; Ito, Kei

    2010-01-01

    Perception of gravity is essential for animals: most animals possess specific sense organs to detect the direction of the gravitational force. Little is known, however, about the molecular and neural mechanisms underlying their behavioral responses to gravity. Drosophila melanogaster, having a rather simple nervous system and a large variety of molecular genetic tools available, serves as an ideal model for analyzing the mechanisms underlying gravity sensing. Here we describe an assay to measure simple gravity responses of flies behaviorally. This method can be applied for screening genetic mutants of gravity perception. Furthermore, in combination with recent genetic techniques to silence or activate selective sets of neurons, it serves as a powerful tool to systematically identify neural substrates required for the proper behavioral responses to gravity. The assay requires 10 min to perform, and two experiments can be performed simultaneously, enabling 12 experiments per hour.

  2. A Tool for Medical Research

    NASA Technical Reports Server (NTRS)

    1992-01-01

    California Measurements, Inc.'s PC-2 Aerosol Particle Analyzer, developed by William Chiang, a former Jet Propulsion Laboratory (JPL) engineer, was used in a study to measure the size of particles in the medical environment. Chiang has a NASA license for the JPL crystal oscillator technology and originally built the instrument for atmospheric research. In the operating room, it enabled researchers from the University of California to obtain multiple sets of data repeatedly and accurately. The study concluded that significant amounts of aerosols are generated during surgery when power tools are employed, and most of these are in the respirable size. Almost all contain blood and are small enough to pass through surgical masks. Research on the presence of blood aerosols during oral surgery had similar results. Further studies are planned to determine the possibility of HIV transmission during surgery, and the PC-2H will be used to quantify blood aerosols.

  3. Data-driven multi-scale multi-physics models to derive process-structure-property relationships for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-01-01

    Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.

  4. Intravital microscopy

    PubMed Central

    Masedunskas, Andrius; Milberg, Oleg; Porat-Shliom, Natalie; Sramkova, Monika; Wigand, Tim; Amornphimoltham, Panomwat; Weigert, Roberto

    2012-01-01

    Intravital microscopy is an extremely powerful tool that enables imaging several biological processes in live animals. Recently, the ability to image subcellular structures in several organs combined with the development of sophisticated genetic tools has made possible extending this approach to investigate several aspects of cell biology. Here we provide a general overview of intravital microscopy with the goal of highlighting its potential and challenges. Specifically, this review is geared toward researchers that are new to intravital microscopy and focuses on practical aspects of carrying out imaging in live animals. Here we share the know-how that comes from first-hand experience, including topics such as choosing the right imaging platform and modality, surgery and stabilization techniques, anesthesia and temperature control. Moreover, we highlight some of the approaches that facilitate subcellular imaging in live animals by providing numerous examples of imaging selected organelles and the actin cytoskeleton in multiple organs. PMID:22992750

  5. High-Level Data Races

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Havelund, Klaus; Biere, Armin; Koga, Dennis (Technical Monitor)

    2003-01-01

    Data races are a common problem in concurrent and multi-threaded programming. They are hard to detect without proper tool support. Despite the successful application of these tools, experience shows that the notion of data race is not powerful enough to capture certain types of inconsistencies occurring in practice. In this paper we investigate data races on a higher abstraction layer. This enables us to detect inconsistent uses of shared variables, even if no classical race condition occurs. For example, a data structure representing a coordinate pair may have to be treated atomically. By lifting the meaning of a data race to a higher level, such problems can now be covered. The paper defines the concepts view and view consistency to give a notation for this novel kind of property. It describes what kinds of errors can be detected with this new definition, and where its limitations are. It also gives a formal guideline for using data structures in a multi-threading environment.

  6. Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Syamlal, Madhava; Cottrell, Roger

    2012-09-30

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less

  7. Internet Enabled Remote Driving of a Combat Hybrid Electric Power System for Duty Cycle Measurement

    DTIC Science & Technology

    2007-06-01

    INTERNET ENABLED REMOTE DRIVING OF A COMBAT HYBRID ELECTRIC POWER SYSTEM FOR DUTY CYCLE MEASUREMENT Jarrett Goodell1 Marc Compere , Ph.D.2...Orlando, FL, April 2006. 2. Compere , M.; M.; Goodell, J.; Simon, M; Smith, W.; Brudnak, M, “Robust Control Techniques Enabling Duty Cycle...2006-01-3077, SAE Power Systems Conference, Nov. 2006. 3. Compere , M.; Simon, M.; Kajs, J.; Pozolo, M., “Tracked Vehicle Mobility Load Emulation for a

  8. Resiliency scoring for business continuity plans.

    PubMed

    Olson, Anna; Anderson, Jamie

    Through this paper readers will learn of a scoring methodology, referred to as resiliency scoring, which enables the evaluation of business continuity plans based upon analysis of their alignment with a predefined set of criteria that can be customised and are adaptable to the needs of any organisation. This patent pending tool has been successful in driving engagement and is a powerful resource to improve reporting capabilities, identify risks and gauge organisational resilience. The role of business continuity professionals is to aid their organisations in planning and preparedness activities aimed at mitigating the impacts of potential disruptions and ensuring critical business functions can continue in the event of unforeseen circumstances. This may seem like a daunting task for what can typically be a small team of individuals. For this reason, it is important to be able to leverage industry standards, documented best practices and effective tools to streamline and support your continuity programme. The resiliency scoring methodology developed and implemented at Target has proven to be a valuable tool in taking the organisation's continuity programme to the next level. This paper will detail how the tool was developed and provide guidance on how it can be customised to fit your organisation's unique needs.

  9. 29 CFR 1910.242 - Hand and portable powered tools and equipment, general.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to less than 30 p.s.i. and then only with effective chip guarding and personal protective equipment. ... 29 Labor 5 2011-07-01 2011-07-01 false Hand and portable powered tools and equipment, general... Powered Tools and Other Hand-Held Equipment § 1910.242 Hand and portable powered tools and equipment...

  10. 29 CFR 1910.242 - Hand and portable powered tools and equipment, general.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to less than 30 p.s.i. and then only with effective chip guarding and personal protective equipment. ... 29 Labor 5 2010-07-01 2010-07-01 false Hand and portable powered tools and equipment, general... Powered Tools and Other Hand-Held Equipment § 1910.242 Hand and portable powered tools and equipment...

  11. Capstone: A Geometry-Centric Platform to Enable Physics-Based Simulation and Design of Systems

    DTIC Science & Technology

    2015-10-05

    foundation for the air-vehicle early design tool DaVinci being developed by CREATETM-AV project to enable development of associative models of air...CREATETM-AV solvers Kestrel [11] and Helios [16,17]. Furthermore, it is the foundation for the CREATETM-AV’s DaVinci [9] tool that provides a... Tools and Environments (CREATETM) program [6] aimed at developing a suite of high- performance physics-based computational tools addressing the needs

  12. Reconfigurable engineered motile semiconductor microparticles.

    PubMed

    Ohiri, Ugonna; Shields, C Wyatt; Han, Koohee; Tyler, Talmage; Velev, Orlin D; Jokerst, Nan

    2018-05-03

    Locally energized particles form the basis for emerging classes of active matter. The design of active particles has led to their controlled locomotion and assembly. The next generation of particles should demonstrate robust control over their active assembly, disassembly, and reconfiguration. Here we introduce a class of semiconductor microparticles that can be comprehensively designed (in size, shape, electric polarizability, and patterned coatings) using standard microfabrication tools. These custom silicon particles draw energy from external electric fields to actively propel, while interacting hydrodynamically, and sequentially assemble and disassemble on demand. We show that a number of electrokinetic effects, such as dielectrophoresis, induced charge electrophoresis, and diode propulsion, can selectively power the microparticle motions and interactions. The ability to achieve on-demand locomotion, tractable fluid flows, synchronized motility, and reversible assembly using engineered silicon microparticles may enable advanced applications that include remotely powered microsensors, artificial muscles, reconfigurable neural networks and computational systems.

  13. Environmental applications of single collector high resolution ICP-MS.

    PubMed

    Krachler, Michael

    2007-08-01

    The number of environmental applications of single collector high resolution ICP-MS (HR-ICP-MS) has increased rapidly in recent years. There are many factors that contribute to make HR-ICP-MS a very powerful tool in environmental analysis. They include the extremely low detection limits achievable, tremendously high sensitivity, the ability to separate ICP-MS signals of the analyte from spectral interferences, enabling the reliable determination of many trace elements, and the reasonable precision of isotope ratio measurements. These assets are improved even further using high efficiency sample introduction systems. Therefore, external factors such as the stability of laboratory blanks are frequently the limiting factor in HR-ICP-MS analysis rather than the detection power. This review aims to highlight the most recent applications of HR-ICP-MS in this sector, focusing on matrices and applications where the superior capabilities of the instrumental technique are most useful and often ultimately required.

  14. Small but mighty: Dark matter substructures

    NASA Astrophysics Data System (ADS)

    Cyr-Racine, Francis-Yan; Keeton, Charles; Moustakas, Leonidas

    2018-01-01

    The fundamental properties of dark matter, such as its mass, self-interaction, and coupling to other particles, can have a major impact on the evolution of cosmological density fluctuations on small length scales. Strong gravitational lenses have long been recognized as powerful tools to study the dark matter distribution on these small subgalactic scales. In this talk, we discuss how gravitationally lensed quasars and extended lensed arcs could be used to probe non minimal dark matter models. We comment on the possibilities enabled by precise astrometry, deep imaging, and time delays to extract information about mass substructures inside lens galaxies. To this end, we introduce a new lensing statistics that allows for a robust diagnostic of the presence of perturbations caused by substructures. We determine which properties of mass substructures are most readily constrained by lensing data and forecast the constraining power of current and future observations.

  15. Steinberg ``AUDIOMAPS'' Music Appreciation-Via-Understanding: Special-Relativity + Expectations ``Quantum-Theory'': a Quantum-ACOUSTO/MUSICO-Dynamics (QA/MD)

    NASA Astrophysics Data System (ADS)

    Fender, Lee; Steinberg, Russell; Siegel, Edward Carl-Ludwig

    2011-03-01

    Steinberg wildly popular "AUDIOMAPS" music enjoyment/appreciation-via-understanding methodology, versus art, music-dynamics evolves, telling a story in (3+1)-dimensions: trails, frames, timbres, + dynamics amplitude vs. music-score time-series (formal-inverse power-spectrum) surprisingly closely parallels (3+1)-dimensional Einstein(1905) special-relativity "+" (with its enjoyment-expectations) a manifestation of quantum-theory expectation-values, together a music quantum-ACOUSTO/MUSICO-dynamics(QA/MD). Analysis via Derrida deconstruction enabled Siegel-Baez "Category-Semantics" "FUZZYICS"="CATEGORYICS ('TRIZ") Aristotle SoO DEduction , irrespective of Boon-Klimontovich vs. Voss-Clark[PRL(77)] music power-spectrum analysis sampling-time/duration controversy: part versus whole, shows QA/MD reigns supreme as THE music appreciation-via-analysis tool for the listener in musicology!!! Connection to Deutsch-Hartmann-Levitin[This is Your Brain on Music, (06)] brain/mind-barrier brain/mind-music connection is subtle/compelling/immediate!!!

  16. Computational resources for ribosome profiling: from database to Web server and software.

    PubMed

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Development of Radiated Power Diagnostics for NSTX-U

    NASA Astrophysics Data System (ADS)

    Reinke, Matthew; van Eden, G. G.; Lovell, Jack; Peterson, Byron; Gray, Travis; Chandra, Rian; Stratton, Brent; Ellis, Robert; NSTX-U Team

    2016-10-01

    New tools to measure radiated power in NSTX-U are under development to support a range of core and boundary physics research. Multiple resistive bolometer pinhole cameras are being built and calibrated to support FY17 operations, all utilizing standard Au-foil sensors from IPT-Albrecht. The radiation in the lower divertor will be measured using two, 8 channel arrays viewing both vertically and radially to enable estimates of the 2D radiation structure. The core radiation will be measured using a 24 channel array viewing tangentially near the midplane, observing the full cross-section from the inner to outer limiter. This enables characterization of the centrifugally-driven in/out radiation asymmetry expected from mid-Z and high-Z impurities in highly rotating NSTX-U plasmas. All sensors utilize novel FPGA-based BOLO8BLF analyzers from D-tAcq Solutions. Resistive bolometer measurements are complemented by an InfraRed Video Bolometer (IRVB) which measures the temperature change of radiation absorber using an IR camera. A prototype IRVB system viewing the lower divertor was installed on NSTX-U for FY16 operations. Initial results from the plasma and benchtop testing are used to demonstrate the relative advantages between IRVB and resistive bolometers. Supported in Part by DE-AC05-00OR22725 & DE-AC02-09CH11466.

  18. Pathway-Based Kernel Boosting for the Analysis of Genome-Wide Association Studies

    PubMed Central

    Manitz, Juliane; Burger, Patricia; Amos, Christopher I.; Chang-Claude, Jenny; Wichmann, Heinz-Erich; Kneib, Thomas; Bickeböller, Heike

    2017-01-01

    The analysis of genome-wide association studies (GWAS) benefits from the investigation of biologically meaningful gene sets, such as gene-interaction networks (pathways). We propose an extension to a successful kernel-based pathway analysis approach by integrating kernel functions into a powerful algorithmic framework for variable selection, to enable investigation of multiple pathways simultaneously. We employ genetic similarity kernels from the logistic kernel machine test (LKMT) as base-learners in a boosting algorithm. A model to explain case-control status is created iteratively by selecting pathways that improve its prediction ability. We evaluated our method in simulation studies adopting 50 pathways for different sample sizes and genetic effect strengths. Additionally, we included an exemplary application of kernel boosting to a rheumatoid arthritis and a lung cancer dataset. Simulations indicate that kernel boosting outperforms the LKMT in certain genetic scenarios. Applications to GWAS data on rheumatoid arthritis and lung cancer resulted in sparse models which were based on pathways interpretable in a clinical sense. Kernel boosting is highly flexible in terms of considered variables and overcomes the problem of multiple testing. Additionally, it enables the prediction of clinical outcomes. Thus, kernel boosting constitutes a new, powerful tool in the analysis of GWAS data and towards the understanding of biological processes involved in disease susceptibility. PMID:28785300

  19. Pathway-Based Kernel Boosting for the Analysis of Genome-Wide Association Studies.

    PubMed

    Friedrichs, Stefanie; Manitz, Juliane; Burger, Patricia; Amos, Christopher I; Risch, Angela; Chang-Claude, Jenny; Wichmann, Heinz-Erich; Kneib, Thomas; Bickeböller, Heike; Hofner, Benjamin

    2017-01-01

    The analysis of genome-wide association studies (GWAS) benefits from the investigation of biologically meaningful gene sets, such as gene-interaction networks (pathways). We propose an extension to a successful kernel-based pathway analysis approach by integrating kernel functions into a powerful algorithmic framework for variable selection, to enable investigation of multiple pathways simultaneously. We employ genetic similarity kernels from the logistic kernel machine test (LKMT) as base-learners in a boosting algorithm. A model to explain case-control status is created iteratively by selecting pathways that improve its prediction ability. We evaluated our method in simulation studies adopting 50 pathways for different sample sizes and genetic effect strengths. Additionally, we included an exemplary application of kernel boosting to a rheumatoid arthritis and a lung cancer dataset. Simulations indicate that kernel boosting outperforms the LKMT in certain genetic scenarios. Applications to GWAS data on rheumatoid arthritis and lung cancer resulted in sparse models which were based on pathways interpretable in a clinical sense. Kernel boosting is highly flexible in terms of considered variables and overcomes the problem of multiple testing. Additionally, it enables the prediction of clinical outcomes. Thus, kernel boosting constitutes a new, powerful tool in the analysis of GWAS data and towards the understanding of biological processes involved in disease susceptibility.

  20. SBCDDB: Sleeping Beauty Cancer Driver Database for gene discovery in mouse models of human cancers

    PubMed Central

    Mann, Michael B

    2018-01-01

    Abstract Large-scale oncogenomic studies have identified few frequently mutated cancer drivers and hundreds of infrequently mutated drivers. Defining the biological context for rare driving events is fundamentally important to increasing our understanding of the druggable pathways in cancer. Sleeping Beauty (SB) insertional mutagenesis is a powerful gene discovery tool used to model human cancers in mice. Our lab and others have published a number of studies that identify cancer drivers from these models using various statistical and computational approaches. Here, we have integrated SB data from primary tumor models into an analysis and reporting framework, the Sleeping Beauty Cancer Driver DataBase (SBCDDB, http://sbcddb.moffitt.org), which identifies drivers in individual tumors or tumor populations. Unique to this effort, the SBCDDB utilizes a single, scalable, statistical analysis method that enables data to be grouped by different biological properties. This allows for SB drivers to be evaluated (and re-evaluated) under different contexts. The SBCDDB provides visual representations highlighting the spatial attributes of transposon mutagenesis and couples this functionality with analysis of gene sets, enabling users to interrogate relationships between drivers. The SBCDDB is a powerful resource for comparative oncogenomic analyses with human cancer genomics datasets for driver prioritization. PMID:29059366

  1. Climate tools in mainstream Linux distributions

    NASA Astrophysics Data System (ADS)

    McKinstry, Alastair

    2015-04-01

    Debian/meterology is a project to integrate climate tools and analysis software into the mainstream Debian/Ubuntu Linux distributions. This work describes lessons learnt, and recommends practices for scientific software to be adopted and maintained in OS distributions. In addition to standard analysis tools (cdo,, grads, ferret, metview, ncl, etc.), software used by the Earth System Grid Federation was chosen for integraion, to enable ESGF portals to be built on this base; however exposing scientific codes via web APIs enables security weaknesses, normally ignorable, to be exposed. How tools are hardened, and what changes are required to handle security upgrades, are described. Secondly, to enable libraries and components (e.g. Python modules) to be integrated requires planning by writers: it is not sufficient to assume users can upgrade their code when you make incompatible changes. Here, practices are recommended to enable upgrades and co-installability of C, C++, Fortran and Python codes. Finally, software packages such as NetCDF and HDF5 can be built in multiple configurations. Tools may then expect incompatible versions of these libraries (e.g. serial and parallel) to be simultaneously available; how this was solved in Debian using "pkg-config" and shared library interfaces is described, and best practices for software writers to enable this are summarised.

  2. Data acquisition and real-time control using spreadsheets: interfacing Excel with external hardware.

    PubMed

    Aliane, Nourdine

    2010-07-01

    Spreadsheets have become a popular computational tool and a powerful platform for performing engineering calculations. Moreover, spreadsheets include a macro language, which permits the inclusion of standard computer code in worksheets, and thereby enable developers to greatly extend spreadsheets' capabilities by designing specific add-ins. This paper describes how to use Excel spreadsheets in conjunction to Visual Basic for Application programming language to perform data acquisition and real-time control. Afterwards, the paper presents two Excel applications with interactive user interfaces developed for laboratory demonstrations and experiments in an introductory course in control. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  3. One-step random mutagenesis by error-prone rolling circle amplification

    PubMed Central

    Fujii, Ryota; Kitaoka, Motomitsu; Hayashi, Kiyoshi

    2004-01-01

    In vitro random mutagenesis is a powerful tool for altering properties of enzymes. We describe here a novel random mutagenesis method using rolling circle amplification, named error-prone RCA. This method consists of only one DNA amplification step followed by transformation of the host strain, without treatment with any restriction enzymes or DNA ligases, and results in a randomly mutated plasmid library with 3–4 mutations per kilobase. Specific primers or special equipment, such as a thermal-cycler, are not required. This method permits rapid preparation of randomly mutated plasmid libraries, enabling random mutagenesis to become a more commonly used technique. PMID:15507684

  4. A new look at deep-sea video

    USGS Publications Warehouse

    Chezar, H.; Lee, J.

    1985-01-01

    A deep-towed photographic system with completely self-contained recording instrumentation and power can obtain color-video and still-photographic transects along rough terrane without need for a long electrically conducting cable. Both the video- and still-camera systems utilize relatively inexpensive and proven off-the-shelf hardware adapted for deep-water environments. The small instrument frame makes the towed sled an ideal photographic tool for use on ship or small-boat operations. The system includes a temperature probe and altimeter that relay data acoustically from the sled to the surface ship. This relay enables the operator to monitor simultaneously water temperature and the precise height off the bottom. ?? 1985.

  5. Rapid and tunable method to temporally control gene editing based on conditional Cas9 stabilization. | Office of Cancer Genomics

    Cancer.gov

    The CRISPR/Cas9 system is a powerful tool for studying gene function. Here, we describe a method that allows temporal control of CRISPR/Cas9 activity based on conditional Cas9 destabilization. We demonstrate that fusing an FKBP12-derived destabilizing domain to Cas9 (DD-Cas9) enables conditional Cas9 expression and temporal control of gene editing in the presence of an FKBP12 synthetic ligand. This system can be easily adapted to co-express, from the same promoter, DD-Cas9 with any other gene of interest without co-modulation of the latter.

  6. Transcriptome analysis by strand-specific sequencing of complementary DNA

    PubMed Central

    Parkhomchuk, Dmitri; Borodina, Tatiana; Amstislavskiy, Vyacheslav; Banaru, Maria; Hallen, Linda; Krobitsch, Sylvia; Lehrach, Hans; Soldatov, Alexey

    2009-01-01

    High-throughput complementary DNA sequencing (RNA-Seq) is a powerful tool for whole-transcriptome analysis, supplying information about a transcript's expression level and structure. However, it is difficult to determine the polarity of transcripts, and therefore identify which strand is transcribed. Here, we present a simple cDNA sequencing protocol that preserves information about a transcript's direction. Using Saccharomyces cerevisiae and mouse brain transcriptomes as models, we demonstrate that knowing the transcript's orientation allows more accurate determination of the structure and expression of genes. It also helps to identify new genes and enables studying promoter-associated and antisense transcription. The transcriptional landscapes we obtained are available online. PMID:19620212

  7. Transcriptome analysis by strand-specific sequencing of complementary DNA.

    PubMed

    Parkhomchuk, Dmitri; Borodina, Tatiana; Amstislavskiy, Vyacheslav; Banaru, Maria; Hallen, Linda; Krobitsch, Sylvia; Lehrach, Hans; Soldatov, Alexey

    2009-10-01

    High-throughput complementary DNA sequencing (RNA-Seq) is a powerful tool for whole-transcriptome analysis, supplying information about a transcript's expression level and structure. However, it is difficult to determine the polarity of transcripts, and therefore identify which strand is transcribed. Here, we present a simple cDNA sequencing protocol that preserves information about a transcript's direction. Using Saccharomyces cerevisiae and mouse brain transcriptomes as models, we demonstrate that knowing the transcript's orientation allows more accurate determination of the structure and expression of genes. It also helps to identify new genes and enables studying promoter-associated and antisense transcription. The transcriptional landscapes we obtained are available online.

  8. Recent advances in the chemistry of Rh carbenoids: multicomponent reactions of diazocarbonyl compounds

    NASA Astrophysics Data System (ADS)

    Medvedev, J. J.; Nikolaev, V. A.

    2015-07-01

    Multicomponent reactions of diazo compounds catalyzed by RhII complexes become a powerful tool for organic synthesis. They enable three- or four-step processes to be carried out as one-pot procedures (actually as one step) with high stereoselectivity to give complex organic molecules, including biologically active compounds. This review addresses recent results in the chemistry of Rh-catalyzed multicomponent reactions of diazocarbonyl compounds with the intermediate formation of N-, O- and C=O-ylides. The diastereo- and enantioselectivity of these reactions and the possibility of using various co-catalysts to increase the efficiency of the processes under consideration are discussed. The bibliography includes 120 references.

  9. Space Nuclear Power Systems

    NASA Technical Reports Server (NTRS)

    Houts, Michael G.

    2012-01-01

    Fission power and propulsion systems can enable exciting space exploration missions. These include bases on the moon and Mars; and the exploration, development, and utilization of the solar system. In the near-term, fission surface power systems could provide abundant, constant, cost-effective power anywhere on the surface of the Moon or Mars, independent of available sunlight. Affordable access to Mars, the asteroid belt, or other destinations could be provided by nuclear thermal rockets. In the further term, high performance fission power supplies could enable both extremely high power levels on planetary surfaces and fission electric propulsion vehicles for rapid, efficient cargo and crew transfer. Advanced fission propulsion systems could eventually allow routine access to the entire solar system. Fission systems could also enable the utilization of resources within the solar system.

  10. Student Agency for Powerful Learning

    ERIC Educational Resources Information Center

    Wiliams, Philip

    2017-01-01

    School libraries play a powerful role in enabling, informing, and sustaining student agency, and nothing engages and motivates students more deeply than enabling them to become the active agents in the process of learning. Students with agency are powerful learners who are prepared to engage with the world with sustained, courageous curiosity.…

  11. Building and Sustaining International Scientific Partnerships Through Data Sharing

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Yoksas, T.

    2008-05-01

    Understanding global environmental processes and their regional linkages has heightened the importance of strong international scientific partnerships. At the same time, the Internet and its myriad manifestations, along with innovative web services, have amply demonstrated the compounding benefits of cyberinfrastructure and the power of networked communities. The increased globalization of science, especially in solving interdisciplinary Earth system science problems, requires that science be conducted collaboratively by distributed teams of investigators, often involving sharing of knowledge and resources like community models and other tools. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. Its understanding requires finding, collecting, integrating, and assimilating data from observations and model simulations from diverse fields and across traditional disciplinary boundaries. For the past two decades, the NSF-sponsored Unidata Program Center has been providing the data services, tools, and cyberinfrastructure leadership that advance Earth system science education and research, and enabled opportunities for broad participation. Beginning as a collection of US-based, mostly atmospheric science departments, the Unidata community now transcends international boundaries and geoscience disciplines. Today, Unidata technologies are used in many countries on all continents in research, education and operational settings, and in many international projects (e.g., IPCC assessments, International Polar Year, and THORPEX). The program places high value on the transformational changes enabled by such international scientific partnerships and continually provides opportunities to share knowledge, data, tools and other resources to advance geoscience research and education. This talk will provide an overview of Unidata's ongoing efforts to foster to international scientific partnerships toward building a globally-engaged community of educators and researchers in the geosciences. The presentation will discuss how developments in Earth and Space Science Informatics are enabling new approaches to solving geoscientific problems. The presentation will also describe how Unidata resources are being leveraged by broader initiatives in UCAR and elsewhere.

  12. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications

    PubMed Central

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-01-01

    Background Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. Results The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. Conclusion The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools. PMID:18021453

  13. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications.

    PubMed

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-11-19

    Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools.

  14. Diamond tool wear detection method using cutting force and its power spectrum analysis in ultra-precision fly cutting

    NASA Astrophysics Data System (ADS)

    Zhang, G. Q.; To, S.

    2014-08-01

    Cutting force and its power spectrum analysis was thought to be an effective method monitoring tool wear in many cutting processes and a significant body of research has been conducted on this research area. However, relative little similar research was found in ultra-precision fly cutting. In this paper, a group of experiments were carried out to investigate the cutting forces and its power spectrum characteristics under different tool wear stages. Result reveals that the cutting force increases with the progress of tool wear. The cutting force signals under different tool wear stages were analyzed using power spectrum analysis. The analysis indicates that a characteristic frequency does exist in the power spectrum of the cutting force, whose power spectral density increases with the increasing of tool wear level, this characteristic frequency could be adopted to monitor diamond tool wear in ultra-precision fly cutting.

  15. Radiation Mitigation and Power Optimization Design Tools for Reconfigurable Hardware in Orbit

    NASA Technical Reports Server (NTRS)

    French, Matthew; Graham, Paul; Wirthlin, Michael; Wang, Li; Larchev, Gregory

    2005-01-01

    The Reconfigurable Hardware in Orbit (RHinO)project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. In the second year of the project, design tools that leverage an established FPGA design environment have been created to visualize and analyze an FPGA circuit for radiation weaknesses and power inefficiencies. For radiation, a single event Upset (SEU) emulator, persistence analysis tool, and a half-latch removal tool for Xilinx/Virtex-II devices have been created. Research is underway on a persistence mitigation tool and multiple bit upsets (MBU) studies. For power, synthesis level dynamic power visualization and analysis tools have been completed. Power optimization tools are under development and preliminary test results are positive.

  16. CRISPR Primer Designer: Design primers for knockout and chromosome imaging CRISPR-Cas system.

    PubMed

    Yan, Meng; Zhou, Shi-Rong; Xue, Hong-Wei

    2015-07-01

    The clustered regularly interspaced short palindromic repeats (CRISPR)-associated system enables biologists to edit genomes precisely and provides a powerful tool for perturbing endogenous gene regulation, modulation of epigenetic markers, and genome architecture. However, there are concerns about the specificity of the system, especially the usages of knocking out a gene. Previous designing tools either were mostly built-in websites or ran as command-line programs, and none of them ran locally and acquired a user-friendly interface. In addition, with the development of CRISPR-derived systems, such as chromosome imaging, there were still no tools helping users to generate specific end-user spacers. We herein present CRISPR Primer Designer for researchers to design primers for CRISPR applications. The program has a user-friendly interface, can analyze the BLAST results by using multiple parameters, score for each candidate spacer, and generate the primers when using a certain plasmid. In addition, CRISPR Primer Designer runs locally and can be used to search spacer clusters, and exports primers for the CRISPR-Cas system-based chromosome imaging system. © 2014 Institute of Botany, Chinese Academy of Sciences.

  17. Applications of CRISPR/Cas9 in the Mammalian Central Nervous System



    PubMed Central

    Savell, Katherine E.; Day, Jeremy J.

    2017-01-01

    Within the central nervous system, gene regulatory mechanisms are crucial regulators of cellular development and function, and dysregulation of these systems is commonly observed in major neuropsychiatric and neurological disorders. However, due to a lack of tools to specifically modulate the genome and epigenome in the central nervous system, many molecular and genetic mechanisms underlying cognitive function and behavior are still unknown. Although genome editing tools have been around for decades, the recent emergence of inexpensive, straightforward, and widely accessible CRISPR/Cas9 systems has led to a revolution in gene editing. The development of the catalytically dead Cas9 (dCas9) expanded this flexibility even further by acting as an anchoring system for fused effector proteins, structural scaffolds, and RNAs. Together, these advances have enabled robust, modular approaches for specific targeting and modification of the local chromatin environment at a single gene. This review highlights these advancements and how the combination of powerful modulatory tools paired with the versatility of CRISPR-Cas9-based systems offer great potential for understanding the underlying genetic and epigenetic contributions of neuronal function, behavior, and neurobiological diseases. PMID:29259522

  18. The cultural niche: Why social learning is essential for human adaptation

    PubMed Central

    Boyd, Robert; Richerson, Peter J.; Henrich, Joseph

    2011-01-01

    In the last 60,000 y humans have expanded across the globe and now occupy a wider range than any other terrestrial species. Our ability to successfully adapt to such a diverse range of habitats is often explained in terms of our cognitive ability. Humans have relatively bigger brains and more computing power than other animals, and this allows us to figure out how to live in a wide range of environments. Here we argue that humans may be smarter than other creatures, but none of us is nearly smart enough to acquire all of the information necessary to survive in any single habitat. In even the simplest foraging societies, people depend on a vast array of tools, detailed bodies of local knowledge, and complex social arrangements and often do not understand why these tools, beliefs, and behaviors are adaptive. We owe our success to our uniquely developed ability to learn from others. This capacity enables humans to gradually accumulate information across generations and develop well-adapted tools, beliefs, and practices that are too complex for any single individual to invent during their lifetime. PMID:21690340

  19. Quasistatic Cavity Resonance for Ubiquitous Wireless Power Transfer.

    PubMed

    Chabalko, Matthew J; Shahmohammadi, Mohsen; Sample, Alanson P

    2017-01-01

    Wireless power delivery has the potential to seamlessly power our electrical devices as easily as data is transmitted through the air. However, existing solutions are limited to near contact distances and do not provide the geometric freedom to enable automatic and un-aided charging. We introduce quasistatic cavity resonance (QSCR), which can enable purpose-built structures, such as cabinets, rooms, and warehouses, to generate quasistatic magnetic fields that safely deliver kilowatts of power to mobile receivers contained nearly anywhere within. A theoretical model of a quasistatic cavity resonator is derived, and field distributions along with power transfer efficiency are validated against measured results. An experimental demonstration shows that a 54 m3 QSCR room can deliver power to small coil receivers in nearly any position with 40% to 95% efficiency. Finally, a detailed safety analysis shows that up to 1900 watts can be transmitted to a coil receiver enabling safe and ubiquitous wireless power.

  20. Quasistatic Cavity Resonance for Ubiquitous Wireless Power Transfer

    PubMed Central

    Shahmohammadi, Mohsen; Sample, Alanson P.

    2017-01-01

    Wireless power delivery has the potential to seamlessly power our electrical devices as easily as data is transmitted through the air. However, existing solutions are limited to near contact distances and do not provide the geometric freedom to enable automatic and un-aided charging. We introduce quasistatic cavity resonance (QSCR), which can enable purpose-built structures, such as cabinets, rooms, and warehouses, to generate quasistatic magnetic fields that safely deliver kilowatts of power to mobile receivers contained nearly anywhere within. A theoretical model of a quasistatic cavity resonator is derived, and field distributions along with power transfer efficiency are validated against measured results. An experimental demonstration shows that a 54 m3 QSCR room can deliver power to small coil receivers in nearly any position with 40% to 95% efficiency. Finally, a detailed safety analysis shows that up to 1900 watts can be transmitted to a coil receiver enabling safe and ubiquitous wireless power. PMID:28199321

  1. Low Power LDPC Code Decoder Architecture Based on Intermediate Message Compression Technique

    NASA Astrophysics Data System (ADS)

    Shimizu, Kazunori; Togawa, Nozomu; Ikenaga, Takeshi; Goto, Satoshi

    Reducing the power dissipation for LDPC code decoder is a major challenging task to apply it to the practical digital communication systems. In this paper, we propose a low power LDPC code decoder architecture based on an intermediate message-compression technique which features as follows: (i) An intermediate message compression technique enables the decoder to reduce the required memory capacity and write power dissipation. (ii) A clock gated shift register based intermediate message memory architecture enables the decoder to decompress the compressed messages in a single clock cycle while reducing the read power dissipation. The combination of the above two techniques enables the decoder to reduce the power dissipation while keeping the decoding throughput. The simulation results show that the proposed architecture improves the power efficiency up to 52% and 18% compared to that of the decoder based on the overlapped schedule and the rapid convergence schedule without the proposed techniques respectively.

  2. Design of a seismo-acoustic station for Antarctica.

    PubMed

    Contrafatto, Danilo; Fasone, Rosario; Ferro, Angelo; Larocca, Graziano; Laudani, Giuseppe; Rapisarda, Salvatore; Scuderi, Luciano; Zuccarello, Luciano; Privitera, Eugenio; Cannata, Andrea

    2018-04-01

    In recent years, seismological studies in Antarctica have contributed plenty of new knowledge in many fields of earth science. Moreover, acoustic investigations are now also considered a powerful tool that provides insights for many different objectives, such as analyses of regional climate-related changes and studies of volcanic degassing and explosive activities. However, installation and maintenance of scientific instrumentation in Antarctica can be really challenging. Indeed, the instruments have to face the most extreme climate on the planet. They must be tolerant of very low temperatures and robust enough to survive strong winds. Moreover, one of the most critical tasks is powering a remote system year-round at polar latitudes. In this work, we present a novel seismo-acoustic station designed to work reliably in polar regions. To enable year-round seismo-acoustic data collection in such a remote, extreme environment, a hybrid powering system is used, integrating solar panels, a wind generator, and batteries. A power management system was specifically developed to either charge the battery bank or divert energy surplus to warm the enclosure or release the excess energy to the outside environment. Finally, due to the prohibitive environmental conditions at most Antarctic installation sites, the station was designed to be deployed quickly.

  3. Design of a seismo-acoustic station for Antarctica

    NASA Astrophysics Data System (ADS)

    Contrafatto, Danilo; Fasone, Rosario; Ferro, Angelo; Larocca, Graziano; Laudani, Giuseppe; Rapisarda, Salvatore; Scuderi, Luciano; Zuccarello, Luciano; Privitera, Eugenio; Cannata, Andrea

    2018-04-01

    In recent years, seismological studies in Antarctica have contributed plenty of new knowledge in many fields of earth science. Moreover, acoustic investigations are now also considered a powerful tool that provides insights for many different objectives, such as analyses of regional climate-related changes and studies of volcanic degassing and explosive activities. However, installation and maintenance of scientific instrumentation in Antarctica can be really challenging. Indeed, the instruments have to face the most extreme climate on the planet. They must be tolerant of very low temperatures and robust enough to survive strong winds. Moreover, one of the most critical tasks is powering a remote system year-round at polar latitudes. In this work, we present a novel seismo-acoustic station designed to work reliably in polar regions. To enable year-round seismo-acoustic data collection in such a remote, extreme environment, a hybrid powering system is used, integrating solar panels, a wind generator, and batteries. A power management system was specifically developed to either charge the battery bank or divert energy surplus to warm the enclosure or release the excess energy to the outside environment. Finally, due to the prohibitive environmental conditions at most Antarctic installation sites, the station was designed to be deployed quickly.

  4. The ADE scorecards: a tool for adverse drug event detection in electronic health records.

    PubMed

    Chazard, Emmanuel; Băceanu, Adrian; Ferret, Laurie; Ficheur, Grégoire

    2011-01-01

    Although several methods exist for Adverse Drug events (ADE) detection due to past hospitalizations, a tool that could display those ADEs to the physicians does not exist yet. This article presents the ADE Scorecards, a Web tool that enables to screen past hospitalizations extracted from Electronic Health Records (EHR), using a set of ADE detection rules, presently rules discovered by data mining. The tool enables the physicians to (1) get contextualized statistics about the ADEs that happen in their medical department, (2) see the rules that are useful in their department, i.e. the rules that could have enabled to prevent those ADEs and (3) review in detail the ADE cases, through a comprehensive interface displaying the diagnoses, procedures, lab results, administered drugs and anonymized records. The article shows a demonstration of the tool through a use case.

  5. 29 CFR 1926.304 - Woodworking tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR CONSTRUCTION Tools-Hand and Power § 1926.304 Woodworking tools. (a) Disconnect switches. All fixed power driven woodworking tools shall be provided with a disconnect..., power-driven circular saws shall be equipped with guards above and below the base plate or shoe. The...

  6. A standard-enabled workflow for synthetic biology.

    PubMed

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  7. Concentrating solar power (CSP) power cycle improvements through application of advanced materials

    NASA Astrophysics Data System (ADS)

    Siefert, John A.; Libby, Cara; Shingledecker, John

    2016-05-01

    Concentrating solar power (CSP) systems with thermal energy storage (TES) capability offer unique advantages to other renewable energy technologies in that solar radiation can be captured and stored for utilization when the sun is not shining. This makes the technology attractive as a dispatchable resource, and as such the Electric Power Research Institute (EPRI) has been engaged in research and development activities to understand and track the technology, identify key technical challenges, and enable improvements to meet future cost and performance targets to enable greater adoption of this carbon-free energy resource. EPRI is also involved with technically leading a consortium of manufacturers, government labs, and research organizations to enable the next generation of fossil fired power plants with advanced ultrasupercritical (A-USC) steam temperatures up to 760°C (1400°F). Materials are a key enabling technology for both of these seemingly opposed systems. This paper discusses how major strides in structural materials for A-USC fossil fired power plants may be translated into improved CSP systems which meet target requirements.

  8. How a future energy world could look?

    NASA Astrophysics Data System (ADS)

    Ewert, M.

    2012-10-01

    The future energy system will change significantly within the next years as a result of the following Mega Trends: de-carbonization, urbanization, fast technology development, individualization, glocalization (globalization and localization) and changing demographics. Increasing fluctuating renewable production will change the role of non-renewable generation. Distributed energy from renewables and micro generation will change the direction of the energy flow in the electricity grids. Production will not follow demand but demand has to follow production. This future system is enabled by the fast technical development of information and communication technologies which will be present in the entire system. In this paper the results of a comprehensive analysis with different scenarios is summarized. Tools were used like the analysis of policy trends in the European countries, modelling of the European power grid, modelling of the European power markets and the analysis of technology developments with cost reduction potentials. With these tools the interaction of the main actors in the energy markets like conventional generation and renewable generation, grid transport, electricity storage including new storage options from E-Mobility, Power to Gas, Compressed Air Energy storage and demand side management were considered. The potential application of technologies and investments in new energy technologies were analyzed within existing frameworks and markets as well as new business models in new markets with different frameworks. In the paper the over all trend of this analysis is presented by describing a potential future energy world. This world represents only one of numerous options with comparable characteristics.

  9. Assessment of quasi-linear effect of RF power spectrum for enabling lower hybrid current drive in reactor plasmas

    NASA Astrophysics Data System (ADS)

    Cesario, Roberto; Cardinali, Alessandro; Castaldo, Carmine; Amicucci, Luca; Ceccuzzi, Silvio; Galli, Alessandro; Napoli, Francesco; Panaccione, Luigi; Santini, Franco; Schettini, Giuseppe; Tuccillo, Angelo Antonio

    2017-10-01

    The main research on the energy from thermonuclear fusion uses deuterium plasmas magnetically trapped in toroidal devices. To suppress the turbulent eddies that impair thermal insulation and pressure tight of the plasma, current drive (CD) is necessary, but tools envisaged so far are unable accomplishing this task while efficiently and flexibly matching the natural current profiles self-generated at large radii of the plasma column [1-5]. The lower hybrid current drive (LHCD) [6] can satisfy this important need of a reactor [1], but the LHCD system has been unexpectedly mothballed on JET. The problematic extrapolation of the LHCD tool at reactor graded high values of, respectively, density and temperatures of plasma has been now solved. The high density problem is solved by the FTU (Frascati Tokamak Upgrade) method [7], and solution of the high temperature one is presented here. Model results based on quasi-linear (QL) theory evidence the capability, w.r.t linear theory, of suitable operating parameters of reducing the wave damping in hot reactor plasmas. Namely, using higher RF power densities [8], or a narrower antenna power spectrum in refractive index [9,10], the obstacle for LHCD represented by too high temperature of reactor plasmas should be overcome. The former method cannot be used for routinely, safe antenna operations, Thus, only the latter key is really exploitable in a reactor. The proposed solutions are ultimately necessary for viability of an economic reactor.

  10. L-Py: An L-System Simulation Framework for Modeling Plant Architecture Development Based on a Dynamic Language

    PubMed Central

    Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe

    2012-01-01

    The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: (i) by keeping a simple syntax while allowing for high-level programming constructs, (ii) by making code execution easy and avoiding compilation overhead, (iii) by allowing a high-level of model reusability and the building of complex modular models, and (iv) by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales) into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom. PMID:22670147

  11. L-py: an L-system simulation framework for modeling plant architecture development based on a dynamic language.

    PubMed

    Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe

    2012-01-01

    The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: (i) by keeping a simple syntax while allowing for high-level programming constructs, (ii) by making code execution easy and avoiding compilation overhead, (iii) by allowing a high-level of model reusability and the building of complex modular models, and (iv) by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales) into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom.

  12. Distributed cyberinfrastructure tools for automated data processing of structural monitoring data

    NASA Astrophysics Data System (ADS)

    Zhang, Yilan; Kurata, Masahiro; Lynch, Jerome P.; van der Linden, Gwendolyn; Sederat, Hassan; Prakash, Atul

    2012-04-01

    The emergence of cost-effective sensing technologies has now enabled the use of dense arrays of sensors to monitor the behavior and condition of large-scale bridges. The continuous operation of dense networks of sensors presents a number of new challenges including how to manage such massive amounts of data that can be created by the system. This paper reports on the progress of the creation of cyberinfrastructure tools which hierarchically control networks of wireless sensors deployed in a long-span bridge. The internet-enabled cyberinfrastructure is centrally managed by a powerful database which controls the flow of data in the entire monitoring system architecture. A client-server model built upon the database provides both data-provider and system end-users with secured access to various levels of information of a bridge. In the system, information on bridge behavior (e.g., acceleration, strain, displacement) and environmental condition (e.g., wind speed, wind direction, temperature, humidity) are uploaded to the database from sensor networks installed in the bridge. Then, data interrogation services interface with the database via client APIs to autonomously process data. The current research effort focuses on an assessment of the scalability and long-term robustness of the proposed cyberinfrastructure framework that has been implemented along with a permanent wireless monitoring system on the New Carquinez (Alfred Zampa Memorial) Suspension Bridge in Vallejo, CA. Many data interrogation tools are under development using sensor data and bridge metadata (e.g., geometric details, material properties, etc.) Sample data interrogation clients including those for the detection of faulty sensors, automated modal parameter extraction.

  13. Design Space Toolbox V2: Automated Software Enabling a Novel Phenotype-Centric Modeling Strategy for Natural and Synthetic Biological Systems

    PubMed Central

    Lomnitz, Jason G.; Savageau, Michael A.

    2016-01-01

    Mathematical models of biochemical systems provide a means to elucidate the link between the genotype, environment, and phenotype. A subclass of mathematical models, known as mechanistic models, quantitatively describe the complex non-linear mechanisms that capture the intricate interactions between biochemical components. However, the study of mechanistic models is challenging because most are analytically intractable and involve large numbers of system parameters. Conventional methods to analyze them rely on local analyses about a nominal parameter set and they do not reveal the vast majority of potential phenotypes possible for a given system design. We have recently developed a new modeling approach that does not require estimated values for the parameters initially and inverts the typical steps of the conventional modeling strategy. Instead, this approach relies on architectural features of the model to identify the phenotypic repertoire and then predict values for the parameters that yield specific instances of the system that realize desired phenotypic characteristics. Here, we present a collection of software tools, the Design Space Toolbox V2 based on the System Design Space method, that automates (1) enumeration of the repertoire of model phenotypes, (2) prediction of values for the parameters for any model phenotype, and (3) analysis of model phenotypes through analytical and numerical methods. The result is an enabling technology that facilitates this radically new, phenotype-centric, modeling approach. We illustrate the power of these new tools by applying them to a synthetic gene circuit that can exhibit multi-stability. We then predict values for the system parameters such that the design exhibits 2, 3, and 4 stable steady states. In one example, inspection of the basins of attraction reveals that the circuit can count between three stable states by transient stimulation through one of two input channels: a positive channel that increases the count, and a negative channel that decreases the count. This example shows the power of these new automated methods to rapidly identify behaviors of interest and efficiently predict parameter values for their realization. These tools may be applied to understand complex natural circuitry and to aid in the rational design of synthetic circuits. PMID:27462346

  14. SOSPAC- SOLAR SPACE POWER ANALYSIS CODE

    NASA Technical Reports Server (NTRS)

    Selcuk, M. K.

    1994-01-01

    The Solar Space Power Analysis Code, SOSPAC, was developed to examine the solar thermal and photovoltaic power generation options available for a satellite or spacecraft in low earth orbit. SOSPAC is a preliminary systems analysis tool and enables the engineer to compare the areas, weights, and costs of several candidate electric and thermal power systems. The configurations studied include photovoltaic arrays and parabolic dish systems to produce electricity only, and in various combinations to provide both thermal and electric power. SOSPAC has been used for comparison and parametric studies of proposed power systems for the NASA Space Station. The initial requirements are projected to be about 40 kW of electrical power, and a similar amount of thermal power with temperatures above 1000 degrees Centigrade. For objects in low earth orbit, the aerodynamic drag caused by suitably large photovoltaic arrays is very substantial. Smaller parabolic dishes can provide thermal energy at a collection efficiency of about 80%, but at increased cost. SOSPAC allows an analysis of cost and performance factors of five hybrid power generating systems. Input includes electrical and thermal power requirements, sun and shade durations for the satellite, and unit weight and cost for subsystems and components. Performance equations of the five configurations are derived, and the output tabulates total weights of the power plant assemblies, area of the arrays, efficiencies, and costs. SOSPAC is written in FORTRAN IV for batch execution and has been implemented on an IBM PC computer operating under DOS with a central memory requirement of approximately 60K of 8 bit bytes. This program was developed in 1985.

  15. Pellet to Part Manufacturing System for CNCs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roschli, Alex C.; Love, Lonnie J.; Post, Brian K.

    Oak Ridge National Laboratory’s Manufacturing Demonstration Facility worked with Hybrid Manufacturing Technologies to develop a compact prototype composite additive manufacturing head that can effectively extrude injection molding pellets. The head interfaces with conventional CNC machine tools enabling rapid conversion of conventional machine tools to additive manufacturing tools. The intent was to enable wider adoption of Big Area Additive Manufacturing (BAAM) technology and combine BAAM technology with conventional machining systems.

  16. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  17. 30 CFR 56.14116 - Hand-held power tools.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Hand-held power tools. 56.14116 Section 56... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 56.14116 Hand-held power tools. (a) Power drills...

  18. 30 CFR 56.14116 - Hand-held power tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Hand-held power tools. 56.14116 Section 56... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 56.14116 Hand-held power tools. (a) Power drills...

  19. 30 CFR 56.14116 - Hand-held power tools.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Hand-held power tools. 56.14116 Section 56... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 56.14116 Hand-held power tools. (a) Power drills...

  20. 30 CFR 57.14116 - Hand-held power tools.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Hand-held power tools. 57.14116 Section 57... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 57.14116 Hand-held power tools. (a) Power drills...

  1. 30 CFR 56.14116 - Hand-held power tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Hand-held power tools. 56.14116 Section 56... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 56.14116 Hand-held power tools. (a) Power drills...

  2. 30 CFR 57.14116 - Hand-held power tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Hand-held power tools. 57.14116 Section 57... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 57.14116 Hand-held power tools. (a) Power drills...

  3. 30 CFR 57.14116 - Hand-held power tools.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Hand-held power tools. 57.14116 Section 57... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 57.14116 Hand-held power tools. (a) Power drills...

  4. 30 CFR 56.14116 - Hand-held power tools.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Hand-held power tools. 56.14116 Section 56... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 56.14116 Hand-held power tools. (a) Power drills...

  5. 30 CFR 57.14116 - Hand-held power tools.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Hand-held power tools. 57.14116 Section 57... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 57.14116 Hand-held power tools. (a) Power drills...

  6. 30 CFR 57.14116 - Hand-held power tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Hand-held power tools. 57.14116 Section 57... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 57.14116 Hand-held power tools. (a) Power drills...

  7. The Blue LED Nobel Prize: Historical context, current scientific understanding, human benefit

    DOE PAGES

    Tsao, Jeffrey Y.; Han, Jung; Haitz, Roland H.; ...

    2015-06-19

    Here, the paths that connect scientific understanding with tools and technology are rarely linear. Sometimes scientific understanding leads and enables, sometimes tools and technologies lead and enable. But by feeding on each other, they create virtuous spirals of forward and backward innovation.

  8. The Blue LED Nobel Prize: Historical context, current scientific understanding, human benefit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsao, Jeffrey Y.; Han, Jung; Haitz, Roland H.

    Here, the paths that connect scientific understanding with tools and technology are rarely linear. Sometimes scientific understanding leads and enables, sometimes tools and technologies lead and enable. But by feeding on each other, they create virtuous spirals of forward and backward innovation.

  9. NMR Methods, Applications and Trends for Groundwater Evaluation and Management

    NASA Astrophysics Data System (ADS)

    Walsh, D. O.; Grunewald, E. D.

    2011-12-01

    Nuclear magnetic resonance (NMR) measurements have a tremendous potential for improving groundwater characterization, as they provide direct detection and measurement of groundwater and unique information about pore-scale properties. NMR measurements, commonly used in chemistry and medicine, are utilized in geophysical investigations through non-invasive surface NMR (SNMR) or downhole NMR logging measurements. Our recent and ongoing research has focused on improving the performance and interpretation of NMR field measurements for groundwater characterization. Engineering advancements have addressed several key technical challenges associated with SNMR measurements. Susceptibility of SNMR measurements to environmental noise has been dramatically reduced through the development of multi-channel acquisition hardware and noise-cancellation software. Multi-channel instrumentation (up to 12 channels) has also enabled more efficient 2D and 3D imaging. Previous limitations in measuring NMR signals from water in silt, clay and magnetic geology have been addressed by shortening the instrument dead-time from 40 ms to 4 ms, and increasing the power output. Improved pulse sequences have been developed to more accurately estimate NMR relaxation times and their distributions, which are sensitive to pore size distributions. Cumulatively, these advancements have vastly expanded the range of environments in which SNMR measurements can be obtained, enabling detection of groundwater in smaller pores, in magnetic geology, in the unsaturated zone, and nearby to infrastructure (presented here in case studies). NMR logging can provide high-resolution estimates of bound and mobile water content and pore size distributions. While NMR logging has been utilized in oil and gas applications for decades, its use in groundwater investigations has been limited by the large size and high cost of oilfield NMR logging tools and services. Recently, engineering efforts funded by the US Department of Energy have produced an NMR logging tool that is much smaller and less costly than comparable oilfield NMR logging tools. This system is specifically designed for near surface groundwater investigations, incorporates small diameter probes (as small as 1.67 inches diameter) and man-portable surface stations, and provides NMR data and information content on par with oilfield NMR logging tools. A direct-push variant of this logging tool has also been developed. Key challenges associated with small diameter tools include inherently lower SNR and logging speeds, the desire to extend the sensitive zone as far as possible into unconsolidated formations, and simultaneously maintaining high power and signal fidelity. Our ongoing research in groundwater NMR aims to integrating surface and borehole measurements for regional-scale permeability mapping, and to develop in-place NMR sensors for long term monitoring of contaminant and remediation processes. In addition to groundwater resource characterization, promising new applications of NMR include assessing water content in ice and permafrost, management of groundwater in mining operations, and evaluation and management of groundwater in civil engineering applications.

  10. Spinoff 2013

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Topics covered include: Innovative Software Tools Measure Behavioral Alertness; Miniaturized, Portable Sensors Monitor Metabolic Health; Patient Simulators Train Emergency Caregivers; Solar Refrigerators Store Life-Saving Vaccines; Monitors Enable Medication Management in Patients' Homes; Handheld Diagnostic Device Delivers Quick Medical Readings; Experiments Result in Safer, Spin-Resistant Aircraft; Interfaces Visualize Data for Airline Safety, Efficiency; Data Mining Tools Make Flights Safer, More Efficient; NASA Standards Inform Comfortable Car Seats; Heat Shield Paves the Way for Commercial Space; Air Systems Provide Life Support to Miners; Coatings Preserve Metal, Stone, Tile, and Concrete; Robots Spur Software That Lends a Hand; Cloud-Based Data Sharing Connects Emergency Managers; Catalytic Converters Maintain Air Quality in Mines; NASA-Enhanced Water Bottles Filter Water on the Go; Brainwave Monitoring Software Improves Distracted Minds; Thermal Materials Protect Priceless, Personal Keepsakes; Home Air Purifiers Eradicate Harmful Pathogens; Thermal Materials Drive Professional Apparel Line; Radiant Barriers Save Energy in Buildings; Open Source Initiative Powers Real-Time Data Streams; Shuttle Engine Designs Revolutionize Solar Power; Procedure-Authoring Tool Improves Safety on Oil Rigs; Satellite Data Aid Monitoring of Nation's Forests; Mars Technologies Spawn Durable Wind Turbines; Programs Visualize Earth and Space for Interactive Education; Processor Units Reduce Satellite Construction Costs; Software Accelerates Computing Time for Complex Math; Simulation Tools Prevent Signal Interference on Spacecraft; Software Simplifies the Sharing of Numerical Models; Virtual Machine Language Controls Remote Devices; Micro-Accelerometers Monitor Equipment Health; Reactors Save Energy, Costs for Hydrogen Production; Cameras Monitor Spacecraft Integrity to Prevent Failures; Testing Devices Garner Data on Insulation Performance; Smart Sensors Gather Information for Machine Diagnostics; Oxygen Sensors Monitor Bioreactors and Ensure Health and Safety; Vision Algorithms Catch Defects in Screen Displays; and Deformable Mirrors Capture Exoplanet Data, Reflect Lasers.

  11. Radiation tolerant compact image sensor using CdTe photodiode and field emitter array (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Masuzawa, Tomoaki; Neo, Yoichiro; Mimura, Hidenori; Okamoto, Tamotsu; Nagao, Masayoshi; Akiyoshi, Masafumi; Sato, Nobuhiro; Takagi, Ikuji; Tsuji, Hiroshi; Gotoh, Yasuhito

    2016-10-01

    A growing demand on incident detection is recognized since the Great East Japan Earthquake and successive accidents in Fukushima nuclear power plant in 2011. Radiation tolerant image sensors are powerful tools to collect crucial information at initial stages of such incidents. However, semiconductor based image sensors such as CMOS and CCD have limited tolerance to radiation exposure. Image sensors used in nuclear facilities are conventional vacuum tubes using thermal cathodes, which have large size and high power consumption. In this study, we propose a compact image sensor composed of a CdTe-based photodiode and a matrix-driven Spindt-type electron beam source called field emitter array (FEA). A basic principle of FEA-based image sensors is similar to conventional Vidicon type camera tubes, but its electron source is replaced from a thermal cathode to FEA. The use of a field emitter as an electron source should enable significant size reduction while maintaining high radiation tolerance. Current researches on radiation tolerant FEAs and development of CdTe based photoconductive films will be presented.

  12. Gate-Tuned Thermoelectric Power in Black Phosphorus.

    PubMed

    Saito, Yu; Iizuka, Takahiko; Koretsune, Takashi; Arita, Ryotaro; Shimizu, Sunao; Iwasa, Yoshihiro

    2016-08-10

    The electric field effect is a useful means of elucidating intrinsic material properties as well as for designing functional devices. The electric-double-layer transistor (EDLT) enables the control of carrier density in a wide range, which is recently proved to be an effective tool for the investigation of thermoelectric properties. Here, we report the gate-tuning of thermoelectric power in a black phosphorus (BP) single crystal flake with the thickness of 40 nm. Using an EDLT configuration, we successfully control the thermoelectric power (S) and find that the S of ion-gated BP reached +510 μV/K at 210 K in the hole depleted state, which is much higher than the reported bulk single crystal value of +340 μV/K at 300 K. We compared this experimental data with the first-principles-based calculation and found that this enhancement is qualitatively explained by the effective thinning of the conduction channel of the BP flake and nonuniformity of the channel owing to the gate operation in a depletion mode. Our results provide new opportunities for further engineering BP as a thermoelectric material in nanoscale.

  13. Capacity and reliability analyses with applications to power quality

    NASA Astrophysics Data System (ADS)

    Azam, Mohammad; Tu, Fang; Shlapak, Yuri; Kirubarajan, Thiagalingam; Pattipati, Krishna R.; Karanam, Rajaiah

    2001-07-01

    The deregulation of energy markets, the ongoing advances in communication networks, the proliferation of intelligent metering and protective power devices, and the standardization of software/hardware interfaces are creating a dramatic shift in the way facilities acquire and utilize information about their power usage. The currently available power management systems gather a vast amount of information in the form of power usage, voltages, currents, and their time-dependent waveforms from a variety of devices (for example, circuit breakers, transformers, energy and power quality meters, protective relays, programmable logic controllers, motor control centers). What is lacking is an information processing and decision support infrastructure to harness this voluminous information into usable operational and management knowledge to handle the health of their equipment and power quality, minimize downtime and outages, and to optimize operations to improve productivity. This paper considers the problem of evaluating the capacity and reliability analyses of power systems with very high availability requirements (e.g., systems providing energy to data centers and communication networks with desired availability of up to 0.9999999). The real-time capacity and margin analysis helps operators to plan for additional loads and to schedule repair/replacement activities. The reliability analysis, based on computationally efficient sum of disjoint products, enables analysts to decide the optimum levels of redundancy, aids operators in prioritizing the maintenance options for a given budget and monitoring the system for capacity margin. The resulting analytical and software tool is demonstrated on a sample data center.

  14. Enabling lunar and space missions by laser power transmission

    NASA Technical Reports Server (NTRS)

    Deyoung, R. J.; Nealy, J. E.; Humes, D. H.; Meador, W. E.

    1992-01-01

    Applications are proposed for laser power transmission on the Moon. A solar-pumped laser in lunar orbit would beam power to the lunar surface for conversion into either electricity or propulsion needs. For example, lunar rovers could be much more flexible and lighter than rovers using other primary power sources. Also, laser power could be absorbed by lunar soil to create a hard glassy surface for dust-free roadways and launch pads. Laser power could also be used to power small lunar rockets or orbital transfer vehicles, and finally, photovoltaic laser converters could power remote excavation vehicles and human habitats. Laser power transmission is shown to be a highly flexible, enabling primary power source for lunar missions.

  15. Metabolic Engineering for the Production of Natural Products

    PubMed Central

    Pickens, Lauren B.; Tang, Yi; Chooi, Yit-Heng

    2014-01-01

    Natural products and natural product derived compounds play an important role in modern healthcare as frontline treatments for many diseases and as inspiration for chemically synthesized therapeutics. With advances in sequencing and recombinant DNA technology, many of the biosynthetic pathways responsible for the production of these chemically complex and pharmaceutically valuable compounds have been elucidated. With an ever expanding toolkit of biosynthetic components, metabolic engineering is an increasingly powerful method to improve natural product titers and generate novel compounds. Heterologous production platforms have enabled access to pathways from difficult to culture strains; systems biology and metabolic modeling tools have resulted in increasing predictive and analytic capabilities; advances in expression systems and regulation have enabled the fine-tuning of pathways for increased efficiency, and characterization of individual pathway components has facilitated the construction of hybrid pathways for the production of new compounds. These advances in the many aspects of metabolic engineering have not only yielded fascinating scientific discoveries but also make it an increasingly viable approach for the optimization of natural product biosynthesis. PMID:22432617

  16. Software Framework for Advanced Power Plant Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Widmann; Sorin Munteanu; Aseem Jain

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less

  17. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    PubMed

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  18. Trends in non-stationary signal processing techniques applied to vibration analysis of wind turbine drive train - A contemporary survey

    NASA Astrophysics Data System (ADS)

    Uma Maheswari, R.; Umamaheswari, R.

    2017-02-01

    Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.

  19. Electromagnetic Design of Feedhorn-Coupled Transition-Edge Sensors for Cosmic Microwave Background Polarimetery

    NASA Technical Reports Server (NTRS)

    Chuss, David T.

    2011-01-01

    Observations of the cosmic microwave background (CMB) provide a powerful tool for probing the evolution of the early universe. Specifically, precision measurement of the polarization of the CMB enables a direct test for cosmic inflation. A key technological element on the path to the measurement of this faint signal is the capability to produce large format arrays of background-limited detectors. We describe the electromagnetic design of feedhorn-coupled, TES-based sensors. Each linear orthogonal polarization from the feed horn is coupled to a superconducting microstrip line via a symmetric planar orthomode transducer (OMT). The symmetric OMT design allows for highly-symmetric beams with low cross-polarization over a wide bandwidth. In addition, this architecture enables a single microstrip filter to define the passband for each polarization. Care has been taken in the design to eliminate stray coupling paths to the absorbers. These detectors will be fielded in the Cosmology Large Angular Scale Surveyor (CLASS).

  20. Using mobile phones as acoustic sensors for high-throughput mosquito surveillance

    PubMed Central

    Mukundarajan, Haripriya; Hol, Felix Jan Hein; Castillo, Erica Araceli; Newby, Cooper

    2017-01-01

    The direct monitoring of mosquito populations in field settings is a crucial input for shaping appropriate and timely control measures for mosquito-borne diseases. Here, we demonstrate that commercially available mobile phones are a powerful tool for acoustically mapping mosquito species distributions worldwide. We show that even low-cost mobile phones with very basic functionality are capable of sensitively acquiring acoustic data on species-specific mosquito wingbeat sounds, while simultaneously recording the time and location of the human-mosquito encounter. We survey a wide range of medically important mosquito species, to quantitatively demonstrate how acoustic recordings supported by spatio-temporal metadata enable rapid, non-invasive species identification. As proof-of-concept, we carry out field demonstrations where minimally-trained users map local mosquitoes using their personal phones. Thus, we establish a new paradigm for mosquito surveillance that takes advantage of the existing global mobile network infrastructure, to enable continuous and large-scale data acquisition in resource-constrained areas. PMID:29087296

  1. Perturbing Tandem Energy Transfer in Luminescent Heterobinuclear Lanthanide Coordination Polymer Nanoparticles Enables Real-Time Monitoring of Release of the Anthrax Biomarker from Bacterial Spores.

    PubMed

    Gao, Nan; Zhang, Yunfang; Huang, Pengcheng; Xiang, Zhehao; Wu, Fang-Ying; Mao, Lanqun

    2018-06-05

    Lanthanide-based luminescent sensors have been widely used for the detection of the anthrax biomarker dipicolinic acid (DPA). However, mainly based on DPA sensitization to the lanthanide core, most of them failed to realize robust detection of DPA in bacterial spores. We proposed a new strategy for reliable detection of DPA by perturbing a tandem energy transfer in heterobinuclear lanthanide coordination polymer nanoparticles simply constructed by two kinds of lanthanide ions, Tb 3+ and Eu 3+ , and guanosine 5'-monophosphate. This smart luminescent probe was demonstrated to exhibit highly sensitive and selective visual luminescence color change upon exposure to DPA, enabling accurate detection of DPA in complex biosystems such as bacterial spores. DPA release from bacterial spores on physiological germination was also successfully monitored in real time by confocal imaging. This probe is thus expected to be a powerful tool for efficient detection of bacterial spores in responding to anthrax threats.

  2. Using mobile phones as acoustic sensors for high-throughput mosquito surveillance.

    PubMed

    Mukundarajan, Haripriya; Hol, Felix Jan Hein; Castillo, Erica Araceli; Newby, Cooper; Prakash, Manu

    2017-10-31

    The direct monitoring of mosquito populations in field settings is a crucial input for shaping appropriate and timely control measures for mosquito-borne diseases. Here, we demonstrate that commercially available mobile phones are a powerful tool for acoustically mapping mosquito species distributions worldwide. We show that even low-cost mobile phones with very basic functionality are capable of sensitively acquiring acoustic data on species-specific mosquito wingbeat sounds, while simultaneously recording the time and location of the human-mosquito encounter. We survey a wide range of medically important mosquito species, to quantitatively demonstrate how acoustic recordings supported by spatio-temporal metadata enable rapid, non-invasive species identification. As proof-of-concept, we carry out field demonstrations where minimally-trained users map local mosquitoes using their personal phones. Thus, we establish a new paradigm for mosquito surveillance that takes advantage of the existing global mobile network infrastructure, to enable continuous and large-scale data acquisition in resource-constrained areas.

  3. An interactive environment for agile analysis and visualization of ChIP-sequencing data.

    PubMed

    Lerdrup, Mads; Johansen, Jens Vilstrup; Agrawal-Singh, Shuchi; Hansen, Klaus

    2016-04-01

    To empower experimentalists with a means for fast and comprehensive chromatin immunoprecipitation sequencing (ChIP-seq) data analyses, we introduce an integrated computational environment, EaSeq. The software combines the exploratory power of genome browsers with an extensive set of interactive and user-friendly tools for genome-wide abstraction and visualization. It enables experimentalists to easily extract information and generate hypotheses from their own data and public genome-wide datasets. For demonstration purposes, we performed meta-analyses of public Polycomb ChIP-seq data and established a new screening approach to analyze more than 900 datasets from mouse embryonic stem cells for factors potentially associated with Polycomb recruitment. EaSeq, which is freely available and works on a standard personal computer, can substantially increase the throughput of many analysis workflows, facilitate transparency and reproducibility by automatically documenting and organizing analyses, and enable a broader group of scientists to gain insights from ChIP-seq data.

  4. Intravital imaging by simultaneous label-free autofluorescence-multiharmonic microscopy.

    PubMed

    You, Sixian; Tu, Haohua; Chaney, Eric J; Sun, Yi; Zhao, Youbo; Bower, Andrew J; Liu, Yuan-Zhi; Marjanovic, Marina; Sinha, Saurabh; Pu, Yang; Boppart, Stephen A

    2018-05-29

    Intravital microscopy (IVM) emerged and matured as a powerful tool for elucidating pathways in biological processes. Although label-free multiphoton IVM is attractive for its non-perturbative nature, its wide application has been hindered, mostly due to the limited contrast of each imaging modality and the challenge to integrate them. Here we introduce simultaneous label-free autofluorescence-multiharmonic (SLAM) microscopy, a single-excitation source nonlinear imaging platform that uses a custom-designed excitation window at 1110 nm and shaped ultrafast pulses at 10 MHz to enable fast (2-orders-of-magnitude improvement), simultaneous, and efficient acquisition of autofluorescence (FAD and NADH) and second/third harmonic generation from a wide array of cellular and extracellular components (e.g., tumor cells, immune cells, vesicles, and vessels) in living tissue using only 14 mW for extended time-lapse investigations. Our work demonstrates the versatility and efficiency of SLAM microscopy for tracking cellular events in vivo, and is a major enabling advance in label-free IVM.

  5. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    PubMed

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power Tools), (ii) the manual loading of preprocessing libraries, and (iii) the management of intermediate files, such as results and metadata. Micro-Analyzer users can directly manage Affymetrix binary data without worrying about locating and invoking the proper preprocessing tools and chip-specific libraries. Moreover, users of the Micro-Analyzer tool can load the preprocessed data directly into the well-known TM4 platform, extending in such a way also the TM4 capabilities. Consequently, Micro Analyzer offers the following advantages: (i) it reduces possible errors in the preprocessing and further analysis phases, e.g. due to the incorrect choice of parameters or due to the use of old libraries, (ii) it enables the combined and centralized pre-processing of different arrays, (iii) it may enhance the quality of further analysis by storing the workflow, i.e. information about the preprocessing steps, and (iv) finally Micro-Analzyer is freely available as a standalone application at the project web site http://sourceforge.net/projects/microanalyzer/. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. The effect of magnesium ions on chromosome structure as observed by helium ion microscopy.

    PubMed

    Dwiranti, Astari; Hamano, Tohru; Takata, Hideaki; Nagano, Shoko; Guo, Hongxuan; Onishi, Keiko; Wako, Toshiyuki; Uchiyama, Susumu; Fukui, Kiichi

    2014-02-01

    One of the few conclusions known about chromosome structure is that Mg2+ is required for the organization of chromosomes. Scanning electron microscopy is a powerful tool for studying chromosome morphology, but being nonconductive, chromosomes require metal/carbon coating that may conceal information about the detailed surface structure of the sample. Helium ion microscopy (HIM), which has recently been developed, does not require sample coating due to its charge compensation system. Here we investigated the structure of isolated human chromosomes under different Mg2+ concentrations by HIM. High-contrast and resolution images from uncoated samples obtained by HIM enabled investigation on the effects of Mg2+ on chromosome structure. Chromatin fiber information was obtained more clearly with uncoated than coated chromosomes. Our results suggest that both overall features and detailed structure of chromatin are significantly affected by different Mg2+ concentrations. Chromosomes were more condensed and a globular structure of chromatin with 30 nm diameter was visualized with 5 mM Mg2+ treatment, while 0 mM Mg2+ resulted in a less compact and more fibrous structure 11 nm in diameter. We conclude that HIM is a powerful tool for investigating chromosomes and other biological samples without requiring metal/carbon coating.

  7. Knowledge to Action - Understanding Natural Hazards-Induced Power Outage Scenarios for Actionable Disaster Responses

    NASA Astrophysics Data System (ADS)

    Kar, B.; Robinson, C.; Koch, D. B.; Omitaomu, O.

    2017-12-01

    The Sendai Framework for Disaster Risk Reduction 2015-2030 identified the following four priorities to prevent and reduce disaster risks: i) understanding disaster risk; ii) strengthening governance to manage disaster risk; iii) investing in disaster risk reduction for resilience and; iv) enhancing disaster preparedness for effective response, and to "Build Back Better" in recovery, rehabilitation and reconstruction. While forecasting and decision making tools are in place to predict and understand future impacts of natural hazards, the knowledge to action approach that currently exists fails to provide updated information needed by decision makers to undertake response and recovery efforts following a hazard event. For instance, during a tropical storm event advisories are released every two to three hours, but manual analysis of geospatial data to determine potential impacts of the event tends to be time-consuming and a post-event process. Researchers at Oak Ridge National Laboratory have developed a Spatial Decision Support System that enables real-time analysis of storm impact based on updated advisory. A prototype of the tool that focuses on determining projected power outage areas and projected duration of outages demonstrates the feasibility of integrating science with decision making for emergency management personnel to act in real time to protect communities and reduce risk.

  8. EMU battery/SMM power tool characterization study

    NASA Technical Reports Server (NTRS)

    Palandati, C.

    1982-01-01

    The power tool which will be used to replace the attitude control system in the SMM spacecraft was modified to operate from a self contained battery. The extravehicular mobility unit (EMU) battery was tested for the power tool application. The results are that the EMU battery is capable of operating the power tool within the pulse current range of 2.0 to 15.0 amperes and battery temperature range of -10 to 40 degrees Celsius.

  9. Probabilistic Weather Information Tailored to the Needs of Transmission System Operators

    NASA Astrophysics Data System (ADS)

    Alberts, I.; Stauch, V.; Lee, D.; Hagedorn, R.

    2014-12-01

    Reliable and accurate forecasts for wind and photovoltaic (PV) power production are essential for stable transmission systems. A high potential for improving the wind and PV power forecasts lies in optimizing the weather forecasts, since these energy sources are highly weather dependent. For this reason the main objective of the German research project EWeLiNE is to improve the quality the underlying numerical weather predictions towards energy operations. In this project, the German Meteorological Service (DWD), the Fraunhofer Institute for Wind Energy and Energy System Technology, and three of the German transmission system operators (TSOs) are working together to improve the weather and power forecasts. Probabilistic predictions are of particular interest, as the quantification of uncertainties provides an important tool for risk management. Theoretical considerations suggest that it can be advantageous to use probabilistic information to represent and respond to the remaining uncertainties in the forecasts. However, it remains a challenge to integrate this information into the decision making processes related to market participation and power systems operations. The project is planned and carried out in close cooperation with the involved TSOs in order to ensure the usability of the products developed. It will conclude with a demonstration phase, in which the improved models and newly developed products are combined into a process chain and used to provide information to TSOs in a real-time decision support tool. The use of a web-based development platform enables short development cycles and agile adaptation to evolving user needs. This contribution will present the EWeLiNE project and discuss ideas on how to incorporate probabilistic information into the users' current decision making processes.

  10. Range and Endurance Tradeoffs on Personal Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    Snyder, Christopher A.

    2016-01-01

    Rotorcraft design has always been a challenging tradeoff among overall size, capabilities, complexity, and other factors based on available technology and customer requirements. Advancements in propulsion, energy systems and other technologies have enabled new vehicles and missions; complementary advances in analysis methods and tools enable exploration of these enhanced vehicles and the evolving mission design space. A system study was performed to better understand the interdependency between vehicle design and propulsion system capabilities versus hover loiter requirements and range capability. Three representative vertical lift vehicles were developed to explore the tradeoff in capability between hover efficiency versus range and endurance capability. The vehicles were a single-main rotor helicopter, a tilt rotor, and a vertical take-off and landing (VTOL) aircraft. Vehicle capability was limited to two or three people (including pilot or crew) and maximum range within one hour of flight (100-200 miles, depending on vehicle). Two types of propulsion and energy storage systems were used in this study. First was traditional hydrocarbon-fueled cycles (such as Otto, diesel or gas turbine cycles). Second was an all-electric system using electric motors, power management and distribution, assuming batteries for energy storage, with the possibility of hydrocarbon-fueled range extenders. The high power requirements for hover significantly reduced mission radius capability. Loiter was less power intensive, resulting in about 12 the equivalent mission radius penalty. With so many design variables, the VTOL aircraft has the potential to perform well for a variety of missions. This vehicle is a good candidate for additional study; component model development is also required to adequately assess performance over the design space of interest.

  11. Range and Endurance Tradeoffs on Personal Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    Snyder, Christopher A.

    2016-01-01

    Rotorcraft design has always been a challenging tradeoff among overall size, capabilities, complexity, and other factors based on available technology and customer requirements. Advancements in propulsion, energy systems and other technologies have enabled new vehicles and missions; complementary advances in analysis methods and tools enable exploration of these enhanced vehicles and the evolving mission design space. A system study was performed to better understand the interdependency between vehicle design and propulsion system capabilities versus hover / loiter requirements and range capability. Three representative vertical lift vehicles were developed to explore the tradeoff in capability between hover efficiency versus range and endurance capability. The vehicles were a single-main rotor helicopter, a tilt rotor, and a vertical take-off and landing (VTOL) aircraft. Vehicle capability was limited to two or three people (including pilot or crew) and maximum range within one hour of flight (100-200 miles, depending on vehicle). Two types of propulsion and energy storage systems were used in this study. First was traditional hydrocarbon-fueled cycles (such as Otto, diesel or gas turbine cycles). Second was an all-electric system using electric motors, power management and distribution, assuming batteries for energy storage, with the possibility of hydrocarbon-fueled range extenders. The high power requirements for hover significantly reduced mission radius capability. Loiter was less power intensive, resulting in about 1/2 the equivalent mission radius penalty. With so many design variables, the VTOL aircraft has the potential to perform well for a variety of missions. This vehicle is a good candidate for additional study; component model development is also required to adequately assess performance over the design space of interest.

  12. Earth to Orbit Beamed Energy Experiment

    NASA Technical Reports Server (NTRS)

    Johnson, Les; Montgomery, Edward E.

    2017-01-01

    As a means of primary propulsion, beamed energy propulsion offers the benefit of offloading much of the propulsion system mass from the vehicle, increasing its potential performance and freeing it from the constraints of the rocket equation. For interstellar missions, beamed energy propulsion is arguably the most viable in the near- to mid-term. A near-term demonstration showing the feasibility of beamed energy propulsion is necessary and, fortunately, feasible using existing technologies. Key enabling technologies are large area, low mass spacecraft and efficient and safe high power laser systems capable of long distance propagation. NASA is currently developing the spacecraft technology through the Near Earth Asteroid Scout solar sail mission and has signed agreements with the Planetary Society to study the feasibility of precursor laser propulsion experiments using their LightSail-2 solar sail spacecraft. The capabilities of Space Situational Awareness assets and the advanced analytical tools available for fine resolution orbit determination now make it possible to investigate the practicalities of an Earth-to-orbit Beamed Energy eXperiment (EBEX) - a demonstration at delivered power levels that only illuminate a spacecraft without causing damage to it. The degree to which this can be expected to produce a measurable change in the orbit of a low ballistic coefficient spacecraft is investigated. Key system characteristics and estimated performance are derived for a near term mission opportunity involving the LightSail-2 spacecraft and laser power levels modest in comparison to those proposed previously. While the technology demonstrated by such an experiment is not sufficient to enable an interstellar precursor mission, if approved, then it would be the next step toward that goal.

  13. Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.

    PubMed

    Fong, Stephen S

    2014-08-01

    Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.

  14. Future Opportunities for Dynamic Power Systems for NASA Missions

    NASA Technical Reports Server (NTRS)

    Shaltens, Richard K.

    2007-01-01

    Dynamic power systems have the potential to be used in Radioisotope Power Systems (RPS) and Fission Surface Power Systems (FSPS) to provide high efficiency, reliable and long life power generation for future NASA applications and missions. Dynamic power systems have been developed by NASA over the decades, but none have ever operated in space. Advanced Stirling convertors are currently being developed at the NASA Glenn Research Center. These systems have demonstrated high efficiencies to enable high system specific power (>8 W(sub e)/kg) for 100 W(sub e) class Advanced Stirling Radioisotope Generators (ASRG). The ASRG could enable significant extended and expanded operation on the Mars surface and on long-life deep space missions. In addition, advanced high power Stirling convertors (>150 W(sub e)/kg), for use with surface fission power systems, could provide power ranging from 30 to 50 kWe, and would be enabling for both lunar and Mars exploration. This paper will discuss the status of various energy conversion options currently under development by NASA Glenn for the Radioisotope Power System Program for NASA s Science Mission Directorate (SMD) and the Prometheus Program for the Exploration Systems Mission Directorate (ESMD).

  15. J-Earth: An Essential Resource for Terrestrial Remote Sensing and Data Analysis

    NASA Astrophysics Data System (ADS)

    Dunn, S.; Rupp, J.; Cheeseman, S.; Christensen, P. R.; Prashad, L. C.; Dickenshied, S.; Anwar, S.; Noss, D.; Murray, K.

    2011-12-01

    There is a need for a software tool that has the ability to display and analyze various types of earth science and social data through a simple, user-friendly interface. The J-Earth software tool has been designed to be easily accessible for download and intuitive use, regardless of the technical background of the user base. This tool does not require courses or text books to learn to use, yet is powerful enough to allow a more general community of users to perform complex data analysis. Professions that will benefit from this tool range from geologists, geographers, and climatologists to sociologists, economists, and ecologists as well as policy makers. J-Earth was developed by the Arizona State University Mars Space Flight Facility as part of the JMARS (Java Mission-planning and Analysis for Remote Sensing) suite of open-source tools. The program is a Geographic Information Systems (GIS) application used for viewing and processing satellite and airborne remote sensing data. While the functionality of JMARS has historically focused on the research needs of the planetary science community, J-Earth has been designed for a much broader Earth-based user audience. NASA instrument products accessible within J-Earth include data from ASTER, GOES, Landsat, MODIS, and TIMS. While J-Earth contains exceptionally comprehensive and high resolution satellite-derived data and imagery, this tool also includes many socioeconomic data products from projects lead by international organizations and universities. Datasets used in J-Earth take the form of grids, rasters, remote sensor "stamps", maps, and shapefiles. Some highly demanded global datasets available within J-Earth include five levels of administrative/political boundaries, climate data for current conditions as well as models for future climates, population counts and densities, land cover/land use, and poverty indicators. While this application does share the same powerful functionality of JMARS, J-Earth's apperance is enhanced for much easier data analysis. J-Earth utilizes a layering system to view data from different sources which can then be exported, scaled, colored and superimposed for quick comparisons. Users may now perform spatial analysis over several diverse datasets with respect to a defined geographic area or the entire globe. In addition, several newly acquired global datasets contain a temporal dimension which when accessed through J-Earth, make this a unique and powerful tool for spatial analysis over time. The functionality and ease of use set J-Earth apart from all other terrestrial GIS software packages and enable endless social, political, and scientific possibilities

  16. Development of Asset Management Decision Support Tools for Power Equipment

    NASA Astrophysics Data System (ADS)

    Okamoto, Tatsuki; Takahashi, Tsuguhiro

    Development of asset management decision support tools become very intensive in order to reduce maintenance cost of power equipment due to the liberalization of power business. This article reviews some aspects of present status of asset management decision support tools development for power equipment based on the papers published in international conferences, domestic conventions, and several journals.

  17. miRNet - dissecting miRNA-target interactions and functional associations through network-based visual analysis

    PubMed Central

    Fan, Yannan; Siklenka, Keith; Arora, Simran K.; Ribeiro, Paula; Kimmins, Sarah; Xia, Jianguo

    2016-01-01

    MicroRNAs (miRNAs) can regulate nearly all biological processes and their dysregulation is implicated in various complex diseases and pathological conditions. Recent years have seen a growing number of functional studies of miRNAs using high-throughput experimental technologies, which have produced a large amount of high-quality data regarding miRNA target genes and their interactions with small molecules, long non-coding RNAs, epigenetic modifiers, disease associations, etc. These rich sets of information have enabled the creation of comprehensive networks linking miRNAs with various biologically important entities to shed light on their collective functions and regulatory mechanisms. Here, we introduce miRNet, an easy-to-use web-based tool that offers statistical, visual and network-based approaches to help researchers understand miRNAs functions and regulatory mechanisms. The key features of miRNet include: (i) a comprehensive knowledge base integrating high-quality miRNA-target interaction data from 11 databases; (ii) support for differential expression analysis of data from microarray, RNA-seq and quantitative PCR; (iii) implementation of a flexible interface for data filtering, refinement and customization during network creation; (iv) a powerful fully featured network visualization system coupled with enrichment analysis. miRNet offers a comprehensive tool suite to enable statistical analysis and functional interpretation of various data generated from current miRNA studies. miRNet is freely available at http://www.mirnet.ca. PMID:27105848

  18. The ESA Gaia Archive: Data Release 1

    NASA Astrophysics Data System (ADS)

    Salgado, J.; González-Núñez, J.; Gutiérrez-Sánchez, R.; Segovia, J. C.; Durán, J.; Hernández, J. L.; Arviset, C.

    2017-10-01

    The ESA Gaia mission is producing the most accurate source catalogue in astronomy to date. This represents a challenge in archiving to make the information and data accessible to astronomers in an efficient way, due to the size and complexity of the data. Also, new astronomical missions, taking larger and larger volumes of data, are reinforcing this change in the development of archives. Archives, as simple applications to access data, are evolving into complex data centre structures where computing power services are available for users and data mining tools are integrated into the server side. In the case of astronomy missions that involve the use of large catalogues, such as Gaia (or Euclid to come), the common ways to work on the data need to be changed to the following paradigm: "move the code close to the data". This implies that data mining functionalities are becoming a must to allow for the maximum scientific exploitation of the data. To enable these capabilities, a TAP+ interface, crossmatch capabilities, full catalogue histograms, serialisation of intermediate results in cloud resources, such as VOSpace etc., have been implemented for the Gaia Data Release 1 (DR1), to enable the exploitation of these science resources by the community without any bottlenecks in the connection bandwidth. We present the architecture, infrastructure and tools already available in the Gaia Archive DR1 (http://archives.esac.esa.int/gaia/) and we describe the capabilities and infrastructure.

  19. Data management routines for reproducible research using the G-Node Python Client library

    PubMed Central

    Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J.; Garbers, Christian; Rautenberg, Philipp L.; Wachtler, Thomas

    2014-01-01

    Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow. PMID:24634654

  20. PeakML/mzMatch: a file format, Java library, R library, and tool-chain for mass spectrometry data analysis.

    PubMed

    Scheltema, Richard A; Jankevics, Andris; Jansen, Ritsert C; Swertz, Morris A; Breitling, Rainer

    2011-04-01

    The recent proliferation of high-resolution mass spectrometers has generated a wealth of new data analysis methods. However, flexible integration of these methods into configurations best suited to the research question is hampered by heterogeneous file formats and monolithic software development. The mzXML, mzData, and mzML file formats have enabled uniform access to unprocessed raw data. In this paper we present our efforts to produce an equally simple and powerful format, PeakML, to uniformly exchange processed intermediary and result data. To demonstrate the versatility of PeakML, we have developed an open source Java toolkit for processing, filtering, and annotating mass spectra in a customizable pipeline (mzMatch), as well as a user-friendly data visualization environment (PeakML Viewer). The PeakML format in particular enables the flexible exchange of processed data between software created by different groups or companies, as we illustrate by providing a PeakML-based integration of the widely used XCMS package with mzMatch data processing tools. As an added advantage, downstream analysis can benefit from direct access to the full mass trace information underlying summarized mass spectrometry results, providing the user with the means to rapidly verify results. The PeakML/mzMatch software is freely available at http://mzmatch.sourceforge.net, with documentation, tutorials, and a community forum.

  1. Broad spectrum microarray for fingerprint-based bacterial species identification

    PubMed Central

    2010-01-01

    Background Microarrays are powerful tools for DNA-based molecular diagnostics and identification of pathogens. Most target a limited range of organisms and are based on only one or a very few genes for specific identification. Such microarrays are limited to organisms for which specific probes are available, and often have difficulty discriminating closely related taxa. We have developed an alternative broad-spectrum microarray that employs hybridisation fingerprints generated by high-density anonymous markers distributed over the entire genome for identification based on comparison to a reference database. Results A high-density microarray carrying 95,000 unique 13-mer probes was designed. Optimized methods were developed to deliver reproducible hybridisation patterns that enabled confident discrimination of bacteria at the species, subspecies, and strain levels. High correlation coefficients were achieved between replicates. A sub-selection of 12,071 probes, determined by ANOVA and class prediction analysis, enabled the discrimination of all samples in our panel. Mismatch probe hybridisation was observed but was found to have no effect on the discriminatory capacity of our system. Conclusions These results indicate the potential of our genome chip for reliable identification of a wide range of bacterial taxa at the subspecies level without laborious prior sequencing and probe design. With its high resolution capacity, our proof-of-principle chip demonstrates great potential as a tool for molecular diagnostics of broad taxonomic groups. PMID:20163710

  2. Unleashing Empirical Equations with "Nonlinear Fitting" and "GUM Tree Calculator"

    NASA Astrophysics Data System (ADS)

    Lovell-Smith, J. W.; Saunders, P.; Feistel, R.

    2017-10-01

    Empirical equations having large numbers of fitted parameters, such as the international standard reference equations published by the International Association for the Properties of Water and Steam (IAPWS), which form the basis of the "Thermodynamic Equation of Seawater—2010" (TEOS-10), provide the means to calculate many quantities very accurately. The parameters of these equations are found by least-squares fitting to large bodies of measurement data. However, the usefulness of these equations is limited since uncertainties are not readily available for most of the quantities able to be calculated, the covariance of the measurement data is not considered, and further propagation of the uncertainty in the calculated result is restricted since the covariance of calculated quantities is unknown. In this paper, we present two tools developed at MSL that are particularly useful in unleashing the full power of such empirical equations. "Nonlinear Fitting" enables propagation of the covariance of the measurement data into the parameters using generalized least-squares methods. The parameter covariance then may be published along with the equations. Then, when using these large, complex equations, "GUM Tree Calculator" enables the simultaneous calculation of any derived quantity and its uncertainty, by automatic propagation of the parameter covariance into the calculated quantity. We demonstrate these tools in exploratory work to determine and propagate uncertainties associated with the IAPWS-95 parameters.

  3. Data management routines for reproducible research using the G-Node Python Client library.

    PubMed

    Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J; Garbers, Christian; Rautenberg, Philipp L; Wachtler, Thomas

    2014-01-01

    Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow.

  4. Extensible Computational Chemistry Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  5. Constraint-based modeling in microbial food biotechnology

    PubMed Central

    Rau, Martin H.

    2018-01-01

    Genome-scale metabolic network reconstruction offers a means to leverage the value of the exponentially growing genomics data and integrate it with other biological knowledge in a structured format. Constraint-based modeling (CBM) enables both the qualitative and quantitative analyses of the reconstructed networks. The rapid advancements in these areas can benefit both the industrial production of microbial food cultures and their application in food processing. CBM provides several avenues for improving our mechanistic understanding of physiology and genotype–phenotype relationships. This is essential for the rational improvement of industrial strains, which can further be facilitated through various model-guided strain design approaches. CBM of microbial communities offers a valuable tool for the rational design of defined food cultures, where it can catalyze hypothesis generation and provide unintuitive rationales for the development of enhanced community phenotypes and, consequently, novel or improved food products. In the industrial-scale production of microorganisms for food cultures, CBM may enable a knowledge-driven bioprocess optimization by rationally identifying strategies for growth and stability improvement. Through these applications, we believe that CBM can become a powerful tool for guiding the areas of strain development, culture development and process optimization in the production of food cultures. Nevertheless, in order to make the correct choice of the modeling framework for a particular application and to interpret model predictions in a biologically meaningful manner, one should be aware of the current limitations of CBM. PMID:29588387

  6. Web-Based Architecture to Enable Compute-Intensive CAD Tools and Multi-user Synchronization in Teleradiology

    NASA Astrophysics Data System (ADS)

    Mehta, Neville; Kompalli, Suryaprakash; Chaudhary, Vipin

    Teleradiology is the electronic transmission of radiological patient images, such as x-rays, CT, or MR across multiple locations. The goal could be interpretation, consultation, or medical records keeping. Information technology solutions have enabled electronic records and their associated benefits are evident in health care today. However, salient aspects of collaborative interfaces, and computer assisted diagnostic (CAD) tools are yet to be integrated into workflow designs. The Computer Assisted Diagnostics and Interventions (CADI) group at the University at Buffalo has developed an architecture that facilitates web-enabled use of CAD tools, along with the novel concept of synchronized collaboration. The architecture can support multiple teleradiology applications and case studies are presented here.

  7. Flate-plate photovoltaic power systems handbook for Federal agencies

    NASA Technical Reports Server (NTRS)

    Cochrane, E. H.; Lawson, A. C.; Savage, C. H.

    1984-01-01

    The primary purpose is to provide a tool for personnel in Federal agencies to evaluate the viability of potential photovoltaic applications. A second objective is to provide descriptions of various photovoltaic systems installed by different Federal agencies under the Federal Photovoltaic Utilization Program so that other agencies may consider similar applications. A third objective is to share lessons learned to enable more effective procurement, design, installation, and operation of future photovoltaic systems. The intent is not to provide a complete handbook, but rather to provide a guide for Federal agency personnel with additional information incorporated by references. The steps to be followed in selecting, procuring, and installing a photovoltaic application are given.

  8. Energy management system turns data into market info

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Traynor, P.J.; Ackerman, W.J.

    1996-09-01

    The designers claim that Wisconsin Power & Light Co`s new energy management system is the first system of its type in the world in terms of the comprehensiveness and scope of its stored and retrievable data. Furthermore, the system`s link to the utility`s generating assets enables powerplant management to dispatch generation resources based on up-to-date unit characteristics. That means that the new system gives WP&L a competitive tool to optimize operations as well as fine-tune its EMS based on timely load and unit response information. Additionally, the EMS gives WP&L insight into the complex issues related to the unbundling ofmore » generation resources.« less

  9. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  10. An introduction to the new Productivity Information Management System (PIMS)

    NASA Technical Reports Server (NTRS)

    Hull, R.

    1982-01-01

    The productivity information management system (PIMS), is described. The main objective of this computerized system is to enable management scientists to interactively explore data concerning DSN operations, maintenance and repairs, to develop and verify models for management planning. The PIMS will provide a powerful set of tools for iteratively manipulating data sets in a wide variety of ways. The initial version of PIMS will be a small scale pilot system. The following topics are discussed: (1) the motivation for developing PIMS; (2) various data sets which will be integrated by PIMS; (3) overall design of PIMS; and (4) how PIMS will be used. A survey of relevant databases concerning DSN operations at Goldstone is also included.

  11. Europe PMC: a full-text literature database for the life sciences and platform for innovation

    PubMed Central

    2015-01-01

    This article describes recent developments of Europe PMC (http://europepmc.org), the leading database for life science literature. Formerly known as UKPMC, the service was rebranded in November 2012 as Europe PMC to reflect the scope of the funding agencies that support it. Several new developments have enriched Europe PMC considerably since then. Europe PMC now offers RESTful web services to access both articles and grants, powerful search tools such as citation-count sort order and data citation features, a service to add publications to your ORCID, a variety of export formats, and an External Links service that enables any related resource to be linked from Europe PMC content. PMID:25378340

  12. Motility, Force Generation, and Energy Consumption of Unicellular Parasites.

    PubMed

    Hochstetter, Axel; Pfohl, Thomas

    2016-07-01

    Motility is a key factor for pathogenicity of unicellular parasites, enabling them to infiltrate and evade host cells, and perform several of their life-cycle events. State-of-the-art methods of motility analysis rely on a combination of optical tweezers with high-resolution microscopy and microfluidics. With this technology, propulsion forces, energies, and power generation can be determined so as to shed light on the motion mechanisms, chemotactic behavior, and specific survival strategies of unicellular parasites. With these new tools in hand, we can elucidate the mechanisms of motility and force generation of unicellular parasites, and identify ways to manipulate and eventually inhibit them. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Data Movement Dominates: Advanced Memory Technology to Address the Real Exascale Power Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergman, Keren

    Energy is the fundamental barrier to Exascale supercomputing and is dominated by the cost of moving data from one point to another, not computation. Similarly, performance is dominated by data movement, not computation. The solution to this problem requires three critical technologies: 3D integration, optical chip-to-chip communication, and a new communication model. The central goal of the Sandia led "Data Movement Dominates" project aimed to develop memory systems and new architectures based on these technologies that have the potential to lower the cost of local memory accesses by orders of magnitude and provide substantially more bandwidth. Only through these transformationalmore » advances can future systems reach the goals of Exascale computing with a manageable power budgets. The Sandia led team included co-PIs from Columbia University, Lawrence Berkeley Lab, and the University of Maryland. The Columbia effort of Data Movement Dominates focused on developing a physically accurate simulation environment and experimental verification for optically-connected memory (OCM) systems that can enable continued performance scaling through high-bandwidth capacity, energy-efficient bit-rate transparency, and time-of-flight latency. With OCM, memory device parallelism and total capacity can scale to match future high-performance computing requirements without sacrificing data-movement efficiency. When we consider systems with integrated photonics, links to memory can be seamlessly integrated with the interconnection network-in a sense, memory becomes a primary aspect of the interconnection network. At the core of the Columbia effort, toward expanding our understanding of OCM enabled computing we have created an integrated modeling and simulation environment that uniquely integrates the physical behavior of the optical layer. The PhoenxSim suite of design and software tools developed under this effort has enabled the co-design of and performance evaluation photonics-enabled OCM architectures on Exascale computing systems.« less

  14. A midas plugin to enable construction of reproducible web-based image processing pipelines

    PubMed Central

    Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A.; Oguz, Ipek

    2013-01-01

    Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline. PMID:24416016

  15. A midas plugin to enable construction of reproducible web-based image processing pipelines.

    PubMed

    Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek

    2013-01-01

    Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  16. Freeform Fluidics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dehoff, Ryan R; Love, Lonnie J; Lind, Randall F

    This work explores the integration of miniaturized fluid power and additive manufacturing. Oak Ridge National Laboratory (ORNL) has been developing an approach to miniaturized fluidic actuation and control that enables high dexterity, low cost and a pathway towards energy efficiency. Previous work focused on mesoscale digital control valves (high pressure, low flow) and the integration of actuation and fluid passages directly with the structure, the primary application being fluid powered robotics. The fundamental challenge was part complexity. ORNL s new additive manufacturing technologies (e-beam, laser and ultrasonic deposition) enables freeform manufacturing using conventional metal alloys with excellent mechanical properties. Themore » combination of these two technologies, miniaturized fluid power and additive manufacturing, can enable a paradigm shift in fluid power, increasing efficiency while simultaneously reducing weight, size, complexity and cost. This paper focuses on the impact additive manufacturing can have on new forms of fluid power components and systems. We begin with a description of additive manufacturing processes, highlighting the strengths and weaknesses of each technology. Next we describe fundamental results of material characterization to understand the design and mechanical limits of parts made with the e-beam process. A novel design approach is introduced that enables integration of fluid powered actuation with mechanical structure. Finally, we describe a proof-of-principle demonstration: an anthropomorphic (human-like) hydraulically powered hand with integrated power supply and actuation.« less

  17. APT, The Phase I Tool for HST Cycle 12

    NASA Astrophysics Data System (ADS)

    Blacker, B.; Berch, M.; Curtis, G.; Douglas, R.; Downes, R.; Krueger, A.; O'Dea, C.

    2002-12-01

    In our continuing effort to streamline our systems and improve service to the science community, the Space Telescope Science Institute (STScI) is developing and releasing, APT - The Astronomer's Proposal Tool as the new interface for Hubble Space Telescope (HST) Phase I and Phase II proposal submissions for HST Cycle 12. The goal of the APT, is to bring state of the art technology, more visual tools and power into the hands of proposers so that they can optimize the scientific return of their HST programs. Proposing for HST and other missions, consists of requesting observing time and/or archival research funding. This step is called Phase I, where the scientific merit of a proposal is considered by a community based peer-review process. Accepted proposals then proceed thru Phase II, where the observations are specified in sufficient detail to enable scheduling on the telescope. In this paper we will present our concept and implementation plans for our Phase I development and submission tool, APT. In addition, we will go behind the scenes and discuss the implications for the Science Policies Division (SPD) and other groups at the STScI caused by a new submission tool and submission output products. The Space Telescope Science Institute (STScI) is operated by the Association of Universities for Research in Astronomy, Inc., for the National Aeronautics and Space Administration.

  18. Nuclear Energy for Space Exploration

    NASA Technical Reports Server (NTRS)

    Houts, Michael G.

    2010-01-01

    Nuclear power and propulsion systems can enable exciting space exploration missions. These include bases on the moon and Mars; and the exploration, development, and utilization of the solar system. In the near-term, fission surface power systems could provide abundant, constant, cost-effective power anywhere on the surface of the Moon or Mars, independent of available sunlight. Affordable access to Mars, the asteroid belt, or other destinations could be provided by nuclear thermal rockets. In the further term, high performance fission power supplies could enable both extremely high power levels on planetary surfaces and fission electric propulsion vehicles for rapid, efficient cargo and crew transfer. Advanced fission propulsion systems could eventually allow routine access to the entire solar system. Fission systems could also enable the utilization of resources within the solar system. Fusion and antimatter systems may also be viable in the future

  19. An Adjunct Galilean Satellite Orbiter Using a Small Radioisotope Power Source

    NASA Technical Reports Server (NTRS)

    Abelson, Robert Dean; Randolph, J.; Alkalai, L.; Collins, D.; Moore, W.

    2005-01-01

    This is a conceptual mission study intended to demonstrate the range of possible missions and applications that could be enabled were a new generation of Small Radioisotope Power Systems to be developed by NASA and DOE. While such systems are currently being considered by NASA and DOE, they do not currently exist. This study is one of several small RPS-enabled mission concepts that were studied and presented in the NASA/JPL document "Enabling Exploration with Small Radioisotope Power Systems" available at: http://solarsystem.nasa.gov/multimedia/download-detail.cfm?DL_ID=82

  20. Lifecycle Prognostics Architecture for Selected High-Cost Active Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N. Lybeck; B. Pham; M. Tawfik

    There are an extensive body of knowledge and some commercial products available for calculating prognostics, remaining useful life, and damage index parameters. The application of these technologies within the nuclear power community is still in its infancy. Online monitoring and condition-based maintenance is seeing increasing acceptance and deployment, and these activities provide the technological bases for expanding to add predictive/prognostics capabilities. In looking to deploy prognostics there are three key aspects of systems that are presented and discussed: (1) component/system/structure selection, (2) prognostic algorithms, and (3) prognostics architectures. Criteria are presented for component selection: feasibility, failure probability, consequences of failure,more » and benefits of the prognostics and health management (PHM) system. The basis and methods commonly used for prognostics algorithms are reviewed and summarized. Criteria for evaluating PHM architectures are presented: open, modular architecture; platform independence; graphical user interface for system development and/or results viewing; web enabled tools; scalability; and standards compatibility. Thirteen software products were identified and discussed in the context of being potentially useful for deployment in a PHM program applied to systems in a nuclear power plant (NPP). These products were evaluated by using information available from company websites, product brochures, fact sheets, scholarly publications, and direct communication with vendors. The thirteen products were classified into four groups of software: (1) research tools, (2) PHM system development tools, (3) deployable architectures, and (4) peripheral tools. Eight software tools fell into the deployable architectures category. Of those eight, only two employ all six modules of a full PHM system. Five systems did not offer prognostic estimates, and one system employed the full health monitoring suite but lacked operations and maintenance support. Each product is briefly described in Appendix A. Selection of the most appropriate software package for a particular application will depend on the chosen component, system, or structure. Ongoing research will determine the most appropriate choices for a successful demonstration of PHM systems in aging NPPs.« less

  1. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    NASA Astrophysics Data System (ADS)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  2. Simulink/PARS Integration Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vacaliuc, B.; Nakhaee, N.

    2013-12-18

    The state of the art for signal processor hardware has far out-paced the development tools for placing applications on that hardware. In addition, signal processors are available in a variety of architectures, each uniquely capable of handling specific types of signal processing efficiently. With these processors becoming smaller and demanding less power, it has become possible to group multiple processors, a heterogeneous set of processors, into single systems. Different portions of the desired problem set can be assigned to different processor types as appropriate. As software development tools do not keep pace with these processors, especially when multiple processors ofmore » different types are used, a method is needed to enable software code portability among multiple processors and multiple types of processors along with their respective software environments. Sundance DSP, Inc. has developed a software toolkit called “PARS”, whose objective is to provide a framework that uses suites of tools provided by different vendors, along with modeling tools and a real time operating system, to build an application that spans different processor types. The software language used to express the behavior of the system is a very high level modeling language, “Simulink”, a MathWorks product. ORNL has used this toolkit to effectively implement several deliverables. This CRADA describes this collaboration between ORNL and Sundance DSP, Inc.« less

  3. Web-based metabolic network visualization with a zooming user interface

    PubMed Central

    2011-01-01

    Background Displaying complex metabolic-map diagrams, for Web browsers, and allowing users to interact with them for querying and overlaying expression data over them is challenging. Description We present a Web-based metabolic-map diagram, which can be interactively explored by the user, called the Cellular Overview. The main characteristic of this application is the zooming user interface enabling the user to focus on appropriate granularities of the network at will. Various searching commands are available to visually highlight sets of reactions, pathways, enzymes, metabolites, and so on. Expression data from single or multiple experiments can be overlaid on the diagram, which we call the Omics Viewer capability. The application provides Web services to highlight the diagram and to invoke the Omics Viewer. This application is entirely written in JavaScript for the client browsers and connect to a Pathway Tools Web server to retrieve data and diagrams. It uses the OpenLayers library to display tiled diagrams. Conclusions This new online tool is capable of displaying large and complex metabolic-map diagrams in a very interactive manner. This application is available as part of the Pathway Tools software that powers multiple metabolic databases including Biocyc.org: The Cellular Overview is accessible under the Tools menu. PMID:21595965

  4. Enhancing knowledge discovery from cancer genomics data with Galaxy

    PubMed Central

    Albuquerque, Marco A.; Grande, Bruno M.; Ritch, Elie J.; Pararajalingam, Prasath; Jessa, Selin; Krzywinski, Martin; Grewal, Jasleen K.; Shah, Sohrab P.; Boutros, Paul C.

    2017-01-01

    Abstract The field of cancer genomics has demonstrated the power of massively parallel sequencing techniques to inform on the genes and specific alterations that drive tumor onset and progression. Although large comprehensive sequence data sets continue to be made increasingly available, data analysis remains an ongoing challenge, particularly for laboratories lacking dedicated resources and bioinformatics expertise. To address this, we have produced a collection of Galaxy tools that represent many popular algorithms for detecting somatic genetic alterations from cancer genome and exome data. We developed new methods for parallelization of these tools within Galaxy to accelerate runtime and have demonstrated their usability and summarized their runtimes on multiple cloud service providers. Some tools represent extensions or refinement of existing toolkits to yield visualizations suited to cohort-wide cancer genomic analysis. For example, we present Oncocircos and Oncoprintplus, which generate data-rich summaries of exome-derived somatic mutation. Workflows that integrate these to achieve data integration and visualizations are demonstrated on a cohort of 96 diffuse large B-cell lymphomas and enabled the discovery of multiple candidate lymphoma-related genes. Our toolkit is available from our GitHub repository as Galaxy tool and dependency definitions and has been deployed using virtualization on multiple platforms including Docker. PMID:28327945

  5. Software tools for interactive instruction in radiologic anatomy.

    PubMed

    Alvarez, Antonio; Gold, Garry E; Tobin, Brian; Desser, Terry S

    2006-04-01

    To promote active learning in an introductory Radiologic Anatomy course through the use of computer-based exercises. DICOM datasets from our hospital PACS system were transferred to a networked cluster of desktop computers in a medical school classroom. Medical students in the Radiologic Anatomy course were divided into four small groups and assigned to work on a clinical case for 45 minutes. The groups used iPACS viewer software, a free DICOM viewer, to view images and annotate anatomic structures. The classroom instructor monitored and displayed each group's work sequentially on the master screen by running SynchronEyes, a software tool for controlling PC desktops remotely. Students were able to execute the assigned tasks using the iPACS software with minimal oversight or instruction. Course instructors displayed each group's work on the main display screen of the classroom as the students presented the rationale for their decisions. The interactive component of the course received high ratings from the students and overall course ratings were higher than in prior years when the course was given solely in lecture format. DICOM viewing software is an excellent tool for enabling students to learn radiologic anatomy from real-life clinical datasets. Interactive exercises performed in groups can be powerful tools for stimulating students to learn radiologic anatomy.

  6. Enhancing knowledge discovery from cancer genomics data with Galaxy.

    PubMed

    Albuquerque, Marco A; Grande, Bruno M; Ritch, Elie J; Pararajalingam, Prasath; Jessa, Selin; Krzywinski, Martin; Grewal, Jasleen K; Shah, Sohrab P; Boutros, Paul C; Morin, Ryan D

    2017-05-01

    The field of cancer genomics has demonstrated the power of massively parallel sequencing techniques to inform on the genes and specific alterations that drive tumor onset and progression. Although large comprehensive sequence data sets continue to be made increasingly available, data analysis remains an ongoing challenge, particularly for laboratories lacking dedicated resources and bioinformatics expertise. To address this, we have produced a collection of Galaxy tools that represent many popular algorithms for detecting somatic genetic alterations from cancer genome and exome data. We developed new methods for parallelization of these tools within Galaxy to accelerate runtime and have demonstrated their usability and summarized their runtimes on multiple cloud service providers. Some tools represent extensions or refinement of existing toolkits to yield visualizations suited to cohort-wide cancer genomic analysis. For example, we present Oncocircos and Oncoprintplus, which generate data-rich summaries of exome-derived somatic mutation. Workflows that integrate these to achieve data integration and visualizations are demonstrated on a cohort of 96 diffuse large B-cell lymphomas and enabled the discovery of multiple candidate lymphoma-related genes. Our toolkit is available from our GitHub repository as Galaxy tool and dependency definitions and has been deployed using virtualization on multiple platforms including Docker. © The Author 2017. Published by Oxford University Press.

  7. Application priority of GSHP systems in the climate conditions of the United States

    DOE PAGES

    Cho, Soolyeon; Ray, Saurabh; Im, Piljae; ...

    2017-05-15

    Building energy-performance simulation programs are powerful tools for many aspects of feasibility studies regarding ground source heat pump (GSHP). However, the understanding of the limitations of the energy modelling programs, their capability of predicting energy performance early in the design process, and the complicated functionality of these programs makes the software programs harder to use and less practical. The interactive tool developed in this study seeks to provide analysis information in a straightforward manner that is inexpensive, convenient, and sophisticated. This tool uses an inclusive approach to assess the feasibility of GSHPs by prescreening critical factors such as climate conditions,more » ground temperatures, energy use, and cost savings. It is interactive and enables the user to do a feasibility analysis with a weighting factor for each feasibility criterion based on the user’s preference and interests. The application of the tool explains feasibility scores of 15 representative cities in various climatic conditions across the US. Results for commercial buildings show that the GSHP systems are more feasible in cold and dry, cool and humid, and very cold areas than warm and dry, very hot and humid, and mixed marine areas, and that most feasibility levels are located on good and moderate.« less

  8. Exploring the use of concept chains to structure teacher trainees' understanding of science

    NASA Astrophysics Data System (ADS)

    Machin, Janet; Varleys, Janet; Loxley, Peter

    2004-12-01

    This paper reports on a paper and pencil concept-sorting strategy that enables trainee teachers to restructure their knowledge in any one domain of science. It is used as a self-study tool, mainly to enable them to break down and understand the progression of concepts beyond the level at which they have to teach. The strategy involves listing key ideas in an increasingly complex and inclusive fashion such that a 'chain' is developed where the initial statements are simple and the final ones more complex. Evaluation of the strategy with trainees over a five-year period revealed promising potential for the strategy as a self-study tool, as well as an audit tool, enabling tutors to more easily identify misconceptions. There was some evidence that trainees found the strategy useful in preparing themselves to teach in the classroom, possibly by enabling meaningful learning to take place according to the Ausubel-Novak-Gowin theory.

  9. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  10. CD control with defect inspection: you can teach an old dog a new trick

    NASA Astrophysics Data System (ADS)

    Utzny, Clemens; Ullrich, Albrecht; Heumann, Jan; Mohn, Elias; Meusemann, Stefan; Seltmann, Rolf

    2012-11-01

    Achieving the required critical dimensions (CD) with the best possible uniformity (CDU) on photo-masks has always played a pivotal role in enabling chip technology. Current control strategies are based on scanning electron microscopy (SEM) based measurements implying a sparse spatial resolution on the order of ~ 10-2 m to 10-1 m. A higher spatial resolution could be reached with an adequate measurement sampling, however the increase in the number of measurements makes this approach in the context of a productive environment unfeasible. With the advent of more powerful defect inspection tools a significantly higher spatial resolution of 10-4 m can be achieved by measuring also CD during the regular defect inspection. This method is not limited to the measurement of specific measurement features thus paving the way to a CD assessment of all electrically relevant mask patterns. Enabling such a CD measurement gives way to new realms of CD control. Deterministic short range CD effects which were previously interpreted as noise can be resolved and addressed by CD compensation methods. This in can lead to substantial improvements of the CD uniformity. Thus the defect inspection mediated CD control closes a substantial gap in the mask manufacturing process by allowing the control of short range CD effects which were up till now beyond the reach of regular CD SEM based control strategies. This increase in spatial resolution also counters the decrease in measurement precision due to the usage of an optical system. In this paper we present detailed results on a) the CD data generated during the inspection process, b) the analytical tools needed for relating this data to CD SEM measurement and c) how the CD inspection process enables new dimension of CD compensation within the mask manufacturing process. We find that the inspection based CD measurement generates typically around 500000 measurements with a homogeneous covering of the active mask area. In comparing the CD inspection results with CD SEM measurement on a single measurement point base we find that optical limitations of the inspection tool play a substantial role within the photon based inspection process. Once these shift are characterized and removed a correlation coefficient of 0.9 between these two CD measurement techniques is found. This finding agrees well with a signature based matching approach. Based on these findings we set up a dedicated pooling algorithm which performs on outlier removal for all CD inspections together with a data clustering according to feature specific tool induced shifts. This way tool induced shift effects can be removed and CD signature computation is enabled. A statistical model of the CD signatures which relates the mask design parameters on the relevant length scales to CD effects thus enabling the computation CD compensation maps. The compensation maps address the CD effects on various distinct length scales and we show that long and short range contributions to the CD variation are decreased. We find that the CD uniformity is improved by 25% using this novel CD compensation strategy.

  11. The environment power system analysis tool development program

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  12. Freeform Fluidics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Love, Lonnie J; Richardson, Bradley S; Lind, Randall F

    This work explores the integration of miniaturized fluid power and additive manufacturing. Oak Ridge National Laboratory (ORNL) has been developing an approach to miniaturized fluidic actuation and control that enables high dexterity, low cost and a pathway towards energy efficiency. Previous work focused on mesoscale digital control valves (high pressure, low flow) and the integration of actuation and fluid passages directly with the structure. The primary application being fluid powered robotics. The fundamental challenge was part complexity. Additive manufacturing technologies (E-Beam, Laser and Ultrasonic deposition) enable freeform manufacturing using conventional metal alloys with excellent mechanical properties. The combination of thesemore » two technologies (miniaturized fluid power and additive manufacturing) can enable a paradigm shift in fluid power, increasing efficiency while simultaneously reducing weight, size, complexity and cost.« less

  13. Functional specifications for AI software tools for electric power applications. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faught, W.S.

    1985-08-01

    The principle barrier to the introduction of artificial intelligence (AI) technology to the electric power industry has not been a lack of interest or appropriate problems, for the industry abounds in both. Like most others, however, the electric power industry lacks the personnel - knowledge engineers - with the special combination of training and skills AI programming demands. Conversely, very few AI specialists are conversant with electric power industry problems and applications. The recent availability of sophisticated AI programming environments is doing much to alleviate this shortage. These products provide a set of powerful and usable software tools that enablemore » even non-AI scientists to rapidly develop AI applications. The purpose of this project was to develop functional specifications for programming tools that, when integrated with existing general-purpose knowledge engineering tools, would expedite the production of AI applications for the electric power industry. Twelve potential applications, representative of major problem domains within the nuclear power industry, were analyzed in order to identify those tools that would be of greatest value in application development. Eight tools were specified, including facilities for power plant modeling, data base inquiry, simulation and machine-machine interface.« less

  14. Non-invasive red light optogenetic pacing and optical coherence microscopy (OCM) imaging for drosophila melanogaster (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Men, Jing; Li, Airong; Jerwick, Jason; Tanzi, Rudolph E.; Zhou, Chao

    2017-02-01

    Cardiac pacing could be a powerful tool for investigating mammalian cardiac electrical conduction systems as well as for treatment of certain cardiac pathologies. However, traditional electrical pacing using pacemaker requires an invasive surgical procedure. Electrical currents from the implanted electrodes can also cause damage to heart tissue, further restricting its utility. Optogenetic pacing has been developed as a promising, non-invasive alternative to electrical stimulation for controlling animal heart rhythms. It induces heart contractions by shining pulsed light on transgene-generated microbial opsins, which in turn activate the light gated ion channels in animal hearts. However, commonly used opsins in optogenetic pacing, such as channelrhodopsin-2 (ChR2), require short light wavelength stimulation (475 nm), which is strongly absorbed and scattered by tissue. Here, we performed optogenetic pacing by expression of recently engineered red-shifted microbial opsins, ReaChR and CsChrimson, in a well-established animal model, Drosophila melanogaster, using the 617 nm stimulation light pulses. The OCM technique enables non-invasive optical imaging of animal hearts with high speed and ultrahigh axial and transverse resolutions. We integrated a customized OCM system with the optical stimulation system to monitor the optogenetic pacing noninvasively. The use of red-sifted opsins enabled deeper penetration of simulating light at lower power, which is promising for applications of optogenetic pacing in mammalian cardiac pathology studies or clinical treatments in the future.

  15. Next-generation technologies for spatial proteomics: Integrating ultra-high speed MALDI-TOF and high mass resolution MALDI FTICR imaging mass spectrometry for protein analysis.

    PubMed

    Spraggins, Jeffrey M; Rizzo, David G; Moore, Jessica L; Noto, Michael J; Skaar, Eric P; Caprioli, Richard M

    2016-06-01

    MALDI imaging mass spectrometry is a powerful analytical tool enabling the visualization of biomolecules in tissue. However, there are unique challenges associated with protein imaging experiments including the need for higher spatial resolution capabilities, improved image acquisition rates, and better molecular specificity. Here we demonstrate the capabilities of ultra-high speed MALDI-TOF and high mass resolution MALDI FTICR IMS platforms as they relate to these challenges. High spatial resolution MALDI-TOF protein images of rat brain tissue and cystic fibrosis lung tissue were acquired at image acquisition rates >25 pixels/s. Structures as small as 50 μm were spatially resolved and proteins associated with host immune response were observed in cystic fibrosis lung tissue. Ultra-high speed MALDI-TOF enables unique applications including megapixel molecular imaging as demonstrated for lipid analysis of cystic fibrosis lung tissue. Additionally, imaging experiments using MALDI FTICR IMS were shown to produce data with high mass accuracy (<5 ppm) and resolving power (∼75 000 at m/z 5000) for proteins up to ∼20 kDa. Analysis of clear cell renal cell carcinoma using MALDI FTICR IMS identified specific proteins localized to healthy tissue regions, within the tumor, and also in areas of increased vascularization around the tumor. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Green Power Partner Resources

    EPA Pesticide Factsheets

    EPA Green Power Partners can access tools and resources to help promote their green power commitments. Partners use these tools to communicate the benefits of their green power use to their customers, stakeholders, and the general public.

  17. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  18. Technology Advances Enabling a New Class of Hybrid Underwater Vehicles

    NASA Astrophysics Data System (ADS)

    Bowen, A.

    2016-02-01

    Both tethered (ROV) and untethered (AUV) systems have proven to be highly valuable tools for a range of application undersea. Certain enabling technologies coupled with recent advances in robotic systems make it possible to consider supplementing many of the functions performed by these platforms with appropriately designed semi-autonomous vehicles that may be less expensive operate than traditional deep-water ROVs. Such vehicles can be deployed from smaller ships and may lead to sea-floor resident systems able to perform a range of interventions under direct human control when required. These systems are effectively a hybrid cross between ROV and AUV vehicles and poised to enable an important new class of undersea vehicle. It is now possible to radically redefine the meaning of the words "tethered vehicle" to include virtual tethering via acoustic and optical means or through the use of small diameter re-useable tethers, providing not power but only high bandwidth communications. The recent developments at Woods Hole Oceanographic Institution (WHOI), paves the way for a derivative vehicle type able to perform a range of interventions in deep water. Such battery-powered, hybrid-tethered vehicles will be able to perform tasks that might otherwise require a conventional ROV. These functions will be possible from less complex ships because of a greatly reduced dependence on large, heavy tethers and associated vehicle handling equipment. In certain applications, such vehicles can be resident within subsea facilities, able to provide operators with near instant access when required. Several key emerging technologies and capabilities make such a vehicle possible. Advances in both acoustic and optical "wireless" underwater communications and mico-tethers as pioneered by the HROV Nereus offer the potential to transform ROV type operations and thus offer planners and designers an important new dimension to subsea robotic intervention

  19. Direct analysis in real time high resolution mass spectrometry as a tool for rapid characterization of mind-altering plant materials and revelation of supplement adulteration--The case of Kanna.

    PubMed

    Lesiak, Ashton D; Cody, Robert B; Ubukata, Masaaki; Musah, Rabi A

    2016-03-01

    We demonstrate the utility of direct analysis in real time ionization coupled with high resolution time-of-flight mass spectrometry (DART-HRTOFMS) in revealing the adulteration of commercially available Sceletium tortuosum, a mind-altering plant-based drug commonly known as Kanna. Accurate masses consistent with alkaloids previously isolated from S. tortuosum plant material enabled identification of the products as Kanna, and in-source collision-induced dissociation (CID) confirmed the presence of one of these alkaloids, hordenine, while simultaneously revealing the presence of an adulterant. The stimulant ephedrine, which has been banned in herbal products and supplements, was confirmed to be present in a sample through the use of in-source CID. High-throughput DART-HRTOFMS was shown to be a powerful tool to not only screen plant-based drugs of abuse for psychotropic alkaloids, but also to reveal the presence of scheduled substances and adulterants. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Time-gated detection of protein-protein interactions with transcriptional readout

    PubMed Central

    Sanchez, Mateo I; Coukos, Robert; von Zastrow, Mark

    2017-01-01

    Transcriptional assays, such as yeast two-hybrid and TANGO, that convert transient protein-protein interactions (PPIs) into stable expression of transgenes are powerful tools for PPI discovery, screens, and analysis of cell populations. However, such assays often have high background and lose information about PPI dynamics. We have developed SPARK (Specific Protein Association tool giving transcriptional Readout with rapid Kinetics), in which proteolytic release of a membrane-tethered transcription factor (TF) requires both a PPI to deliver a protease proximal to its cleavage peptide and blue light to uncage the cleavage site. SPARK was used to detect 12 different PPIs in mammalian cells, with 5 min temporal resolution and signal ratios up to 37. By shifting the light window, we could reconstruct PPI time-courses. Combined with FACS, SPARK enabled 51 fold enrichment of PPI-positive over PPI-negative cells. Due to its high specificity and sensitivity, SPARK has the potential to advance PPI analysis and discovery. PMID:29189201

  1. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  2. In situ visualization and data analysis for turbidity currents simulation

    NASA Astrophysics Data System (ADS)

    Camata, Jose J.; Silva, Vítor; Valduriez, Patrick; Mattoso, Marta; Coutinho, Alvaro L. G. A.

    2018-01-01

    Turbidity currents are underflows responsible for sediment deposits that generate geological formations of interest for the oil and gas industry. LibMesh-sedimentation is an application built upon the libMesh library to simulate turbidity currents. In this work, we present the integration of libMesh-sedimentation with in situ visualization and in transit data analysis tools. DfAnalyzer is a solution based on provenance data to extract and relate strategic simulation data in transit from multiple data for online queries. We integrate libMesh-sedimentation and ParaView Catalyst to perform in situ data analysis and visualization. We present a parallel performance analysis for two turbidity currents simulations showing that the overhead for both in situ visualization and in transit data analysis is negligible. We show that our tools enable monitoring the sediments appearance at runtime and steer the simulation based on the solver convergence and visual information on the sediment deposits, thus enhancing the analytical power of turbidity currents simulations.

  3. The Reactome Pathway Knowledgebase

    PubMed Central

    Jupe, Steven; Matthews, Lisa; Sidiropoulos, Konstantinos; Gillespie, Marc; Garapati, Phani; Haw, Robin; Jassal, Bijay; Korninger, Florian; May, Bruce; Milacic, Marija; Roca, Corina Duenas; Rothfels, Karen; Sevilla, Cristoffer; Shamovsky, Veronica; Shorser, Solomon; Varusai, Thawfeek; Viteri, Guilherme; Weiser, Joel

    2018-01-01

    Abstract The Reactome Knowledgebase (https://reactome.org) provides molecular details of signal transduction, transport, DNA replication, metabolism, and other cellular processes as an ordered network of molecular transformations—an extended version of a classic metabolic map, in a single consistent data model. Reactome functions both as an archive of biological processes and as a tool for discovering unexpected functional relationships in data such as gene expression profiles or somatic mutation catalogues from tumor cells. To support the continued brisk growth in the size and complexity of Reactome, we have implemented a graph database, improved performance of data analysis tools, and designed new data structures and strategies to boost diagram viewer performance. To make our website more accessible to human users, we have improved pathway display and navigation by implementing interactive Enhanced High Level Diagrams (EHLDs) with an associated icon library, and subpathway highlighting and zooming, in a simplified and reorganized web site with adaptive design. To encourage re-use of our content, we have enabled export of pathway diagrams as ‘PowerPoint’ files. PMID:29145629

  4. Simulation of an Asynchronous Machine by using a Pseudo Bond Graph

    NASA Astrophysics Data System (ADS)

    Romero, Gregorio; Felez, Jesus; Maroto, Joaquin; Martinez, M. Luisa

    2008-11-01

    For engineers, computer simulation, is a basic tool since it enables them to understand how systems work without actually needing to see them. They can learn how they work in different circumstances and optimize their design with considerably less cost in terms of time and money than if they had to carry out tests on a physical system. However, if computer simulation is to be reliable it is essential for the simulation model to be validated. There is a wide range of commercial brands on the market offering products for electrical domain simulation (SPICE, LabVIEW PSCAD,Dymola, Simulink, Simplorer,...). These are powerful tools, but require the engineer to have a perfect knowledge of the electrical field. This paper shows an alternative methodology to can simulate an asynchronous machine using the multidomain Bond Graph technique and apply it in any program that permit the simulation of models based in this technique; no extraordinary knowledge of this technique and electric field are required to understand the process .

  5. Direct Metal Deposition of H13 Tool Steel on Copper Alloy Substrate: Parametric Investigation

    NASA Astrophysics Data System (ADS)

    Imran, M. Khalid; Masood, S. H.; Brandt, Milan

    2015-12-01

    Over the past decade, researchers have demonstrated interest in tribology and prototyping by the laser aided material deposition process. Laser aided direct metal deposition (DMD) enables the formation of a uniform clad by melting the powder to form desired component from metal powder materials. In this research H13 tool steel has been used to clad on a copper alloy substrate using DMD. The effects of laser parameters on the quality of DMD deposited clad have been investigated and acceptable processing parameters have been determined largely through trial-and-error approaches. The relationships between DMD process parameters and the product characteristics such as porosity, micro-cracks and microhardness have been analysed using scanning electron microscope (SEM), image analysis software (ImageJ) and microhardness tester. It has been found that DMD parameters such as laser power, powder mass flow rate, feed rate and focus size have an important role in clad quality and crack formation.

  6. Using the Browser for Science: A Collaborative Toolkit for Astronomy

    NASA Astrophysics Data System (ADS)

    Connolly, A. J.; Smith, I.; Krughoff, K. S.; Gibson, R.

    2011-07-01

    Astronomical surveys have yielded hundreds of terabytes of catalogs and images that span many decades of the electromagnetic spectrum. Even when observatories provide user-friendly web interfaces, exploring these data resources remains a complex and daunting task. In contrast, gadgets and widgets have become popular in social networking (e.g. iGoogle, Facebook). They provide a simple way to make complex data easily accessible that can be customized based on the interest of the user. With ASCOT (an AStronomical COllaborative Toolkit) we expand on these concepts to provide a customizable and extensible gadget framework for use in science. Unlike iGoogle, where all of the gadgets are independent, the gadgets we develop communicate and share information, enabling users to visualize and interact with data through multiple, simultaneous views. With this approach, web-based applications for accessing and visualizing data can be generated easily and, by linking these tools together, integrated and powerful data analysis and discovery tools can be constructed.

  7. XLinkDB 2.0: integrated, large-scale structural analysis of protein crosslinking data

    PubMed Central

    Schweppe, Devin K.; Zheng, Chunxiang; Chavez, Juan D.; Navare, Arti T.; Wu, Xia; Eng, Jimmy K.; Bruce, James E.

    2016-01-01

    Motivation: Large-scale chemical cross-linking with mass spectrometry (XL-MS) analyses are quickly becoming a powerful means for high-throughput determination of protein structural information and protein–protein interactions. Recent studies have garnered thousands of cross-linked interactions, yet the field lacks an effective tool to compile experimental data or access the network and structural knowledge for these large scale analyses. We present XLinkDB 2.0 which integrates tools for network analysis, Protein Databank queries, modeling of predicted protein structures and modeling of docked protein structures. The novel, integrated approach of XLinkDB 2.0 enables the holistic analysis of XL-MS protein interaction data without limitation to the cross-linker or analytical system used for the analysis. Availability and Implementation: XLinkDB 2.0 can be found here, including documentation and help: http://xlinkdb.gs.washington.edu/. Contact: jimbruce@uw.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153666

  8. Education and Outreach with the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Lawton, Brandon L.; Eisenhamer, B.; Raddick, M. J.; Mattson, B. J.; Harris, J.

    2012-01-01

    The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. Many future missions will also be incorporated into the VAO tools when they launch. The Education and Public Outreach (E/PO) program for the VAO is led by the Space Telescope Science Institute in collaboration with the HEASARC E/PO program and Johns Hopkins University. VAO E/PO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public, formal education, and informal education communities. Our E/PO efforts will be structured to provide uniform access to VAO information, enabling educational opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that many VO programs have built powerful tools for E/PO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. We are building partnerships with Microsoft, Zooniverse, and NASA's Night Sky Network to leverage the communities and tools that already exist to meet the needs of our audiences. Our formal education program is standards-based and aims to give teachers the tools to use real astronomical data to teach the STEM subjects. To determine which tools the VAO will incorporate into the formal education program, needs assessments will be conducted with educators across the U.S.

  9. Enabling Real-Time Volume Rendering of Functional Magnetic Resonance Imaging on an iOS Device.

    PubMed

    Holub, Joseph; Winer, Eliot

    2017-12-01

    Powerful non-invasive imaging technologies like computed tomography (CT), ultrasound, and magnetic resonance imaging (MRI) are used daily by medical professionals to diagnose and treat patients. While 2D slice viewers have long been the standard, many tools allowing 3D representations of digital medical data are now available. The newest imaging advancement, functional MRI (fMRI) technology, has changed medical imaging from viewing static to dynamic physiology (4D) over time, particularly to study brain activity. Add this to the rapid adoption of mobile devices for everyday work and the need to visualize fMRI data on tablets or smartphones arises. However, there are few mobile tools available to visualize 3D MRI data, let alone 4D fMRI data. Building volume rendering tools on mobile devices to visualize 3D and 4D medical data is challenging given the limited computational power of the devices. This paper describes research that explored the feasibility of performing real-time 3D and 4D volume raycasting on a tablet device. The prototype application was tested on a 9.7" iPad Pro using two different fMRI datasets of brain activity. The results show that mobile raycasting is able to achieve between 20 and 40 frames per second for traditional 3D datasets, depending on the sampling interval, and up to 9 frames per second for 4D data. While the prototype application did not always achieve true real-time interaction, these results clearly demonstrated that visualizing 3D and 4D digital medical data is feasible with a properly constructed software framework.

  10. Verification and Validation Strategy for LWRS Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carl M. Stoots; Richard R. Schultz; Hans D. Gougar

    2012-09-01

    One intension of the Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to create advanced computational tools for safety assessment that enable more accurate representation of a nuclear power plant safety margin. These tools are to be used to study the unique issues posed by lifetime extension and relicensing of the existing operating fleet of nuclear power plants well beyond their first license extension period. The extent to which new computational models / codes such as RELAP-7 can be used for reactor licensing / relicensing activities depends mainly upon the thoroughness with which they have been verifiedmore » and validated (V&V). This document outlines the LWRS program strategy by which RELAP-7 code V&V planning is to be accomplished. From the perspective of developing and applying thermal-hydraulic and reactivity-specific models to reactor systems, the US Nuclear Regulatory Commission (NRC) Regulatory Guide 1.203 gives key guidance to numeric model developers and those tasked with the validation of numeric models. By creating Regulatory Guide 1.203 the NRC defined a framework for development, assessment, and approval of transient and accident analysis methods. As a result, this methodology is very relevant and is recommended as the path forward for RELAP-7 V&V. However, the unique issues posed by lifetime extension will require considerations in addition to those addressed in Regulatory Guide 1.203. Some of these include prioritization of which plants / designs should be studied first, coupling modern supporting experiments to the stringent needs of new high fidelity models / codes, and scaling of aging effects.« less

  11. Anthropogenic Sulphur Dioxide Load over China as Observed from Different Satellite Sensors

    NASA Technical Reports Server (NTRS)

    Koukouli, M. E.; Balis, D. S.; Johannes Van Der A, Ronald; Theys, N.; Hedelt, P.; Richter, A.; Krotkov, N.; Li, Can; Taylor, M.

    2016-01-01

    China, with its rapid economic growth and immense exporting power, has been the focus of many studies during this previous decade quantifying its increasing emissions contribution to the Earth's atmosphere. With a population slowly shifting towards enlarged power and purchasing needs, the ceaseless inauguration of new power plants, smelters, refineries and industrial parks leads infallibly to increases in sulphur dioxide, SO2, emissions. The recent capability of next generation algorithms as well as new space-borne instruments to detect anthropogenic SO2 loads has enabled a fast advancement in this field. In the following work, algorithms providing total SO2 columns over China based on SCIAMACHY/Envisat, OMI/Aura and GOME2/MetopA observations are presented. The need for post-processing and gridding of the SO2 fields is further revealed in this work, following the path of previous publications. Further, it is demonstrated that the usage of appropriate statistical tools permits studying parts of the datasets typically excluded, such as the winter months loads. Focusing on actual point sources, such as megacities and known power plant locations, instead of entire provinces, monthly mean time series have been examined in detail. The sharp decline in SO2 emissions in more than 90% - 95% of the locations studied confirms the recent implementation of government desulphurisation legislation; however, locations with increases, even for the previous five years, are also identified. These belong to provinces with emerging economies which are in haste to install power plants and are possibly viewed leniently by the authorities, in favour of growth. The SO2 load seasonality has also been examined in detail with a novel mathematical tool, with 70% of the point sources having a statistically significant annual cycle with highs in winter and lows in summer, following the heating requirements of the Chinese population.

  12. Anthropogenic sulphur dioxide load over China as observed from different satellite sensors

    NASA Astrophysics Data System (ADS)

    Koukouli, M. E.; Balis, D. S.; van der A, Ronald Johannes; Theys, N.; Hedelt, P.; Richter, A.; Krotkov, N.; Li, C.; Taylor, M.

    2016-11-01

    China, with its rapid economic growth and immense exporting power, has been the focus of many studies during this previous decade quantifying its increasing emissions contribution to the Earth's atmosphere. With a population slowly shifting towards enlarged power and purchasing needs, the ceaseless inauguration of new power plants, smelters, refineries and industrial parks leads infallibly to increases in sulphur dioxide, SO2, emissions. The recent capability of next generation algorithms as well as new space-borne instruments to detect anthropogenic SO2 loads has enabled a fast advancement in this field. In the following work, algorithms providing total SO2 columns over China based on SCIAMACHY/Envisat, OMI/Aura and GOME2/MetopA observations are presented. The need for post-processing and gridding of the SO2 fields is further revealed in this work, following the path of previous publications. Further, it is demonstrated that the usage of appropriate statistical tools permits studying parts of the datasets typically excluded, such as the winter months loads. Focusing on actual point sources, such as megacities and known power plant locations, instead of entire provinces, monthly mean time series have been examined in detail. The sharp decline in SO2 emissions in more than 90%-95% of the locations studied confirms the recent implementation of government desulphurisation legislation; however, locations with increases, even for the previous five years, are also identified. These belong to provinces with emerging economies which are in haste to install power plants and are possibly viewed leniently by the authorities, in favour of growth. The SO2 load seasonality has also been examined in detail with a novel mathematical tool, with 70% of the point sources having a statistically significant annual cycle with highs in winter and lows in summer, following the heating requirements of the Chinese population.

  13. Investigating Data Motion Power Trends to Enable Power-Efficient OpenSHMEM Implementations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mintz, Tiffany M; D'Azevedo, Eduardo F.; Gorentla Venkata, Manjunath

    2016-01-01

    As we continue to develop extreme-scale systems, it is becoming increasingly important to be mindful and more in control of power consumed by these systems. With high performance requirements being more constrained by power and data movement quickly becoming the critical concern for both power and performance, now is an opportune time for OpenSHMEM implementations to address the need for more power-efficient data movement. In order to enable power efficient OpenSHMEM implementations, we have formulated power trend studies that emphasize power consumption for one-sided communications and the disparities in power consumption across multiple implementations. In this paper, we present powermore » trend analysis, generate targeted hypotheses for increasing power efficiency with OpenSHMEM, and discuss prospective research for power efficient OpenSHMEM implementations.« less

  14. Use of high-power diode lasers for hardening and thermal conduction welding of metals

    NASA Astrophysics Data System (ADS)

    Klocke, Fritz; Demmer, Axel; Zaboklicki, A.

    1997-08-01

    CO2 and Nd:YAG high power lasers have become established as machining tools in industrial manufacturing over the last few years. The most important advantages compared to conventional processing techniques lie in the absence of forces introduced by the laser into the workpiece and in the simple arid highly accurate control in terms ofpositioning and timing making the laser a universally applicable, wear-free and extremely flexible tool /1,2/. The laser can be utilised costeffectively in numerous manufacturing processes but there are also further applications for the laser which produce excellent results from a technical point of view, but are not justified in terms of cost. The extensive use of lasers, particularly in small companies and workshops, is hindered by two main reasons: the complexity and size ofthe laser source and plant and the high investment costs /3/. A new generation of lasers, the high power diode lasers (HDL), combines high performance with a compact design, making the laser a cheap and easy to use tool with many applications /3,4,5,6/. In the diode laser, the laser beam is generated by a microelectronic diode which transforms electrical energy directly into laser energy. Diode lasers with low power outputs have, for some time, been making their mark in our everyday lives: they are used in CD players, laser printers and scanners at cash tills. Modern telecommunications would be impossible without these lasers which enable information to be transmitted in the form oflight impulses through optical fibres. They can also be found in compact precision measurement instrumentation - range fmders, interferometers and pollutant analysis devices /3,6/. In the field of material processing, the first applications ofthe laser, such as for soldering, inscribing, surface hardening and plastic or heat conduction welding, will exceed the limits ofthe relatively low performance output currently available. The diode laser has a shorter wavelength than the CO2 and Nd:YAG lasers making it more favourable in terms ofthe absorption behaviour ofthe laser beam - an advantage that will soon have a significant effect on the range of its applications.

  15. Neuroscience imaging enabled by new highly tunable and high peak power femtosecond lasers

    NASA Astrophysics Data System (ADS)

    Hakulinen, T.; Klein, J.

    2017-02-01

    Neuroscience applications benefit from recent developments in industrial femtosecond laser technology. New laser sources provide several megawatts of peak power at wavelength of 1040 nm, which enables simultaneous optogenetics photoactivation of tens or even hundreds of neurons using red shifted opsins. Another recent imaging trend is to move towards longer wavelengths, which would enable access to deeper layers of tissue due to lower scattering and lower absorption in the tissue. Femtosecond lasers pumping a non-collinear optical parametric amplifier (NOPA) enable the access to longer wavelengths with high peak powers. High peak powers of >10 MW at 1300 nm and 1700 nm allow effective 3-photon excitation of green and red shifted calcium indicators respectively and access to deeper, sub-cortex layers of the brain. Early results include in vivo detection of spontaneous activity in hippocampus within an intact mouse brain, where neurons express GCaMP6 activated in a 3-photon process at 1320 nm.

  16. Low-cost wireless voltage & current grid monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hines, Jacqueline

    This report describes the development and demonstration of a novel low-cost wireless power distribution line monitoring system. This system measures voltage, current, and relative phase on power lines of up to 35 kV-class. The line units operate without any batteries, and without harvesting energy from the power line. Thus, data on grid condition is provided even in outage conditions, when line current is zero. This enhances worker safety by detecting the presence of voltage and current that may appear from stray sources on nominally isolated lines. Availability of low-cost power line monitoring systems will enable widespread monitoring of the distributionmore » grid. Real-time data on local grid operating conditions will enable grid operators to optimize grid operation, implement grid automation, and understand the impact of solar and other distributed sources on grid stability. The latter will enable utilities to implement eneygy storage and control systems to enable greater penetration of solar into the grid.« less

  17. Discovering new methods of data fusion, visualization, and analysis in 3D immersive environments for hyperspectral and laser altimetry data

    NASA Astrophysics Data System (ADS)

    Moore, C. A.; Gertman, V.; Olsoy, P.; Mitchell, J.; Glenn, N. F.; Joshi, A.; Norpchen, D.; Shrestha, R.; Pernice, M.; Spaete, L.; Grover, S.; Whiting, E.; Lee, R.

    2011-12-01

    Immersive virtual reality environments such as the IQ-Station or CAVE° (Cave Automated Virtual Environment) offer new and exciting ways to visualize and explore scientific data and are powerful research and educational tools. Combining remote sensing data from a range of sensor platforms in immersive 3D environments can enhance the spectral, textural, spatial, and temporal attributes of the data, which enables scientists to interact and analyze the data in ways never before possible. Visualization and analysis of large remote sensing datasets in immersive environments requires software customization for integrating LiDAR point cloud data with hyperspectral raster imagery, the generation of quantitative tools for multidimensional analysis, and the development of methods to capture 3D visualizations for stereographic playback. This study uses hyperspectral and LiDAR data acquired over the China Hat geologic study area near Soda Springs, Idaho, USA. The data are fused into a 3D image cube for interactive data exploration and several methods of recording and playback are investigated that include: 1) creating and implementing a Virtual Reality User Interface (VRUI) patch configuration file to enable recording and playback of VRUI interactive sessions within the CAVE and 2) using the LiDAR and hyperspectral remote sensing data and GIS data to create an ArcScene 3D animated flyover, where left- and right-eye visuals are captured from two independent monitors for playback in a stereoscopic player. These visualizations can be used as outreach tools to demonstrate how integrated data and geotechnology techniques can help scientists see, explore, and more adequately comprehend scientific phenomena, both real and abstract.

  18. Broadband sidebands generated by parametric instability in lower hybrid current drive experiments on EAST

    NASA Astrophysics Data System (ADS)

    Amicucci, L.; Ding, B. J.; Castaldo, C.; Cesario, R.; Giovannozzi, E.; Li, M. H.; Tuccillo, A. A.

    2015-12-01

    Modern research on nuclear fusion energy, based on the tokamak concept, has strong need of tools for actively driving non-inductive current especially at the periphery of plasma column, where tools available so far have poor efficiency. This is essential for solving one of the most critical problems for thermonuclear reactor, consisting in how to achieve the figure of fusion gain in the context of sufficient stability. The lower hybrid current drive (LHCD) effect has the potential capability of driving current at large radii of reactor plasma with high efficiency [1]. Experiments recently carried out on EAST showed that a strong activity of LH sideband waves (from the RF probe spectra), accompanied by weak core penetration of the coupled LH power, is present when operating at relatively high plasma densities. Previous theoretical results, confirmed by experiments on FTU, showed that the LH sideband phenomenon is produced by parametric instability (PI), which are mitigated by higher plasma edge temperatures. This condition is thus useful for enabling the LH power propagation when operating with profiles having high plasma densities even at the edge. In the present work, we show new PI modeling of EAST plasmas data, obtained in condition of higher plasma edge temperature due to chamber lithisation. The obtained trend of the PI frequencies and growth rates is consistent with data of RF probe spectra, available in different regimes of lithisated and not lithisated vessel. Moreover, these spectra are interpreted as PI effect occurring at the periphery of plasma column, however in the low field side where the LH power is coupled.

  19. Ultra-hard amorphous AlMgB14 films RF sputtered onto curved substrates

    NASA Astrophysics Data System (ADS)

    Grishin, A. M.; Putrolaynen, V. V.; Yuzvyuk, M. H.

    2017-03-01

    Recently, hard AlMgB14 (BAM) coatings were deposited for the first time by RF magnetron sputtering using a single stoichiometric ceramic target. High target sputtering power and sufficiently short target-to-substrate distance were found to be critical processing conditions. They enabled fabrication of stoichiometric in-depth compositionally homogeneous films with the peak values of nanohardness 88 GPa and Young’s modulus 517 GPa at the penetration depth of 26 nm and, respectively, 35 GPa and 275 GPa at 200 nm depth in 2 µm thick film (Grishin et al 2014 JETP Lett. 100 680). The narrow range of sufficiently short target-to-substrate distance makes impossible to coat non flat specimens. To achieve ultimate BAM films’ characteristics onto curved surfaces we developed two-step sputtering process. The first thin layer is deposited as a template at low RF power that facilitates a layered Frank van der Merwe mode growth of smooth film occurs. The next layer is grown at high RF target sputtering power. The affinity of subsequent flow of sputtered atoms to already evenly condensed template fosters the development of smooth film surface. As an example, we made BAM coating onto hemispherical 5 mm in diameter ball made from a hard tool steel and used as a head of a special gauge. Very smooth (6.6 nm RMS surface roughness) and hard AlMgB14 films fabricated onto commercial ball-shaped items enhance hardness of tool steel specimens by a factor of four.

  20. Next-generation fiber lasers enabled by high-performance components

    NASA Astrophysics Data System (ADS)

    Kliner, D. A. V.; Victor, B.; Rivera, C.; Fanning, G.; Balsley, D.; Farrow, R. L.; Kennedy, K.; Hampton, S.; Hawke, R.; Soukup, E.; Reynolds, M.; Hodges, A.; Emery, J.; Brown, A.; Almonte, K.; Nelson, M.; Foley, B.; Dawson, D.; Hemenway, D. M.; Urbanek, W.; DeVito, M.; Bao, L.; Koponen, J.; Gross, K.

    2018-02-01

    Next-generation industrial fiber lasers enable challenging applications that cannot be addressed with legacy fiber lasers. Key features of next-generation fiber lasers include robust back-reflection protection, high power stability, wide power tunability, high-speed modulation and waveform generation, and facile field serviceability. These capabilities are enabled by high-performance components, particularly pump diodes and optical fibers, and by advanced fiber laser designs. We summarize the performance and reliability of nLIGHT diodes, fibers, and next-generation industrial fiber lasers at power levels of 500 W - 8 kW. We show back-reflection studies with up to 1 kW of back-reflected power, power-stability measurements in cw and modulated operation exhibiting sub-1% stability over a 5 - 100% power range, and high-speed modulation (100 kHz) and waveform generation with a bandwidth 20x higher than standard fiber lasers. We show results from representative applications, including cutting and welding of highly reflective metals (Cu and Al) for production of Li-ion battery modules and processing of carbon fiber reinforced polymers.

  1. Characterization of a novel bioreactor system for 3D cellular mechanobiology studies.

    PubMed

    Cook, Colin A; Huri, Pinar Y; Ginn, Brian P; Gilbert-Honick, Jordana; Somers, Sarah M; Temple, Joshua P; Mao, Hai-Quan; Grayson, Warren L

    2016-08-01

    In vitro engineering systems can be powerful tools for studying tissue development in response to biophysical stimuli as well as for evaluating the functionality of engineered tissue grafts. It has been challenging, however, to develop systems that adequately integrate the application of biomimetic mechanical strain to engineered tissue with the ability to assess functional outcomes in real time. The aim of this study was to design a bioreactor system capable of real-time conditioning (dynamic, uniaxial strain, and electrical stimulation) of centimeter-long 3D tissue engineered constructs simultaneously with the capacity to monitor local strains. The system addresses key limitations of uniform sample loading and real-time imaging capabilities. Our system features an electrospun fibrin scaffold, which exhibits physiologically relevant stiffness and uniaxial alignment that facilitates cell adhesion, alignment, and proliferation. We have demonstrated the capacity for directly incorporating human adipose-derived stromal/stem cells into the fibers during the electrospinning process and subsequent culture of the cell-seeded constructs in the bioreactor. The bioreactor facilitates accurate pre-straining of the 3D constructs as well as the application of dynamic and static uniaxial strains while monitoring bulk construct tensions. The incorporation of fluorescent nanoparticles throughout the scaffolds enables in situ monitoring of local strain fields using fluorescent digital image correlation techniques, since the bioreactor is imaging compatible, and allows the assessment of local sample stiffness and stresses when coupled with force sensor measurements. In addition, the system is capable of measuring the electromechanical coupling of skeletal muscle explants by applying an electrical stimulus and simultaneously measuring the force of contraction. The packaging of these technologies, biomaterials, and analytical methods into a single bioreactor system has produced a powerful tool that will enable improved engineering of functional 3D ligaments, tendons, and skeletal muscles. Biotechnol. Bioeng. 2016;113: 1825-1837. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. GeoPad: Innovative Applications of Information Technology in Field Science Education

    NASA Astrophysics Data System (ADS)

    Knoop, P. A.; van der Pluijm, B.

    2003-12-01

    A core requirement for most undergraduate degrees in the Earth sciences is a course in field geology, which provides students with training in field science methodologies, including geologic mapping. The University of Michigan Geological Sciences' curriculum includes a seven-week, summer field course, GS-440, based out of the university's Camp Davis Geologic Field Station, near Jackson, WY. Such field-based courses stand to benefit tremendously from recent innovations in Information Technology \\(IT\\), especially in the form of increasing portability, new haptic interfaces for personal computers, and advancements in Geographic Information System \\(GIS\\) software. Such innovations are enabling in-the-field, real-time access to powerful data collection, analysis, visualization, and interpretation tools. The benefits of these innovations, however, can only be realized on a broad basis when the IT reaches a level of maturity at which users can easily employ it to enhance their learning experience and scientific activities, rather than the IT itself being a primary focus of the curriculum or a constraint on field activities. The GeoPad represents a combination of these novel technologies that achieves that goal. The GeoPad concept integrates a ruggedized Windows XP TabletPC equipped with wireless networking, a portable GPS receiver, digital camera, microphone-headset, voice-recognition software, GIS, and supporting, digital, geo-referenced data-sets. A key advantage of the GeoPad is enabling field-based usage of visualization software and data focusing on \\(3D\\) geospatial relationships \\(developed as part of the complementary GeoWall initiative\\), which provides a powerful new tool for enhancing and facilitating undergraduate field geology education, as demonstrated during the summer 2003 session of GS-440. In addition to an education in field methodologies, students also gain practical experience using IT that they will encounter during their continued educational, research, or professional careers. This approach is immediately applicable to field geology courses elsewhere and indeed to other field-oriented programs \\(e.g., in biology, archeology, ecology\\), given similar needs.

  3. Final Report to the National Energy Technology Laboratory on FY14- FY15 Cooperative Research with the Consortium for Electric Reliability Technology Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vittal, Vijay; Lampis, Anna Rosa

    The Power System Engineering Research Center (PSERC) engages in technological, market, and policy research for an efficient, secure, resilient, adaptable, and economic U.S. electric power system. PSERC, as a founding partner of the Consortium for Electric Reliability Technology Solutions (CERTS), conducted a multi-year program of research for U.S. Department of Energy (DOE) Office of Electricity Delivery and Energy Reliability (OE) to develop new methods, tools, and technologies to protect and enhance the reliability and efficiency of the U.S. electric power system as competitive electricity market structures evolve, and as the grid moves toward wide-scale use of decentralized generation (such asmore » renewable energy sources) and demand-response programs. Phase I of OE’s funding for PSERC, under cooperative agreement DE-FC26-09NT43321, started in fiscal year (FY) 2009 and ended in FY2013. It was administered by DOE’s National Energy Technology Laboratory (NETL) through a cooperative agreement with Arizona State University (ASU). ASU provided sub-awards to the participating PSERC universities. This document is PSERC’s final report to NETL on the activities for OE, conducted through CERTS, from September 2015 through September 2017 utilizing FY 2014 to FY 2015 funding under cooperative agreement DE-OE0000670. PSERC is a thirteen-university consortium with over 30 industry members. Since 1996, PSERC has been engaged in research and education efforts with the mission of “empowering minds to engineer the future electric energy system.” Its work is focused on achieving: • An efficient, secure, resilient, adaptable, and economic electric power infrastructure serving society • A new generation of educated technical professionals in electric power • Knowledgeable decision-makers on critical energy policy issues • Sustained, quality university programs in electric power engineering. PSERC core research is funded by industry, with a budget supporting approximately 30 principal investigators and some 70 graduate students and other researchers. Its researchers are multi-disciplinary, conducting research in three principal areas: power systems, power markets and policy, and transmission and distribution technologies. The research is collaborative; each project involves researchers typically at two universities working with industry advisors who have expressed interest in the project. Examples of topics for recent PSERC research projects include grid integration of renewables and energy storage, new tools for taking advantage of increased penetration of real-time system measurements, advanced system protection methods to maintain grid reliability, and risk and reliability assessment of increasingly complex cyber-enabled power systems. A PSERC’s objective is to proactively address the technical and policy challenges of U.S. electric power systems. To achieve this objective, PSERC works with CERTS to conduct technical research on advanced applications and investigate the design of fair and transparent electricity markets; these research topics align with CERTS research areas 1 and 2: Real-time Grid Reliability Management (Area 1), and Reliability and Markets (Area 2). The CERTS research areas overlap with the PSERC research stems: Power Systems, Power Markets, and Transmission and Distribution Technologies, as described on the PSERC website (see http://www.pserc.org/research/research_program.aspx). The performers were with Arizona State University (ASU), Cornell University (CU), University of California at Berkeley (UCB), and University of Illinois at Urbana-Champaign (UIUC). PSERC research activities in the area of reliability and markets focused on electric market and power policy analyses. The resulting studies suggest ways to frame best practices using organized markets for managing U.S. grid assets reliably and to identify highest priority areas for improvement. PSERC research activities in the area of advanced applications focused on mid- to long-term software research and development, with anticipated outcomes that move innovative ideas toward real-world application. Under the CERTS research area of Real-time Grid Reliability Management, PSERC has been focused on Advanced Applications Research and Development (AARD), a subgroup of activities that works to develop advanced applications and tools to more effectively operate the electricity delivery system, by enabling advanced analysis, visualization, monitoring and alarming, and decision support capabilities for grid operators.« less

  4. Grid Stability Awareness System (GSAS) Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feuerborn, Scott; Ma, Jian; Black, Clifton

    The project team developed a software suite named Grid Stability Awareness System (GSAS) for power system near real-time stability monitoring and analysis based on synchrophasor measurement. The software suite consists of five analytical tools: an oscillation monitoring tool, a voltage stability monitoring tool, a transient instability monitoring tool, an angle difference monitoring tool, and an event detection tool. These tools have been integrated into one framework to provide power grid operators with both real-time or near real-time stability status of a power grid and historical information about system stability status. These tools are being considered for real-time use in themore » operation environment.« less

  5. Hand and power tools: A compilation

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Some hand and power tools were described. Section One describes several tools and shop techniques that may be useful in the home or commercial shop. Section Two contains descriptions of tools that are particularly applicable to industrial work, and in Section Three a number of metal working tools are presented.

  6. Exosome separation using microfluidic systems: size-based, immunoaffinity-based and dynamic methodologies.

    PubMed

    Yang, Fang; Liao, Xiangzhi; Tian, Yuan; Li, Guiying

    2017-04-01

    Exosomes, nanovesicles secreted by most types of cells, exist in virtually all bodily fluids. Their rich nucleic acid and protein content make them potentially valuable biomarkers for noninvasive molecular diagnostics. They also show promise, after further development, to serve as a drug delivery system. Unfortunately, existing exosome separation technologies, such as ultracentrifugation and methods incorporating magnetic beads, are time-consuming, laborious and separate only exosomes of low purity. Thus, a more effective separation method is highly desirable. Microfluidic platforms are ideal tools for exosome separation, since they enable fast, cost-efficient, portable and precise processing of nanoparticles and small volumes of liquid samples. Recently, several microfluidic-based exosome separation technologies have been studied. In this article, the advantages of the most recent technologies, as well as their limitations, challenges and potential uses in novel microfluidic exosome separation and collection applications is reviewed. This review outlines the uses of new powerful microfluidic exosome detection tools for biologists and clinicians, as well as exosome separation tools for microfluidic engineers. Current challenges of exosome separation methodologies are also described, in order to highlight areas for future research and development. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Martian resource locations: Identification and optimization

    NASA Astrophysics Data System (ADS)

    Chamitoff, Gregory; James, George; Barker, Donald; Dershowitz, Adam

    2005-04-01

    The identification and utilization of in situ Martian natural resources is the key to enable cost-effective long-duration missions and permanent human settlements on Mars. This paper presents a powerful software tool for analyzing Martian data from all sources, and for optimizing mission site selection based on resource collocation. This program, called Planetary Resource Optimization and Mapping Tool (PROMT), provides a wide range of analysis and display functions that can be applied to raw data or imagery. Thresholds, contours, custom algorithms, and graphical editing are some of the various methods that can be used to process data. Output maps can be created to identify surface regions on Mars that meet any specific criteria. The use of this tool for analyzing data, generating maps, and collocating features is demonstrated using data from the Mars Global Surveyor and the Odyssey spacecraft. The overall mission design objective is to maximize a combination of scientific return and self-sufficiency based on utilization of local materials. Landing site optimization involves maximizing accessibility to collocated science and resource features within a given mission radius. Mission types are categorized according to duration, energy resources, and in situ resource utilization. Preliminary optimization results are shown for a number of mission scenarios.

  8. Toward a Real-Time Measurement-Based System for Estimation of Helicopter Engine Degradation Due to Compressor Erosion

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Simo, Donald L.

    2007-01-01

    This paper presents a preliminary demonstration of an automated health assessment tool, capable of real-time on-board operation using existing engine control hardware. The tool allows operators to discern how rapidly individual turboshaft engines are degrading. As the compressor erodes, performance is lost, and with it the ability to generate power. Thus, such a tool would provide an instant assessment of the engine s fitness to perform a mission, and would help to pinpoint any abnormal wear or performance anomalies before they became serious, thereby decreasing uncertainty and enabling improved maintenance scheduling. The research described in the paper utilized test stand data from a T700-GE-401 turboshaft engine that underwent sand-ingestion testing to scale a model-based compressor efficiency degradation estimation algorithm. This algorithm was then applied to real-time Health Usage and Monitoring System (HUMS) data from a T700-GE-701C to track compressor efficiency on-line. The approach uses an optimal estimator called a Kalman filter. The filter is designed to estimate the compressor efficiency using only data from the engine s sensors as input.

  9. Extension of least squares spectral resolution algorithm to high-resolution lipidomics data.

    PubMed

    Zeng, Ying-Xu; Mjøs, Svein Are; David, Fabrice P A; Schmid, Adrien W

    2016-03-31

    Lipidomics, which focuses on the global study of molecular lipids in biological systems, has been driven tremendously by technical advances in mass spectrometry (MS) instrumentation, particularly high-resolution MS. This requires powerful computational tools that handle the high-throughput lipidomics data analysis. To address this issue, a novel computational tool has been developed for the analysis of high-resolution MS data, including the data pretreatment, visualization, automated identification, deconvolution and quantification of lipid species. The algorithm features the customized generation of a lipid compound library and mass spectral library, which covers the major lipid classes such as glycerolipids, glycerophospholipids and sphingolipids. Next, the algorithm performs least squares resolution of spectra and chromatograms based on the theoretical isotope distribution of molecular ions, which enables automated identification and quantification of molecular lipid species. Currently, this methodology supports analysis of both high and low resolution MS as well as liquid chromatography-MS (LC-MS) lipidomics data. The flexibility of the methodology allows it to be expanded to support more lipid classes and more data interpretation functions, making it a promising tool in lipidomic data analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Levelized cost of energy (LCOE) metric to characterize solar absorber coatings for the CSP industry

    DOE PAGES

    Boubault, Antoine; Ho, Clifford K.; Hall, Aaron; ...

    2015-07-08

    The contribution of each component of a power generation plant to the levelized cost of energy (LCOE) can be estimated and used to increase the power output while reducing system operation and maintenance costs. The LCOE is used in order to quantify solar receiver coating influence on the LCOE of solar power towers. Two new parameters are introduced: the absolute levelized cost of coating (LCOC) and the LCOC efficiency. Depending on the material properties, aging, costs, and temperature, the absolute LCOC enables quantifying the cost-effectiveness of absorber coatings, as well as finding optimal operating conditions. The absolute LCOC is investigatedmore » for different hypothetic coatings and is demonstrated on Pyromark 2500 paint. Results show that absorber coatings yield lower LCOE values in most cases, even at significant costs. Optimal reapplication intervals range from one to five years. At receiver temperatures greater than 700 °C, non-selective coatings are not always worthwhile while durable selective coatings consistently reduce the LCOE—up to 12% of the value obtained for an uncoated receiver. Moreover the absolute LCOC is a powerful tool to characterize and compare different coatings, not only considering their initial efficiencies but also including their durability.« less

  11. The Biomolecule Sequencer Project: Nanopore Sequencing as a Dual-Use Tool for Crew Health and Astrobiology Investigations

    NASA Technical Reports Server (NTRS)

    John, K. K.; Botkin, D. S.; Burton, A. S.; Castro-Wallace, S. L.; Chaput, J. D.; Dworkin, J. P.; Lehman, N.; Lupisella, M. L.; Mason, C. E.; Smith, D. J.; hide

    2016-01-01

    Human missions to Mars will fundamentally transform how the planet is explored, enabling new scientific discoveries through more sophisticated sample acquisition and processing than can currently be implemented in robotic exploration. The presence of humans also poses new challenges, including ensuring astronaut safety and health and monitoring contamination. Because the capability to transfer materials to Earth will be extremely limited, there is a strong need for in situ diagnostic capabilities. Nucleotide sequencing is a particularly powerful tool because it can be used to: (1) mitigate microbial risks to crew by allowing identification of microbes in water, in air, and on surfaces; (2) identify optimal treatment strategies for infections that arise in crew members; and (3) track how crew members, microbes, and mission-relevant organisms (e.g., farmed plants) respond to conditions on Mars through transcriptomic and genomic changes. Sequencing would also offer benefits for science investigations occurring on the surface of Mars by permitting identification of Earth-derived contamination in samples. If Mars contains indigenous life, and that life is based on nucleic acids or other closely related molecules, sequencing would serve as a critical tool for the characterization of those molecules. Therefore, spaceflight-compatible nucleic acid sequencing would be an important capability for both crew health and astrobiology exploration. Advances in sequencing technology on Earth have been driven largely by needs for higher throughput and read accuracy. Although some reduction in size has been achieved, nearly all commercially available sequencers are not compatible with spaceflight due to size, power, and operational requirements. Exceptions are nanopore-based sequencers that measure changes in current caused by DNA passing through pores; these devices are inherently much smaller and require significantly less power than sequencers using other detection methods. Consequently, nanopore-based sequencers could be made flight-ready with only minimal modifications.

  12. Integral Full Core Multi-Physics PWR Benchmark with Measured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forget, Benoit; Smith, Kord; Kumar, Shikhar

    In recent years, the importance of modeling and simulation has been highlighted extensively in the DOE research portfolio with concrete examples in nuclear engineering with the CASL and NEAMS programs. These research efforts and similar efforts worldwide aim at the development of high-fidelity multi-physics analysis tools for the simulation of current and next-generation nuclear power reactors. Like all analysis tools, verification and validation is essential to guarantee proper functioning of the software and methods employed. The current approach relies mainly on the validation of single physic phenomena (e.g. critical experiment, flow loops, etc.) and there is a lack of relevantmore » multiphysics benchmark measurements that are necessary to validate high-fidelity methods being developed today. This work introduces a new multi-cycle full-core Pressurized Water Reactor (PWR) depletion benchmark based on two operational cycles of a commercial nuclear power plant that provides a detailed description of fuel assemblies, burnable absorbers, in-core fission detectors, core loading and re-loading patterns. This benchmark enables analysts to develop extremely detailed reactor core models that can be used for testing and validation of coupled neutron transport, thermal-hydraulics, and fuel isotopic depletion. The benchmark also provides measured reactor data for Hot Zero Power (HZP) physics tests, boron letdown curves, and three-dimensional in-core flux maps from 58 instrumented assemblies. The benchmark description is now available online and has been used by many groups. However, much work remains to be done on the quantification of uncertainties and modeling sensitivities. This work aims to address these deficiencies and make this benchmark a true non-proprietary international benchmark for the validation of high-fidelity tools. This report details the BEAVRS uncertainty quantification for the first two cycle of operations and serves as the final report of the project.« less

  13. Modernizing Distribution System Restoration to Achieve Grid Resiliency Against Extreme Weather Events: An Integrated Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chen; Wang, Jianhui; Ton, Dan

    Recent severe power outages caused by extreme weather hazards have highlighted the importance and urgency of improving the resilience of the electric power grid. As the distribution grids still remain vulnerable to natural disasters, the power industry has focused on methods of restoring distribution systems after disasters in an effective and quick manner. The current distribution system restoration practice for utilities is mainly based on predetermined priorities and tends to be inefficient and suboptimal, and the lack of situational awareness after the hazard significantly delays the restoration process. As a result, customers may experience an extended blackout, which causes largemore » economic loss. On the other hand, the emerging advanced devices and technologies enabled through grid modernization efforts have the potential to improve the distribution system restoration strategy. However, utilizing these resources to aid the utilities in better distribution system restoration decision-making in response to extreme weather events is a challenging task. Therefore, this paper proposes an integrated solution: a distribution system restoration decision support tool designed by leveraging resources developed for grid modernization. We first review the current distribution restoration practice and discuss why it is inadequate in response to extreme weather events. Then we describe how the grid modernization efforts could benefit distribution system restoration, and we propose an integrated solution in the form of a decision support tool to achieve the goal. The advantages of the solution include improving situational awareness of the system damage status and facilitating survivability for customers. The paper provides a comprehensive review of how the existing methodologies in the literature could be leveraged to achieve the key advantages. The benefits of the developed system restoration decision support tool include the optimal and efficient allocation of repair crews and resources, the expediting of the restoration process, and the reduction of outage durations for customers, in response to severe blackouts due to extreme weather hazards.« less

  14. Demand Response Advanced Controls Framework and Assessment of Enabling Technology Costs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potter, Jennifer; Cappers, Peter

    The Demand Response Advanced Controls Framework and Assessment of Enabling Technology Costs research describe a variety of DR opportunities and the various bulk power system services they can provide. The bulk power system services are mapped to a generalized taxonomy of DR “service types”, which allows us to discuss DR opportunities and bulk power system services in fewer yet broader categories that share similar technological requirements which mainly drive DR enablement costs. The research presents a framework for the costs to automate DR and provides descriptions of the various elements that drive enablement costs. The report introduces the various DRmore » enabling technologies and end-uses, identifies the various services that each can provide to the grid and provides the cost assessment for each enabling technology. In addition to a report, this research includes a Demand Response Advanced Controls Database and User Manual. They are intended to provide users with the data that underlies this research and instructions for how to use that database more effectively and efficiently.« less

  15. Sub-cycle light transients for attosecond, X-ray, four-dimensional imaging

    NASA Astrophysics Data System (ADS)

    Fattahi, Hanieh

    2016-10-01

    This paper reviews the revolutionary development of ultra-short, multi-TW laser pulse generation made possible by current laser technology. The design of the unified laser architecture discussed in this paper, based on the synthesis of ultrabroadband optical parametric chirped-pulse amplifiers, promises to provide powerful light transients with electromagnetic forces engineerable on the electron time scale. By coherent combination of multiple amplifiers operating in different wavelength ranges, pulses with wavelength spectra extending from less than 1 ?m to more than 10 ?m, with sub-cycle duration at unprecedented peak and average power levels can be generated. It is shown theoretically that these light transients enable the efficient generation of attosecond X-ray pulses with photon flux sufficient to image, for the first time, picometre-attosecond trajectories of electrons, by means of X-ray diffraction and record the electron dynamics by attosecond spectroscopy. The proposed system leads to a tool with sub-atomic spatio-temporal resolution for studying different processes deep inside matter.

  16. Enhancing the usability and performance of structured association mapping algorithms using automation, parallelization, and visualization in the GenAMap software system

    PubMed Central

    2012-01-01

    Background Structured association mapping is proving to be a powerful strategy to find genetic polymorphisms associated with disease. However, these algorithms are often distributed as command line implementations that require expertise and effort to customize and put into practice. Because of the difficulty required to use these cutting-edge techniques, geneticists often revert to simpler, less powerful methods. Results To make structured association mapping more accessible to geneticists, we have developed an automatic processing system called Auto-SAM. Auto-SAM enables geneticists to run structured association mapping algorithms automatically, using parallelization. Auto-SAM includes algorithms to discover gene-networks and find population structure. Auto-SAM can also run popular association mapping algorithms, in addition to five structured association mapping algorithms. Conclusions Auto-SAM is available through GenAMap, a front-end desktop visualization tool. GenAMap and Auto-SAM are implemented in JAVA; binaries for GenAMap can be downloaded from http://sailing.cs.cmu.edu/genamap. PMID:22471660

  17. Multi-photon microscopy with a low-cost and highly efficient Cr:LiCAF laser

    PubMed Central

    Sakadić, Sava; Demirbas, Umit; Mempel, Thorsten R.; Moore, Anna; Ruvinskaya, Svetlana; Boas, David A.; Sennaroglu, Alphan; Kartner, Franz X.; Fujimoto, James G.

    2009-01-01

    Multi-photon microscopy (MPM) is a powerful tool for biomedical imaging, enabling molecular contrast and integrated structural and functional imaging on the cellular and subcellular level. However, the cost and complexity of femtosecond laser sources that are required in MPM are significant hurdles to widespread adoption of this important imaging modality. In this work, we describe femtosecond diode pumped Cr:LiCAF laser technology as a low cost alternative to femtosecond Ti:Sapphire lasers for MPM. Using single mode pump diodes which cost only $150 each, a diode pumped Cr:LiCAF laser generates ~70-fs duration, 1.8-nJ pulses at ~800 nm wavelengths, with a repetition rate of 100 MHz and average output power of 180 mW. Representative examples of MPM imaging in neuroscience, immunology, endocrinology and cancer research using Cr:LiCAF laser technology are presented. These studies demonstrate the potential of this laser source for use in a broad range of MPM applications. PMID:19065223

  18. Wide-area situation awareness in electric power grid

    NASA Astrophysics Data System (ADS)

    Greitzer, Frank L.

    2010-04-01

    Two primary elements of the US energy policy are demand management and efficiency and renewable sources. Major objectives are clean energy transmission and integration, reliable energy transmission, and grid cyber security. Development of the Smart Grid seeks to achieve these goals by lowering energy costs for consumers, achieving energy independence and reducing greenhouse gas emissions. The Smart Grid is expected to enable real time wide-area situation awareness (SA) for operators. Requirements for wide-area SA have been identified among interoperability standards proposed by the Federal Energy Regulatory Commission and the National Institute of Standards and Technology to ensure smart-grid functionality. Wide-area SA and enhanced decision support and visualization tools are key elements in the transformation to the Smart Grid. This paper discusses human factors research to promote SA in the electric power grid and the Smart Grid. Topics that will be discussed include the role of human factors in meeting US energy policy goals, the impact and challenges for Smart Grid development, and cyber security challenges.

  19. Cross-Population Joint Analysis of eQTLs: Fine Mapping and Functional Annotation

    PubMed Central

    Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Mapping expression quantitative trait loci (eQTLs) has been shown as a powerful tool to uncover the genetic underpinnings of many complex traits at molecular level. In this paper, we present an integrative analysis approach that leverages eQTL data collected from multiple population groups. In particular, our approach effectively identifies multiple independent cis-eQTL signals that are consistent across populations, accounting for population heterogeneity in allele frequencies and linkage disequilibrium patterns. Furthermore, by integrating genomic annotations, our analysis framework enables high-resolution functional analysis of eQTLs. We applied our statistical approach to analyze the GEUVADIS data consisting of samples from five population groups. From this analysis, we concluded that i) jointly analysis across population groups greatly improves the power of eQTL discovery and the resolution of fine mapping of causal eQTL ii) many genes harbor multiple independent eQTLs in their cis regions iii) genetic variants that disrupt transcription factor binding are significantly enriched in eQTLs (p-value = 4.93 × 10-22). PMID:25906321

  20. MDAnalysis: a toolkit for the analysis of molecular dynamics simulations.

    PubMed

    Michaud-Agrawal, Naveen; Denning, Elizabeth J; Woolf, Thomas B; Beckstein, Oliver

    2011-07-30

    MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM's powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU General Public License from http://mdanalysis.googlecode.com. Copyright © 2011 Wiley Periodicals, Inc.

  1. High-throughput screening of a CRISPR/Cas9 library for functional genomics in human cells.

    PubMed

    Zhou, Yuexin; Zhu, Shiyou; Cai, Changzu; Yuan, Pengfei; Li, Chunmei; Huang, Yanyi; Wei, Wensheng

    2014-05-22

    Targeted genome editing technologies are powerful tools for studying biology and disease, and have a broad range of research applications. In contrast to the rapid development of toolkits to manipulate individual genes, large-scale screening methods based on the complete loss of gene expression are only now beginning to be developed. Here we report the development of a focused CRISPR/Cas-based (clustered regularly interspaced short palindromic repeats/CRISPR-associated) lentiviral library in human cells and a method of gene identification based on functional screening and high-throughput sequencing analysis. Using knockout library screens, we successfully identified the host genes essential for the intoxication of cells by anthrax and diphtheria toxins, which were confirmed by functional validation. The broad application of this powerful genetic screening strategy will not only facilitate the rapid identification of genes important for bacterial toxicity but will also enable the discovery of genes that participate in other biological processes.

  2. On-chip manipulation of single microparticles, cells, and organisms using surface acoustic waves.

    PubMed

    Ding, Xiaoyun; Lin, Sz-Chin Steven; Kiraly, Brian; Yue, Hongjun; Li, Sixing; Chiang, I-Kao; Shi, Jinjie; Benkovic, Stephen J; Huang, Tony Jun

    2012-07-10

    Techniques that can dexterously manipulate single particles, cells, and organisms are invaluable for many applications in biology, chemistry, engineering, and physics. Here, we demonstrate standing surface acoustic wave based "acoustic tweezers" that can trap and manipulate single microparticles, cells, and entire organisms (i.e., Caenorhabditis elegans) in a single-layer microfluidic chip. Our acoustic tweezers utilize the wide resonance band of chirped interdigital transducers to achieve real-time control of a standing surface acoustic wave field, which enables flexible manipulation of most known microparticles. The power density required by our acoustic device is significantly lower than its optical counterparts (10,000,000 times less than optical tweezers and 100 times less than optoelectronic tweezers), which renders the technique more biocompatible and amenable to miniaturization. Cell-viability tests were conducted to verify the tweezers' compatibility with biological objects. With its advantages in biocompatibility, miniaturization, and versatility, the acoustic tweezers presented here will become a powerful tool for many disciplines of science and engineering.

  3. A Foldable Lithium-Sulfur Battery.

    PubMed

    Li, Lu; Wu, Zi Ping; Sun, Hao; Chen, Deming; Gao, Jian; Suresh, Shravan; Chow, Philippe; Singh, Chandra Veer; Koratkar, Nikhil

    2015-11-24

    The next generation of deformable and shape-conformable electronics devices will need to be powered by batteries that are not only flexible but also foldable. Here we report a foldable lithium-sulfur (Li-S) rechargeable battery, with the highest areal capacity (∼3 mAh cm(-2)) reported to date among all types of foldable energy-storage devices. The key to this result lies in the use of fully foldable and superelastic carbon nanotube current-collector films and impregnation of the active materials (S and Li) into the current-collectors in a checkerboard pattern, enabling the battery to be folded along two mutually orthogonal directions. The carbon nanotube films also serve as the sulfur entrapment layer in the Li-S battery. The foldable battery showed <12% loss in specific capacity over 100 continuous folding and unfolding cycles. Such shape-conformable Li-S batteries with significantly greater energy density than traditional lithium-ion batteries could power the flexible and foldable devices of the future including laptops, cell phones, tablet computers, surgical tools, and implantable biomedical devices.

  4. Steinberg ``AUDIOMAPS" Music Appreciation-Via-Understanding: Special-Relativity + Expectations "Quantum-Theory": a Quantum-ACOUSTO/MUSICO-Dynamics (QA/MD)

    NASA Astrophysics Data System (ADS)

    Steinberg, R.; Siegel, E.

    2010-03-01

    ``AUDIOMAPS'' music enjoyment/appreciation-via-understanding methodology, versus art, music-dynamics evolves, telling a story in (3+1)-dimensions: trails, frames, timbres, + dynamics amplitude vs. music-score time-series (formal-inverse power- spectrum) surprisingly closely parallels (3+1)-dimensional Einstein(1905) special-relativity ``+'' (with its enjoyment- expectations) a manifestation of quantum-theory expectation- values, together a music quantum-ACOUSTO/MUSICO-dynamics (QA/MD). Analysis via Derrida deconstruction enabled Siegel- Baez ``Category-Semantics'' ``FUZZYICS''=``CATEGORYICS (``SON of 'TRIZ") classic Aristotle ``Square-of-Opposition" (SoO) DEduction-logic, irrespective of Boon-Klimontovich versus Voss- Clark[PRL(77)] music power-spectrum analysis sampling- time/duration controversy: part versus whole, shows that ``AUDIOMAPS" QA/MD reigns supreme as THE music appreciation-via- analysis tool for the listener in musicology!!! Connection to Deutsch-Hartmann-Levitin[This is Your Brain on Music,(2006)] brain/mind-barrier brain/mind-music connection is both subtle and compelling and immediate!!!

  5. Driving ATHLETE: Analysis of Operational Efficiency

    NASA Technical Reports Server (NTRS)

    Townsend, Julie; Mittman, David

    2012-01-01

    The All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) is a modular mobility and manipulation platform being developed to support NASA operations in a variety of missions, including exploration of planetary surfaces. The agile system consists of a symmetrical arrangement of six limbs, each with seven articulated degrees of freedom and a powered wheel. This design enables transport of bulky payloads over a wide range of terrain and is envisioned as a tool to mobilize habitats, power-generation equipment, and other supplies for long-range exploration and outpost construction. In FY2010, ATHLETE traversed more than 80 km in field environments over eight weeks of testing, demonstrating that the concept is well suited to long-range travel. Although ATHLETE is designed to travel at speeds of up to 5 kilometers per hour, the observed average traverse rate during field-testing rarely exceeded 1.5 kilometers per hour. This paper investigates sources of inefficiency in ATHLETE traverse operations and identifies targets for improvement of overall traverse rate.

  6. Driving ATHLETE: Analysis of Operational Efficiency

    NASA Technical Reports Server (NTRS)

    Townsend, Julie; Mittman, David

    2012-01-01

    The All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) is a modular mobility and manipulation platform being developed to support NASA operations in a variety of missions, including exploration of planetary surfaces. The agile system consists of a symmetrical arrangement of six limbs, each with seven articulated degrees of freedom and a powered wheel. This design enables transport of bulky payloads over a wide range of terrain and is envisioned as a tool to mobilize habitats, power-generation equipment, and other supplies for long-range exploration and outpost construction. In 2010, ATHLETE traversed more than 80 km in field environments over eight weeks of testing, demonstrating that the concept is well suited to long-range travel. However, while ATHLETE is designed to travel at speeds of up to 5 kilometers per hour, the observed average traverse rate during field-testing rarely exceeded 1.5 kilometers per hour. This paper investigates sources of inefficiency in ATHLETE traverse operations and identifies targets for improvement of overall traverse rate.

  7. A short review of radiation-induced raft-mediated graft copolymerization: A powerful combination for modifying the surface properties of polymers in a controlled manner

    NASA Astrophysics Data System (ADS)

    Barsbay, Murat; Güven, Olgun

    2009-12-01

    Surface grafting of polymeric materials is attracting increasing attention as it enables the preparation of new materials from known and commercially available polymers having desirable bulk properties such as thermal stability, elasticity, permeability, etc., in conjunction with advantageous newly tailored surface properties such as biocompatibility, biomimicry, adhesion, etc. Ionizing radiation, particularly γ radiation is one of the most powerful tools for preparing graft copolymers as it generates radicals on most substrates. With the advent of living free-radical polymerization techniques, application of γ radiation has been extended to a new era of grafting; grafting in a controlled manner to achieve surfaces with tailored and well-defined properties. This report presents the current use of γ radiation in living free-radical polymerization and highlights the use of both techniques together as a combination to present an advance in the ability to prepare surfaces with desired, tunable and well-defined properties.

  8. COINSTAC: A Privacy Enabled Model and Prototype for Leveraging and Processing Decentralized Brain Imaging Data.

    PubMed

    Plis, Sergey M; Sarwate, Anand D; Wood, Dylan; Dieringer, Christopher; Landis, Drew; Reed, Cory; Panta, Sandeep R; Turner, Jessica A; Shoemaker, Jody M; Carter, Kim W; Thompson, Paul; Hutchison, Kent; Calhoun, Vince D

    2016-01-01

    The field of neuroimaging has embraced the need for sharing and collaboration. Data sharing mandates from public funding agencies and major journal publishers have spurred the development of data repositories and neuroinformatics consortia. However, efficient and effective data sharing still faces several hurdles. For example, open data sharing is on the rise but is not suitable for sensitive data that are not easily shared, such as genetics. Current approaches can be cumbersome (such as negotiating multiple data sharing agreements). There are also significant data transfer, organization and computational challenges. Centralized repositories only partially address the issues. We propose a dynamic, decentralized platform for large scale analyses called the Collaborative Informatics and Neuroimaging Suite Toolkit for Anonymous Computation (COINSTAC). The COINSTAC solution can include data missing from central repositories, allows pooling of both open and "closed" repositories by developing privacy-preserving versions of widely-used algorithms, and incorporates the tools within an easy-to-use platform enabling distributed computation. We present an initial prototype system which we demonstrate on two multi-site data sets, without aggregating the data. In addition, by iterating across sites, the COINSTAC model enables meta-analytic solutions to converge to "pooled-data" solutions (i.e., as if the entire data were in hand). More advanced approaches such as feature generation, matrix factorization models, and preprocessing can be incorporated into such a model. In sum, COINSTAC enables access to the many currently unavailable data sets, a user friendly privacy enabled interface for decentralized analysis, and a powerful solution that complements existing data sharing solutions.

  9. COINSTAC: A Privacy Enabled Model and Prototype for Leveraging and Processing Decentralized Brain Imaging Data

    PubMed Central

    Plis, Sergey M.; Sarwate, Anand D.; Wood, Dylan; Dieringer, Christopher; Landis, Drew; Reed, Cory; Panta, Sandeep R.; Turner, Jessica A.; Shoemaker, Jody M.; Carter, Kim W.; Thompson, Paul; Hutchison, Kent; Calhoun, Vince D.

    2016-01-01

    The field of neuroimaging has embraced the need for sharing and collaboration. Data sharing mandates from public funding agencies and major journal publishers have spurred the development of data repositories and neuroinformatics consortia. However, efficient and effective data sharing still faces several hurdles. For example, open data sharing is on the rise but is not suitable for sensitive data that are not easily shared, such as genetics. Current approaches can be cumbersome (such as negotiating multiple data sharing agreements). There are also significant data transfer, organization and computational challenges. Centralized repositories only partially address the issues. We propose a dynamic, decentralized platform for large scale analyses called the Collaborative Informatics and Neuroimaging Suite Toolkit for Anonymous Computation (COINSTAC). The COINSTAC solution can include data missing from central repositories, allows pooling of both open and “closed” repositories by developing privacy-preserving versions of widely-used algorithms, and incorporates the tools within an easy-to-use platform enabling distributed computation. We present an initial prototype system which we demonstrate on two multi-site data sets, without aggregating the data. In addition, by iterating across sites, the COINSTAC model enables meta-analytic solutions to converge to “pooled-data” solutions (i.e., as if the entire data were in hand). More advanced approaches such as feature generation, matrix factorization models, and preprocessing can be incorporated into such a model. In sum, COINSTAC enables access to the many currently unavailable data sets, a user friendly privacy enabled interface for decentralized analysis, and a powerful solution that complements existing data sharing solutions. PMID:27594820

  10. Space power systems technology enablement study. [for the space transportation system

    NASA Technical Reports Server (NTRS)

    Smith, L. D.; Stearns, J. W.

    1978-01-01

    The power system technologies which enable or enhance future space missions requiring a few kilowatts or less and using the space shuttle were assessed. The advances in space power systems necessary for supporting the capabilities of the space transportation system were systematically determined and benefit/cost/risk analyses were used to identify high payoff technologies and technological priorities. The missions that are enhanced by each development are discussed.

  11. CFD Multiphysics Tool

    NASA Technical Reports Server (NTRS)

    Perrell, Eric R.

    2005-01-01

    The recent bold initiatives to expand the human presence in space require innovative approaches to the design of propulsion systems whose underlying technology is not yet mature. The space propulsion community has identified a number of candidate concepts. A short list includes solar sails, high-energy-density chemical propellants, electric and electromagnetic accelerators, solar-thermal and nuclear-thermal expanders. For each of these, the underlying physics are relatively well understood. One could easily cite authoritative texts, addressing both the governing equations, and practical solution methods for, e.g. electromagnetic fields, heat transfer, radiation, thermophysics, structural dynamics, particulate kinematics, nuclear energy, power conversion, and fluid dynamics. One could also easily cite scholarly works in which complete equation sets for any one of these physical processes have been accurately solved relative to complex engineered systems. The Advanced Concepts and Analysis Office (ACAO), Space Transportation Directorate, NASA Marshall Space Flight Center, has recently released the first alpha version of a set of computer utilities for performing the applicable physical analyses relative to candidate deep-space propulsion systems such as those listed above. PARSEC, Preliminary Analysis of Revolutionary in-Space Engineering Concepts, enables rapid iterative calculations using several physics tools developed in-house. A complete cycle of the entire tool set takes about twenty minutes. PARSEC is a level-zero/level-one design tool. For PARSEC s proof-of-concept, and preliminary design decision-making, assumptions that significantly simplify the governing equation sets are necessary. To proceed to level-two, one wishes to retain modeling of the underlying physics as close as practical to known applicable first principles. This report describes results of collaboration between ACAO, and Embry-Riddle Aeronautical University (ERAU), to begin building a set of level-two design tools for PARSEC. The "CFD Multiphysics Tool" will be the propulsive element of the tool set. The name acknowledges that space propulsion performance assessment is primarily a fluid mechanics problem. At the core of the CFD Multiphysics Tool is an open-source CFD code, HYP, under development at ERAU. ERAU is renowned for its undergraduate degree program in Aerospace Engineering the largest in the nation. The strength of the program is its applications-oriented curriculum, which culminates in one of three two-course Engineering Design sequences: Aerospace Propulsion, Spacecraft, or Aircraft. This same philosophy applies to the HYP Project, albeit with fluid physics modeling commensurate with graduate research. HYP s purpose, like the Multiphysics Tool s, is to enable calculations of real (three-dimensional; geometrically complex; intended for hardware development) applications of high speed and propulsive fluid flows.

  12. Examining System-Wide Impacts of Solar PV Control Systems with a Power Hardware-in-the-Loop Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Tess L.; Fuller, Jason C.; Schneider, Kevin P.

    2014-06-08

    High penetration levels of distributed solar PV power generation can lead to adverse power quality impacts, such as excessive voltage rise, voltage flicker, and reactive power values that result in unacceptable voltage levels. Advanced inverter control schemes have been developed that have the potential to mitigate many power quality concerns. However, local closed-loop control may lead to unintended behavior in deployed systems as complex interactions can occur between numerous operating devices. To enable the study of the performance of advanced control schemes in a detailed distribution system environment, a test platform has been developed that integrates Power Hardware-in-the-Loop (PHIL) withmore » concurrent time-series electric distribution system simulation. In the test platform, GridLAB-D, a distribution system simulation tool, runs a detailed simulation of a distribution feeder in real-time mode at the Pacific Northwest National Laboratory (PNNL) and supplies power system parameters at a point of common coupling. At the National Renewable Energy Laboratory (NREL), a hardware inverter interacts with grid and PV simulators emulating an operational distribution system. Power output from the inverters is measured and sent to PNNL to update the real-time distribution system simulation. The platform is described and initial test cases are presented. The platform is used to study the system-wide impacts and the interactions of inverter control modes—constant power factor and active Volt/VAr control—when integrated into a simulated IEEE 8500-node test feeder. We demonstrate that this platform is well-suited to the study of advanced inverter controls and their impacts on the power quality of a distribution feeder. Additionally, results are used to validate GridLAB-D simulations of advanced inverter controls.« less

  13. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  14. Genetic screens in human cells using the CRISPR-Cas9 system.

    PubMed

    Wang, Tim; Wei, Jenny J; Sabatini, David M; Lander, Eric S

    2014-01-03

    The bacterial clustered regularly interspaced short palindromic repeats (CRISPR)-Cas9 system for genome editing has greatly expanded the toolbox for mammalian genetics, enabling the rapid generation of isogenic cell lines and mice with modified alleles. Here, we describe a pooled, loss-of-function genetic screening approach suitable for both positive and negative selection that uses a genome-scale lentiviral single-guide RNA (sgRNA) library. sgRNA expression cassettes were stably integrated into the genome, which enabled a complex mutant pool to be tracked by massively parallel sequencing. We used a library containing 73,000 sgRNAs to generate knockout collections and performed screens in two human cell lines. A screen for resistance to the nucleotide analog 6-thioguanine identified all expected members of the DNA mismatch repair pathway, whereas another for the DNA topoisomerase II (TOP2A) poison etoposide identified TOP2A, as expected, and also cyclin-dependent kinase 6, CDK6. A negative selection screen for essential genes identified numerous gene sets corresponding to fundamental processes. Last, we show that sgRNA efficiency is associated with specific sequence motifs, enabling the prediction of more effective sgRNAs. Collectively, these results establish Cas9/sgRNA screens as a powerful tool for systematic genetic analysis in mammalian cells.

  15. Building Safer Systems With SpecTRM

    NASA Technical Reports Server (NTRS)

    2003-01-01

    System safety, an integral component in software development, often poses a challenge to engineers designing computer-based systems. While the relaxed constraints on software design allow for increased power and flexibility, this flexibility introduces more possibilities for error. As a result, system engineers must identify the design constraints necessary to maintain safety and ensure that the system and software design enforces them. Safeware Engineering Corporation, of Seattle, Washington, provides the information, tools, and techniques to accomplish this task with its Specification Tools and Requirements Methodology (SpecTRM). NASA assisted in developing this engineering toolset by awarding the company several Small Business Innovation Research (SBIR) contracts with Ames Research Center and Langley Research Center. The technology benefits NASA through its applications for Space Station rendezvous and docking. SpecTRM aids system and software engineers in developing specifications for large, complex safety critical systems. The product enables engineers to find errors early in development so that they can be fixed with the lowest cost and impact on the system design. SpecTRM traces both the requirements and design rationale (including safety constraints) throughout the system design and documentation, allowing engineers to build required system properties into the design from the beginning, rather than emphasizing assessment at the end of the development process when changes are limited and costly.System safety, an integral component in software development, often poses a challenge to engineers designing computer-based systems. While the relaxed constraints on software design allow for increased power and flexibility, this flexibility introduces more possibilities for error. As a result, system engineers must identify the design constraints necessary to maintain safety and ensure that the system and software design enforces them. Safeware Engineering Corporation, of Seattle, Washington, provides the information, tools, and techniques to accomplish this task with its Specification Tools and Requirements Methodology (SpecTRM). NASA assisted in developing this engineering toolset by awarding the company several Small Business Innovation Research (SBIR) contracts with Ames Research Center and Langley Research Center. The technology benefits NASA through its applications for Space Station rendezvous and docking. SpecTRM aids system and software engineers in developing specifications for large, complex safety critical systems. The product enables engineers to find errors early in development so that they can be fixed with the lowest cost and impact on the system design. SpecTRM traces both the requirements and design rationale (including safety constraints) throughout the system design and documentation, allowing engineers to build required system properties into the design from the beginning, rather than emphasizing assessment at the end of the development process when changes are limited and costly.

  16. Informatics methods to enable sharing of quantitative imaging research data.

    PubMed

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. TU-E-BRD-01: President’s Symposium: The Necessity of Innovation in Medical Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bayouth, J; Siewerdsen, J; Wahl, E

    This abstract will not blow you away, but speed-painting presenter Erik Wahl will certainly make a truly unique AAPM symposium that you will not want to miss. Along with clinical director John Bayouth and scientific leader Jeff Siewerdsen, this session will highlight innovation. To avoid being button pushers and irrelevant investigators of yesterday’s science, we must innovate. This is particularly challenging in the changing landscape of declining research funding and healthcare reimbursement. But all hope is not lost, Medical Physics is a field born out of innovation. As scientists we quickly translated the man-made and natural phenomena of radiation intomore » a tool that could diagnose broken bones, locate foreign objects imbedded within the body, and treat a spectrum of diseases. As hyperbolae surrounding the curative powers of radiation overcame society, physicists continued their systematic pursuit of a fundamental understanding of radiation and applied their knowledge to enable the diagnostic and therapeutic power of this new tool. Health economics and the decline in research funding have put the Medical Physicist in a precarious position: how do we optimally participate in medical research and advanced patient care in the face of many competing needs? Today's diagnostic imaging and therapeutic approaches are tremendously sophisticated. Researchers and commercial vendors are producing technologies at a remarkable rate; to enable their safe and effective implementation Medical Physicists must work from a fundamental understanding of these technologies. This requires all of us, clinically practicing Medical Physicists, Researchers and Educators alike, to combine our training in scientific methods with innovation. Innovation is the key to our past, a necessity for our contemporary challenges, and critical for the future of Medical Physics. The keynote speakers for the 2014 AAPM Presidential Symposium session will address the way we approach these vitally important technologies for diagnosis and therapy into opportunities to innovate. The speed-painting artist and lecturer Erik Wahl will finish the symposium with a fast-paced and entertaining presentation on embracing the future by creating disruptive innovation strategies. Learning Objectives: Identify connection between Medical Physics and Innovation. Understand how Innovation enables Clinical Medical Physicists to implement novel technologies. Learn how innovative Medical Physics solutions can address significant and relevant challenges in science. Become inspired to pursue a new scientific understanding, positive change in clinical practice, and benefit to patients.« less

  18. Development of a component design tool for metal hydride heat pumps

    NASA Astrophysics Data System (ADS)

    Waters, Essene L.

    Given current demands for more efficient and environmentally friendly energy sources, hydrogen based energy systems are an increasingly popular field of interest. Within the field, metal hydrides have become a prominent focus of research due to their large hydrogen storage capacity and relative system simplicity and safety. Metal hydride heat pumps constitute one such application, in which heat and hydrogen are transferred to and from metal hydrides. While a significant amount of work has been done to study such systems, the scope of materials selection has been quite limited. Typical studies compare only a few metal hydride materials and provide limited justification for the choice of those few. In this work, a metal hydride component design tool has been developed to enable the targeted down-selection of an extensive database of metal hydrides to identify the most promising materials for use in metal hydride thermal systems. The material database contains over 300 metal hydrides with various physical and thermodynamic properties included for each material. Sub-models for equilibrium pressure, thermophysical data, and default properties are used to predict the behavior of each material within the given system. For a given thermal system, this tool can be used to identify optimal materials out of over 100,000 possible hydride combinations. The selection tool described herein has been applied to a stationary combined heat and power system containing a high-temperature proton exchange membrane (PEM) fuel cell, a hot water tank, and two metal hydride beds used as a heat pump. A variety of factors can be used to select materials including efficiency, maximum and minimum system pressures, pressure difference, coefficient of performance (COP), and COP sensitivity. The targeted down-selection of metal hydrides for this system focuses on the system's COP for each potential pair. The values of COP and COP sensitivity have been used to identify pairs of highest interest for use in this application. The metal hydride component design tool developed in this work selects between metal hydride materials on an unprecedented scale. It can be easily applied to other hydrogen-based thermal systems, making it a powerful and versatile tool.

  19. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    PubMed Central

    Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA

    2008-01-01

    Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776

  20. Enabling Rapid Naval Architecture Design Space Exploration

    NASA Technical Reports Server (NTRS)

    Mueller, Michael A.; Dufresne, Stephane; Balestrini-Robinson, Santiago; Mavris, Dimitri

    2011-01-01

    Well accepted conceptual ship design tools can be used to explore a design space, but more precise results can be found using detailed models in full-feature computer aided design programs. However, defining a detailed model can be a time intensive task and hence there is an incentive for time sensitive projects to use conceptual design tools to explore the design space. In this project, the combination of advanced aerospace systems design methods and an accepted conceptual design tool facilitates the creation of a tool that enables the user to not only visualize ship geometry but also determine design feasibility and estimate the performance of a design.

  1. The use of power tools in the insertion of cortical bone screws.

    PubMed

    Elliott, D

    1992-01-01

    Cortical bone screws are commonly used in fracture surgery, most patterns are non-self-tapping and require a thread to be pre-cut. This is traditionally performed using hand tools rather than their powered counterparts. Reasons given usually imply that power tools are more dangerous and cut a less precise thread, but there is no evidence to support this supposition. A series of experiments has been performed which show that the thread pattern cut with either method is identical and that over-penetration with the powered tap is easy to control. The conclusion reached is that both methods produce consistently reliable results but use of power tools is much faster.

  2. High efficiency III-nitride light-emitting diodes

    DOEpatents

    Crawford, Mary; Koleske, Daniel; Cho, Jaehee; Zhu, Di; Noemaun, Ahmed; Schubert, Martin F; Schubert, E. Fred

    2013-05-28

    Tailored doping of barrier layers enables balancing of the radiative recombination among the multiple-quantum-wells in III-Nitride light-emitting diodes. This tailored doping enables more symmetric carrier transport and uniform carrier distribution which help to reduce electron leakage and thus reduce the efficiency droop in high-power III-Nitride LEDs. Mitigation of the efficiency droop in III-Nitride LEDs may enable the pervasive market penetration of solid-state-lighting technologies in high-power lighting and illumination.

  3. Cry-Bt identifier: a biological database for PCR detection of Cry genes present in transgenic plants.

    PubMed

    Singh, Vinay Kumar; Ambwani, Sonu; Marla, Soma; Kumar, Anil

    2009-10-23

    We describe the development of a user friendly tool that would assist in the retrieval of information relating to Cry genes in transgenic crops. The tool also helps in detection of transformed Cry genes from Bacillus thuringiensis present in transgenic plants by providing suitable designed primers for PCR identification of these genes. The tool designed based on relational database model enables easy retrieval of information from the database with simple user queries. The tool also enables users to access related information about Cry genes present in various databases by interacting with different sources (nucleotide sequences, protein sequence, sequence comparison tools, published literature, conserved domains, evolutionary and structural data). http://insilicogenomics.in/Cry-btIdentifier/welcome.html.

  4. Raising Virtual Laboratories in Australia onto global platforms

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Barker, M.; Fraser, R.; Evans, B. J. K.; Moloney, G.; Proctor, R.; Moise, A. F.; Hamish, H.

    2016-12-01

    Across the globe, Virtual Laboratories (VLs), Science Gateways (SGs), and Virtual Research Environments (VREs) are being developed that enable users who are not co-located to actively work together at various scales to share data, models, tools, software, workflows, best practices, etc. Outcomes range from enabling `long tail' researchers to more easily access specific data collections, to facilitating complex workflows on powerful supercomputers. In Australia, government funding has facilitated the development of a range of VLs through the National eResearch Collaborative Tools and Resources (NeCTAR) program. The VLs provide highly collaborative, research-domain oriented, integrated software infrastructures that meet user community needs. Twelve VLs have been funded since 2012, including the Virtual Geophysics Laboratory (VGL); Virtual Hazards, Impact and Risk Laboratory (VHIRL); Climate and Weather Science Laboratory (CWSLab); Marine Virtual Laboratory (MarVL); and Biodiversity and Climate Change Virtual Laboratory (BCCVL). These VLs share similar technical challenges, with common issues emerging on integration of tools, applications and access data collections via both cloud-based environments and other distributed resources. While each VL began with a focus on a specific research domain, communities of practice have now formed across the VLs around common issues, and facilitate identification of best practice case studies, and new standards. As a result, tools are now being shared where the VLs access data via data services using international standards such as ISO, OGC, W3C. The sharing of these approaches is starting to facilitate re-usability of infrastructure and is a step towards supporting interdisciplinary research. Whilst the focus of the VLs are Australia-centric, by using standards, these environments are able to be extended to analysis on other international datasets. Many VL datasets are subsets of global datasets and so extension to global is a small (and often requested) step. Similarly, most of the tools, software, and other technologies could be shared across infrastructures globally. Therefore, it is now time to better connect the Australian VLs with similar initiatives elsewhere to create international platforms that can contribute to global research challenges.

  5. Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Emma; Kiliccote, Sila; McParland, Charles

    2014-07-01

    This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation.more » Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve/ensure accuracy, providing information on normally estimated values such as underground conductor impedance, and characterization of complex loads. Although the input of high-fidelity data to existing tools will be challenging, µPMU data on phase angle (as well as other data from advanced sensors) will be useful for basic operational decisions that are based on a trend of changing data.« less

  6. Triboelectric-Based Transparent Secret Code.

    PubMed

    Yuan, Zuqing; Du, Xinyu; Li, Nianwu; Yin, Yingying; Cao, Ran; Zhang, Xiuling; Zhao, Shuyu; Niu, Huidan; Jiang, Tao; Xu, Weihua; Wang, Zhong Lin; Li, Congju

    2018-04-01

    Private and security information for personal identification requires an encrypted tool to extend communication channels between human and machine through a convenient and secure method. Here, a triboelectric-based transparent secret code (TSC) that enables self-powered sensing and information identification simultaneously in a rapid process method is reported. The transparent and hydrophobic TSC can be conformed to any cambered surface due to its high flexibility, which extends the application scenarios greatly. Independent of the power source, the TSC can induce obvious electric signals only by surface contact. This TSC is velocity-dependent and capable of achieving a peak voltage of ≈4 V at a resistance load of 10 MΩ and a sliding speed of 0.1 m s -1 , according to a 2 mm × 20 mm rectangular stripe. The fabricated TSC can maintain its performance after reciprocating rolling for about 5000 times. The applications of TSC as a self-powered code device are demonstrated, and the ordered signals can be recognized through the height of the electric peaks, which can be further transferred into specific information by the processing program. The designed TSC has great potential in personal identification, commodity circulation, valuables management, and security defense applications.

  7. Trade studies for nuclear space power systems

    NASA Technical Reports Server (NTRS)

    Smith, John M.; Bents, David J.; Bloomfield, Harvey S.

    1991-01-01

    As human visions of space applications expand and as we probe further out into the universe, our needs for power will also expand, and missions will evolve which are enabled by nuclear power. A broad spectrum of missions which are enhanced or enabled by nuclear power sources have been defined. These include Earth orbital platforms, deep space platforms, planetary exploration, and terrestrial resource exploration. The recently proposed Space Exploration Initiative (SEI) to the Moon and Mars has more clearly defined these missions and their power requirements. Presented here are results of recent studies of radioisotope and nuclear reactor energy sources, combined with various energy conversion devices for Earth orbital applications, SEI lunar/Mars rovers, surface power, and planetary exploration.

  8. Tritium-powered radiation sensor network

    NASA Astrophysics Data System (ADS)

    Litz, Marc S.; Russo, Johnny A.; Katsis, Dimos

    2016-05-01

    Isotope power supplies offer long-lived (100 years using 63Ni), low-power energy sources, enabling sensors or communications nodes for the lifetime of infrastructure. A tritium beta-source (12.5-year half-life) encapsulated in a phosphor-lined vial couples directly to a photovoltaic (PV) to generate a trickle current into an electrical load. An inexpensive design is described using commercial-of-the-shelf (COTS) components that generate 100 μWe for nextgeneration compact electronics/sensors. A matched radiation sensor has been built for long-duration missions utilizing microprocessor-controlled sleep modes, low-power electronic components, and a passive interrupt driven environmental wake-up. The low-power early-warning radiation detector network and isotope power source enables no-maintenance mission lifetimes.

  9. MESA: Message-Based System Analysis Using Runtime Verification

    NASA Technical Reports Server (NTRS)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  10. A cyber infrastructure for the SKA Telescope Manager

    NASA Astrophysics Data System (ADS)

    Barbosa, Domingos; Barraca, João. P.; Carvalho, Bruno; Maia, Dalmiro; Gupta, Yashwant; Natarajan, Swaminathan; Le Roux, Gerhard; Swart, Paul

    2016-07-01

    The Square Kilometre Array Telescope Manager (SKA TM) will be responsible for assisting the SKA Operations and Observation Management, carrying out System diagnosis and collecting Monitoring and Control data from the SKA subsystems and components. To provide adequate compute resources, scalability, operation continuity and high availability, as well as strict Quality of Service, the TM cyber-infrastructure (embodied in the Local Infrastructure - LINFRA) consists of COTS hardware and infrastructural software (for example: server monitoring software, host operating system, virtualization software, device firmware), providing a specially tailored Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) solution. The TM infrastructure provides services in the form of computational power, software defined networking, power, storage abstractions, and high level, state of the art IaaS and PaaS management interfaces. This cyber platform will be tailored to each of the two SKA Phase 1 telescopes (SKA_MID in South Africa and SKA_LOW in Australia) instances, each presenting different computational and storage infrastructures and conditioned by location. This cyber platform will provide a compute model enabling TM to manage the deployment and execution of its multiple components (observation scheduler, proposal submission tools, MandC components, Forensic tools and several Databases, etc). In this sense, the TM LINFRA is primarily focused towards the provision of isolated instances, mostly resorting to virtualization technologies, while defaulting to bare hardware if specifically required due to performance, security, availability, or other requirement.

  11. High throughput and quantitative approaches for measuring circadian rhythms in cyanobacteria using bioluminescence

    PubMed Central

    Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.

    2016-01-01

    The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451

  12. Artificial Intelligence for Diabetes Management and Decision Support: Literature Review

    PubMed Central

    Contreras, Ivan

    2018-01-01

    Background Artificial intelligence methods in combination with the latest technologies, including medical devices, mobile computing, and sensor technologies, have the potential to enable the creation and delivery of better management services to deal with chronic diseases. One of the most lethal and prevalent chronic diseases is diabetes mellitus, which is characterized by dysfunction of glucose homeostasis. Objective The objective of this paper is to review recent efforts to use artificial intelligence techniques to assist in the management of diabetes, along with the associated challenges. Methods A review of the literature was conducted using PubMed and related bibliographic resources. Analyses of the literature from 2010 to 2018 yielded 1849 pertinent articles, of which we selected 141 for detailed review. Results We propose a functional taxonomy for diabetes management and artificial intelligence. Additionally, a detailed analysis of each subject category was performed using related key outcomes. This approach revealed that the experiments and studies reviewed yielded encouraging results. Conclusions We obtained evidence of an acceleration of research activity aimed at developing artificial intelligence-powered tools for prediction and prevention of complications associated with diabetes. Our results indicate that artificial intelligence methods are being progressively established as suitable for use in clinical daily practice, as well as for the self-management of diabetes. Consequently, these methods provide powerful tools for improving patients’ quality of life. PMID:29848472

  13. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    NASA Astrophysics Data System (ADS)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  14. The current state of the art of quantitative phosphoproteomics and its applications to diabetes research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan, Chi Yuet X’avia; Gritsenko, Marina A.; Smith, Richard D.

    Protein phosphorylation is a fundamental regulatory mechanism in many cellular processes and aberrant perturbation of phosphorylation has been revealed in various human diseases. Kinases and their cognate inhibitors have been hotspot for drug development. Therefore, the emerging tools, which enable a system-wide quantitative profiling of phosphoproteome, would offer a powerful impetus in unveiling novel signaling pathways, drug targets and/or biomarkers for the disease of interest. In this review, we will highlight recent advances in phosphoproteomics, the current state-of-the-art of the technologies, and the challenges and future perspectives of this research area. Finally, we will underscore some exemplary applications of phosphoproteomicsmore » in diabetes research.« less

  15. Nationwide Databases in Orthopaedic Surgery Research.

    PubMed

    Bohl, Daniel D; Singh, Kern; Grauer, Jonathan N

    2016-10-01

    The use of nationwide databases to conduct orthopaedic research has expanded markedly in recent years. Nationwide databases offer large sample sizes, sampling of patients who are representative of the country as a whole, and data that enable investigation of trends over time. The most common use of nationwide databases is to study the occurrence of postoperative adverse events. Other uses include the analysis of costs and the investigation of critical hospital metrics, such as length of stay and readmission rates. Although nationwide databases are powerful research tools, readers should be aware of the differences between them and their limitations. These include variations and potential inaccuracies in data collection, imperfections in patient sampling, insufficient postoperative follow-up, and lack of orthopaedic-specific outcomes.

  16. Logic integer programming models for signaling networks.

    PubMed

    Haus, Utz-Uwe; Niermann, Kathrin; Truemper, Klaus; Weismantel, Robert

    2009-05-01

    We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.

  17. Implementation of an Ultra-Bright Thermographic Phosphor for Gas Turbine Engine Temperature Measurements

    NASA Technical Reports Server (NTRS)

    Eldridge, Jeffrey I.; Bencic, Timothy J.; Zhu, Dongming; Cuy, Michael D.; Wolfe, Douglas E.; Allison, Stephen W.; Beshears, David L.; Jenkins, Thomas P.; Heeg, Bauke; Howard, Robert P.; hide

    2014-01-01

    The overall goal of the Aeronautics Research Mission Directorate (ARMD) Seedling Phase II effort was to build on the promising temperature-sensing characteristics of the ultrabright thermographic phosphor Cr-doped gadolinium aluminum perovskite (Cr:GAP) demonstrated in Phase I by transitioning towards an engine environment implementation. The strategy adopted was to take advantage of the unprecedented retention of ultra-bright luminescence from Cr:GAP at temperatures over 1000 C to enable fast 2D temperature mapping of actual component surfaces as well as to utilize inexpensive low-power laser-diode excitation suitable for on-wing diagnostics. A special emphasis was placed on establishing Cr:GAP luminescence-based surface temperature mapping as a new tool for evaluating engine component surface cooling effectiveness.

  18. Diode Lasers used in Plastic Welding and Selective Laser Soldering - Applications and Products

    NASA Astrophysics Data System (ADS)

    Reinl, S.

    Aside from conventional welding methods, laser welding of plastics has established itself as a proven bonding method. The component-conserving and clean process offers numerous advantages and enables welding of sensitive assemblies in automotive, electronic, medical, human care, food packaging and consumer electronics markets. Diode lasers are established since years within plastic welding applications. Also, soft soldering using laser radiation is becoming more and more significant in the field of direct diode laser applications. Fast power controllability combined with a contactless temperature measurement to minimize thermal damage make the diode laser an ideal tool for this application. These advantages come in to full effect when soldering of increasingly small parts in temperature sensitive environments is necessary.

  19. Transitioning Rationally Designed Catalytic Materials to Real 'Working' Catalysts Produced at Commercial Scale: Nanoparticle Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaidle, Joshua A.; Habas, Susan E.; Baddour, Frederick G.

    Catalyst design, from idea to commercialization, requires multi-disciplinary scientific and engineering research and development over 10-20 year time periods. Historically, the identification of new or improved catalyst materials has largely been an empirical trial-and-error process. However, advances in computational capabilities (new tools and increased processing power) coupled with new synthetic techniques have started to yield rationally-designed catalysts with controlled nano-structures and tailored properties. This technological advancement represents an opportunity to accelerate the catalyst development timeline and to deliver new materials that outperform existing industrial catalysts or enable new applications, once a number of unique challenges associated with the scale-up ofmore » nano-structured materials are overcome.« less

  20. Multimodality hard-x-ray imaging of a chromosome with nanoscale spatial resolution

    DOE PAGES

    Yan, Hanfei; Nazaretski, Evgeny; Lauer, Kenneth R.; ...

    2016-02-05

    Here, we developed a scanning hard x-ray microscope using a new class of x-ray nano-focusing optic called a multilayer Laue lens and imaged a chromosome with nanoscale spatial resolution. The combination of the hard x-ray's superior penetration power, high sensitivity to elemental composition, high spatial-resolution and quantitative analysis creates a unique tool with capabilities that other microscopy techniques cannot provide. Using this microscope, we simultaneously obtained absorption-, phase-, and fluorescence-contrast images of Pt-stained human chromosome samples. The high spatial-resolution of the microscope and its multi-modality imaging capabilities enabled us to observe the internal ultra-structures of a thick chromosome without sectioningmore » it.« less

Top