On the next generation of reliability analysis tools
NASA Technical Reports Server (NTRS)
Babcock, Philip S., IV; Leong, Frank; Gai, Eli
1987-01-01
The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.
Benchmarking and Self-Assessment in the Wine Industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galitsky, Christina; Radspieler, Anthony; Worrell, Ernst
2005-12-01
Not all industrial facilities have the staff or theopportunity to perform a detailed audit of their operations. The lack ofknowledge of energy efficiency opportunities provides an importantbarrier to improving efficiency. Benchmarking programs in the U.S. andabroad have shown to improve knowledge of the energy performance ofindustrial facilities and buildings and to fuel energy managementpractices. Benchmarking provides a fair way to compare the energyintensity of plants, while accounting for structural differences (e.g.,the mix of products produced, climate conditions) between differentfacilities. In California, the winemaking industry is not only one of theeconomic pillars of the economy; it is also a large energymore » consumer, witha considerable potential for energy-efficiency improvement. LawrenceBerkeley National Laboratory and Fetzer Vineyards developed the firstbenchmarking tool for the California wine industry called "BEST(Benchmarking and Energy and water Savings Tool) Winery". BEST Wineryenables a winery to compare its energy efficiency to a best practicereference winery. Besides overall performance, the tool enables the userto evaluate the impact of implementing efficiency measures. The toolfacilitates strategic planning of efficiency measures, based on theestimated impact of the measures, their costs and savings. The tool willraise awareness of current energy intensities and offer an efficient wayto evaluate the impact of future efficiency measures.« less
D. Evan Mercer; Frederick W. Cubbage; Gregory E. Frey
2014-01-01
This chapter provides principles, literature and a case study about the economics of agroforestry. We examine necessary conditions for achieving efficiency in agroforestry system design and economic analysis tools for assessing efficiency and adoptability of agroforestry. The tools presented here (capital budgeting, linear progranuning, production frontier analysis...
New Automotive Air Conditioning System Simulation Tool Developed in MATLAB/Simulink
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiss, T.; Chaney, L.; Meyer, J.
Further improvements in vehicle fuel efficiency require accurate evaluation of the vehicle's transient total power requirement. When operated, the air conditioning (A/C) system is the largest auxiliary load on a vehicle; therefore, accurate evaluation of the load it places on the vehicle's engine and/or energy storage system is especially important. Vehicle simulation software, such as 'Autonomie,' has been used by OEMs to evaluate vehicles' energy performance. A transient A/C simulation tool incorporated into vehicle simulation models would also provide a tool for developing more efficient A/C systems through a thorough consideration of the transient A/C system performance. The dynamic systemmore » simulation software Matlab/Simulink was used to develop new and more efficient vehicle energy system controls. The various modeling methods used for the new simulation tool are described in detail. Comparison with measured data is provided to demonstrate the validity of the model.« less
Information Power Grid (IPG) Tutorial 2003
NASA Technical Reports Server (NTRS)
Meyers, George
2003-01-01
For NASA and the general community today Grid middleware: a) provides tools to access/use data sources (databases, instruments, ...); b) provides tools to access computing (unique and generic); c) Is an enabler of large scale collaboration. Dynamically responding to needs is a key selling point of a grid. Independent resources can be joined as appropriate to solve a problem. Provide tools to enable the building of a frameworks for application. Provide value added service to the NASA user base for utilizing resources on the grid in new and more efficient ways. Provides tools for development of Frameworks.
Tool to Prioritize Energy Efficiency Investments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farese, P.; Gelman, R.; Hendron, R.
2012-08-01
To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.
Sadeghi, Samira; Sadeghi, Leyla; Tricot, Nicolas; Mathieu, Luc
2017-12-01
Accident reports are published in order to communicate the information and lessons learned from accidents. An efficient accident recording and analysis system is a necessary step towards improvement of safety. However, currently there is a shortage of efficient tools to support such recording and analysis. In this study we introduce a flexible and customizable tool that allows structuring and analysis of this information. This tool has been implemented under TEEXMA®. We named our prototype TEEXMA®SAFETY. This tool provides an information management system to facilitate data collection, organization, query, analysis and reporting of accidents. A predefined information retrieval module provides ready access to data which allows the user to quickly identify the possible hazards for specific machines and provides information on the source of hazards. The main target audience for this tool includes safety personnel, accident reporters and designers. The proposed data model has been developed by analyzing different accident reports.
Benchmarking: A Study of School and School District Effect and Efficiency.
ERIC Educational Resources Information Center
Swanson, Austin D.; Engert, Frank
The "New York State School Report Card" provides a vehicle for benchmarking with respect to student achievement. In this study, additional tools were developed for making external comparisons with respect to achievement, and tools were added for assessing fiscal policy and efficiency. Data from school years 1993-94 through 1995-96 were…
Knowledge management: An abstraction of knowledge base and database management systems
NASA Technical Reports Server (NTRS)
Riedesel, Joel D.
1990-01-01
Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.
Design handbook : energy efficiency and water conservation in NAS facilities
DOT National Transportation Integrated Search
1997-09-30
This handbook was created to provide definitive energy efficiency and water conservation design criteria for the design of NAS facilities. FAA-HDBK-001 provides implementation strategies and tools to comply with E.O. 12902, Energy and Water Conservat...
BEST Winery Guidebook: Benchmarking and Energy and Water SavingsTool for the Wine Industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galitsky, Christina; Worrell, Ernst; Radspieler, Anthony
2005-10-15
Not all industrial facilities have the staff or the opportunity to perform a detailed audit of their operations. The lack of knowledge of energy efficiency opportunities provides an important barrier to improving efficiency. Benchmarking has demonstrated to help energy users understand energy use and the potential for energy efficiency improvement, reducing the information barrier. In California, the wine making industry is not only one of the economic pillars of the economy; it is also a large energy consumer, with a considerable potential for energy-efficiency improvement. Lawrence Berkeley National Laboratory and Fetzer Vineyards developed an integrated benchmarking and self-assessment tool formore » the California wine industry called ''BEST''(Benchmarking and Energy and water Savings Tool) Winery. BEST Winery enables a winery to compare its energy efficiency to a best practice winery, accounting for differences in product mix and other characteristics of the winery. The tool enables the user to evaluate the impact of implementing energy and water efficiency measures. The tool facilitates strategic planning of efficiency measures, based on the estimated impact of the measures, their costs and savings. BEST Winery is available as a software tool in an Excel environment. This report serves as background material, documenting assumptions and information on the included energy and water efficiency measures. It also serves as a user guide for the software package.« less
Electronic tools for infectious diseases and microbiology
Burdette, Steven D
2007-01-01
Electronic tools for infectious diseases and medical microbiology have the ability to change the way the diagnosis and treatment of infectious diseases are approached. Medical information today has the ability to be dynamic, keeping up with the latest research or clinical issues, instead of being static and years behind, as many textbooks are. The ability to rapidly disseminate information around the world opens up the possibility of communicating with people thousands of miles away to quickly and efficiently learn about emerging infections. Electronic tools have expanded beyond the desktop computer and the Internet, and now include personal digital assistants and other portable devices such as cellular phones. These pocket-sized devices have the ability to provide access to clinical information at the point of care. New electronic tools include e-mail listservs, electronic drug databases and search engines that allow focused clinical questions. The goal of the present article is to provide an overview of how electronic tools can impact infectious diseases and microbiology, while providing links and resources to allow users to maximize their efficiency in accessing this information. Links to the mentioned Web sites and programs are provided along with other useful electronic tools. PMID:18978984
Leroy, Gondy; Xu, Jennifer; Chung, Wingyan; Eggers, Shauna; Chen, Hsinchun
2007-01-01
Retrieving sufficient relevant information online is difficult for many people because they use too few keywords to search and search engines do not provide many support tools. To further complicate the search, users often ignore support tools when available. Our goal is to evaluate in a realistic setting when users use support tools and how they perceive these tools. We compared three medical search engines with support tools that require more or less effort from users to form a query and evaluate results. We carried out an end user study with 23 users who were asked to find information, i.e., subtopics and supporting abstracts, for a given theme. We used a balanced within-subjects design and report on the effectiveness, efficiency and usability of the support tools from the end user perspective. We found significant differences in efficiency but did not find significant differences in effectiveness between the three search engines. Dynamic user support tools requiring less effort led to higher efficiency. Fewer searches were needed and more documents were found per search when both query reformulation and result review tools dynamically adjust to the user query. The query reformulation tool that provided a long list of keywords, dynamically adjusted to the user query, was used most often and led to more subtopics. As hypothesized, the dynamic result review tools were used more often and led to more subtopics than static ones. These results were corroborated by the usability questionnaires, which showed that support tools that dynamically optimize output were preferred.
Govorunova, Elena G; Sineshchekov, Oleg A; Janz, Roger; Liu, Xiaoqin; Spudich, John L
2015-08-07
Light-gated rhodopsin cation channels from chlorophyte algae have transformed neuroscience research through their use as membrane-depolarizing optogenetic tools for targeted photoactivation of neuron firing. Photosuppression of neuronal action potentials has been limited by the lack of equally efficient tools for membrane hyperpolarization. We describe anion channel rhodopsins (ACRs), a family of light-gated anion channels from cryptophyte algae that provide highly sensitive and efficient membrane hyperpolarization and neuronal silencing through light-gated chloride conduction. ACRs strictly conducted anions, completely excluding protons and larger cations, and hyperpolarized the membrane of cultured animal cells with much faster kinetics at less than one-thousandth of the light intensity required by the most efficient currently available optogenetic proteins. Natural ACRs provide optogenetic inhibition tools with unprecedented light sensitivity and temporal precision. Copyright © 2015, American Association for the Advancement of Science.
Overview of Virtual Observatory Tools
NASA Astrophysics Data System (ADS)
Allen, M. G.
2009-07-01
I provide a brief introduction and tour of selected Virtual Observatory tools to highlight some of the core functions provided by the VO, and the way that astronomers may use the tools and services for doing science. VO tools provide advanced functions for searching and using images, catalogues and spectra that have been made available in the VO. The tools may work together by providing efficient and innovative browsing and analysis of data, and I also describe how many VO services may be accessed by a scripting or command line environment. Early science usage of the VO provides important feedback on the development of the system, and I show how VO portals try to address early user comments about the navigation and use of the VO.
Computer tools for systems engineering at LaRC
NASA Technical Reports Server (NTRS)
Walters, J. Milam
1994-01-01
The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.
Post-Flight Data Analysis Tool
NASA Technical Reports Server (NTRS)
George, Marina
2018-01-01
A software tool that facilitates the retrieval and analysis of post-flight data. This allows our team and other teams to effectively and efficiently analyze and evaluate post-flight data in order to certify commercial providers.
Building Efficiency Evaluation and Uncertainty Analysis with DOE's Asset Score Preview
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2016-08-12
Building Energy Asset Score Tool, developed by the U.S. Department of Energy (DOE), is a program to encourage energy efficiency improvement by helping building owners and managers assess a building's energy-related systems independent of operations and maintenance. Asset Score Tool uses a simplified EnergyPlus model to provide an assessment of building systems, through minimum user inputs of basic building characteristics. Asset Score Preview is a newly developed option that allows users to assess their building's systems and the potential value of a more in-depth analysis via an even more simplified approach. This methodology provides a preliminary approach to estimating amore » building's energy efficiency and potential for improvement. This paper provides an overview of the methodology used for the development of Asset Score Preview and the scoring methodology.« less
ERIC Educational Resources Information Center
Fuller, Scott; Davis, Jason
2003-01-01
The Multimedia Tool Box Talk is a web-based quick reference safety guide and training tool for construction personnel. An intended outcome of this effort was to provide an efficient and effective way to locate and interpret crucial safety information while at the job site. The tool includes information from the Occupational Safety and Health…
Provide Views | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Fundamental Aeronautics Program: Overview of Project Work in Supersonic Cruise Efficiency
NASA Technical Reports Server (NTRS)
Castner, Raymond
2011-01-01
The Supersonics Project, part of NASA?s Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2011) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.
NASA Technical Reports Server (NTRS)
Castner, Ray
2012-01-01
The Supersonics Project, part of NASA's Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2012) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.
A tool to measure nurse efficiency and value.
Landry, M T; Landry, H T; Hebert, W
2001-07-01
Home care nurses who have multiple roles can increase their value by validating their contributions and work efficiency. This article presents a method for tracking nurse efficiency for those who are paid on an hourly basis, and provides a mechanism to document their contributions to the home care agency.
The efficiency of geophysical adjoint codes generated by automatic differentiation tools
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Köhl, A.; Stammer, D.
2016-02-01
The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.
3D TRUMP - A GBI launch window tool
NASA Astrophysics Data System (ADS)
Karels, Steven N.; Hancock, John; Matchett, Gary
3D TRUMP is a novel GPS and communicatons-link software analysis tool developed for the SDIO's Ground-Based Interceptor (GBI) program. 3D TRUMP uses a computationally efficient analysis tool which provides key GPS-based performance measures for an entire GBI mission's reentry vehicle and interceptor trajectories. Algorithms and sample outputs are presented.
Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs
NASA Astrophysics Data System (ADS)
RIngenburg, Michael F.
Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.
DOT National Transportation Integrated Search
2007-03-01
This course provides INDOT staff with foundational knowledge and skills in project management principles and methodologies. INDOTs project management processes provide the tools for interdisciplinary teams to efficiently and effectively deliver pr...
Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp
2016-11-18
ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .
Evaluation of the Terminal Precision Scheduling and Spacing System for Near-Term NAS Application
NASA Technical Reports Server (NTRS)
Thipphavong, Jane; Martin, Lynne Hazel; Swenson, Harry N.; Lin, Paul; Nguyen, Jimmy
2012-01-01
NASA has developed a capability for terminal area precision scheduling and spacing (TAPSS) to provide higher capacity and more efficiently manage arrivals during peak demand periods. This advanced technology is NASA's vision for the NextGen terminal metering capability. A set of human-in-the-loop experiments was conducted to evaluate the performance of the TAPSS system for near-term implementation. The experiments evaluated the TAPSS system under the current terminal routing infrastructure to validate operational feasibility. A second goal of the study was to measure the benefit of the Center and TRACON advisory tools to help prioritize the requirements for controller radar display enhancements. Simulation results indicate that using the TAPSS system provides benefits under current operations, supporting a 10% increase in airport throughput. Enhancements to Center decision support tools had limited impact on improving the efficiency of terminal operations, but did provide more fuel-efficient advisories to achieve scheduling conformance within 20 seconds. The TRACON controller decision support tools were found to provide the most benefit, by improving the precision in schedule conformance to within 20 seconds, reducing the number of arrivals having lateral path deviations by 50% and lowering subjective controller workload. Overall, the TAPSS system was found to successfully develop an achievable terminal arrival metering plan that was sustainable under heavy traffic demand levels and reduce the complexity of terminal operations when coupled with the use of the terminal controller advisory tools.
JAtlasView: a Java atlas-viewer for browsing biomedical 3D images and atlases.
Feng, Guangjie; Burton, Nick; Hill, Bill; Davidson, Duncan; Kerwin, Janet; Scott, Mark; Lindsay, Susan; Baldock, Richard
2005-03-09
Many three-dimensional (3D) images are routinely collected in biomedical research and a number of digital atlases with associated anatomical and other information have been published. A number of tools are available for viewing this data ranging from commercial visualization packages to freely available, typically system architecture dependent, solutions. Here we discuss an atlas viewer implemented to run on any workstation using the architecture neutral Java programming language. We report the development of a freely available Java based viewer for 3D image data, descibe the structure and functionality of the viewer and how automated tools can be developed to manage the Java Native Interface code. The viewer allows arbitrary re-sectioning of the data and interactive browsing through the volume. With appropriately formatted data, for example as provided for the Electronic Atlas of the Developing Human Brain, a 3D surface view and anatomical browsing is available. The interface is developed in Java with Java3D providing the 3D rendering. For efficiency the image data is manipulated using the Woolz image-processing library provided as a dynamically linked module for each machine architecture. We conclude that Java provides an appropriate environment for efficient development of these tools and techniques exist to allow computationally efficient image-processing libraries to be integrated relatively easily.
Measuring Security Effectiveness and Efficiency at U.S. Commercial Airports
2013-03-01
formative program evaluation and policy analysis to investigate current airport security programs. It identifies innovative public administration and...policy-analysis tools that could provide potential benefits to airport security . These tools will complement the System Based Risk Management framework if
Induction Consolidation of Thermoplastic Composites Using Smart Susceptors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsen, Marc R
2012-06-14
This project has focused on the area of energy efficient consolidation and molding of fiber reinforced thermoplastic composite components as an energy efficient alternative to the conventional processing methods such as autoclave processing. The expanding application of composite materials in wind energy, automotive, and aerospace provides an attractive energy efficiency target for process development. The intent is to have this efficient processing along with the recyclable thermoplastic materials ready for large scale application before these high production volume levels are reached. Therefore, the process can be implemented in a timely manner to realize the maximum economic, energy, and environmental efficiencies.more » Under this project an increased understanding of the use of induction heating with smart susceptors applied to consolidation of thermoplastic has been achieved. This was done by the establishment of processing equipment and tooling and the subsequent demonstration of this fabrication technology by consolidating/molding of entry level components for each of the participating industrial segments, wind energy, aerospace, and automotive. This understanding adds to the nation's capability to affordably manufacture high quality lightweight high performance components from advanced recyclable composite materials in a lean and energy efficient manner. The use of induction heating with smart susceptors is a precisely controlled low energy method for the consolidation and molding of thermoplastic composites. The smart susceptor provides intrinsic thermal control based on the interaction with the magnetic field from the induction coil thereby producing highly repeatable processing. The low energy usage is enabled by the fact that only the smart susceptor surface of the tool is heated, not the entire tool. Therefore much less mass is heated resulting in significantly less required energy to consolidate/mold the desired composite components. This energy efficiency results in potential energy savings of {approx}75% as compared to autoclave processing in aerospace, {approx}63% as compared to compression molding in automotive, and {approx}42% energy savings as compared to convectively heated tools in wind energy. The ability to make parts in a rapid and controlled manner provides significant economic advantages for each of the industrial segments. These attributes were demonstrated during the processing of the demonstration components on this project.« less
Java web tools for PCR, in silico PCR, and oligonucleotide assembly and analysis.
Kalendar, Ruslan; Lee, David; Schulman, Alan H
2011-08-01
The polymerase chain reaction is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. We have developed and tested efficient tools for PCR primer and probe design, which also predict oligonucleotide properties based on experimental studies of PCR efficiency. The tools provide comprehensive facilities for designing primers for most PCR applications and their combinations, including standard, multiplex, long-distance, inverse, real-time, unique, group-specific, bisulphite modification assays, Overlap-Extension PCR Multi-Fragment Assembly, as well as a programme to design oligonucleotide sets for long sequence assembly by ligase chain reaction. The in silico PCR primer or probe search includes comprehensive analyses of individual primers and primer pairs. It calculates the melting temperature for standard and degenerate oligonucleotides including LNA and other modifications, provides analyses for a set of primers with prediction of oligonucleotide properties, dimer and G-quadruplex detection, linguistic complexity, and provides a dilution and resuspension calculator. Copyright © 2011 Elsevier Inc. All rights reserved.
Using Learning Analytics to Support Engagement in Collaborative Writing
ERIC Educational Resources Information Center
Liu, Ming; Pardo, Abelardo; Liu, Li
2017-01-01
Online collaborative writing tools provide an efficient way to complete a writing task. However, existing tools only focus on technological affordances and ignore the importance of social affordances in a collaborative learning environment. This article describes a learning analytic system that analyzes writing behaviors, and creates…
SUNREL Related Links | Buildings | NREL
SUNREL Related Links SUNREL Related Links DOE Simulation Software Tools Directory a directory of 301 building software tools for evaluation of energy efficiency, renewable energy, and sustainability in buildings. TREAT Software Program a computer program that uses SUNREL and is designed to provide
NASA Astrophysics Data System (ADS)
Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.
2016-12-01
Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.
Laing, Karen; Baumgartner, Katherine
2005-01-01
Many endoscopy units are looking for ways to improve their efficiency without increasing the number of staff, purchasing additional equipment, or making the patients feel as if they have been rushed through the care process. To accomplish this, a few hospitals have looked to other industries for help. Recently, "lean" methods and tools from the manufacturing industry, have been applied successfully in health care systems, and have proven to be an effective way to eliminate waste and redundancy in workplace processes. The "lean" method and tools in service organizations focuses on providing the most efficient and effective flow of service and products. This article will describe the journey of one endoscopy department within a community hospital to illustrate application of "lean" methods and tools and results.
Balikuddembe, Michael S; Wakholi, Peter K; Tumwesigye, Nazarius M; Tylleskär, Thorkild
2018-01-01
A third of women in childbirth are inadequately monitored, partly due to the tools used. Some stakeholders assert that the current labour monitoring tools are not efficient and need improvement to become more relevant to childbirth attendants. The study objective was to explore the expectations of maternity service providers for a mobile childbirth monitoring tool in maternity facilities in a low-income country like Uganda. Semi-structured interviews of purposively selected midwives and doctors in rural-urban childbirth facilities in Uganda were conducted before thematic data analysis. The childbirth providers expected a tool that enabled fast and secure childbirth record storage and sharing. They desired a tool that would automatically and conveniently register patient clinical findings, and actively provide interactive clinical decision support on a busy ward. The tool ought to support agreed upon standards for good pregnancy outcomes but also adaptable to the patient and their difficult working conditions. The tool functionality should include clinical data management and real-time decision support to the midwives, while the non-functional attributes include versatility and security.
Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif
2008-03-01
High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.
Hewett, Rafe; VanCuren, Anne; Trocio, Loralee; Beaudrault, Sara; Gund, Anona; Luther, Mimi; Groom, Holly
2013-01-01
This project's objective was to enhance efforts to improve vaccine-ordering efficiencies among targeted clinics using publicly purchased vaccines. Using an assessment of ordering behavior developed by the Centers for Disease Control and Prevention, we selected and trained immunization providers and assessed improvements in ordering behavior by comparing ordering patterns before and after the intervention. A total of 144 Vaccines for Children program providers in Oregon. We assessed 144 providers trained in the Economic Order Quantity process between January and November 2010. INTERVENTION (IF APPLICABLE): Providers were invited to participate in regional trainings. Trainings included assignment of ordering frequency and dissemination of tools to support adherence to the recommended ordering frequency. The percent increase in targeted clinics ordering according to recommended order frequency and the resulting decrease in orders placed, as an outcome of training and ordering tools. Only 35% of targeted providers were ordering according to the recommended ordering frequency before the project began. After completing training, utilizing ordering tools and ordering over a 7-month period, 78% of the targeted clinics were ordering according to the recommended frequency, a 120% increase in the number of clinics ordering with the recommended frequency. At baseline, targeted clinics placed 915 total vaccine orders over a 7-month period. After completing training and participating in the Economic Order Quantity process, only 645 orders were placed, a reduction of 30% . The initiative was successful in reducing the number of orders placed by Vaccines for Children providers in Oregon. A previous effort to reduce ordering, without the use of training or tools, did not achieve the same levels of provider compliance, suggesting that the addition of staff and development of tools were helpful in supporting behavior change and improving providers' ability to adhere to assigned order frequencies. Reducing order frequency results in more efficient vaccine ordering patterns and benefits vaccine distributors, Oregon Immunization Program staff, and provider staff.
EPA's solvent substitution software tool, PARIS III is provided by the EPA for free, and can be effective and efficiently used to help environmentally-conscious individuals find better and greener solvent mixtures for many different common industrial processes. People can downlo...
Increasing Student Engagement through Paired Technologies
ERIC Educational Resources Information Center
Basko, Lynn; Hartman, Jillian
2017-01-01
This article highlights efficient ways to combine tech tools, such as Remind and video conferencing, to increase student engagement and faculty/student communication. Using Remind is a great way to provide information to students outside of LoudCloud, and video conferencing is a tool for having synchronous meetings and conferences with students.…
ERIC Educational Resources Information Center
Borgmeier, Chris; Horner, Robert H.
2006-01-01
Faced with limited resources, schools require tools that increase the accuracy and efficiency of functional behavioral assessment. Yarbrough and Carr (2000) provided evidence that informant confidence ratings of the likelihood of problem behavior in specific situations offered a promising tool for predicting the accuracy of function-based…
Women's Energy Tool Kit: Home Heating, Cooling and Weatherization.
ERIC Educational Resources Information Center
Byalin, Joan
This book is the first in a series of Energy Tool Kits designed for women by Consumer Action Now, a non-profit organization devoted to promoting energy efficiency and renewable energy resources. Information is provided in 16 sections: introduction, home energy survey; caulking; weatherstripping (double-hung and sliding windows, and casement,…
Pricing: A Normative Strategy in the Delivery of Human Services.
ERIC Educational Resources Information Center
Moore, Stephen T.
1995-01-01
Discusses a normative strategy toward pricing human services, which will allow providers to develop pricing strategies within the context of organizational missions, goals, and values. Pricing is an effective tool for distributing resources and improving efficiency, and can be used as a tool for encouraging desired patterns of service utilization.…
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1992-01-01
The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.
The future challenge for aeropropulsion
NASA Technical Reports Server (NTRS)
Rosen, Robert; Bowditch, David N.
1992-01-01
NASA's research in aeropropulsion is focused on improving the efficiency, capability, and environmental compatibility for all classes of future aircraft. The development of innovative concepts, and theoretical, experimental, and computational tools provide the knowledge base for continued propulsion system advances. Key enabling technologies include advances in internal fluid mechanics, structures, light-weight high-strength composite materials, and advanced sensors and controls. Recent emphasis has been on the development of advanced computational tools in internal fluid mechanics, structural mechanics, reacting flows, and computational chemistry. For subsonic transport applications, very high bypass ratio turbofans with increased engine pressure ratio are being investigated to increase fuel efficiency and reduce airport noise levels. In a joint supersonic cruise propulsion program with industry, the critical environmental concerns of emissions and community noise are being addressed. NASA is also providing key technologies for the National Aerospaceplane, and is studying propulsion systems that provide the capability for aircraft to accelerate to and cruise in the Mach 4-6 speed range. The combination of fundamental, component, and focused technology development underway at NASA will make possible dramatic advances in aeropropulsion efficiency and environmental compatibility for future aeronautical vehicles.
Implementing electronic handover: interventions to improve efficiency, safety and sustainability.
Alhamid, Sharifah Munirah; Lee, Desmond Xue-Yuan; Wong, Hei Man; Chuah, Matthew Bingfeng; Wong, Yu Jun; Narasimhalu, Kaavya; Tan, Thuan Tong; Low, Su Ying
2016-10-01
Effective handovers are critical for patient care and safety. Electronic handover tools are increasingly used today to provide an effective and standardized platform for information exchange. The implementation of an electronic handover system in tertiary hospitals can be a major challenge. Previous efforts in implementing an electronic handover tool failed due to poor compliance and buy-in from end-users. A new electronic handover tool was developed and incorporated into the existing electronic medical records (EMRs) for medical patients in Singapore General Hospital (SGH). There was poor compliance by on-call doctors in acknowledging electronic handovers, and lack of adherence to safety rules, raising concerns about the safety and efficiency of the electronic handover tool. Urgent measures were needed to ensure its safe and sustained use. A quality improvement group comprising stakeholders, including end-users, developed multi-faceted interventions using rapid PDSA (P-Plan, D-Do, S-Study, A-Act ) cycles to address these issues. Innovative solutions using media and online software provided cost-efficient measures to improve compliance. The percentage of unacknowledged handovers per day was used as the main outcome measure throughout all PDSA cycles. Doctors were also assessed for improvement in their knowledge of safety rules and their perception of the electronic handover tool. An electronic handover tool complementing daily clinical practice can be successfully implemented using solutions devised through close collaboration with end-users supported by the senior leadership. A combined 'bottom-up' and 'top-down' approach with regular process evaluations is crucial for its long-term sustainability. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Development of microsatellite markers in Parthenium ssp.
USDA-ARS?s Scientific Manuscript database
Molecular markers provide the most efficient means to study genetic diversity within and among species of a particular genus. In addition, molecular markers can facilitate breeding efforts by providing tools necessary to reduce the time required to obtain recombinant genotypes with improved agricu...
Preparing Effective Special Education Teachers. What Works for Special-Needs Learners Series
ERIC Educational Resources Information Center
Mamlin, Nancy
2012-01-01
What tools are in the toolkit of an excellent special educator, and how can teacher preparation programs provide these tools in the most efficient, effective way possible? This practical, clearly written book is grounded in current research and policy as well as the author's extensive experience as a teacher educator. It identifies what special…
Increasing efficiency of information dissemination and collection through the World Wide Web
Daniel P. Huebner; Malchus B. Baker; Peter F. Ffolliott
2000-01-01
Researchers, managers, and educators have access to revolutionary technology for information transfer through the World Wide Web (Web). Using the Web to effectively gather and distribute information is addressed in this paper. Tools, tips, and strategies are discussed. Companion Web sites are provided to guide users in selecting the most appropriate tool for searching...
ERIC Educational Resources Information Center
Cann, Cynthia W.; Brumagim, Alan L.
2008-01-01
The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…
Tracking the fate of watershed nitrogen: The “N-Sink” Web Tool and Two Case Studies
This product describes the application of a web-based decision support tool, N-Sink, in two case study watersheds. N-Sink is a customized ArcMap© program that provides maps of N sourcesand sinks within a watershed, and estimates the delivery efficiency of N movement from sou...
NASA Technical Reports Server (NTRS)
deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher
2013-01-01
This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.
Development of optimal grinding and polishing tools for aspheric surfaces
NASA Astrophysics Data System (ADS)
Burge, James H.; Anderson, Bill; Benjamin, Scott; Cho, Myung K.; Smith, Koby Z.; Valente, Martin J.
2001-12-01
The ability to grind and polish steep aspheric surfaces to high quality is limited by the tools used for working the surface. The optician prefers to use large, stiff tools to get good natural smoothing, avoiding small scale surface errors. This is difficult for steep aspheres because the tools must have sufficient compliance to fit the aspheric surface, yet we wish the tools to be stiff so they wear down high regions on the surface. This paper presents a toolkit for designing optimal tools that provide large scale compliance to fit the aspheric surface, yet maintain small scale stiffness for efficient polishing.
Sousa, V; Matos, J P; Almeida, N; Saldanha Matos, J
2014-01-01
Operation, maintenance and rehabilitation comprise the main concerns of wastewater infrastructure asset management. Given the nature of the service provided by a wastewater system and the characteristics of the supporting infrastructure, technical issues are relevant to support asset management decisions. In particular, in densely urbanized areas served by large, complex and aging sewer networks, the sustainability of the infrastructures largely depends on the implementation of an efficient asset management system. The efficiency of such a system may be enhanced with technical decision support tools. This paper describes the role of artificial intelligence tools such as artificial neural networks and support vector machines for assisting the planning of operation and maintenance activities of wastewater infrastructures. A case study of the application of this type of tool to the wastewater infrastructures of Sistema de Saneamento da Costa do Estoril is presented.
Promoting climate literacy through social engagement: the Green Ninja Project
NASA Astrophysics Data System (ADS)
Cordero, E. C.; Todd, A.
2012-12-01
One of the challenges of communicating climate change to younger audiences is the disconnect between global issues and local impacts. The Green Ninja is a climate-action superhero that aims to energize young people about climate science through media and social engagement tools. In this presentation, we'll highlight two of the tools designed to help K-12 students implement appropriate local mitigation strategies. A mobile phone application builds and supports a social community around taking action at local businesses regarding themes such as food, packaging and energy efficiency. An energy efficiency contest in local schools utilizes smart meter technology to provide feedback on household energy use and conservation. These tools are supported by films and lesson plans that link formal and informal education channels. The effectiveness of these methodologies as tools to engage young people in climate science and action will be discussed.
Francisco Rodríguez y Silva; Armando González-Cabán
2016-01-01
We propose an economic analysis using utility and productivity, and efficiency theories to provide fire managers a decision support tool to determine the most efficient fire management programs levels. By incorporating managersâ accumulated fire suppression experiences (capitalized experience) in the analysis we help fire managers...
Using S-P Chart and Bloom Taxonomy to Develop Intelligent Formative Assessment Tool
ERIC Educational Resources Information Center
Chang, Wen-Chih; Yang, Hsuan-Che; Shih, Timothy K.; Chao, Louis R.
2009-01-01
E-learning provides a convenient and efficient way for learning. Formative assessment not only guides student in instruction and learning, diagnose skill or knowledge gaps, but also measures progress and evaluation. An efficient and convenient e-learning formative assessment system is the key character for e-learning. However, most e-learning…
BEopt-CA (Ex): A Tool for Optimal Integration of EE, DR and PV in Existing California Homes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christensen, Craig; Horowitz, Scott; Maguire, Jeff
2014-04-01
This project targeted the development of a software tool, BEopt-CA (Ex) (Building Energy Optimization Tool for California Existing Homes), that aims to facilitate balanced integration of energy efficiency (EE), demand response (DR), and photovoltaics (PV) in the residential retrofit1 market. The intent is to provide utility program managers and contractors in the EE/DR/PV marketplace with a means of balancing the integration of EE, DR, and PV
NASA Technical Reports Server (NTRS)
Peters, R. L.
1969-01-01
Improved cutting fluid completely controls the heat generated from machining operations, thus providing longer tool life. Fluid is especially useful in the working of plastics and replaces less efficient contaminating oils.
IVisTMSA: Interactive Visual Tools for Multiple Sequence Alignments.
Pervez, Muhammad Tariq; Babar, Masroor Ellahi; Nadeem, Asif; Aslam, Naeem; Naveed, Nasir; Ahmad, Sarfraz; Muhammad, Shah; Qadri, Salman; Shahid, Muhammad; Hussain, Tanveer; Javed, Maryam
2015-01-01
IVisTMSA is a software package of seven graphical tools for multiple sequence alignments. MSApad is an editing and analysis tool. It can load 409% more data than Jalview, STRAP, CINEMA, and Base-by-Base. MSA comparator allows the user to visualize consistent and inconsistent regions of reference and test alignments of more than 21-MB size in less than 12 seconds. MSA comparator is 5,200% efficient and more than 40% efficient as compared to BALiBASE c program and FastSP, respectively. MSA reconstruction tool provides graphical user interfaces for four popular aligners and allows the user to load several sequence files at a time. FASTA generator converts seven formats of alignments of unlimited size into FASTA format in a few seconds. MSA ID calculator calculates identity matrix of more than 11,000 sequences with a sequence length of 2,696 base pairs in less than 100 seconds. Tree and Distance Matrix calculation tools generate phylogenetic tree and distance matrix, respectively, using neighbor joining% identity and BLOSUM 62 matrix.
The Integrated Waste Tracking System - A Flexible Waste Management Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Robert Stephen
2001-02-01
The US Department of Energy (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) has fully embraced a flexible, computer-based tool to help increase waste management efficiency and integrate multiple operational functions from waste generation through waste disposition while reducing cost. The Integrated Waste Tracking System (IWTS)provides comprehensive information management for containerized waste during generation,storage, treatment, transport, and disposal. The IWTS provides all information necessary for facilities to properly manage and demonstrate regulatory compliance. As a platformindependent, client-server and Web-based inventory and compliance system, the IWTS has proven to be a successful tracking, characterization, compliance, and reporting tool that meets themore » needs of both operations and management while providing a high level of management flexibility.« less
Tools for visually exploring biological networks.
Suderman, Matthew; Hallett, Michael
2007-10-15
Many tools exist for visually exploring biological networks including well-known examples such as Cytoscape, VisANT, Pathway Studio and Patika. These systems play a key role in the development of integrative biology, systems biology and integrative bioinformatics. The trend in the development of these tools is to go beyond 'static' representations of cellular state, towards a more dynamic model of cellular processes through the incorporation of gene expression data, subcellular localization information and time-dependent behavior. We provide a comprehensive review of the relative advantages and disadvantages of existing systems with two goals in mind: to aid researchers in efficiently identifying the appropriate existing tools for data visualization; to describe the necessary and realistic goals for the next generation of visualization tools. In view of the first goal, we provide in the Supplementary Material a systematic comparison of more than 35 existing tools in terms of over 25 different features. Supplementary data are available at Bioinformatics online.
White, David B.
1991-01-01
An electrical safety device for use in power tools that is designed to automatically discontinue operation of the power tool upon physical contact of the tool with a concealed conductive material. A step down transformer is used to supply the operating power for a disconnect relay and a reset relay. When physical contact is made between the power tool and the conductive material, an electrical circuit through the disconnect relay is completed and the operation of the power tool is automatically interrupted. Once the contact between the tool and conductive material is broken, the power tool can be quickly and easily reactivated by a reset push button activating the reset relay. A remote reset is provided for convenience and efficiency of operation.
Program audit, A management tool
NASA Technical Reports Server (NTRS)
Miller, T. J.
1971-01-01
Program gives in-depth view of organizational performance at all levels of the management structure, and provides means by which managers can effectively and efficiently evaluate adequacy of management direction, policies, and procedures.
Google-Earth Based Visualizations for Environmental Flows and Pollutant Dispersion in Urban Areas
Liu, Daoming; Kenjeres, Sasa
2017-01-01
In the present study, we address the development and application of an efficient tool for conversion of results obtained by an integrated computational fluid dynamics (CFD) and computational reaction dynamics (CRD) approach and their visualization in the Google Earth. We focus on results typical for environmental fluid mechanics studies at a city scale that include characteristic wind flow patterns and dispersion of reactive scalars. This is achieved by developing a code based on the Java language, which converts the typical four-dimensional structure (spatial and temporal dependency) of data results in the Keyhole Markup Language (KML) format. The visualization techniques most often used are revisited and implemented into the conversion tool. The potential of the tool is demonstrated in a case study of smog formation due to an intense traffic emission in Rotterdam (The Netherlands). It is shown that the Google Earth can provide a computationally efficient and user-friendly means of data representation. This feature can be very useful for visualization of pollution at street levels, which is of great importance for the city residents. Various meteorological and traffic emissions can be easily visualized and analyzed, providing a powerful, user-friendly tool for traffic regulations and urban climate adaptations. PMID:28257078
OPPL-Galaxy, a Galaxy tool for enhancing ontology exploitation as part of bioinformatics workflows
2013-01-01
Background Biomedical ontologies are key elements for building up the Life Sciences Semantic Web. Reusing and building biomedical ontologies requires flexible and versatile tools to manipulate them efficiently, in particular for enriching their axiomatic content. The Ontology Pre Processor Language (OPPL) is an OWL-based language for automating the changes to be performed in an ontology. OPPL augments the ontologists’ toolbox by providing a more efficient, and less error-prone, mechanism for enriching a biomedical ontology than that obtained by a manual treatment. Results We present OPPL-Galaxy, a wrapper for using OPPL within Galaxy. The functionality delivered by OPPL (i.e. automated ontology manipulation) can be combined with the tools and workflows devised within the Galaxy framework, resulting in an enhancement of OPPL. Use cases are provided in order to demonstrate OPPL-Galaxy’s capability for enriching, modifying and querying biomedical ontologies. Conclusions Coupling OPPL-Galaxy with other bioinformatics tools of the Galaxy framework results in a system that is more than the sum of its parts. OPPL-Galaxy opens a new dimension of analyses and exploitation of biomedical ontologies, including automated reasoning, paving the way towards advanced biological data analyses. PMID:23286517
An efficient framework for Java data processing systems in HPC environments
NASA Astrophysics Data System (ADS)
Fries, Aidan; Castañeda, Javier; Isasi, Yago; Taboada, Guillermo L.; Portell de Mora, Jordi; Sirvent, Raül
2011-11-01
Java is a commonly used programming language, although its use in High Performance Computing (HPC) remains relatively low. One of the reasons is a lack of libraries offering specific HPC functions to Java applications. In this paper we present a Java-based framework, called DpcbTools, designed to provide a set of functions that fill this gap. It includes a set of efficient data communication functions based on message-passing, thus providing, when a low latency network such as Myrinet is available, higher throughputs and lower latencies than standard solutions used by Java. DpcbTools also includes routines for the launching, monitoring and management of Java applications on several computing nodes by making use of JMX to communicate with remote Java VMs. The Gaia Data Processing and Analysis Consortium (DPAC) is a real case where scientific data from the ESA Gaia astrometric satellite will be entirely processed using Java. In this paper we describe the main elements of DPAC and its usage of the DpcbTools framework. We also assess the usefulness and performance of DpcbTools through its performance evaluation and the analysis of its impact on some DPAC systems deployed in the MareNostrum supercomputer (Barcelona Supercomputing Center).
Field Assessment of Energy Audit Tools for Retrofit Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, J.; Bohac, D.; Nelson, C.
2013-07-01
This project focused on the use of home energy ratings as a tool to promote energy retrofits in existing homes. A home energy rating provides a quantitative appraisal of a home's asset performance, usually compared to a benchmark such as the average energy use of similar homes in the same region. Home rating systems can help motivate homeowners in several ways. Ratings can clearly communicate a home's achievable energy efficiency potential, provide a quantitative assessment of energy savings after retrofits are completed, and show homeowners how they rate compared to their neighbors, thus creating an incentive to conform to amore » social standard. An important consideration is how rating tools for the retrofit market will integrate with existing home energy service programs. For residential programs that target energy savings only, home visits should be focused on key efficiency measures for that home. In order to gain wide adoption, a rating tool must be easily integrated into the field process, demonstrate consistency and reasonable accuracy to earn the trust of home energy technicians, and have a low monetary cost and time hurdle for homeowners. Along with the Home Energy Score, this project also evaluated the energy modeling performance of SIMPLE and REM/Rate.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-02
..., preferences, or experiences of customers or other stakeholders relating to existing or future services or... effective, efficient, and satisfying experience with the Agency's programs. This feedback will provide insights into customer or stakeholder perceptions, experiences and expectations, provide an early warning...
Insect transformation with piggyBac: getting the number of injections just right
Morrison, N. I.; Shimeld, S. M.
2016-01-01
Abstract The insertion of exogenous genetic cargo into insects using transposable elements is a powerful research tool with potential applications in meeting food security and public health challenges facing humanity. piggyBac is the transposable element most commonly utilized for insect germline transformation. The described efficiency of this process is variable in the published literature, and a comprehensive review of transformation efficiency in insects is lacking. This study compared and contrasted all available published data with a comprehensive data set provided by a biotechnology group specializing in insect transformation. Based on analysis of these data, with particular focus on the more complete observational data from the biotechnology group, we designed a decision tool to aid researchers' decision‐making when using piggyBac to transform insects by microinjection. A combination of statistical techniques was used to define appropriate summary statistics of piggyBac transformation efficiency by species and insect order. Publication bias was assessed by comparing the data sets. The bias was assessed using strategies co‐opted from the medical literature. The work culminated in building the Goldilocks decision tool, a Markov‐Chain Monte‐Carlo simulation operated via a graphical interface and providing guidance on best practice for those seeking to transform insects using piggyBac. PMID:27027400
C 3, A Command-line Catalog Cross-match Tool for Large Astrophysical Catalogs
NASA Astrophysics Data System (ADS)
Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio
2017-02-01
Modern Astrophysics is based on multi-wavelength data organized into large and heterogeneous catalogs. Hence, the need for efficient, reliable and scalable catalog cross-matching methods plays a crucial role in the era of the petabyte scale. Furthermore, multi-band data have often very different angular resolution, requiring the highest generality of cross-matching features, mainly in terms of region shape and resolution. In this work we present C 3 (Command-line Catalog Cross-match), a multi-platform application designed to efficiently cross-match massive catalogs. It is based on a multi-core parallel processing paradigm and conceived to be executed as a stand-alone command-line process or integrated within any generic data reduction/analysis pipeline, providing the maximum flexibility to the end-user, in terms of portability, parameter configuration, catalog formats, angular resolution, region shapes, coordinate units and cross-matching types. Using real data, extracted from public surveys, we discuss the cross-matching capabilities and computing time efficiency also through a direct comparison with some publicly available tools, chosen among the most used within the community, and representative of different interface paradigms. We verified that the C 3 tool has excellent capabilities to perform an efficient and reliable cross-matching between large data sets. Although the elliptical cross-match and the parametric handling of angular orientation and offset are known concepts in the astrophysical context, their availability in the presented command-line tool makes C 3 competitive in the context of public astronomical tools.
CCTop: An Intuitive, Flexible and Reliable CRISPR/Cas9 Target Prediction Tool
del Sol Keyer, Maria; Wittbrodt, Joachim; Mateo, Juan L.
2015-01-01
Engineering of the CRISPR/Cas9 system has opened a plethora of new opportunities for site-directed mutagenesis and targeted genome modification. Fundamental to this is a stretch of twenty nucleotides at the 5’ end of a guide RNA that provides specificity to the bound Cas9 endonuclease. Since a sequence of twenty nucleotides can occur multiple times in a given genome and some mismatches seem to be accepted by the CRISPR/Cas9 complex, an efficient and reliable in silico selection and evaluation of the targeting site is key prerequisite for the experimental success. Here we present the CRISPR/Cas9 target online predictor (CCTop, http://crispr.cos.uni-heidelberg.de) to overcome limitations of already available tools. CCTop provides an intuitive user interface with reasonable default parameters that can easily be tuned by the user. From a given query sequence, CCTop identifies and ranks all candidate sgRNA target sites according to their off-target quality and displays full documentation. CCTop was experimentally validated for gene inactivation, non-homologous end-joining as well as homology directed repair. Thus, CCTop provides the bench biologist with a tool for the rapid and efficient identification of high quality target sites. PMID:25909470
Jian, Bo; Hou, Wensheng; Wu, Cunxiang; Liu, Bin; Liu, Wei; Song, Shikui; Bi, Yurong; Han, Tianfu
2009-06-25
Transgenic approaches provide a powerful tool for gene function investigations in plants. However, some legumes are still recalcitrant to current transformation technologies, limiting the extent to which functional genomic studies can be performed on. Superroot of Lotus corniculatus is a continuous root cloning system allowing direct somatic embryogenesis and mass regeneration of plants. Recently, a technique to obtain transgenic L. corniculatus plants from Superroot-derived leaves through A. tumefaciens-mediated transformation was described. However, transformation efficiency was low and it took about six months from gene transfer to PCR identification. In the present study, we developed an A. rhizogenes-mediated transformation of Superroot-derived L. corniculatus for gene function investigation, combining the efficient A. rhizogenes-mediated transformation and the rapid regeneration system of Superroot. The transformation system using A. rhizogenes K599 harbouring pGFPGUSPlus was improved by validating some parameters which may influence the transformation frequency. Using stem sections with one node as explants, a 2-day pre-culture of explants, infection with K599 at OD(600) = 0.6, and co-cultivation on medium (pH 5.4) at 22 degrees C for 2 days enhanced the transformation frequency significantly. As proof of concept, Superroot-derived L. corniculatus was transformed with a gene from wheat encoding an Na+/H+ antiporter (TaNHX2) using the described system. Transgenic Superroot plants were obtained and had increased salt tolerance, as expected from the expression of TaNHX2. A rapid and efficient tool for gene function investigation in L. corniculatus was developed, combining the simplicity and high efficiency of the Superroot regeneration system and the availability of A. rhizogenes-mediated transformation. This system was improved by validating some parameters influencing the transformation frequency, which could reach 92% based on GUS detection. The combination of the highly efficient transformation and the regeneration system of Superroot provides a valuable tool for functional genomics studies in L. corniculatus.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, Alberta; Mann, Margaret; Gelman, Rachel
In evaluating next-generation materials and processes, the supply chain can have a large impact on the life cycle energy impacts. The Materials Flow through Industry (MFI) tool was developed for the Department of Energy's Advanced Manufacturing Office to be able to evaluate the energy impacts of the U.S. supply chain. The tool allows users to perform process comparisons, material substitutions, and grid modifications, and to see the effects of implementing sector efficiency potentials (Masanet, et al. 2009). This paper reviews the methodology of the tool and provides results around specific scenarios.
Leveraging business intelligence to make better decisions: Part I.
Reimers, Mona
2014-01-01
Data is the new currency. Business intelligence tools will provide better performing practices with a competitive intelligence advantage that will separate the high performers from the rest of the pack. Given the investments of time and money into our data systems, practice leaders must work to take every advantage and look at the datasets as a potential goldmine of business intelligence decision tools. A fresh look at decision tools created from practice data will create efficiencies and improve effectiveness for end-users and managers.
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda S.; Wunderlich, Dana A.; Willoughby, John K.
1992-01-01
New and innovative software technology is presented that provides a cost effective bridge for smoothly transitioning prototype software, in the field of planning and scheduling, into an operational environment. Specifically, this technology mixes the flexibility and human design efficiency of dynamic data typing with the rigor and run-time efficiencies of static data typing. This new technology provides a very valuable tool for conducting the extensive, up-front system prototyping that leads to specifying the correct system and producing a reliable, efficient version that will be operationally effective and will be accepted by the intended users.
Spectral analysis for GNSS coordinate time series using chirp Fourier transform
NASA Astrophysics Data System (ADS)
Feng, Shengtao; Bo, Wanju; Ma, Qingzun; Wang, Zifan
2017-12-01
Spectral analysis for global navigation satellite system (GNSS) coordinate time series provides a principal tool to understand the intrinsic mechanism that affects tectonic movements. Spectral analysis methods such as the fast Fourier transform, Lomb-Scargle spectrum, evolutionary power spectrum, wavelet power spectrum, etc. are used to find periodic characteristics in time series. Among spectral analysis methods, the chirp Fourier transform (CFT) with less stringent requirements is tested with synthetic and actual GNSS coordinate time series, which proves the accuracy and efficiency of the method. With the length of series only limited to even numbers, CFT provides a convenient tool for windowed spectral analysis. The results of ideal synthetic data prove CFT accurate and efficient, while the results of actual data show that CFT is usable to derive periodic information from GNSS coordinate time series.
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.; Olariu, Stephen
1995-01-01
The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.
ERIC Educational Resources Information Center
Joyner, Amy
2003-01-01
Handheld computers provide students tremendous computing and learning power at about a 10th the cost of a regular computer. Describes the evolution of handhelds; provides some examples of their uses; and cites research indicating they are effective classroom tools that can improve efficiency and instruction. A sidebar lists handheld resources.…
Educators Beware: Avoiding the Scams
ERIC Educational Resources Information Center
DuBoff, Leonard D.; King, Christy O.
2009-01-01
The technology boom has provided opportunities for practitioners to work more efficiently by making available a host of timesaving, as well as cost-effective, tools. Sadly, modern technology has also provided the less scrupulous segments of society with opportunities to take advantage of others. All too familiar are the Nigerian money scams that…
Processing MPI Datatypes Outside MPI
NASA Astrophysics Data System (ADS)
Ross, Robert; Latham, Robert; Gropp, William; Lusk, Ewing; Thakur, Rajeev
The MPI datatype functionality provides a powerful tool for describing structured memory and file regions in parallel applications, enabling noncontiguous data to be operated on by MPI communication and I/O routines. However, no facilities are provided by the MPI standard to allow users to efficiently manipulate MPI datatypes in their own codes.
Optimal designs for copula models
Perrone, E.; Müller, W.G.
2016-01-01
Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616
Intelligent control system based on ARM for lithography tool
NASA Astrophysics Data System (ADS)
Chen, Changlong; Tang, Xiaoping; Hu, Song; Wang, Nan
2014-08-01
The control system of traditional lithography tool is based on PC and MCU. The PC handles the complex algorithm, human-computer interaction, and communicates with MCU via serial port; The MCU controls motors and electromagnetic valves, etc. This mode has shortcomings like big volume, high power consumption, and wasting of PC resource. In this paper, an embedded intelligent control system of lithography tool, based on ARM, is provided. The control system used S5PV210 as processor, completing the functions of PC in traditional lithography tool, and provided a good human-computer interaction by using LCD and capacitive touch screen. Using Android4.0.3 as operating system, the equipment provided a cool and easy UI which made the control more user-friendly, and implemented remote control and debug, pushing video information of product by network programming. As a result, it's convenient for equipment vendor to provide technical support for users. Finally, compared with traditional lithography tool, this design reduced the PC part, making the hardware resources efficiently used and reducing the cost and volume. Introducing embedded OS and the concepts in "The Internet of things" into the design of lithography tool can be a development trend.
CoCoNUT: an efficient system for the comparison and analysis of genomes
2008-01-01
Background Comparative genomics is the analysis and comparison of genomes from different species. This area of research is driven by the large number of sequenced genomes and heavily relies on efficient algorithms and software to perform pairwise and multiple genome comparisons. Results Most of the software tools available are tailored for one specific task. In contrast, we have developed a novel system CoCoNUT (Computational Comparative geNomics Utility Toolkit) that allows solving several different tasks in a unified framework: (1) finding regions of high similarity among multiple genomic sequences and aligning them, (2) comparing two draft or multi-chromosomal genomes, (3) locating large segmental duplications in large genomic sequences, and (4) mapping cDNA/EST to genomic sequences. Conclusion CoCoNUT is competitive with other software tools w.r.t. the quality of the results. The use of state of the art algorithms and data structures allows CoCoNUT to solve comparative genomics tasks more efficiently than previous tools. With the improved user interface (including an interactive visualization component), CoCoNUT provides a unified, versatile, and easy-to-use software tool for large scale studies in comparative genomics. PMID:19014477
War Gamers Handbook: A Guide for Professional War Gamers
2015-11-01
more complex games led us to integrate knowledge management, web tools, and multitouch , multiuser technologies in order to more efficiently and... Multitouch multiuser (MTMU) and communications operating picture (COP) interfaces ◊ Web development—Web tools and player interfaces Now that the game...hurricane or flood scenario to provide a plausible backdrop to facilitate player interaction toward game objectives. Scenarios should include only the
Recipe for Success: Digital Viewables
NASA Technical Reports Server (NTRS)
LaPha, Steven; Gaydos, Frank
2014-01-01
The Engineering Services Contract (ESC) and Information Management Communication Support contract (IMCS) at Kennedy Space Center (KSC) provide services to NASA in respect to flight and ground systems design and development. These groups provides the necessary tools, aid, and best practice methodologies required for efficient, optimized design and process development. The team is responsible for configuring and implementing systems, software, along with training, documentation, and administering standards. The team supports over 200 engineers and design specialists with the use of Windchill, Creo Parametric, NX, AutoCAD, and a variety of other design and analysis tools.
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay W.; Lyzenga, Gregory A.; Granat, Robert A.; Norton, Charles D.; Rundle, John B.; Pierce, Marlon E.; Fox, Geoffrey C.; McLeod, Dennis; Ludwig, Lisa Grant
2012-01-01
QuakeSim 2.0 improves understanding of earthquake processes by providing modeling tools and integrating model applications and various heterogeneous data sources within a Web services environment. QuakeSim is a multisource, synergistic, data-intensive environment for modeling the behavior of earthquake faults individually, and as part of complex interacting systems. Remotely sensed geodetic data products may be explored, compared with faults and landscape features, mined by pattern analysis applications, and integrated with models and pattern analysis applications in a rich Web-based and visualization environment. Integration of heterogeneous data products with pattern informatics tools enables efficient development of models. Federated database components and visualization tools allow rapid exploration of large datasets, while pattern informatics enables identification of subtle, but important, features in large data sets. QuakeSim is valuable for earthquake investigations and modeling in its current state, and also serves as a prototype and nucleus for broader systems under development. The framework provides access to physics-based simulation tools that model the earthquake cycle and related crustal deformation. Spaceborne GPS and Inter ferometric Synthetic Aperture (InSAR) data provide information on near-term crustal deformation, while paleoseismic geologic data provide longerterm information on earthquake fault processes. These data sources are integrated into QuakeSim's QuakeTables database system, and are accessible by users or various model applications. UAVSAR repeat pass interferometry data products are added to the QuakeTables database, and are available through a browseable map interface or Representational State Transfer (REST) interfaces. Model applications can retrieve data from Quake Tables, or from third-party GPS velocity data services; alternatively, users can manually input parameters into the models. Pattern analysis of GPS and seismicity data has proved useful for mid-term forecasting of earthquakes, and for detecting subtle changes in crustal deformation. The GPS time series analysis has also proved useful as a data-quality tool, enabling the discovery of station anomalies and data processing and distribution errors. Improved visualization tools enable more efficient data exploration and understanding. Tools provide flexibility to science users for exploring data in new ways through download links, but also facilitate standard, intuitive, and routine uses for science users and end users such as emergency responders.
Circuit design tool. User's manual, revision 2
NASA Technical Reports Server (NTRS)
Miyake, Keith M.; Smith, Donald E.
1992-01-01
The CAM chip design was produced in a UNIX software environment using a design tool that supports definition of digital electronic modules, composition of these modules into higher level circuits, and event-driven simulation of these circuits. Our design tool provides an interface whose goals include straightforward but flexible primitive module definition and circuit composition, efficient simulation, and a debugging environment that facilitates design verification and alteration. The tool provides a set of primitive modules which can be composed into higher level circuits. Each module is a C-language subroutine that uses a set of interface protocols understood by the design tool. Primitives can be altered simply by recoding their C-code image; in addition new primitives can be added allowing higher level circuits to be described in C-code rather than as a composition of primitive modules--this feature can greatly enhance the speed of simulation.
Proteomic analyses of the environmental toxicity of carcinogenic chemicals
Protein expression and posttranslational modifications consistently change in response to the exposure to environmental chemicals. Recent technological advances in proteomics provide new tools for more efficient characterization of protein expression and posttranslational modific...
Using COPE To Improve Quality of Care: The Experience of the Family Planning Association of Kenya.
ERIC Educational Resources Information Center
Bradley, Janet
1998-01-01
COPE (Client-Oriented, Provider-Efficient) methodology, a self-assessment tool that has been used in 35 countries around the world, was used to improve the quality of care in family planning clinics in Kenya. COPE involves a process that legitimately invests power with providers and clinic-level staff. It gives providers more control over their…
RTU Comparison Calculator Enhancement Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, James D.; Wang, Weimin; Katipamula, Srinivas
Over the past two years, Department of Energy’s Building Technologies Office (BTO) has been investigating ways to increase the operating efficiency of the packaged rooftop units (RTUs) in the field. First, by issuing a challenge to the RTU manufactures to increase the integrated energy efficiency ratio (IEER) by 60% over the existing ASHRAE 90.1-2010 standard. Second, by evaluating the performance of an advanced RTU controller that reduces the energy consumption by over 40%. BTO has previously also funded development of a RTU comparison calculator (RTUCC). RTUCC is a web-based tool that provides the user a way to compare energy andmore » cost savings for two units with different efficiencies. However, the RTUCC currently cannot compare savings associated with either the RTU Challenge unit or the advanced RTU controls retrofit. Therefore, BTO has asked PNNL to enhance the tool so building owners can compare energy and savings associated with this new class of products. This document provides the details of the enhancements that are required to support estimating energy savings from use of RTU challenge units or advanced controls on existing RTUs.« less
Conditions Database for the Belle II Experiment
NASA Astrophysics Data System (ADS)
Wood, L.; Elsethagen, T.; Schram, M.; Stephan, E.
2017-10-01
The Belle II experiment at KEK is preparing for first collisions in 2017. Processing the large amounts of data that will be produced will require conditions data to be readily available to systems worldwide in a fast and efficient manner that is straightforward for both the user and maintainer. The Belle II conditions database was designed with a straightforward goal: make it as easily maintainable as possible. To this end, HEP-specific software tools were avoided as much as possible and industry standard tools used instead. HTTP REST services were selected as the application interface, which provide a high-level interface to users through the use of standard libraries such as curl. The application interface itself is written in Java and runs in an embedded Payara-Micro Java EE application server. Scalability at the application interface is provided by use of Hazelcast, an open source In-Memory Data Grid (IMDG) providing distributed in-memory computing and supporting the creation and clustering of new application interface instances as demand increases. The IMDG provides fast and efficient access to conditions data via in-memory caching.
ADAM: analysis of discrete models of biological systems using computer algebra.
Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard
2011-07-20
Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.
ParCAT: A Parallel Climate Analysis Toolkit
NASA Astrophysics Data System (ADS)
Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.
2012-12-01
Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.
Energy-Saving Opportunities for Manufacturing Enterprises (International English Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This fact sheet provides information about the Industrial Technologies Program Save Energy Now energy audit process, software tools, training, energy management standards, and energy efficient technologies to help U.S. companies identify energy cost savings.
Use of Electronic Health Record Tools to Facilitate and Audit Infliximab Prescribing.
Sharpless, Bethany R; Del Rosario, Fernando; Molle-Rios, Zarela; Hilmas, Elora
2018-01-01
The objective of this project was to assess a pediatric institution's use of infliximab and develop and evaluate electronic health record tools to improve safety and efficiency of infliximab ordering through auditing and improved communication. Best use of infliximab was defined through a literature review, analysis of baseline use of infliximab at our institution, and distribution and analysis of a national survey. Auditing and order communication were optimized through implementation of mandatory indications in the infliximab orderable and creation of an interactive flowsheet that collects discrete and free-text data. The value of the implemented electronic health record tools was assessed at the conclusion of the project. Baseline analysis determined that 93.8% of orders were dosed appropriately according to the findings of a literature review. After implementation of the flowsheet and indications, the time to perform an audit of use was reduced from 60 minutes to 5 minutes per month. Four months post implementation, data were entered by 60% of the pediatric gastroenterologists at our institution on 15.3% of all encounters for infliximab. Users were surveyed on the value of the tools, with 100% planning to continue using the workflow, and 82% stating the tools frequently improve the efficiency and safety of infliximab prescribing. Creation of a standard workflow by using an interactive flowsheet has improved auditing ability and facilitated the communication of important order information surrounding infliximab. Providers and pharmacists feel these tools improve the safety and efficiency of infliximab ordering, and auditing data reveal that the tools are being used.
Engineering With Nature Geographic Project Mapping Tool (EWN ProMap)
2015-07-01
EWN ProMap database provides numerous case studies for infrastructure projects such as breakwaters, river engineering dikes, and seawalls that have...the EWN Project Mapping Tool (EWN ProMap) is to assist users in their search for case study information that can be valuable for developing EWN ideas...Essential elements of EWN include: (1) using science and engineering to produce operational efficiencies supporting sustainable delivery of
Application of the gene editing tool, CRISPR-Cas9, for treating neurodegenerative diseases.
Kolli, Nivya; Lu, Ming; Maiti, Panchanan; Rossignol, Julien; Dunbar, Gary L
2018-01-01
Increased accumulation of transcribed protein from the damaged DNA and reduced DNA repair capability contributes to numerous neurological diseases for which effective treatments are lacking. Gene editing techniques provide new hope for replacing defective genes and DNA associated with neurological diseases. With advancements in using such editing tools as zinc finger nucleases (ZFNs), meganucleases, and transcription activator-like effector nucleases (TALENs), etc., scientists are able to design DNA-binding proteins, which can make precise double-strand breaks (DSBs) at the target DNA. Recent developments with the CRISPR-Cas9 gene-editing technology has proven to be more precise and efficient when compared to most other gene-editing techniques. Two methods, non-homologous end joining (NHEJ) and homology-direct repair (HDR), are used in CRISPR-Cas9 system to efficiently excise the defective genes and incorporate exogenous DNA at the target site. In this review article, we provide an overview of the CRISPR-Cas9 methodology, including its molecular mechanism, with a focus on how in this gene-editing tool can be used to counteract certain genetic defects associated with neurological diseases. Detailed understanding of this new tool could help researchers design specific gene editing strategies to repair genetic disorders in selective neurological diseases. Copyright © 2017 Elsevier Ltd. All rights reserved.
Assessment of wear dependence parameters in complex model of cutting tool wear
NASA Astrophysics Data System (ADS)
Antsev, A. V.; Pasko, N. I.; Antseva, N. V.
2018-03-01
This paper addresses wear dependence of the generic efficient life period of cutting tools taken as an aggregate of the law of tool wear rate distribution and dependence of parameters of this law's on the cutting mode, factoring in the random factor as exemplified by the complex model of wear. The complex model of wear takes into account the variance of cutting properties within one batch of tools, variance in machinability within one batch of workpieces, and the stochastic nature of the wear process itself. A technique of assessment of wear dependence parameters in a complex model of cutting tool wear is provided. The technique is supported by a numerical example.
Patient-oriented interactive E-health tools on U.S. hospital Web sites.
Huang, Edgar; Chang, Chiu-Chi Angela
2012-01-01
The purpose of this study is to provide evidence for strategic planning regarding e-health development in U.S. hospitals. A content analysis of a representative sample of the U.S. hospital Web sites has revealed how U.S. hospitals have taken advantage of the 21 patient-oriented interactive tools identified in this study. Significant gaps between various types of hospitals have also been found. It is concluded that although the majority of the U.S. hospitals have adopted traditional functional tools, they need to make significant inroad in implementing the core e-business tools to serve their patients/users, making their Web sites more efficient marketing tools.
A meta-analysis of pedagogical tools used in introductory programming courses
NASA Astrophysics Data System (ADS)
Trees, Frances P.
Programming is recognized as being challenging for teachers to teach and difficult for students to learn. For decades, computer science educators have looked at innovative approaches by creating pedagogical software tools that attempt to facilitate both the teaching of and the learning of programming. This dissertation investigates the motivations for the integration of pedagogical tools in introductory programming courses and the characteristics that are perceived to contribute to the effectiveness of these tools. The study employs three research stages that examine the tool characteristics and their use. The first stage surveys teachers who use pedagogical tools in an introductory programming course. The second interviews teachers to explore the survey results in more detail and to add greater depth into the choice and use of pedagogical tools in the introductory programming class. The third interviews tool developers to provide an explanatory insight of the tool and the motivation for its creation. The results indicate that the pedagogical tools perceived to be effective share common characteristics: They provide an environment that is manageable, flexible and visual; they provide for active engagement in learning activities and support programming in small pieces; they allow for an easy transition to subsequent courses and more robust environments; they provide technical support and resource materials. The results of this study also indicate that recommendations from other computer science educators have a strong impact on a teacher's initial tool choice for an introductory programming course. This study informs present and future tool developers of the characteristics that the teachers perceive to contribute to the effectiveness of a pedagogical tool and how to present their tools to encourage a more efficient and more effective widespread adoption of the tool into the teacher's curriculum. The teachers involved in this study are actively involved in the computer science education community. The results of this study, based on the perceptions of these computer science educators, provide guidance to those educators choosing to introduce a new pedagogical tool into their programming course.
Live minimal path for interactive segmentation of medical images
NASA Astrophysics Data System (ADS)
Chartrand, Gabriel; Tang, An; Chav, Ramnada; Cresson, Thierry; Chantrel, Steeve; De Guise, Jacques A.
2015-03-01
Medical image segmentation is nowadays required for medical device development and in a growing number of clinical and research applications. Since dedicated automatic segmentation methods are not always available, generic and efficient interactive tools can alleviate the burden of manual segmentation. In this paper we propose an interactive segmentation tool based on image warping and minimal path segmentation that is efficient for a wide variety of segmentation tasks. While the user roughly delineates the desired organs boundary, a narrow band along the cursors path is straightened, providing an ideal subspace for feature aligned filtering and minimal path algorithm. Once the segmentation is performed on the narrow band, the path is warped back onto the original image, precisely delineating the desired structure. This tool was found to have a highly intuitive dynamic behavior. It is especially efficient against misleading edges and required only coarse interaction from the user to achieve good precision. The proposed segmentation method was tested for 10 difficult liver segmentations on CT and MRI images, and the resulting 2D overlap Dice coefficient was 99% on average..
The impact of a novel resident leadership training curriculum.
Awad, Samir S; Hayley, Barbara; Fagan, Shawn P; Berger, David H; Brunicardi, F Charles
2004-11-01
Today's complex health care environment coupled with the 80-hour workweek mandate has required that surgical resident team interactions evolve from a military command-and-control style to a collaborative leadership style. A novel educational curriculum was implemented with objectives of training the residents to have the capacity/ability to create and manage powerful teams through alignment, communication, and integrity integral tools to practicing a collaborative leadership style while working 80 hours per week. Specific strategies were as follows: (1) to focus on quality of patient care and service while receiving a high education-to-service ratio, and (2) to maximize efficiency through time management. This article shows that leadership training as part of a resident curriculum can significantly increase a resident's view of leadership in the areas of alignment, communication, and integrity; tools previously shown in business models to be vital for effective and efficient teams. This curriculum, over the course of the surgical residency, can provide residents with the necessary tools to deliver efficient quality of care while working within the 80-hour workweek mandate in a more collaborative style environment.
Common Methodology for Efficient Airspace Operations
NASA Technical Reports Server (NTRS)
Sridhar, Banavar
2012-01-01
Topics include: a) Developing a common methodology to model and avoid disturbances affecting airspace. b) Integrated contrails and emission models to a national level airspace simulation. c) Developed capability to visualize, evaluate technology and alternate operational concepts and provide inputs for policy-analysis tools to reduce the impact of aviation on the environment. d) Collaborating with Volpe Research Center, NOAA and DLR to leverage expertise and tools in aircraft emissions and weather/climate modeling. Airspace operations is a trade-off balancing safety, capacity, efficiency and environmental considerations. Ideal flight: Unimpeded wind optimal route with optimal climb and descent. Operations degraded due to reduction in airport and airspace capacity caused by inefficient procedures and disturbances.
An integrated knowledge system for wind tunnel testing - Project Engineers' Intelligent Assistant
NASA Technical Reports Server (NTRS)
Lo, Ching F.; Shi, George Z.; Hoyt, W. A.; Steinle, Frank W., Jr.
1993-01-01
The Project Engineers' Intelligent Assistant (PEIA) is an integrated knowledge system developed using artificial intelligence technology, including hypertext, expert systems, and dynamic user interfaces. This system integrates documents, engineering codes, databases, and knowledge from domain experts into an enriched hypermedia environment and was designed to assist project engineers in planning and conducting wind tunnel tests. PEIA is a modular system which consists of an intelligent user-interface, seven modules and an integrated tool facility. Hypermedia technology is discussed and the seven PEIA modules are described. System maintenance and updating is very easy due to the modular structure and the integrated tool facility provides user access to commercial software shells for documentation, reporting, or database updating. PEIA is expected to provide project engineers with technical information, increase efficiency and productivity, and provide a realistic tool for personnel training.
Jang, Mihue; Han, Hee Dong; Ahn, Hyung Jun
2016-01-01
Incorporating multiple copies of two RNAi molecules into a single nanostructure in a precisely controlled manner can provide an efficient delivery tool to regulate multiple gene pathways in the relation of mutual dependence. Here, we show a RNA nanotechnology platform for a two-in-one RNAi delivery system to contain polymeric two RNAi molecules within the same RNA nanoparticles, without the aid of polyelectrolyte condensation reagents. As our RNA nanoparticles lead to the simultaneous silencing of two targeted mRNAs, of which biological functions are highly interdependent, combination therapy for multi-drug resistance cancer cells, which was studied as a specific application of our two-in-one RNAi delivery system, demonstrates the efficient synergistic effects for cancer therapy. Therefore, this RNA nanoparticles approach has an efficient tool for a simultaneous co-delivery of RNAi molecules in the RNAi-based biomedical applications, and our current studies present an efficient strategy to overcome multi-drug resistance caused by malfunction of genes in chemotherapy. PMID:27562435
EggLib: processing, analysis and simulation tools for population genetics and genomics
2012-01-01
Background With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. Results In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. Conclusions EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded. PMID:22494792
EggLib: processing, analysis and simulation tools for population genetics and genomics.
De Mita, Stéphane; Siol, Mathieu
2012-04-11
With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded.
Application of data envelopment analysis in measuring the efficiency of mutual fund
NASA Astrophysics Data System (ADS)
Nik, Marzieh Geramian; Mihanzadeh, Hooman; Izadifar, Mozhgan; Nik, Babak Geramian
2015-05-01
The growth of mutual fund industry during the past decades emphasizes the importance of this investment vehicle particularly in prosperity of financial markets and in turn, financial growth of each country. Therefore, evaluating the relative efficiency of mutual funds as investment tool is of importance. In this study, a combined model of DEA (data envelopment analysis), and goal programming (GoDEA) approaches contributes widely to analyze the return efficiency of Mutual Funds in an attempt to separate efficient and inefficient Funds as well as identifying the inefficiency resources. Mixed asset local funds, which are managed jointly by CIMB and Public Mutual Berhad, have been selected for the purpose of this paper. As a result, Public Small Cap Fund (P Small Cap) is regarded as the most efficient mutual fund during the period of study. The integrated model aims to first guide investors to choose the best performing fund among other mutual funds, secondly provides the realistic and appropriate benchmark in compare to other classic method, and finally confirms the utility of data envelopment analysis (DEA) as decision-making tool.
Hollingsworth, Bruce; Parkin, David
2003-10-01
Several tools are available to health care organisations in England to measure efficiency, but these are widely reported to be unpopular and unusable. Moreover, they do not have a sound conceptual basis. This paper describes the development and evaluation of a user-friendly tool that organisations can use to measure their efficiency, based on the technique of data envelopment analysis (DEA), which has a firm basis in economic theory. Routine data from 57 providers and 14 purchasing organisations in one region of the English National Health Service (NHS) for 1994-1996 were used to create information on efficiency based on DEA. This was presented to them using guides that explained the information and how it was to be used. They were surveyed to elicit their views on current measures of efficiency and on the potential use of the DEA-based information. The DEA measure demonstrated considerable scope for improvements in health service efficiency. There was a very small improvement over time with larger changes in some hospitals than others. Overall, 80% of those surveyed gave high scores for the potential usefulness of the DEA-based measures compared with 9-45% for existing methods. The quality of presentation of the information was also consistently high. There is dissatisfaction with efficiency information currently available to the NHS. DEA produces potentially useful information, which is easy to use and can be easily explained to and understood by potential users. The next step would be the implementation, on a developmental basis, of a routine DEA-based information system.
NASA Technical Reports Server (NTRS)
Mercer, Joey; Callantine, Todd; Martin, Lynne
2012-01-01
A recent human-in-the-loop simulation in the Airspace Operations Laboratory (AOL) at NASA's Ames Research Center investigated the robustness of Controller-Managed Spacing (CMS) operations. CMS refers to AOL-developed controller tools and procedures for enabling arrivals to conduct efficient Optimized Profile Descents with sustained high throughput. The simulation provided a rich data set for examining how a traffic management supervisor and terminal-area controller participants used the CMS tools and coordinated to respond to off-nominal events. This paper proposes quantitative measures for characterizing the participants responses. Case studies of go-around events, replicated during the simulation, provide insights into the strategies employed and the role the CMS tools played in supporting them.
Tools for Administration of a UNIX-Based Network
NASA Technical Reports Server (NTRS)
LeClaire, Stephen; Farrar, Edward
2004-01-01
Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of subnetworks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File SystemJ software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.
Using Kepler for Tool Integration in Microarray Analysis Workflows.
Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C
Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.
Enhancements to Demilitarization Process Maps Program (ProMap)
2016-10-14
map tool, ProMap, was improved by implementing new features, and sharing data with MIDAS and AMDIT databases . Specifically, process efficiency was...improved by 1) providing access to APE information contained in the AMDIT database directly from inside ProMap when constructing a process map, 2...what equipment can be efficiently used to demil a particular munition. Associated with this task was the upgrade of the AMDIT database so that
Dong, Fengping; Xie, Kabin; Chen, Yueying; Yang, Yinong; Mao, Yingwei
2016-01-01
CRISPR/Cas9 has been widely used for genomic editing in many organisms. Many human diseases are caused by multiple mutations. The CRISPR/Cas9 system provides a potential tool to introduce multiple mutations in a genome. To mimic complicated genomic variants in human diseases, such as multiple gene deletions or mutations, two or more small guide RNAs (sgRNAs) need to be introduced all together. This can be achieved by separate Pol III promoters in a construct. However, limited enzyme sites and increased insertion size lower the efficiency to make a construct. Here, we report a strategy to quickly assembly multiple sgRNAs in one construct using a polycistronic-tRNA-gRNA (PTG) strategy. Taking advantage of the endogenous tRNA processing system in mammalian cells, we efficiently express multiple sgRNAs driven using only one Pol III promoter. Using an all-in-one construct carrying PTG, we disrupt the deacetylase domain in multiple histone deacetylases (HDACs) in human cells simultaneously. We demonstrate that multiple HDAC deletions significantly affect the activation of the Wnt-signaling pathway. Thus, this method enables to efficiently target multiple genes and provide a useful tool to establish mutated cells mimicking human diseases. PMID:27890617
Dong, Fengping; Xie, Kabin; Chen, Yueying; Yang, Yinong; Mao, Yingwei
2017-01-22
CRISPR/Cas9 has been widely used for genomic editing in many organisms. Many human diseases are caused by multiple mutations. The CRISPR/Cas9 system provides a potential tool to introduce multiple mutations in a genome. To mimic complicated genomic variants in human diseases, such as multiple gene deletions or mutations, two or more small guide RNAs (sgRNAs) need to be introduced all together. This can be achieved by separate Pol III promoters in a construct. However, limited enzyme sites and increased insertion size lower the efficiency to make a construct. Here, we report a strategy to quickly assembly multiple sgRNAs in one construct using a polycistronic-tRNA-gRNA (PTG) strategy. Taking advantage of the endogenous tRNA processing system in mammalian cells, we efficiently express multiple sgRNAs driven using only one Pol III promoter. Using an all-in-one construct carrying PTG, we disrupt the deacetylase domain in multiple histone deacetylases (HDACs) in human cells simultaneously. We demonstrate that multiple HDAC deletions significantly affect the activation of the Wnt-signaling pathway. Thus, this method enables to efficiently target multiple genes and provide a useful tool to establish mutated cells mimicking human diseases. Copyright © 2016 Elsevier Inc. All rights reserved.
OSLay: optimal syntenic layout of unfinished assemblies.
Richter, Daniel C; Schuster, Stephan C; Huson, Daniel H
2007-07-01
The whole genome shotgun approach to genome sequencing results in a collection of contigs that must be ordered and oriented to facilitate efficient gap closure. We present a new tool OSLay that uses synteny between matching sequences in a target assembly and a reference assembly to layout the contigs (or scaffolds) in the target assembly. The underlying algorithm is based on maximum weight matching. The tool provides an interactive visualization of the computed layout and the result can be imported into the assembly editing tool Consed to support the design of primer pairs for gap closure. To enhance efficiency in the gap closure phase of a genome project it is crucial to know which contigs are adjacent in the target genome. Related genome sequences can be used to layout contigs in an assembly. OSLay is freely available from: http://www-ab.informatik.unituebingen.de/software/oslay.
Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs
NASA Astrophysics Data System (ADS)
Pianese, C.; Sorrentino, M.
2009-08-01
Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.
Energy Tracking Software Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryan Davis; Nathan Bird; Rebecca Birx
2011-04-04
Acceleration has created an interactive energy tracking and visualization platform that supports decreasing electric, water, and gas usage. Homeowners have access to tools that allow them to gauge their use and track progress toward a smaller energy footprint. Real estate agents have access to consumption data, allowing for sharing a comparison with potential home buyers. Home builders have the opportunity to compare their neighborhood's energy efficiency with competitors. Home energy raters have a tool for gauging the progress of their clients after efficiency changes. And, social groups are able to help encourage members to reduce their energy bills and helpmore » their environment. EnergyIT.com is the business umbrella for all energy tracking solutions and is designed to provide information about our energy tracking software and promote sales. CompareAndConserve.com (Gainesville-Green.com) helps homeowners conserve energy through education and competition. ToolsForTenants.com helps renters factor energy usage into their housing decisions.« less
Tsang, Michael P; Kikuchi-Uehara, Emi; Sonnemann, Guido W; Aymonier, Cyril; Hirao, Masahiko
2017-08-04
It has been some 15 years since the topics of sustainability and nanotechnologies first appeared together in the scientific literature and became a focus of organizations' research and policy developments. On the one hand, this focus is directed towards approaches and tools for risk assessment and management and on the other hand towards life-cycle thinking and assessment. Comparable to their application for regular chemicals, each tool is seen to serve separate objectives as it relates to evaluating nanotechnologies' safety or resource efficiency, respectively. While nanomaterials may provide resource efficient production and consumption, this must balance any potential hazards they pose across their life-cycles. This Perspective advocates for integrating these two tools at the methodological level for achieving this objective, and it explains what advantages and challenges this offers decision-makers while highlighting what research is needed to further enhance integration.
3Drefine: an interactive web server for efficient protein structure refinement
Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin
2016-01-01
3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371
Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis
Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...
2008-01-01
Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less
GIS-MODFLOW: Ein kleines OpenSource-Werkzeug zur Anbindung von GIS-Daten an MODFLOW
NASA Astrophysics Data System (ADS)
Gossel, Wolfgang
2013-06-01
The numerical model MODFLOW (Harbaugh 2005) is an efficient and up-to-date tool for groundwater flow modelling. On the other hand, Geo-Information-Systems (GIS) provide useful tools for data preparation and visualization that can also be incorporated in numerical groundwater modelling. An interface between both would therefore be useful for many hydrogeological investigations. To date, several integrated stand-alone tools have been developed that rely on MODFLOW, MODPATH and transport modelling tools. Simultaneously, several open source-GIS codes were developed to improve functionality and ease of use. These GIS tools can be used as pre- and post-processors of the numerical model MODFLOW via a suitable interface. Here we present GIS-MODFLOW as an open-source tool that provides a new universal interface by using the ESRI ASCII GRID data format that can be converted into MODFLOW input data. This tool can also treat MODFLOW results. Such a combination of MODFLOW and open-source GIS opens new possibilities to render groundwater flow modelling, and simulation results, available to larger circles of hydrogeologists.
Digitizing the Facebow: A Clinician/Technician Communication Tool.
Kalman, Les; Chrapka, Julia; Joseph, Yasmin
2016-01-01
Communication between the clinician and the technician has been an ongoing problem in dentistry. To improve the issue, a dental software application has been developed--the Virtual Facebow App. It is an alternative to the traditional analog facebow, used to orient the maxillary cast in mounting. Comparison data of the two methods indicated that the digitized virtual facebow provided increased efficiency in mounting, increased accuracy in occlusion, and lower cost. Occlusal accuracy, lab time, and total time were statistically significant (P<.05). The virtual facebow provides a novel alternative for cast mounting and another tool for clinician-technician communication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Na; Goel, Supriya; Gorrissen, Willy J.
2013-06-24
The U.S. Department of Energy (DOE) is developing a national voluntary energy asset score system to help building owners to evaluate the as-built physical characteristics (including building envelope, the mechanical and electrical systems) and overall building energy efficiency, independent of occupancy and operational choices. The energy asset score breaks down building energy use information by simulating building performance under typical operating and occupancy conditions for a given use type. A web-based modeling tool, the energy asset score tool facilitates the implementation of the asset score system. The tool consists of a simplified user interface built on a centralized simulation enginemore » (EnergyPlus). It is intended to reduce both the implementation cost for the users and increase modeling standardization compared with an approach that requires users to build their own energy models. A pilot project with forty-two buildings (consisting mostly offices and schools) was conducted in 2012. This paper reports the findings. Participants were asked to collect a minimum set of building data and enter it into the asset score tool. Participants also provided their utility bills, existing ENERGY STAR scores, and previous energy audit/modeling results if available. The results from the asset score tool were compared with the building energy use data provided by the pilot participants. Three comparisons were performed. First, the actual building energy use, either from the utility bills or via ENERGY STAR Portfolio Manager, was compared with the modeled energy use. It was intended to examine how well the energy asset score represents a building’s system efficiencies, and how well it is correlated to a building’s actual energy consumption. Second, calibrated building energy models (where they exist) were used to examine any discrepancies between the asset score model and the pilot participant buildings’ [known] energy use pattern. This comparison examined the end use breakdowns and more detailed time series data. Third, ASHRAE 90.1 prototype buildings were also used as an industry standard modeling approach to test the accuracy level of the asset score tool. Our analysis showed that the asset score tool, which uses simplified building simulation, could provide results comparable to a more detailed energy model. The buildings’ as-built efficiency can be reflected in the energy asset score. An analysis between the modeled energy use through the asset score tool and the actual energy use from the utility bills can further inform building owners about the effectiveness of their building’s operation and maintenance.« less
Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis
2015-01-01
Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.
Wan, Shixiang; Zou, Quan
2017-01-01
Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.
Experience with Using Multiple Types of Visual Educational Tools during Problem-Based Learning.
Kang, Bong Jin
2012-06-01
This study describes the experience of using multiple types of visual educational tools in the setting of problem-based learning (PBL). The author intends to demonstrate their roles in diverse and efficient ways of clinical reasoning and problem solving. Visual educational tools were introduced in a lecture that included their various types, possible benefits, and some examples. Each group made one mechanistic case diagram per week, and each student designed one diagnostic schema or therapeutic algorithm per week, based on their learning issues. The students were also told to provide commentary, which was intended to give insights into their truthfulness. Subsequently, the author administered a questionnaire about the usefulness and weakness of visual educational tools and the difficulties with performing the work. Also, the qualities of the products were assessed by the author. There were many complaints about the adequacy of the introduction of visual educational tools, also revealed by the many initial inappropriate types of products. However, the exercise presentation in the first week improved the level of understanding regarding their purposes and the method of design. In general, students agreed on the benefits of their help in providing a deep understanding of the cases and the possibility of solving clinical problems efficiently. The commentary was helpful in evaluating the truthfulness of their efforts. Students gave suggestions for increasing the percentage of their scores, considering the efforts. Using multiple types of visual educational tools during PBL can be useful in understanding the diverse routes of clinical reasoning and clinical features.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W; Benton, Nathanael; Burns, Patrick
Compressed-air systems are used widely throughout industry for many operations, including pneumatic tools, packaging and automation equipment, conveyors, and other industrial process operations. Compressed-air systems are defined as a group of subsystems composed of air compressors, air treatment equipment, controls, piping, pneumatic tools, pneumatically powered machinery, and process applications using compressed air. A compressed-air system has three primary functional subsystems: supply, distribution, and demand. Air compressors are the primary energy consumers in a compressed-air system and are the primary focus of this protocol. The two compressed-air energy efficiency measures specifically addressed in this protocol are: High-efficiency/variable speed drive (VSD) compressormore » replacing modulating, load/unload, or constant-speed compressor; and Compressed-air leak survey and repairs. This protocol provides direction on how to reliably verify savings from these two measures using a consistent approach for each.« less
2011-09-20
optimal portfolio point on the efficient frontier, for example, Portfolio B on the chart in Figure A1. Then, by subsequently changing some of the ... optimized portfolio controlling for risk using the IRM methodology and tool suite. Results indicate that both rapid and incremental implementation...Results of the KVA and SD scenario analysis provided the financial information required to forecast an optimized
Integrated multidisciplinary analysis tool IMAT users' guide
NASA Technical Reports Server (NTRS)
Meissner, Frances T. (Editor)
1988-01-01
The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.
Self-actuating and self-diagnosing plastically deforming piezo-composite flapping wing MAV
NASA Astrophysics Data System (ADS)
Harish, Ajay B.; Harursampath, Dineshkumar; Mahapatra, D. Roy
2011-04-01
In this work, we propose a constitutive model to describe the behavior of Piezoelectric Fiber Reinforced Composite (PFRC) material consisting of elasto-plastic matrix reinforced by strong elastic piezoelectric fibers. Computational efficiency is achieved using analytical solutions for elastic stifness matrix derived from Variational Asymptotic Methods (VAM). This is extended to provide Structural Health Monitoring (SHM) based on plasticity induced degradation of flapping frequency of PFRC. Overall this work provides an effective mathematical tool that can be used for structural self-health monitoring of plasticity induced flapping degradation of PFRC flapping wing MAVs. The developed tool can be re-calibrated to also provide SHM for other forms of failures like fatigue, matrix cracking etc.
Technology Planning Strategies
ERIC Educational Resources Information Center
Decker, Kathy
2004-01-01
Effective planning strategies drive achievement of an overall technology goal to increase access to electronic information in real time in order to increase efficiency, productivity, and communication across campus. Planning relies on providing access, 'Anytime Anywhere' to student information, calendar, email, course management tools, and the…
DOT National Transportation Integrated Search
2016-09-30
Many transit agencies provide real-time operational information and trip-planning tools through phone, Web, and smartphone applications. These services utilize a one-way information flow from transit agencies to transit users. Current smartphone tech...
Center for Corporate Climate Leadership Leveraging Third-party Programs for Supplier Outreach
Third-party programs maximize efficient use of resources by helping companies request and analyze emissions information from suppliers and then provide suppliers with additional tools to develop their own GHG inventories and manage their GHG emissions.
SR-52 PROGRAMMABLE CALCULATOR PROGRAMS FOR VENTURI SCRUBBERS AND ELECTROSTATIC PRECIPITATORS
The report provides useful tools for estimating particulate removal by venturi scrubbers and electrostatic precipitators. Detailed descriptions are given for programs to predict the penetration (one minus efficiency) for each device. These programs are written specifically for th...
A robot end effector exchange mechanism for space applications
NASA Technical Reports Server (NTRS)
Gorin, Barney F.
1990-01-01
Efficient robot operation requires the use of specialized end effectors or tools for tasks. In spacecraft applications, the microgravity environment precludes the use of gravitational forces to retain the tools in holding fixture. As a result of this, a retention mechanism which forms a part of the tool storage container is required. A unique approach to this problem has resulted in the development of an end effector exchange mechanism that meets the requirements for spaceflight applications while avoiding the complexity usually involved. This mechanism uses multiple latching cams both on the manipulator and in the tool storage container, combined with a system of catch rings to provide retention in both locations and the required failure tolerance. Because of the cam configuration the mechanism operates passively, requiring no electrical commands except those needed to move the manipulator into position. Similarly, it inherently provides interlocks to prevent the release of one cam before its opposite number is engaged.
Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning
NASA Astrophysics Data System (ADS)
Cui, J.; Dong, B.; Li, J.; Li, L.
2017-09-01
As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.
The Application Potential of Eco-Efficiency for Greening Company
NASA Astrophysics Data System (ADS)
Prasaja, Lukman Eka; Hadiyanto
2018-02-01
Eco-Efficiency emerged in the 1990s as a measure of "the efficiency that ecological sources use to meet human needs." As a tool in economic and environmental integration, Eco-efficiency needs to be promoted further so that regulation in government and industrial management can include it as an important instrument. This paper provides several approaches that can help various industries to develop effective eco-efficiency principles. The approach used is to maximize the role of the Steering Committee of the company's internal environment. Utilization of natural resources such as water, forests, mines and energy needs need to be balanced with Eco-Efficiency so that natural exploitation can be well controlled so that sustainable development aspired by the world can be realized.
SCALING AN URBAN EMERGENCY EVACUATION FRAMEWORK: CHALLENGES AND PRACTICES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karthik, Rajasekar; Lu, Wei
2014-01-01
Critical infrastructure disruption, caused by severe weather events, natural disasters, terrorist attacks, etc., has significant impacts on urban transportation systems. We built a computational framework to simulate urban transportation systems under critical infrastructure disruption in order to aid real-time emergency evacuation. This framework will use large scale datasets to provide a scalable tool for emergency planning and management. Our framework, World-Wide Emergency Evacuation (WWEE), integrates population distribution and urban infrastructure networks to model travel demand in emergency situations at global level. Also, a computational model of agent-based traffic simulation is used to provide an optimal evacuation plan for traffic operationmore » purpose [1]. In addition, our framework provides a web-based high resolution visualization tool for emergency evacuation modelers and practitioners. We have successfully tested our framework with scenarios in both United States (Alexandria, VA) and Europe (Berlin, Germany) [2]. However, there are still some major drawbacks for scaling this framework to handle big data workloads in real time. On our back-end, lack of proper infrastructure limits us in ability to process large amounts of data, run the simulation efficiently and quickly, and provide fast retrieval and serving of data. On the front-end, the visualization performance of microscopic evacuation results is still not efficient enough due to high volume data communication between server and client. We are addressing these drawbacks by using cloud computing and next-generation web technologies, namely Node.js, NoSQL, WebGL, Open Layers 3 and HTML5 technologies. We will describe briefly about each one and how we are using and leveraging these technologies to provide an efficient tool for emergency management organizations. Our early experimentation demonstrates that using above technologies is a promising approach to build a scalable and high performance urban emergency evacuation framework that can improve traffic mobility and safety under critical infrastructure disruption in today s socially connected world.« less
Chiu, Kuo Ping; Wong, Chee-Hong; Chen, Qiongyu; Ariyaratne, Pramila; Ooi, Hong Sain; Wei, Chia-Lin; Sung, Wing-Kin Ken; Ruan, Yijun
2006-08-25
We recently developed the Paired End diTag (PET) strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the Project Manager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.
Gillespie-Bennett, Julie; Keall, Michael; Howden-Chapman, Philippa; Baker, Michael G
2013-08-02
Substandard housing is a problem in New Zealand. Historically there has been little recognition of the important aspects of housing quality that affect people's health and safety. In this viewpoint article we outline the importance of assessing these factors as an essential step to improving the health and safety of New Zealanders and household energy efficiency. A practical risk assessment tool adapted to New Zealand conditions, the Healthy Housing Index (HHI), measures the physical characteristics of houses that affect the health and safety of the occupants. This instrument is also the only tool that has been validated against health and safety outcomes and reported in the international peer-reviewed literature. The HHI provides a framework on which a housing warrant of fitness (WOF) can be based. The HHI inspection takes about one hour to conduct and is performed by a trained building inspector. To maximise the effectiveness of this housing quality assessment we envisage the output having two parts. The first would be a pass/fail WOF assessment showing whether or not the house meets basic health, safety and energy efficiency standards. The second component would rate each main assessment area (health, safety and energy efficiency), potentially on a five-point scale. This WOF system would establish a good minimum standard for rental accommodation as well encouraging improved housing performance over time. In this article we argue that the HHI is an important, validated, housing assessment tool that will improve housing quality, leading to better health of the occupants, reduced home injuries, and greater energy efficiency. If required, this tool could be extended to also cover resilience to natural hazards, broader aspects of sustainability, and the suitability of the dwelling for occupants with particular needs.
DyNAVacS: an integrative tool for optimized DNA vaccine design.
Harish, Nagarajan; Gupta, Rekha; Agarwal, Parul; Scaria, Vinod; Pillai, Beena
2006-07-01
DNA vaccines have slowly emerged as keystones in preventive immunology due to their versatility in inducing both cell-mediated as well as humoral immune responses. The design of an efficient DNA vaccine, involves choice of a suitable expression vector, ensuring optimal expression by codon optimization, engineering CpG motifs for enhancing immune responses and providing additional sequence signals for efficient translation. DyNAVacS is a web-based tool created for rapid and easy design of DNA vaccines. It follows a step-wise design flow, which guides the user through the various sequential steps in the design of the vaccine. Further, it allows restriction enzyme mapping, design of primers spanning user specified sequences and provides information regarding the vectors currently used for generation of DNA vaccines. The web version uses Apache HTTP server. The interface was written in HTML and utilizes the Common Gateway Interface scripts written in PERL for functionality. DyNAVacS is an integrated tool consisting of user-friendly programs, which require minimal information from the user. The software is available free of cost, as a web based application at URL: http://miracle.igib.res.in/dynavac/.
Coherent optimal control of photosynthetic molecules
NASA Astrophysics Data System (ADS)
Caruso, F.; Montangero, S.; Calarco, T.; Huelga, S. F.; Plenio, M. B.
2012-04-01
We demonstrate theoretically that open-loop quantum optimal control techniques can provide efficient tools for the verification of various quantum coherent transport mechanisms in natural and artificial light-harvesting complexes under realistic experimental conditions. To assess the feasibility of possible biocontrol experiments, we introduce the main settings and derive optimally shaped and robust laser pulses that allow for the faithful preparation of specified initial states (such as localized excitation or coherent superposition, i.e., propagating and nonpropagating states) of the photosystem and probe efficiently the subsequent dynamics. With these tools, different transport pathways can be discriminated, which should facilitate the elucidation of genuine quantum dynamical features of photosystems and therefore enhance our understanding of the role that coherent processes may play in actual biological complexes.
NONMEMory: a run management tool for NONMEM.
Wilkins, Justin J
2005-06-01
NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.
Experiences on developing digital down conversion algorithms using Xilinx system generator
NASA Astrophysics Data System (ADS)
Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi
2013-07-01
The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.
Characteristics of a semi-custom library development system
NASA Technical Reports Server (NTRS)
Yancey, M.; Cannon, R.
1990-01-01
Standard cell and gate array macro libraries are in common use with workstation computer aided design (CAD) tools for application specific integrated circuit (ASIC) semi-custom application and have resulted in significant improvements in the overall design efficiencies as contrasted with custom design methodologies. Similar design methodology enhancements in providing for the efficient development of the library cells is an important factor in responding to the need for continuous technology improvement. The characteristics of a library development system that provides design flexibility and productivity enhancements for the library development engineer as he provides libraries in the state-of-the-art process technologies are presented. An overview of Gould's library development system ('Accolade') is also presented.
Liquid Pipeline Operator's Control Room Human Factors Risk Assessment and Management Guide
DOT National Transportation Integrated Search
2008-11-26
The purpose of this guide is to document methodologies, tools, procedures, guidance, and instructions that have been developed to provide liquid pipeline operators with an efficient and effective means of managing the human factors risks in their con...
RELIABILITY OF BIOMARKERS OF PESTICIDE EXPOSURE AMONG CHILDREN AND ADULTS IN CTEPP OHIO
Urinary biomarkers offer the potential for providing an efficient tool for exposure classification by reflecting the aggregate of all exposure routes. Substantial variability observed in urinary pesticide metabolite concentrations over short periods of time, however, has cast so...
technologies and operational practices which increase fuel efficiency and reduce emissions from goods movement . EPA provides partners with performance benchmarking tools, fleet management best practices, technology is working with partners to test and verify advanced technologies and operational practices that save
NASA Astrophysics Data System (ADS)
Song, Chi; Zhang, Xuejun; Zhang, Xin; Hu, Haifei; Zeng, Xuefeng
2017-06-01
A rigid conformal (RC) lap can smooth mid-spatial-frequency (MSF) errors, which are naturally smaller than the tool size, while still removing large-scale errors in a short time. However, the RC-lap smoothing efficiency performance is poorer than expected, and existing smoothing models cannot explicitly specify the methods to improve this efficiency. We presented an explicit time-dependent smoothing evaluation model that contained specific smoothing parameters directly derived from the parametric smoothing model and the Preston equation. Based on the time-dependent model, we proposed a strategy to improve the RC-lap smoothing efficiency, which incorporated the theoretical model, tool optimization, and efficiency limit determination. Two sets of smoothing experiments were performed to demonstrate the smoothing efficiency achieved using the time-dependent smoothing model. A high, theory-like tool influence function and a limiting tool speed of 300 RPM were o
DOVIS 2.0: an efficient and easy to use parallel virtual screening tool based on AutoDock 4.0.
Jiang, Xiaohui; Kumar, Kamal; Hu, Xin; Wallqvist, Anders; Reifman, Jaques
2008-09-08
Small-molecule docking is an important tool in studying receptor-ligand interactions and in identifying potential drug candidates. Previously, we developed a software tool (DOVIS) to perform large-scale virtual screening of small molecules in parallel on Linux clusters, using AutoDock 3.05 as the docking engine. DOVIS enables the seamless screening of millions of compounds on high-performance computing platforms. In this paper, we report significant advances in the software implementation of DOVIS 2.0, including enhanced screening capability, improved file system efficiency, and extended usability. To keep DOVIS up-to-date, we upgraded the software's docking engine to the more accurate AutoDock 4.0 code. We developed a new parallelization scheme to improve runtime efficiency and modified the AutoDock code to reduce excessive file operations during large-scale virtual screening jobs. We also implemented an algorithm to output docked ligands in an industry standard format, sd-file format, which can be easily interfaced with other modeling programs. Finally, we constructed a wrapper-script interface to enable automatic rescoring of docked ligands by arbitrarily selected third-party scoring programs. The significance of the new DOVIS 2.0 software compared with the previous version lies in its improved performance and usability. The new version makes the computation highly efficient by automating load balancing, significantly reducing excessive file operations by more than 95%, providing outputs that conform to industry standard sd-file format, and providing a general wrapper-script interface for rescoring of docked ligands. The new DOVIS 2.0 package is freely available to the public under the GNU General Public License.
RNAmutants: a web server to explore the mutational landscape of RNA secondary structures
Waldispühl, Jerome; Devadas, Srinivas; Berger, Bonnie; Clote, Peter
2009-01-01
The history and mechanism of molecular evolution in DNA have been greatly elucidated by contributions from genetics, probability theory and bioinformatics—indeed, mathematical developments such as Kimura's neutral theory, Kingman's coalescent theory and efficient software such as BLAST, ClustalW, Phylip, etc., provide the foundation for modern population genetics. In contrast to DNA, the function of most noncoding RNA depends on tertiary structure, experimentally known to be largely determined by secondary structure, for which dynamic programming can efficiently compute the minimum free energy secondary structure. For this reason, understanding the effect of pointwise mutations in RNA secondary structure could reveal fundamental properties of structural RNA molecules and improve our understanding of molecular evolution of RNA. The web server RNAmutants provides several efficient tools to compute the ensemble of low-energy secondary structures for all k-mutants of a given RNA sequence, where k is bounded by a user-specified upper bound. As we have previously shown, these tools can be used to predict putative deleterious mutations and to analyze regulatory sequences from the hepatitis C and human immunodeficiency genomes. Web server is available at http://bioinformatics.bc.edu/clotelab/RNAmutants/, and downloadable binaries at http://rnamutants.csail.mit.edu/. PMID:19531740
Portalés, Cristina; Casas, Sergio; Gimeno, Jesús; Fernández, Marcos; Poza, Montse
2018-04-19
Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes.
Fernández, Marcos; Poza, Montse
2018-01-01
Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes. PMID:29671799
NDE of logs and standing trees using new acoustic tools : technical application and results
Peter Carter; Xiping Wang; Robert J. Ross; David Briggs
2005-01-01
The new Director ST300 provides a means to efficiently assess stands for stiffness and related wood properties based on standing tree acoustic velocily measures, and can be easily integrated with pre-harvest and earlier stand assessments. This provides for effective valuation for forest sale, stumpage purchase, harvest planning, and ranking of progeny or clones in tree...
A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James
2011-11-01
Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less
NASA Technical Reports Server (NTRS)
Litt, Jonathan S.; Simo, Donald L.
2007-01-01
This paper presents a preliminary demonstration of an automated health assessment tool, capable of real-time on-board operation using existing engine control hardware. The tool allows operators to discern how rapidly individual turboshaft engines are degrading. As the compressor erodes, performance is lost, and with it the ability to generate power. Thus, such a tool would provide an instant assessment of the engine s fitness to perform a mission, and would help to pinpoint any abnormal wear or performance anomalies before they became serious, thereby decreasing uncertainty and enabling improved maintenance scheduling. The research described in the paper utilized test stand data from a T700-GE-401 turboshaft engine that underwent sand-ingestion testing to scale a model-based compressor efficiency degradation estimation algorithm. This algorithm was then applied to real-time Health Usage and Monitoring System (HUMS) data from a T700-GE-701C to track compressor efficiency on-line. The approach uses an optimal estimator called a Kalman filter. The filter is designed to estimate the compressor efficiency using only data from the engine s sensors as input.
Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K
2015-01-01
Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users.
Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K
2015-01-01
Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. Availability PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users. PMID:26339154
area, which includes work on whole building energy modeling, cost-based optimization, model accuracy optimization tool used to provide support for the Building America program's teams and energy efficiency goals Colorado graduate student exploring enhancements to building optimization in terms of robustness and speed
Design as a marketing tool: cater to your clients.
Falick, J
1982-09-01
competing successfully in the market and functioning efficiently often depend on a reassessment of the environment. Accordingly, upgraded convenience, comfort, and atmosphere have become major marketing mechanisms for hospitals. This article presents several examples of how hospitals have used design to provide marketing advantages.
Application of PKI in health care--needs, ambitions, prospects.
Suselj, Marjan; Marcun, Tomaz; Trcek, Denis; Kandus, Gorazd
2003-01-01
Through continual development and considerable investment over the past years, Slovenia has established an information infrastructure providing efficient data links between all the health care actors. This includes furnishing all the citizens and health workers with microprocessor cards--health insurance card and health professional card. These tools have significantly simplified different procedures in the health care and brought services closer to insured persons. The know-how and experiences gathered to day have given rise to vivid discussions of further development steps: introduction of new contents on the infrastructure in place and technological upgrading, in particular progressive incorporation of the PKI concept and thereby integration of card and network solutions to provide an efficient and secure communication environment. This paper outlines key perspectives of the future developments in this segment. With the volume of health care data communications through internet growing steeply, and with the paramount importance of patient--doctor trust and confidence, security tools and solutions in the health care are a critical need.
Measuring Efficiency of Secondary Healthcare Providers in Slovenia
Blatnik, Patricia; Bojnec, Štefan; Tušak, Matej
2017-01-01
Abstract The chief aim of this study was to analyze secondary healthcare providers' efficiency, focusing on the efficiency analysis of Slovene general hospitals. We intended to present a complete picture of technical, allocative, and cost or economic efficiency of general hospitals. Methods We researched the aspects of efficiency with two econometric methods. First, we calculated the necessary quotients of efficiency with the stochastic frontier analyze (SFA), which are realized by econometric evaluation of stochastic frontier functions; then, with the data envelopment analyze (DEA), we calculated the necessary quotients that are based on the linear programming method. Results Results on measures of efficiency showed that the two chosen methods produced two different conclusions. The SFA method concluded Celje General Hospital is the most efficient general hospital, whereas the DEA method concluded Brežice General Hospital was the hospital to be declared as the most efficient hospital. Conclusion Our results are a useful tool that can aid managers, payers, and designers of healthcare policy to better understand how general hospitals operate. The participants can accordingly decide with less difficulty on any further business operations of general hospitals, having the best practices of general hospitals at their disposal. PMID:28730180
A dielectric logging tool with insulated collar for formation fluid detection around borehole
NASA Astrophysics Data System (ADS)
Wang, Bin; Li, Kang; Kong, Fan-Min; Zhao, Jia
2015-08-01
A dielectric tool with insulated collar for analyzing fluid saturation outside a borehole was introduced. The UWB (ultra-wideband) antenna mounted on the tool was optimized to launch a transient pulse. The broadband evaluation method provided more advantages when compared with traditional dielectric tools. The EM (electromagnetic) power distribution outside the borehole was studied, and it was shown that energy was propagated in two modes. Furthermore, the mechanism of the modes was discussed. In order to increase this tools' investigation depth, a novel insulated collar was introduced. In addition, operation in difference formations was discussed and this tool proved to be able to efficiently launch lateral EM waves. Response voltages indicated that the proposed scheme was able to evaluate the fluid saturation of reservoir formations and dielectric dispersion properties. It may be used as an alternative tool for imaging logging applications.
Cooperative problem solving with personal mobile information tools in hospitals.
Buchauer, A; Werner, R; Haux, R
1998-01-01
Health-care professionals have a broad range of needs for information and cooperation while working at different points of care (e.g., outpatient departments, wards, and functional units such as operating theaters). Patient-related data and medical knowledge have to be widely available to support high-quality patient care. Furthermore, due to the increased specialization of health-care professionals, efficient collaboration is required. Personal mobile information tools have a considerable potential to realize almost ubiquitous information and collaborative support. They enable to unite the functionality of conventional tools such as paper forms, dictating machines, and pagers into one tool. Moreover, they can extend the support already provided by clinical workstations. An approach is described for the integration of mobile information tools with heterogeneous hospital information systems. This approach includes identification of functions which should be provided on mobile tools. Major functions are the presentation of medical records and reports, electronic mailing to support interpersonal communication, and the provision of editors for structured clinical documentation. To realize those functions on mobile tools, we propose a document-based client-server architecture that enables mobile information tools to interoperate with existing computer-based application systems. Open application systems and powerful, partially wireless, hospital-wide networks are the prerequisites for the introduction of mobile information tools.
The Status and Promise of Advanced M&V: An Overview of “M&V 2.0” Methods, Tools, and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franconi, Ellen; Gee, Matt; Goldberg, Miriam
Advanced measurement and verification (M&V) of energy efficiency savings, often referred to as M&V 2.0 or advanced M&V, is currently an object of much industry attention. Thus far, however, there has been a lack of clarity about what techniques M&V 2.0 includes, how those techniques differ from traditional approaches, what the key considerations are for their use, and what value propositions M&V 2.0 presents to different stakeholders. The objective of this paper is to provide background information and frame key discussion points related to advanced M&V. The paper identifies the benefits, methods, and requirements of advanced M&V and outlines keymore » technical issues for applying these methods. It presents an overview of the distinguishing elements of M&V 2.0 tools and of how the industry is addressing needs for tool testing, consistency, and standardization, and it identifies opportunities for collaboration. In this paper, we consider two key features of M&V 2.0: (1) automated analytics that can provide ongoing, near-real-time savings estimates, and (2) increased data granularity in terms of frequency, volume, or end-use detail. Greater data granularity for large numbers of customers, such as that derived from comprehensive implementation of advanced metering infrastructure (AMI) systems, leads to very large data volumes. This drives interest in automated processing systems. It is worth noting, however, that automated processing can provide value even when applied to less granular data, such as monthly consumption data series. Likewise, more granular data, such as interval or end-use data, delivers value with or without automated processing, provided the processing is manageable. But it is the combination of greater data detail with automated processing that offers the greatest opportunity for value. Using M&V methods that capture load shapes together with automated processing1 can determine savings in near-real time to provide stakeholders with more timely and detailed information. This information can be used to inform ongoing building operations, provide early input on energy efficiency program design, or assess the impact of efficiency by location and time of day. Stakeholders who can make use of such information include regulators, energy efficiency program administrators, program evaluators, contractors and aggregators, building owners, the investment community, and grid planners. Although each stakeholder has its own priorities and challenges related to savings measurement and verification, the potential exists for all to draw from a single set of efficiency valuation data. Such an integrated approach could provide a base consistency across stakeholder uses.« less
ADAM: Analysis of Discrete Models of Biological Systems Using Computer Algebra
2011-01-01
Background Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. Results We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Conclusions Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics. PMID:21774817
NASA Astrophysics Data System (ADS)
Bompard, E.; Ma, Y. C.; Ragazzi, E.
2006-03-01
Competition has been introduced in the electricity markets with the goal of reducing prices and improving efficiency. The basic idea which stays behind this choice is that, in competitive markets, a greater quantity of the good is exchanged at a lower price, leading to higher market efficiency. Electricity markets are pretty different from other commodities mainly due to the physical constraints related to the network structure that may impact the market performance. The network structure of the system on which the economic transactions need to be undertaken poses strict physical and operational constraints. Strategic interactions among producers that game the market with the objective of maximizing their producer surplus must be taken into account when modeling competitive electricity markets. The physical constraints, specific of the electricity markets, provide additional opportunity of gaming to the market players. Game theory provides a tool to model such a context. This paper discussed the application of game theory to physical constrained electricity markets with the goal of providing tools for assessing the market performance and pinpointing the critical network constraints that may impact the market efficiency. The basic models of game theory specifically designed to represent the electricity markets will be presented. IEEE30 bus test system of the constrained electricity market will be discussed to show the network impacts on the market performances in presence of strategic bidding behavior of the producers.
3Drefine: an interactive web server for efficient protein structure refinement.
Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin
2016-07-08
3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Got Graphs? An Assessment of Data Visualization Tools
NASA Technical Reports Server (NTRS)
Schaefer, C. M.; Foy, M.
2015-01-01
Graphs are powerful tools for simplifying complex data. They are useful for quickly assessing patterns and relationships among one or more variables from a dataset. As the amount of data increases, it becomes more difficult to visualize potential associations. Lifetime Surveillance of Astronaut Health (LSAH) was charged with assessing its current visualization tools along with others on the market to determine whether new tools would be useful for supporting NASA's occupational surveillance effort. It was concluded by members of LSAH that the current tools hindered their ability to provide quick results to researchers working with the department. Due to the high volume of data requests and the many iterations of visualizations requested by researchers, software with a better ability to replicate graphs and edit quickly could improve LSAH's efficiency and lead to faster research results.
Gilliam, Meredith; Krein, Sarah L; Belanger, Karen; Fowler, Karen E; Dimcheff, Derek E; Solomon, Gabriel
2017-01-01
Background: Incomplete or delayed access to discharge information by outpatient providers and patients contributes to discontinuity of care and poor outcomes. Objective: To evaluate the effect of a new electronic discharge summary tool on the timeliness of documentation and communication with outpatient providers. Methods: In June 2012, we implemented an electronic discharge summary tool at our 145-bed university-affiliated Veterans Affairs hospital. The tool facilitates completion of a comprehensive discharge summary note that is available for patients and outpatient medical providers at the time of hospital discharge. Discharge summary note availability, outpatient provider satisfaction, and time between the decision to discharge a patient and discharge note completion were all evaluated before and after implementation of the tool. Results: The percentage of discharge summary notes completed by the time of first post-discharge clinical contact improved from 43% in February 2012 to 100% in September 2012 and was maintained at 100% in 2014. A survey of 22 outpatient providers showed that 90% preferred the new summary and 86% found it comprehensive. Despite increasing required documentation, the time required to discharge a patient, from physician decision to discharge note completion, improved from 5.6 h in 2010 to 4.1 h in 2012 (p = 0.04), and to 2.8 h in 2015 (p < 0.001). Conclusion: The implementation of a novel discharge summary tool improved the timeliness and comprehensiveness of discharge information as needed for the delivery of appropriate, high-quality follow-up care, without adversely affecting the efficiency of the discharge process. PMID:28491308
Gilliam, Meredith; Krein, Sarah L; Belanger, Karen; Fowler, Karen E; Dimcheff, Derek E; Solomon, Gabriel
2017-01-01
Incomplete or delayed access to discharge information by outpatient providers and patients contributes to discontinuity of care and poor outcomes. To evaluate the effect of a new electronic discharge summary tool on the timeliness of documentation and communication with outpatient providers. In June 2012, we implemented an electronic discharge summary tool at our 145-bed university-affiliated Veterans Affairs hospital. The tool facilitates completion of a comprehensive discharge summary note that is available for patients and outpatient medical providers at the time of hospital discharge. Discharge summary note availability, outpatient provider satisfaction, and time between the decision to discharge a patient and discharge note completion were all evaluated before and after implementation of the tool. The percentage of discharge summary notes completed by the time of first post-discharge clinical contact improved from 43% in February 2012 to 100% in September 2012 and was maintained at 100% in 2014. A survey of 22 outpatient providers showed that 90% preferred the new summary and 86% found it comprehensive. Despite increasing required documentation, the time required to discharge a patient, from physician decision to discharge note completion, improved from 5.6 h in 2010 to 4.1 h in 2012 (p = 0.04), and to 2.8 h in 2015 (p < 0.001). The implementation of a novel discharge summary tool improved the timeliness and comprehensiveness of discharge information as needed for the delivery of appropriate, high-quality follow-up care, without adversely affecting the efficiency of the discharge process.
Multispectral analysis tools can increase utility of RGB color images in histology
NASA Astrophysics Data System (ADS)
Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard
2018-04-01
Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.
Inverse Statistics and Asset Allocation Efficiency
NASA Astrophysics Data System (ADS)
Bolgorian, Meysam
In this paper using inverse statistics analysis, the effect of investment horizon on the efficiency of portfolio selection is examined. Inverse statistics analysis is a general tool also known as probability distribution of exit time that is used for detecting the distribution of the time in which a stochastic process exits from a zone. This analysis was used in Refs. 1 and 2 for studying the financial returns time series. This distribution provides an optimal investment horizon which determines the most likely horizon for gaining a specific return. Using samples of stocks from Tehran Stock Exchange (TSE) as an emerging market and S&P 500 as a developed market, effect of optimal investment horizon in asset allocation is assessed. It is found that taking into account the optimal investment horizon in TSE leads to more efficiency for large size portfolios while for stocks selected from S&P 500, regardless of portfolio size, this strategy does not only not produce more efficient portfolios, but also longer investment horizons provides more efficiency.
Modeling Off-Nominal Recovery in NextGen Terminal-Area Operations
NASA Technical Reports Server (NTRS)
Callantine, Todd J.
2011-01-01
Robust schedule-based arrival management requires efficient recovery from off-nominal situations. This paper presents research on modeling off-nominal situations and plans for recovering from them using TRAC, a route/airspace design, fast-time simulation, and analysis tool for studying NextGen trajectory-based operations. The paper provides an overview of a schedule-based arrival-management concept and supporting controller tools, then describes TRAC implementations of methods for constructing off-nominal scenarios, generating trajectory options to meet scheduling constraints, and automatically producing recovery plans.
IMAT (Integrated Multidisciplinary Analysis Tool) user's guide for the VAX/VMS computer
NASA Technical Reports Server (NTRS)
Meissner, Frances T. (Editor)
1988-01-01
The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system for the VAX/VMS computer developed at the Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.
Radiolysis of ethanol and ethanol-water solutions: A tool for studying bioradical reactions
NASA Astrophysics Data System (ADS)
Jore, D.; Champion, B.; Kaouadji, N.; Jay-Gerin, J.-P.; Ferradini, C.
Radiolysis of pure ethanol and ethanol-water solutions is examined in view of its relevance to the study of biological radical mechanisms. On the basis of earlier studies, a consistent reaction scheme is adopted. New data on radical yields are obtained from the radiolysis of dilute solutions of vitamins E and C in these solvents. It is shown that the radiolysis of ethanolic solutions provide an efficient tool to study radical reactions of water-insoluble biomolecules.
Bamidis, P D; Lithari, C; Konstantinidis, S T
2010-01-01
With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489
Bamidis, P D; Lithari, C; Konstantinidis, S T
2010-12-01
With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.
ERIC Educational Resources Information Center
Kingiri, Ann N.
2013-01-01
Purpose: To reflect on the opportunities that a systems understanding of innovation provides for addressing gender issues relevant to women, and to provide some insight on how these might be tackled. Approach: Review of literature relating to gender issues and how they relate to achieving, on the one hand, equity and efficiency goals, and on the…
Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A
2010-01-01
Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.
USDA-ARS?s Scientific Manuscript database
Hyperspectral microscope imaging is presented as a rapid and efficient tool to classify foodborne bacteria species. The spectral data were obtained from five different species of Staphylococcus spp. with a hyperspectral microscope imaging system that provided a maximum of 89 contiguous spectral imag...
Video: NREL and Buildings Research - Continuum Magazine | NREL
solutions to improve the energy efficiency of both residential and commercial buildings, and to accelerate the integration of clean energy technologies with buildings. NREL's commercial buildings research focuses on providing large institutional and private sector commercial building owners with tools
Physician efficiency and reimbursement: a case study.
Cantrell, L E; Flick, J A
1986-01-01
Joint ventures between hospitals and doctors are being widely developed and reported as the most promising mechanism for building alliances, providing financial rewards, and accessing new markets. However, joint ventures cannot be structured to involve an entire medical staff directly. Likewise, they cannot motivate a medical staff to change medical practice patterns in order to improve a hospital's reimbursement efficiency. This article describes a system of physician economic efficiency criteria that is being used by one hospital in making medical staff reappointment decisions and has the effect of placing all physicians at risk individually for the hospital's reimbursement performance. Although somewhat controversial, this economic efficiency program has proven a remarkably effective tool for change.
NREL's Building-Integrated Supercomputer Provides Heating and Efficient Computing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-09-01
NREL's Energy Systems Integration Facility (ESIF) is meant to investigate new ways to integrate energy sources so they work together efficiently, and one of the key tools to that investigation, a new supercomputer, is itself a prime example of energy systems integration. NREL teamed with Hewlett-Packard (HP) and Intel to develop the innovative warm-water, liquid-cooled Peregrine supercomputer, which not only operates efficiently but also serves as the primary source of building heat for ESIF offices and laboratories. This innovative high-performance computer (HPC) can perform more than a quadrillion calculations per second as part of the world's most energy-efficient HPC datamore » center.« less
Chemical annotation of small and peptide-like molecules at the Protein Data Bank
Young, Jasmine Y.; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M.
2013-01-01
Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org PMID:24291661
Tools for Atmospheric Radiative Transfer: Streamer and FluxNet. Revised
NASA Technical Reports Server (NTRS)
Key, Jeffrey R.; Schweiger, Axel J.
1998-01-01
Two tools for the solution of radiative transfer problems are presented. Streamer is a highly flexible medium spectral resolution radiative transfer model based on the plane-parallel theory of radiative transfer. Capable of computing either fluxes or radiances, it is suitable for studying radiative processes at the surface or within the atmosphere and for the development of remote-sensing algorithms. FluxNet is a fast neural network-based implementation of Streamer for computing surface fluxes. It allows for a sophisticated treatment of radiative processes in the analysis of large data sets and potential integration into geophysical models where computational efficiency is an issue. Documentation and tools for the development of alternative versions of Fluxnet are available. Collectively, Streamer and FluxNet solve a wide variety of problems related to radiative transfer: Streamer provides the detail and sophistication needed to perform basic research on most aspects of complex radiative processes while the efficiency and simplicity of FluxNet make it ideal for operational use.
Chemical annotation of small and peptide-like molecules at the Protein Data Bank.
Young, Jasmine Y; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M
2013-01-01
Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org.
Schroy, Paul C; Mylvaganam, Shamini; Davidson, Peter
2014-02-01
Decision aids for colorectal cancer (CRC) screening have been shown to enable patients to identify a preferred screening option, but the extent to which such tools facilitate shared decision making (SDM) from the perspective of the provider is less well established. Our goal was to elicit provider feedback regarding the impact of a CRC screening decision aid on SDM in the primary care setting. Cross-sectional survey. Primary care providers participating in a clinical trial evaluating the impact of a novel CRC screening decision aid on SDM and adherence. Perceptions of the impact of the tool on decision-making and implementation issues. Twenty-nine of 42 (71%) eligible providers responded, including 27 internists and two nurse practitioners. The majority (>60%) felt that use of the tool complimented their usual approach, increased patient knowledge, helped patients identify a preferred screening option, improved the quality of decision making, saved time and increased patients' desire to get screened. Respondents were more neutral is their assessment of whether the tool improved the overall quality of the patient visit or patient satisfaction. Fewer than 50% felt that the tool would be easy to implement into their practices or that it would be widely used by their colleagues. Decision aids for CRC screening can improve the quality and efficiency of SDM from the provider perspective but future use is likely to depend on the extent to which barriers to implementation can be addressed. © 2011 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
3D Slicer as a tool for interactive brain tumor segmentation.
Kikinis, Ron; Pieper, Steve
2011-01-01
User interaction is required for reliable segmentation of brain tumors in clinical practice and in clinical research. By incorporating current research tools, 3D Slicer provides a set of interactive, easy to use tools that can be efficiently used for this purpose. One of the modules of 3D Slicer is an interactive editor tool, which contains a variety of interactive segmentation effects. Use of these effects for fast and reproducible segmentation of a single glioblastoma from magnetic resonance imaging data is demonstrated. The innovation in this work lies not in the algorithm, but in the accessibility of the algorithm because of its integration into a software platform that is practical for research in a clinical setting.
Zaidan, A A; Zaidan, B B; Kadhem, Z; Larbani, M; Lakulu, M B; Hashim, M
2015-02-01
This paper discusses the possibility of promoting public health and implementing educational health services using Facebook. We discuss the challenges and strengths of using such a platform as a tool for public health care systems from two different perspectives, namely, the view of IT developers and that of physicians. We present a new way of evaluating user interactivity in health care systems from tools provided by Facebook that measure statistical traffic in the Internet. Findings show that Facebook is a very promising tool in promoting e-health services in Web 2.0. Results from statistical traffic show that a Facebook page is more efficient than other pages in promoting public health.
Kharchenko, Maria S; Teslya, Petr N; Babaeva, Maria N; Zakataeva, Natalia P
2018-05-01
Bacillus subtilis pheS was genetically modified to obtain a counter-selection marker with high selection efficiency in Bacillus amyloliquefaciens. The application of the new replication-thermosensitive integrative vector pNZTM1, containing this marker, pheS BsT255S/A309G , with a two-step replacement recombination procedure provides an effective tool for the genetic engineering of industrially important Bacillus species. Copyright © 2018. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hotchkiss, E.
This fact sheet provides information on how private industry; federal, state, and local governments; non-profit organizations; and communities can utilize NREL's expertise, tools, and innovations to incorporate energy efficiency and renewable energy into the planning, recovery, and rebuilding stages of disaster.
USDA-ARS?s Scientific Manuscript database
Evaluating the effectiveness of conservation practices (CPs) is an important step to achieving efficient and successful water quality management. Watershed-scale simulation models can provide useful and convenient tools for this evaluation, but simulated conservation practice effectiveness should be...
ERIC Educational Resources Information Center
Olson, Gary A.
2007-01-01
Many professors, staff members, and even administrators see campus computers and e-mail accounts as their own private property--a type of employment benefit provided with no constraints on use. The fact is, universities "assign" computer equipment to personnel as tools to help them perform their jobs more effectively and efficiently, in the same…
INTERPRETATIONS AND LIMITATION OF PULMONARY FUNCTION TESTING IN SMALL LABORATORY ANIMALS
Pulmonary function tests are tools available to the researcher and clinician to evaluate the ability of the lung to perform its essential function of gas exchange. o meet this principal function, the lung needs to operate efficiently with minimal mechanical work as well as provid...
Improving Learning Object Quality: Moodle HEODAR Implementation
ERIC Educational Resources Information Center
Munoz, Carlos; Garcia-Penalvo, Francisco J.; Morales, Erla Mariela; Conde, Miguel Angel; Seoane, Antonio M.
2012-01-01
Automation toward efficiency is the aim of most intelligent systems in an educational context in which results calculation automation that allows experts to spend most of their time on important tasks, not on retrieving, ordering, and interpreting information. In this paper, the authors provide a tool that easily evaluates Learning Objects quality…
The microcomputer scientific software series 4: testing prediction accuracy.
H. Michael Rauscher
1986-01-01
A computer program, ATEST, is described in this combination user's guide / programmer's manual. ATEST provides users with an efficient and convenient tool to test the accuracy of predictors. As input ATEST requires observed-predicted data pairs. The output reports the two components of accuracy, bias and precision.
Mobile phone tools for field-based health care workers in low-income countries.
Derenzi, Brian; Borriello, Gaetano; Jackson, Jonathan; Kumar, Vikram S; Parikh, Tapan S; Virk, Pushwaz; Lesh, Neal
2011-01-01
In low-income regions, mobile phone-based tools can improve the scope and efficiency of field health workers. They can also address challenges in monitoring and supervising a large number of geographically distributed health workers. Several tools have been built and deployed in the field, but little comparison has been done to help understand their effectiveness. This is largely because no framework exists in which to analyze the different ways in which the tools help strengthen existing health systems. In this article we highlight 6 key functions that health systems currently perform where mobile tools can provide the most benefit. Using these 6 health system functions, we compare existing applications for community health workers, an important class of field health workers who use these technologies, and discuss common challenges and lessons learned about deploying mobile tools. © 2011 Mount Sinai School of Medicine.
ARX - A Comprehensive Tool for Anonymizing Biomedical Data
Prasser, Fabian; Kohlmayer, Florian; Lautenschläger, Ronald; Kuhn, Klaus A.
2014-01-01
Collaboration and data sharing have become core elements of biomedical research. Especially when sensitive data from distributed sources are linked, privacy threats have to be considered. Statistical disclosure control allows the protection of sensitive data by introducing fuzziness. Reduction of data quality, however, needs to be balanced against gains in protection. Therefore, tools are needed which provide a good overview of the anonymization process to those responsible for data sharing. These tools require graphical interfaces and the use of intuitive and replicable methods. In addition, extensive testing, documentation and openness to reviews by the community are important. Existing publicly available software is limited in functionality, and often active support is lacking. We present ARX, an anonymization tool that i) implements a wide variety of privacy methods in a highly efficient manner, ii) provides an intuitive cross-platform graphical interface, iii) offers a programming interface for integration into other software systems, and iv) is well documented and actively supported. PMID:25954407
NASA Astrophysics Data System (ADS)
Guillet, S.; Gosmain, A.; Ducoux, W.; Ponçon, M.; Fontaine, G.; Desseix, P.; Perraud, P.
2012-05-01
The increasing use of composite materials in aircrafts primary structures has led to different problematics in the field of safety of flight in lightning conditions. The consequences of this technological mutation, which occurs in a parallel context of extension of electrified critical functions, are addressed by aircraft manufacturers through the enhancement of their available assessment means of lightning transient. On the one hand, simulation tools, provided an accurate description of aircraft design, are today valuable assessment tools, in both predictive and operative terms. On the other hand, in-house test means allow confirmation and consolidation of design office hardening solutions. The combined use of predictive simulation tools and in- house test means offers an efficient and reliable support for all aircraft developments in their various life-time stages. The present paper provides PREFACE research project results that illustrate the above introduced strategy on the de-icing system of the NH90 composite main rotor blade.
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
Interactive Visual Analysis within Dynamic Ocean Models
NASA Astrophysics Data System (ADS)
Butkiewicz, T.
2012-12-01
The many observation and simulation based ocean models available today can provide crucial insights for all fields of marine research and can serve as valuable references when planning data collection missions. However, the increasing size and complexity of these models makes leveraging their contents difficult for end users. Through a combination of data visualization techniques, interactive analysis tools, and new hardware technologies, the data within these models can be made more accessible to domain scientists. We present an interactive system that supports exploratory visual analysis within large-scale ocean flow models. The currents and eddies within the models are illustrated using effective, particle-based flow visualization techniques. Stereoscopic displays and rendering methods are employed to ensure that the user can correctly perceive the complex 3D structures of depth-dependent flow patterns. Interactive analysis tools are provided which allow the user to experiment through the introduction of their customizable virtual dye particles into the models to explore regions of interest. A multi-touch interface provides natural, efficient interaction, with custom multi-touch gestures simplifying the otherwise challenging tasks of navigating and positioning tools within a 3D environment. We demonstrate the potential applications of our visual analysis environment with two examples of real-world significance: Firstly, an example of using customized particles with physics-based behaviors to simulate pollutant release scenarios, including predicting the oil plume path for the 2010 Deepwater Horizon oil spill disaster. Secondly, an interactive tool for plotting and revising proposed autonomous underwater vehicle mission pathlines with respect to the surrounding flow patterns predicted by the model; as these survey vessels have extremely limited energy budgets, designing more efficient paths allows for greater survey areas.
GEsture: an online hand-drawing tool for gene expression pattern search.
Wang, Chunyan; Xu, Yiqing; Wang, Xuelin; Zhang, Li; Wei, Suyun; Ye, Qiaolin; Zhu, Youxiang; Yin, Hengfu; Nainwal, Manoj; Tanon-Reyes, Luis; Cheng, Feng; Yin, Tongming; Ye, Ning
2018-01-01
Gene expression profiling data provide useful information for the investigation of biological function and process. However, identifying a specific expression pattern from extensive time series gene expression data is not an easy task. Clustering, a popular method, is often used to classify similar expression genes, however, genes with a 'desirable' or 'user-defined' pattern cannot be efficiently detected by clustering methods. To address these limitations, we developed an online tool called GEsture. Users can draw, or graph a curve using a mouse instead of inputting abstract parameters of clustering methods. GEsture explores genes showing similar, opposite and time-delay expression patterns with a gene expression curve as input from time series datasets. We presented three examples that illustrate the capacity of GEsture in gene hunting while following users' requirements. GEsture also provides visualization tools (such as expression pattern figure, heat map and correlation network) to display the searching results. The result outputs may provide useful information for researchers to understand the targets, function and biological processes of the involved genes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Dustin Yewell
Echo™ is a MATLAB-based software package designed for robust and scalable analysis of complex data workflows. An alternative to tedious, error-prone conventional processes, Echo is based on three transformative principles for data analysis: self-describing data, name-based indexing, and dynamic resource allocation. The software takes an object-oriented approach to data analysis, intimately connecting measurement data with associated metadata. Echo operations in an analysis workflow automatically track and merge metadata and computation parameters to provide a complete history of the process used to generate final results, while automated figure and report generation tools eliminate the potential to mislabel those results. History reportingmore » and visualization methods provide straightforward auditability of analysis processes. Furthermore, name-based indexing on metadata greatly improves code readability for analyst collaboration and reduces opportunities for errors to occur. Echo efficiently manages large data sets using a framework that seamlessly allocates resources such that only the necessary computations to produce a given result are executed. Echo provides a versatile and extensible framework, allowing advanced users to add their own tools and data classes tailored to their own specific needs. Applying these transformative principles and powerful features, Echo greatly improves analyst efficiency and quality of results in many application areas.« less
GAPIT: genome association and prediction integrated tool.
Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu
2012-09-15
Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.
Vandenberg, Ann E; Vaughan, Camille P; Stevens, Melissa; Hastings, Susan N; Powers, James; Markland, Alayne; Hwang, Ula; Hung, William; Echt, Katharina V
2017-02-01
Clinical decision support (CDS) may improve prescribing for older adults in the Emergency Department (ED) if adopted by providers. Existing prescribing order entry processes were mapped at an initial Veterans Administration Medical Center site, demonstrating cognitive burden, effort and safety concerns. Geriatric order sets incorporating 2012 Beers guidelines and including geriatric prescribing advice and prepopulated order options were developed. Geriatric order sets were implemented at two sites as part of the multicomponent 'Enhancing Quality of Prescribing Practices for Older Veterans Discharged from the Emergency Department' quality improvement initiative. Facilitators and barriers to order sets use at the two sites were evaluated. Phone interviews were conducted with two provider groups (n = 20), those 'EQUiPPED' with the interventions (n = 10, 5 at each site) and Comparison providers who were only exposed to order sets through a clickable option on the ED order menu within the patient's medical record (n = 10, 5 at each site). All providers were asked about order set 'use' and 'usefulness'. Users (n = 11) were asked about 'usability'. Order set adopters described 'usefulness' in terms of 'safety' and 'efficiency', whereas order set consultants and order set non-users described 'usefulness' in terms of 'information' or 'training'. Provider 'autonomy', 'comfort' level with existing tools, and 'learning curve' were stated as barriers to use. Quantifying efficiency advantages and communicating safety benefit over preexisting practices and tools may improve adoption of CDS in ED and in other settings of care. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Metadata Authoring with Versatility and Extensibility
NASA Technical Reports Server (NTRS)
Pollack, Janine; Olsen, Lola
2004-01-01
NASA's Global Change Master Directory (GCMD) assists the scientific community in the discovery of and linkage to Earth science data sets and related services. The GCMD holds over 13,800 data set descriptions in Directory Interchange Format (DIF) and 700 data service descriptions in Service Entry Resource Format (SERF), encompassing the disciplines of geology, hydrology, oceanography, meteorology, and ecology. Data descriptions also contain geographic coverage information and direct links to the data, thus allowing researchers to discover data pertaining to a geographic location of interest, then quickly acquire those data. The GCMD strives to be the preferred data locator for world-wide directory-level metadata. In this vein, scientists and data providers must have access to intuitive and efficient metadata authoring tools. Existing GCMD tools are attracting widespread usage; however, a need for tools that are portable, customizable and versatile still exists. With tool usage directly influencing metadata population, it has become apparent that new tools are needed to fill these voids. As a result, the GCMD has released a new authoring tool allowing for both web-based and stand-alone authoring of descriptions. Furthermore, this tool incorporates the ability to plug-and-play the metadata format of choice, offering users options of DIF, SERF, FGDC, ISO or any other defined standard. Allowing data holders to work with their preferred format, as well as an option of a stand-alone application or web-based environment, docBUlLDER will assist the scientific community in efficiently creating quality data and services metadata.
Kim, Dae Wook; Kim, Sug-Whan
2005-02-07
We present a novel simulation technique that offers efficient mass fabrication strategies for 2m class hexagonal mirror segments of extremely large telescopes. As the first of two studies in series, we establish the theoretical basis of the tool influence function (TIF) for precessing tool polishing simulation for non-rotating workpieces. These theoretical TIFs were then used to confirm the reproducibility of the material removal foot-prints (measured TIFs) of the bulged precessing tooling reported elsewhere. This is followed by the reverse-computation technique that traces, employing the simplex search method, the real polishing pressure from the empirical TIF. The technical details, together with the results and implications described here, provide the theoretical tool for material removal essential to the successful polishing simulation which will be reported in the second study.
Microgrid Analysis Tools Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jimenez, Antonio; Haase, Scott G; Mathur, Shivani
2018-03-05
The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimizationmore » tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).« less
Design of Center-TRACON Automation System
NASA Technical Reports Server (NTRS)
Erzberger, Heinz; Davis, Thomas J.; Green, Steven
1993-01-01
A system for the automated management and control of terminal area traffic, referred to as the Center-TRACON Automation System (CTAS), is being developed at NASA Ames Research Center. In a cooperative program, NASA and FAA have efforts underway to install and evaluate the system at the Denver area and Dallas/Ft. Worth area air traffic control facilities. This paper will review CTAS architecture, and automation functions as well as the integration of CTAS into the existing operational system. CTAS consists of three types of integrated tools that provide computer-generated advisories for both en-route and terminal area controllers to guide them in managing and controlling arrival traffic efficiently. One tool, the Traffic Management Advisor (TMA), generates runway assignments, landing sequences and landing times for all arriving aircraft, including those originating from nearby feeder airports. TMA also assists in runway configuration control and flow management. Another tool, the Descent Advisor (DA), generates clearances for the en-route controllers handling arrival flows to metering gates. The DA's clearances ensure fuel-efficient and conflict free descents to the metering gates at specified crossing times. In the terminal area, the Final Approach Spacing Tool (FAST) provides heading and speed advisories that help controllers produce an accurately spaced flow of aircraft on the final approach course. Data bases consisting of several hundred aircraft performance models, airline preferred operational procedures, and a three dimensional wind model support the operation of CTAS. The first component of CTAS, the Traffic Management Advisor, is being evaluated at the Denver TRACON and the Denver Air Route Traffic Control Center. The second component, the Final Approach Spacing Tool, will be evaluated in several stages at the Dallas/Fort Worth Airport beginning in October 1993. An initial stage of the Descent Advisor tool is being prepared for testing at the Denver Center in late 1994. Operational evaluations of all three integrated CTAS tools are expected to begin at the two field sites in 1995.
Nagata, Tomohisa; Mori, Koji; Aratake, Yutaka; Ide, Hiroshi; Ishida, Hiromi; Nobori, Junichiro; Kojima, Reiko; Odagami, Kiminori; Kato, Anna; Tsutsumi, Akizumi; Matsuda, Shinya
2014-01-01
The aim of the present study was to develop standardized cost estimation tools that provide information to employers about occupational safety and health (OSH) activities for effective and efficient decision making in Japanese companies. We interviewed OSH staff members including full-time professional occupational physicians to list all OSH activities. Using activity-based costing, cost data were obtained from retrospective analyses of occupational safety and health costs over a 1-year period in three manufacturing workplaces and were obtained from retrospective analyses of occupational health services costs in four manufacturing workplaces. We verified the tools additionally in four workplaces including service businesses. We created the OSH and occupational health standardized cost estimation tools. OSH costs consisted of personnel costs, expenses, outsourcing costs and investments for 15 OSH activities. The tools provided accurate, relevant information on OSH activities and occupational health services. The standardized information obtained from our OSH and occupational health cost estimation tools can be used to manage OSH costs, make comparisons of OSH costs between companies and organizations and help occupational health physicians and employers to determine the best course of action.
Liu, Nan; D'Aunno, Thomas
2012-04-01
To develop simple stylized models for evaluating the productivity and cost-efficiencies of different practice models to involve nurse practitioners (NPs) in primary care, and in particular to generate insights on what affects the performance of these models and how. The productivity of a practice model is defined as the maximum number of patients that can be accounted for by the model under a given timeliness-to-care requirement; cost-efficiency is measured by the corresponding annual cost per patient in that model. Appropriate queueing analysis is conducted to generate formulas and values for these two performance measures. Model parameters for the analysis are extracted from the previous literature and survey reports. Sensitivity analysis is conducted to investigate the model performance under different scenarios and to verify the robustness of findings. Employing an NP, whose salary is usually lower than a primary care physician, may not be cost-efficient, in particular when the NP's capacity is underutilized. Besides provider service rates, workload allocation among providers is one of the most important determinants for the cost-efficiency of a practice model involving NPs. Capacity pooling among providers could be a helpful strategy to improve efficiency in care delivery. The productivity and cost-efficiency of a practice model depend heavily on how providers organize their work and a variety of other factors related to the practice environment. Queueing theory provides useful tools to take into account these factors in making strategic decisions on staffing and panel size selection for a practice model. © Health Research and Educational Trust.
NASA Astrophysics Data System (ADS)
Lei, Xiaohui; Wang, Yuhui; Liao, Weihong; Jiang, Yunzhong; Tian, Yu; Wang, Hao
2011-09-01
Many regions are still threatened with frequent floods and water resource shortage problems in China. Consequently, the task of reproducing and predicting the hydrological process in watersheds is hard and unavoidable for reducing the risks of damage and loss. Thus, it is necessary to develop an efficient and cost-effective hydrological tool in China as many areas should be modeled. Currently, developed hydrological tools such as Mike SHE and ArcSWAT (soil and water assessment tool based on ArcGIS) show significant power in improving the precision of hydrological modeling in China by considering spatial variability both in land cover and in soil type. However, adopting developed commercial tools in such a large developing country comes at a high cost. Commercial modeling tools usually contain large numbers of formulas, complicated data formats, and many preprocessing or postprocessing steps that may make it difficult for the user to carry out simulation, thus lowering the efficiency of the modeling process. Besides, commercial hydrological models usually cannot be modified or improved to be suitable for some special hydrological conditions in China. Some other hydrological models are open source, but integrated into commercial GIS systems. Therefore, by integrating hydrological simulation code EasyDHM, a hydrological simulation tool named MWEasyDHM was developed based on open-source MapWindow GIS, the purpose of which is to establish the first open-source GIS-based distributed hydrological model tool in China by integrating modules of preprocessing, model computation, parameter estimation, result display, and analysis. MWEasyDHM provides users with a friendly manipulating MapWindow GIS interface, selectable multifunctional hydrological processing modules, and, more importantly, an efficient and cost-effective hydrological simulation tool. The general construction of MWEasyDHM consists of four major parts: (1) a general GIS module for hydrological analysis, (2) a preprocessing module for modeling inputs, (3) a model calibration module, and (4) a postprocessing module. The general GIS module for hydrological analysis is developed on the basis of totally open-source GIS software, MapWindow, which contains basic GIS functions. The preprocessing module is made up of three submodules including a DEM-based submodule for hydrological analysis, a submodule for default parameter calculation, and a submodule for the spatial interpolation of meteorological data. The calibration module contains parallel computation, real-time computation, and visualization. The postprocessing module includes model calibration and model results spatial visualization using tabular form and spatial grids. MWEasyDHM makes it possible for efficient modeling and calibration of EasyDHM, and promises further development of cost-effective applications in various watersheds.
MITHRA 1.0: A full-wave simulation tool for free electron lasers
NASA Astrophysics Data System (ADS)
Fallahi, Arya; Yahaghi, Alireza; Kärtner, Franz X.
2018-07-01
Free Electron Lasers (FELs) are a solution for providing intense, coherent and bright radiation in the hard X-ray regime. Due to the low wall-plug efficiency of FEL facilities, it is crucial and additionally very useful to develop complete and accurate simulation tools for better optimizing a FEL interaction. The highly sophisticated dynamics involved in a FEL process was the main obstacle hindering the development of general simulation tools for this problem. We present a numerical algorithm based on finite difference time domain/Particle in cell (FDTD/PIC) in a Lorentz boosted coordinate system which is able to fulfill a full-wave simulation of a FEL process. The developed software offers a suitable tool for the analysis of FEL interactions without considering any of the usual approximations. A coordinate transformation to bunch rest frame makes the very different length scales of bunch size, optical wavelengths and the undulator period transform to values with the same order. Consequently, FDTD/PIC simulations in conjunction with efficient parallelization techniques make the full-wave simulation feasible using the available computational resources. Several examples of free electron lasers are analyzed using the developed software, the results are benchmarked based on standard FEL codes and discussed in detail.
Sawja: Static Analysis Workshop for Java
NASA Astrophysics Data System (ADS)
Hubert, Laurent; Barré, Nicolas; Besson, Frédéric; Demange, Delphine; Jensen, Thomas; Monfort, Vincent; Pichardie, David; Turpin, Tiphaine
Static analysis is a powerful technique for automatic verification of programs but raises major engineering challenges when developing a full-fledged analyzer for a realistic language such as Java. Efficiency and precision of such a tool rely partly on low level components which only depend on the syntactic structure of the language and therefore should not be redesigned for each implementation of a new static analysis. This paper describes the Sawja library: a static analysis workshop fully compliant with Java 6 which provides OCaml modules for efficiently manipulating Java bytecode programs. We present the main features of the library, including i) efficient functional data-structures for representing a program with implicit sharing and lazy parsing, ii) an intermediate stack-less representation, and iii) fast computation and manipulation of complete programs. We provide experimental evaluations of the different features with respect to time, memory and precision.
Operationally Efficient Propulsion System Study (OEPSS): OEPSS Video Script
NASA Technical Reports Server (NTRS)
Wong, George S.; Waldrop, Glen S.; Trent, Donnie (Editor)
1992-01-01
The OEPSS video film, along with the OEPSS Databooks, provides a data base of current launch experience that will be useful for design of future expendable and reusable launch systems. The focus is on the launch processing of propulsion systems. A brief 15-minute overview of the OEPSS study results is found at the beginning of the film. The remainder of the film discusses in more detail: current ground operations at the Kennedy Space Center; typical operations issues and problems; critical operations technologies; and efficiency of booster and space propulsion systems. The impact of system architecture on the launch site and its facility infrastucture is emphasized. Finally, a particularly valuable analytical tool, developed during the OEPSS study, that will provide for the "first time" a quantitative measure of operations efficiency for a propulsion system is described.
Advanced control design for hybrid turboelectric vehicle
NASA Technical Reports Server (NTRS)
Abban, Joseph; Norvell, Johnesta; Momoh, James A.
1995-01-01
The new environment standards are a challenge and opportunity for industry and government who manufacture and operate urban mass transient vehicles. A research investigation to provide control scheme for efficient power management of the vehicle is in progress. Different design requirements using functional analysis and trade studies of alternate power sources and controls have been performed. The design issues include portability, weight and emission/fuel efficiency of induction motor, permanent magnet and battery. A strategic design scheme to manage power requirements using advanced control systems is presented. It exploits fuzzy logic, technology and rule based decision support scheme. The benefits of our study will enhance the economic and technical feasibility of technological needs to provide low emission/fuel efficient urban mass transit bus. The design team includes undergraduate researchers in our department. Sample results using NASA HTEV simulation tool are presented.
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knittel, Christopher; Wolfran, Catherine; Gandhi, Raina
A wide range of climate plans rely on energy efficiency to generate energy and carbon emissions reductions, but conventional wisdom holds that consumers have historically underinvested in energy efficiency upgrades. This underinvestment may occur for a variety of reasons, one of which is that consumers are not adequately informed about the benefits to energy efficiency. To address this, the U.S. Department of Energy created a tool called the Home Energy Score (HEScore) to act as a simple, low-cost means to provide clear information about a home’s energy efficiency and motivate homeowners and homebuyers to invest in energy efficiency. The Departmentmore » of Energy is in the process of conducting four evaluations assessing the impact of the Home Energy Score on residential energy efficiency investments and program participation. This paper describes one of these evaluations: a randomized controlled trial conducted in New Jersey in partnership with New Jersey Natural Gas. The evaluation randomly provides homeowners who have received an audit, either because they have recently replaced their furnace, boiler, and/or gas water heater with a high-efficiency model and participated in a free audit to access an incentive, or because they requested an independent audit3, between May 2014 and October 2015, with the Home Energy Score.« less
Design and Evaluation of the Terminal Area Precision Scheduling and Spacing System
NASA Technical Reports Server (NTRS)
Swenson, Harry N.; Thipphavong, Jane; Sadovsky, Alex; Chen, Liang; Sullivan, Chris; Martin, Lynne
2011-01-01
This paper describes the design, development and results from a high fidelity human-in-the-loop simulation of an integrated set of trajectory-based automation tools providing precision scheduling, sequencing and controller merging and spacing functions. These integrated functions are combined into a system called the Terminal Area Precision Scheduling and Spacing (TAPSS) system. It is a strategic and tactical planning tool that provides Traffic Management Coordinators, En Route and Terminal Radar Approach Control air traffic controllers the ability to efficiently optimize the arrival capacity of a demand-impacted airport while simultaneously enabling fuel-efficient descent procedures. The TAPSS system consists of four-dimensional trajectory prediction, arrival runway balancing, aircraft separation constraint-based scheduling, traffic flow visualization and trajectory-based advisories to assist controllers in efficient metering, sequencing and spacing. The TAPSS system was evaluated and compared to today's ATC operation through extensive series of human-in-the-loop simulations for arrival flows into the Los Angeles International Airport. The test conditions included the variation of aircraft demand from a baseline of today's capacity constrained periods through 5%, 10% and 20% increases. Performance data were collected for engineering and human factor analysis and compared with similar operations both with and without the TAPSS system. The engineering data indicate operations with the TAPSS show up to a 10% increase in airport throughput during capacity constrained periods while maintaining fuel-efficient aircraft descent profiles from cruise to landing.
Method for Evaluating Energy Use of Dishwashers, Clothes Washers, and Clothes Dryers: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eastment, M.; Hendron, R.
Building America teams are researching opportunities to improve energy efficiency for some of the more challenging end-uses, such as lighting (both fixed and occupant-provided), appliances (clothes washer, dishwasher, clothes dryer, refrigerator, and range), and miscellaneous electric loads, which are all heavily dependent on occupant behavior and product choices. These end-uses have grown to be a much more significant fraction of total household energy use (as much as 50% for very efficient homes) as energy efficient homes have become more commonplace through programs such as ENERGY STAR and Building America. As modern appliances become more sophisticated the residential energy analyst ismore » faced with a daunting task in trying to calculate the energy savings of high efficiency appliances. Unfortunately, most whole-building simulation tools do not allow the input of detailed appliance specifications. Using DOE test procedures the method outlined in this paper presents a reasonable way to generate inputs for whole-building energy-simulation tools. The information necessary to generate these inputs is available on Energy-Guide labels, the ENERGY-STAR website, California Energy Commission's Appliance website and manufacturer's literature. Building America has developed a standard method for analyzing the effect of high efficiency appliances on whole-building energy consumption when compared to the Building America's Research Benchmark building.« less
The elusive Heisenberg limit in quantum-enhanced metrology
Demkowicz-Dobrzański, Rafał; Kołodyński, Jan; Guţă, Mădălin
2012-01-01
Quantum precision enhancement is of fundamental importance for the development of advanced metrological optical experiments, such as gravitational wave detection and frequency calibration with atomic clocks. Precision in these experiments is strongly limited by the 1/√N shot noise factor with N being the number of probes (photons, atoms) employed in the experiment. Quantum theory provides tools to overcome the bound by using entangled probes. In an idealized scenario this gives rise to the Heisenberg scaling of precision 1/N. Here we show that when decoherence is taken into account, the maximal possible quantum enhancement in the asymptotic limit of infinite N amounts generically to a constant factor rather than quadratic improvement. We provide efficient and intuitive tools for deriving the bounds based on the geometry of quantum channels and semi-definite programming. We apply these tools to derive bounds for models of decoherence relevant for metrological applications including: depolarization, dephasing, spontaneous emission and photon loss. PMID:22990859
Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Stoeckel, Gerhard P.; Doyle, Keith B.
2013-09-01
Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.
Challenges to Integrating Geographically-Dispersed Data and Expertise at U.S. Volcano Observatories
NASA Astrophysics Data System (ADS)
Murray, T. L.; Ewert, J. W.
2010-12-01
During the past 10 years the data and information available to volcano observatories to assess hazards and forecast activity has grown dramatically, a trend that will likely continue. Similarly, the ability of observatories to draw upon external specialists who can provide needed expertise is also increasing. Though technology easily provides the ability to move large amounts of information to the observatory, the challenge remains to efficiently and quickly integrate useful information and expertise into the decision-making process. The problem is further exacerbated by the use of new research techniques during times of heightened activity. Eruptive periods typically accelerate research into volcanic processes as scientists use the opportunity to test new hypotheses and develop new tools. Such experimental methods can be extremely insightful, but may be less easily integrated into the normal data streams that inform decisions. Similarly, there is an increased need for collaborative tools that allow efficient and effective communication between the observatory and external experts. Observatories will continue to be the central focus for integrating information, assessing hazards, and communicating with the public, but will increasingly draw on experts at other observatories, government agencies, academia and even the private sector, both foreign and domestic, to provide analysis and assistance. Fostering efficient communication among such a diverse and geographically dispersed group is a challenge. Addressing these challenges is one of the goals of the U.S. National Volcano Early Warning System, falling under the effort to improve interoperability among the five U.S. volcano observatories and their collaborators. In addition to providing the mechanisms to handle the flow of data, efforts will be directed at simplifying - though retaining the required nuance - information and merging data streams while developing tools that enable observatory staff to quickly integrate the data into the decision-making process. Also, advances in the use of collaborative tools and organizational structure will be required if observatories are to tap into the intellectual resources throughout the volcanological community. The last 10 years saw a continuing explosion in the quantity and quality of data and expertise available to address volcano hazards and volcanic activity; the challenge over the next 10 years will be for us to make the best use of it.
Insight into efficient image registration techniques and the demons algorithm.
Vercauteren, Tom; Pennec, Xavier; Malis, Ezio; Perchant, Aymeric; Ayache, Nicholas
2007-01-01
As image registration becomes more and more central to many biomedical imaging applications, the efficiency of the algorithms becomes a key issue. Image registration is classically performed by optimizing a similarity criterion over a given spatial transformation space. Even if this problem is considered as almost solved for linear registration, we show in this paper that some tools that have recently been developed in the field of vision-based robot control can outperform classical solutions. The adequacy of these tools for linear image registration leads us to revisit non-linear registration and allows us to provide interesting theoretical roots to the different variants of Thirion's demons algorithm. This analysis predicts a theoretical advantage to the symmetric forces variant of the demons algorithm. We show that, on controlled experiments, this advantage is confirmed, and yields a faster convergence.
Electromagnetomechanical elastodynamic model for Lamb wave damage quantification in composites
NASA Astrophysics Data System (ADS)
Borkowski, Luke; Chattopadhyay, Aditi
2014-03-01
Physics-based wave propagation computational models play a key role in structural health monitoring (SHM) and the development of improved damage quantification methodologies. Guided waves (GWs), such as Lamb waves, provide the capability to monitor large plate-like aerospace structures with limited actuators and sensors and are sensitive to small scale damage; however due to the complex nature of GWs, accurate and efficient computation tools are necessary to investigate the mechanisms responsible for dispersion, coupling, and interaction with damage. In this paper, the local interaction simulation approach (LISA) coupled with the sharp interface model (SIM) solution methodology is used to solve the fully coupled electro-magneto-mechanical elastodynamic equations for the piezoelectric and piezomagnetic actuation and sensing of GWs in fiber reinforced composite material systems. The final framework provides the full three-dimensional displacement as well as electrical and magnetic potential fields for arbitrary plate and transducer geometries and excitation waveform and frequency. The model is validated experimentally and proven computationally efficient for a laminated composite plate. Studies are performed with surface bonded piezoelectric and embedded piezomagnetic sensors to gain insight into the physics of experimental techniques used for SHM. The symmetric collocation of piezoelectric actuators is modeled to demonstrate mode suppression in laminated composites for the purpose of damage detection. The effect of delamination and damage (i.e., matrix cracking) on the GW propagation is demonstrated and quantified. The developed model provides a valuable tool for the improvement of SHM techniques due to its proven accuracy and computational efficiency.
Microseismic Full Waveform Modeling in Anisotropic Media with Moment Tensor Implementation
NASA Astrophysics Data System (ADS)
Shi, Peidong; Angus, Doug; Nowacki, Andy; Yuan, Sanyi; Wang, Yanyan
2018-03-01
Seismic anisotropy which is common in shale and fractured rocks will cause travel-time and amplitude discrepancy in different propagation directions. For microseismic monitoring which is often implemented in shale or fractured rocks, seismic anisotropy needs to be carefully accounted for in source location and mechanism determination. We have developed an efficient finite-difference full waveform modeling tool with an arbitrary moment tensor source. The modeling tool is suitable for simulating wave propagation in anisotropic media for microseismic monitoring. As both dislocation and non-double-couple source are often observed in microseismic monitoring, an arbitrary moment tensor source is implemented in our forward modeling tool. The increments of shear stress are equally distributed on the staggered grid to implement an accurate and symmetric moment tensor source. Our modeling tool provides an efficient way to obtain the Green's function in anisotropic media, which is the key of anisotropic moment tensor inversion and source mechanism characterization in microseismic monitoring. In our research, wavefields in anisotropic media have been carefully simulated and analyzed in both surface array and downhole array. The variation characteristics of travel-time and amplitude of direct P- and S-wave in vertical transverse isotropic media and horizontal transverse isotropic media are distinct, thus providing a feasible way to distinguish and identify the anisotropic type of the subsurface. Analyzing the travel-times and amplitudes of the microseismic data is a feasible way to estimate the orientation and density of the induced cracks in hydraulic fracturing. Our anisotropic modeling tool can be used to generate and analyze microseismic full wavefield with full moment tensor source in anisotropic media, which can help promote the anisotropic interpretation and inversion of field data.
Microseismic Full Waveform Modeling in Anisotropic Media with Moment Tensor Implementation
NASA Astrophysics Data System (ADS)
Shi, Peidong; Angus, Doug; Nowacki, Andy; Yuan, Sanyi; Wang, Yanyan
2018-07-01
Seismic anisotropy which is common in shale and fractured rocks will cause travel-time and amplitude discrepancy in different propagation directions. For microseismic monitoring which is often implemented in shale or fractured rocks, seismic anisotropy needs to be carefully accounted for in source location and mechanism determination. We have developed an efficient finite-difference full waveform modeling tool with an arbitrary moment tensor source. The modeling tool is suitable for simulating wave propagation in anisotropic media for microseismic monitoring. As both dislocation and non-double-couple source are often observed in microseismic monitoring, an arbitrary moment tensor source is implemented in our forward modeling tool. The increments of shear stress are equally distributed on the staggered grid to implement an accurate and symmetric moment tensor source. Our modeling tool provides an efficient way to obtain the Green's function in anisotropic media, which is the key of anisotropic moment tensor inversion and source mechanism characterization in microseismic monitoring. In our research, wavefields in anisotropic media have been carefully simulated and analyzed in both surface array and downhole array. The variation characteristics of travel-time and amplitude of direct P- and S-wave in vertical transverse isotropic media and horizontal transverse isotropic media are distinct, thus providing a feasible way to distinguish and identify the anisotropic type of the subsurface. Analyzing the travel-times and amplitudes of the microseismic data is a feasible way to estimate the orientation and density of the induced cracks in hydraulic fracturing. Our anisotropic modeling tool can be used to generate and analyze microseismic full wavefield with full moment tensor source in anisotropic media, which can help promote the anisotropic interpretation and inversion of field data.
Exploration Medical System Trade Study Tools Overview
NASA Technical Reports Server (NTRS)
Mindock, J.; Myers, J.; Latorella, K.; Cerro, J.; Hanson, A.; Hailey, M.; Middour, C.
2018-01-01
ExMC is creating an ecosystem of tools to enable well-informed medical system trade studies. The suite of tools address important system implementation aspects of the space medical capabilities trade space and are being built using knowledge from the medical community regarding the unique aspects of space flight. Two integrating models, a systems engineering model and a medical risk analysis model, tie the tools together to produce an integrated assessment of the medical system and its ability to achieve medical system target requirements. This presentation will provide an overview of the various tools that are a part of the tool ecosystem. Initially, the presentation's focus will address the tools that supply the foundational information to the ecosystem. Specifically, the talk will describe how information that describes how medicine will be practiced is captured and categorized for efficient utilization in the tool suite. For example, the talk will include capturing what conditions will be planned for in-mission treatment, planned medical activities (e.g., periodic physical exam), required medical capabilities (e.g., provide imaging), and options to implement the capabilities (e.g., an ultrasound device). Database storage and configuration management will also be discussed. The presentation will include an overview of how these information tools will be tied to parameters in a Systems Modeling Language (SysML) model, allowing traceability to system behavioral, structural, and requirements content. The discussion will also describe an HRP-led enhanced risk assessment model developed to provide quantitative insight into each capability's contribution to mission success. Key outputs from these various tools, to be shared with the space medical and exploration mission development communities, will be assessments of medical system implementation option satisfaction of requirements and per-capability contributions toward achieving requirements.
REACT Real-Time Emergency Action Coordination Tool
NASA Technical Reports Server (NTRS)
2004-01-01
Recently the Emergency Management Operations Center (EMOC) of St. Tammany Parish turned to the Technology Development and Transfer Office (TDTO) of NASA's Stennis Space Center (SSC) for help in combating the problems associated with water inundation. Working through a Dual-Use Development Agreement the Technology Development and Transfer Office, EMOC and a small geospatial applications company named Nvision provided the parish with a new front-line defense. REACT, Real-time Emergency Action coordination Tool is a decision support system that integrates disparate information to enable more efficient decision making by emergency management personnel.
NASA Astrophysics Data System (ADS)
Sessa, Francesco; D'Angelo, Paola; Migliorati, Valentina
2018-01-01
In this work we have developed an analytical procedure to identify metal ion coordination geometries in liquid media based on the calculation of Combined Distribution Functions (CDFs) starting from Molecular Dynamics (MD) simulations. CDFs provide a fingerprint which can be easily and unambiguously assigned to a reference polyhedron. The CDF analysis has been tested on five systems and has proven to reliably identify the correct geometries of several ion coordination complexes. This tool is simple and general and can be efficiently applied to different MD simulations of liquid systems.
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
The Alba ray tracing code: ART
NASA Astrophysics Data System (ADS)
Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi
2013-09-01
The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.
SAGE: The Self-Adaptive Grid Code. 3
NASA Technical Reports Server (NTRS)
Davies, Carol B.; Venkatapathy, Ethiraj
1999-01-01
The multi-dimensional self-adaptive grid code, SAGE, is an important tool in the field of computational fluid dynamics (CFD). It provides an efficient method to improve the accuracy of flow solutions while simultaneously reducing computer processing time. Briefly, SAGE enhances an initial computational grid by redistributing the mesh points into more appropriate locations. The movement of these points is driven by an equal-error-distribution algorithm that utilizes the relationship between high flow gradients and excessive solution errors. The method also provides a balance between clustering points in the high gradient regions and maintaining the smoothness and continuity of the adapted grid, The latest version, Version 3, includes the ability to change the boundaries of a given grid to more efficiently enclose flow structures and provides alternative redistribution algorithms.
BamTools: a C++ API and toolkit for analyzing and managing BAM files.
Barnett, Derek W; Garrison, Erik K; Quinlan, Aaron R; Strömberg, Michael P; Marth, Gabor T
2011-06-15
Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools.
Occupational health management: an audit tool.
Shelmerdine, L; Williams, N
2003-03-01
Organizations must manage occupational health risks in the workplace and the UK Health & Safety Executive (HSE) has published guidance on successful health and safety management. This paper describes a method of using the published guidance to audit the management of occupational health and safety, first at an organizational level and, secondly, to audit an occupational health service provider's role in the management of health risks. The paper outlines the legal framework in the UK for health risk management and describes the development and use of a tool for qualitative auditing of the efficiency, effectiveness and reliability of occupational health service provision within an organization. The audit tool is presented as a question set and the paper concludes with discussion of the strengths and weaknesses of using this tool, and recommendations on its use.
Developing Healthcare Data Analytics APPs with Open Data Science Tools.
Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong
2017-01-01
Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.
Generating community-built tools for data sharing and analysis in environmental networks
Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David
2016-01-01
Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.
Lonsdale, Jemma; Nicholson, Rose; Weston, Keith; Elliott, Michael; Birchenough, Andrew; Sühring, Roxana
2018-02-01
Estuaries are amongst the most socio-economically and ecologically important environments however, due to competing and conflicting demands, management is often challenging with a complex legislative framework managed by multiple agencies. To facilitate the understanding of this legislative framework, we have developed a GISbased Estuarine Planning Support System tool. The tool integrates the requirements of the relevant legislation and provides a basis for assessing the current environmental state of an estuary as well as informing and assessing new plans to ensure a healthy estuarine state. The tool ensures that the information is easily accessible for regulators, managers, developers and the public. The tool is intended to be adaptable, but is assessed using the Humber Estuary, United Kingdom as a case study area. The successful application of the tool for complex socio-economic and environmental systems demonstrates that the tool can efficiently guide users through the complex requirements needed to support sustainable development. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Design of a high altitude long endurance flying-wing solar-powered unmanned air vehicle
NASA Astrophysics Data System (ADS)
Alsahlani, A. A.; Johnston, L. J.; Atcliffe, P. A.
2017-06-01
The low-Reynolds number environment of high-altitude §ight places severe demands on the aerodynamic design and stability and control of a high altitude, long endurance (HALE) unmanned air vehicle (UAV). The aerodynamic efficiency of a §ying-wing configuration makes it an attractive design option for such an application and is investigated in the present work. The proposed configuration has a high-aspect ratio, swept-wing planform, the wing sweep being necessary to provide an adequate moment arm for outboard longitudinal and lateral control surfaces. A design optimization framework is developed under a MATLAB environment, combining aerodynamic, structural, and stability analysis. Low-order analysis tools are employed to facilitate efficient computations, which is important when there are multiple optimization loops for the various engineering analyses. In particular, a vortex-lattice method is used to compute the wing planform aerodynamics, coupled to a twodimensional (2D) panel method to derive aerofoil sectional characteristics. Integral boundary-layer methods are coupled to the panel method in order to predict §ow separation boundaries during the design iterations. A quasi-analytical method is adapted for application to flyingwing con¦gurations to predict the wing weight and a linear finite-beam element approach is used for structural analysis of the wing-box. Stability is a particular concern in the low-density environment of high-altitude flight for flying-wing aircraft and so provision of adequate directional stability and control power forms part of the optimization process. At present, a modified Genetic Algorithm is used in all of the optimization loops. Each of the low-order engineering analysis tools is validated using higher-order methods to provide con¦dence in the use of these computationally-efficient tools in the present design-optimization framework. This paper includes the results of employing the present optimization tools in the design of a HALE, flying-wing UAV to indicate that this is a viable design configuration option.
NASA Astrophysics Data System (ADS)
Versini, Pierre-Antoine; Tchiguirinskaia, Ioulia; Schertzer, Daniel
2016-04-01
Concentrating buildings and socio-economic activities, urban areas are particularly vulnerable to hydrological risks. Modification in climate may intensify already existing issues concerning stormwater management (due to impervious area) and water supply (due to the increase of the population). In this context, water use efficiency and best water management practices are key-issues in the urban environment already stressed. Blue and green infrastructures are nature-based solutions that provide synergy of the blue and green systems to provide multifunctional solutions and multiple benefits: increased amenity, urban heat island improvement, biodiversity, reduced energy requirements... They are particularly efficient to reduce the potential impact of new and existing developments with respect to stormwater and/or water supply issues. The Multi-Hydro distributed rainfall-runoff model represents an adapted tool to manage the impacts of such infrastructures at the urban basin scale. It is a numerical platform that makes several models interact, each of them representing a specific portion of the water cycle in an urban environment: surface runoff and infiltration depending on a land use classification, sub-surface processes and sewer network drainage. Multi-Hydro is still being developed at the Ecole des Ponts (open access from https://hmco.enpc.fr/Tools-Training/Tools/Multi-Hydro.php) to take into account the wide complexity of urban environments. The latest advancements have made possible the representation of several blue and green infrastructures (green roof, basin, swale). Applied in a new urban development project located in the Paris region, Multi-Hydro has been used to simulate the impact of blue and green infrastructures implementation. It was particularly focused on their ability to fulfil regulation rules established by local stormwater managers in order to connect the parcel to the sewer network. The results show that a combination of several blue and green infrastructures, if they are widely implemented, could represent an efficient tool to ensure regulation rules at the parcel scale.
Googling DNA sequences on the World Wide Web.
Hajibabaei, Mehrdad; Singer, Gregory A C
2009-11-10
New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.
Knowledge management in a waste based biorefinery in the QbD paradigm.
Rathore, Anurag S; Chopda, Viki R; Gomes, James
2016-09-01
Shifting resource base from fossil feedstock to renewable raw materials for production of chemical products has opened up an area of novel applications of industrial biotechnology-based process tools. This review aims to provide a concise and focused discussion on recent advances in knowledge management to facilitate efficient and optimal operation of a biorefinery. Application of quality by design (QbD) and process analytical technology (PAT) as tools for knowledge creation and management at different levels has been highlighted. Role of process integration, government policies, knowledge exchange through collaboration, and use of databases and computational tools have also been touched upon. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, S.; Yin, H.; Kline, D. M.
2006-08-01
This paper describes a joint effort of the Institute for Electrical Engineering of the Chinese Academy of Sciences (IEE), and the U.S. National Renewable Energy Laboratory (NREL) to support China's rural electrification program. This project developed a design tool that provides guidelines both for off-grid renewable energy system designs and for cost-based tariff and finance schemes to support them. This tool was developed to capitalize on lessons learned from the Township Electrification Program that preceded the Village Electrification Program. We describe the methods used to develop the analysis, some indicative results, and the planned use of the tool in themore » Village Electrification Program.« less
Learning Technology: Enhancing Learning in New Designs for the Comprehensive High School.
ERIC Educational Resources Information Center
Damyanovich, Mike; And Others
Technology, directed to each of the parts that collectively give shape and direction to the school, should provide the critical mass necessary to realize the specifications for the New Designs for the Comprehensive High School project. Learners should have access to personal productivity tools that increase effectiveness and efficiency in the…
An intercomparison study of TSM, SEBS, and SEBAL using high-resolution imagery and lysimetric data
USDA-ARS?s Scientific Manuscript database
Over the past three decades, numerous remote sensing based ET mapping algorithms were developed. These algorithms provided a robust, economical, and efficient tool for ET estimations at field and regional scales. The Two Source Model (TSM), Surface Energy Balance System (SEBS), and Surface Energy Ba...
Organize, Communicate, Empower! How Principals Can Make Time for Leadership
ERIC Educational Resources Information Center
Shaver, Heidi
2004-01-01
Instructional leaders need a wide range of skills and talents to be effective in today's schools and this resource will provide a variety of practical strategies and tools for efficiently handling all the details in order to increase productivity. This text highlights techniques, skills, and strategies related to Organization, Communication, and…
Computer Grading As an Instructional Tool.
ERIC Educational Resources Information Center
Rottmann, Ray M.; Hudson, H. T.
1983-01-01
Describes computer grading system providing/storing scores and giving feedback to instructors on how students are performing on a day-to-day basis and how they are handling course concepts. Focuses on the hardware and software of this efficient computerized grading package, which can be used with classes of 250 students (or larger). (Author/JN)
N-Sink: A Tool to Identify Nitrogen Sources and Sinks within aWatershed Framework
N-Sink is a customized ArcMap© program that provides maps of N sourcesand sinks within a watershed, and estimates the delivery efficiency of N movement from sources to the watershed outlet. The primary objective of N-Sink is to assist land use planners, watershed managers, and la...
A Model School Facility for Energy (with Related Video)
ERIC Educational Resources Information Center
Spangler, Seth; Crutchfield, Dave
2011-01-01
Energy modeling can be a powerful tool for managing energy-reduction concepts for an institution. Different types of energy models are developed at various stages of a project to provide data that can verify or disprove suggested energy-efficiency measures. Education institutions should understand what an energy model can do and, more important,…
A Multi-Language System for Knowledge Extraction in E-Learning Videos
ERIC Educational Resources Information Center
Sood, Aparesh; Mittal, Ankush; Sarthi, Divya
2006-01-01
The existing multimedia software in E-Learning does not provide par excellence multimedia data service to the common user, hence E-Learning services are still short of intelligence and sophisticated end user tools for visualization and retrieval. An efficient approach to achieve the tasks such as, regional language narration, regional language…
Using data mining to segment healthcare markets from patients' preference perspectives.
Liu, Sandra S; Chen, Jie
2009-01-01
This paper aims to provide an example of how to use data mining techniques to identify patient segments regarding preferences for healthcare attributes and their demographic characteristics. Data were derived from a number of individuals who received in-patient care at a health network in 2006. Data mining and conventional hierarchical clustering with average linkage and Pearson correlation procedures are employed and compared to show how each procedure best determines segmentation variables. Data mining tools identified three differentiable segments by means of cluster analysis. These three clusters have significantly different demographic profiles. The study reveals, when compared with traditional statistical methods, that data mining provides an efficient and effective tool for market segmentation. When there are numerous cluster variables involved, researchers and practitioners need to incorporate factor analysis for reducing variables to clearly and meaningfully understand clusters. Interests and applications in data mining are increasing in many businesses. However, this technology is seldom applied to healthcare customer experience management. The paper shows that efficient and effective application of data mining methods can aid the understanding of patient healthcare preferences.
Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company
NASA Technical Reports Server (NTRS)
Lores, M. E.
1978-01-01
Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.
Sarritzu, Valerio; Sestu, Nicola; Marongiu, Daniela; Chang, Xueqing; Masi, Sofia; Rizzo, Aurora; Colella, Silvia; Quochi, Francesco; Saba, Michele; Mura, Andrea; Bongiovanni, Giovanni
2017-01-01
Metal-halide perovskite solar cells rival the best inorganic solar cells in power conversion efficiency, providing the outlook for efficient, cheap devices. In order for the technology to mature and approach the ideal Shockley-Queissier efficiency, experimental tools are needed to diagnose what processes limit performances, beyond simply measuring electrical characteristics often affected by parasitic effects and difficult to interpret. Here we study the microscopic origin of recombination currents causing photoconversion losses with an all-optical technique, measuring the electron-hole free energy as a function of the exciting light intensity. Our method allows assessing the ideality factor and breaks down the electron-hole recombination current into bulk defect and interface contributions, providing an estimate of the limit photoconversion efficiency, without any real charge current flowing through the device. We identify Shockley-Read-Hall recombination as the main decay process in insulated perovskite layers and quantify the additional performance degradation due to interface recombination in heterojunctions. PMID:28317883
Coherent transport and energy flow patterns in photosynthesis under incoherent excitation.
Pelzer, Kenley M; Can, Tankut; Gray, Stephen K; Morr, Dirk K; Engel, Gregory S
2014-03-13
Long-lived coherences have been observed in photosynthetic complexes after laser excitation, inspiring new theories regarding the extreme quantum efficiency of photosynthetic energy transfer. Whether coherent (ballistic) transport occurs in nature and whether it improves photosynthetic efficiency remain topics of debate. Here, we use a nonequilibrium Green's function analysis to model exciton transport after excitation from an incoherent source (as opposed to coherent laser excitation). We find that even with an incoherent source, the rate of environmental dephasing strongly affects exciton transport efficiency, suggesting that the relationship between dephasing and efficiency is not an artifact of coherent excitation. The Green's function analysis provides a clear view of both the pattern of excitonic fluxes among chromophores and the multidirectionality of energy transfer that is a feature of coherent transport. We see that even in the presence of an incoherent source, transport occurs by qualitatively different mechanisms as dephasing increases. Our approach can be generalized to complex synthetic systems and may provide a new tool for optimizing synthetic light harvesting materials.
Deshmukh, Rupesh K; Sonah, Humira; Bélanger, Richard R
2016-01-01
Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research.
Protein structural similarity search by Ramachandran codes
Lo, Wei-Cheng; Huang, Po-Jung; Chang, Chih-Hung; Lyu, Ping-Chiang
2007-01-01
Background Protein structural data has increased exponentially, such that fast and accurate tools are necessary to access structure similarity search. To improve the search speed, several methods have been designed to reduce three-dimensional protein structures to one-dimensional text strings that are then analyzed by traditional sequence alignment methods; however, the accuracy is usually sacrificed and the speed is still unable to match sequence similarity search tools. Here, we aimed to improve the linear encoding methodology and develop efficient search tools that can rapidly retrieve structural homologs from large protein databases. Results We propose a new linear encoding method, SARST (Structural similarity search Aided by Ramachandran Sequential Transformation). SARST transforms protein structures into text strings through a Ramachandran map organized by nearest-neighbor clustering and uses a regenerative approach to produce substitution matrices. Then, classical sequence similarity search methods can be applied to the structural similarity search. Its accuracy is similar to Combinatorial Extension (CE) and works over 243,000 times faster, searching 34,000 proteins in 0.34 sec with a 3.2-GHz CPU. SARST provides statistically meaningful expectation values to assess the retrieved information. It has been implemented into a web service and a stand-alone Java program that is able to run on many different platforms. Conclusion As a database search method, SARST can rapidly distinguish high from low similarities and efficiently retrieve homologous structures. It demonstrates that the easily accessible linear encoding methodology has the potential to serve as a foundation for efficient protein structural similarity search tools. These search tools are supposed applicable to automated and high-throughput functional annotations or predictions for the ever increasing number of published protein structures in this post-genomic era. PMID:17716377
Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.
2016-01-01
Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research. PMID:28066459
GlycReSoft: A Software Package for Automated Recognition of Glycans from LC/MS Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Evan; Tan, Yan; Tan, Yuxiang
2012-09-26
Glycosylation modifies the physicochemical properties and protein binding functions of glycoconjugates. These modifications are biosynthesized in the endoplasmic reticulum and Golgi apparatus by a series of enzymatic transformations that are under complex control. As a result, mature glycans on a given site are heterogeneous mixtures of glycoforms. This gives rise to a spectrum of adhesive properties that strongly influences interactions with binding partners and resultant biological effects. In order to understand the roles glycosylation plays in normal and disease processes, efficient structural analysis tools are necessary. In the field of glycomics, liquid chromatography/mass spectrometry (LC/MS) is used to profile themore » glycans present in a given sample. This technology enables comparison of glycan compositions and abundances among different biological samples, i.e. normal versus disease, normal versus mutant, etc. Manual analysis of the glycan profiling LC/MS data is extremely time-consuming and efficient software tools are needed to eliminate this bottleneck. In this work, we have developed a tool to computationally model LC/MS data to enable efficient profiling of glycans. Using LC/MS data deconvoluted by Decon2LS/DeconTools, we built a list of unique neutral masses corresponding to candidate glycan compositions summarized over their various charge states, adducts and range of elution times. Our work aims to provide confident identification of true compounds in complex data sets that are not amenable to manual interpretation. This capability is an essential part of glycomics work flows. We demonstrate this tool, GlycReSoft, using an LC/MS dataset on tissue derived heparan sulfate oligosaccharides. The software, code and a test data set are publically archived under an open source license.« less
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.
Simonyan, Vahan; Mazumder, Raja
2014-09-30
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis
Simonyan, Vahan; Mazumder, Raja
2014-01-01
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953
A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition
2015-10-05
simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows
NASA Astrophysics Data System (ADS)
Pascoe, Stephen; Iwi, Alan; kershaw, philip; Stephens, Ag; Lawrence, Bryan
2014-05-01
The advent of large-scale data and the consequential analysis problems have led to two new challenges for the research community: how to share such data to get the maximum value and how to carry out efficient analysis. Solving both challenges require a form of parallelisation: the first is social parallelisation (involving trust and information sharing), the second data parallelisation (involving new algorithms and tools). The JASMIN infrastructure supports both kinds of parallelism by providing a multi-tennent environment with petabyte-scale storage, VM provisioning and batch cluster facilities. The JASMIN Analysis Platform (JAP) is an analysis software layer for JASMIN which emphasises ease of transition from a researcher's local environment to JASMIN. JAP brings together tools traditionally used by multiple communities and configures them to work together, enabling users to move analysis from their local environment to JASMIN without rewriting code. JAP also provides facilities to exploit JASMIN's parallel capabilities whilst maintaining their familiar analysis environment where ever possible. Modern opensource analysis tools typically have multiple dependent packages, increasing the installation burden on system administrators. When you consider a suite of tools, often with both common and conflicting dependencies, analysis pipelines can become locked to a particular installation simply because of the effort required to reconstruct the dependency tree. JAP addresses this problem by providing a consistent suite of RPMs compatible with RedHat Enterprise Linux and CentOS 6.4. Researchers can install JAP locally, either as RPMs or through a pre-built VM image, giving them the confidence to know moving analysis to JASMIN will not disrupt their environment. Analysis parallelisation is in it's infancy in climate sciences, with few tools capable of exploiting any parallel environment beyond manual scripting of the use of multiple processors. JAP begins to bridge this gap through a veriety of higher-level tools for parallelisation and job scheduling such as IPython-parallel and MPI support for interactive analysis languages. We find that enabling even simple parallelisation of workflows, together with the state of the art I/O performance of JASMIN storage, provides many users with the large increases in efficiency they need to scale their analyses to conteporary data volumes and tackly new, previously inaccessible, problems.
Shared control of a medical robot with haptic guidance.
Xiong, Linfei; Chng, Chin Boon; Chui, Chee Kong; Yu, Peiwu; Li, Yao
2017-01-01
Tele-operation of robotic surgery reduces the radiation exposure during the interventional radiological operations. However, endoscope vision without force feedback on the surgical tool increases the difficulty for precise manipulation and the risk of tissue damage. The shared control of vision and force provides a novel approach of enhanced control with haptic guidance, which could lead to subtle dexterity and better maneuvrability during MIS surgery. The paper provides an innovative shared control method for robotic minimally invasive surgery system, in which vision and haptic feedback are incorporated to provide guidance cues to the clinician during surgery. The incremental potential field (IPF) method is utilized to generate a guidance path based on the anatomy of tissue and surgical tool interaction. Haptic guidance is provided at the master end to assist the clinician during tele-operative surgical robotic task. The approach has been validated with path following and virtual tumor targeting experiments. The experiment results demonstrate that comparing with vision only guidance, the shared control with vision and haptics improved the accuracy and efficiency of surgical robotic manipulation, where the tool-position error distance and execution time are reduced. The validation experiment demonstrates that the shared control approach could help the surgical robot system provide stable assistance and precise performance to execute the designated surgical task. The methodology could also be implemented with other surgical robot with different surgical tools and applications.
Design of a final approach spacing tool for TRACON air traffic control
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Erzberger, Heinz; Bergeron, Hugh
1989-01-01
This paper describes an automation tool that assists air traffic controllers in the Terminal Radar Approach Control (TRACON) Facilities in providing safe and efficient sequencing and spacing of arrival traffic. The automation tool, referred to as the Final Approach Spacing Tool (FAST), allows the controller to interactively choose various levels of automation and advisory information ranging from predicted time errors to speed and heading advisories for controlling time error. FAST also uses a timeline to display current scheduling and sequencing information for all aircraft in the TRACON airspace. FAST combines accurate predictive algorithms and state-of-the-art mouse and graphical interface technology to present advisory information to the controller. Furthermore, FAST exchanges various types of traffic information and communicates with automation tools being developed for the Air Route Traffic Control Center. Thus it is part of an integrated traffic management system for arrival traffic at major terminal areas.
Mobile Autonomous Humanoid Assistant
NASA Technical Reports Server (NTRS)
Diftler, M. A.; Ambrose, R. O.; Tyree, K. S.; Goza, S. M.; Huber, E. L.
2004-01-01
A mobile autonomous humanoid robot is assisting human co-workers at the Johnson Space Center with tool handling tasks. This robot combines the upper body of the National Aeronautics and Space Administration (NASA)/Defense Advanced Research Projects Agency (DARPA) Robonaut system with a Segway(TradeMark) Robotic Mobility Platform yielding a dexterous, maneuverable humanoid perfect for aiding human co-workers in a range of environments. This system uses stereo vision to locate human team mates and tools and a navigation system that uses laser range and vision data to follow humans while avoiding obstacles. Tactile sensors provide information to grasping algorithms for efficient tool exchanges. The autonomous architecture utilizes these pre-programmed skills to form human assistant behaviors. The initial behavior demonstrates a robust capability to assist a human by acquiring a tool from a remotely located individual and then following the human in a cluttered environment with the tool for future use.
Laser-Assisted Stir Welding of 25-mm-Thick HSLA-65 Plate
NASA Astrophysics Data System (ADS)
Williamson, Keith M.
2002-12-01
Laser-assisted stir welding is a hybrid process that combines energy from a laser with functional heating and mechanical energy to join materials in the solid state. The technology is an adaptation of friction stir welding which is particularly suited for joining thick plates. Aluminum plates up to 75 mm thick have been successfully joined using friction stir welding. Since joining occurs in the solid state, stir technology offers the capability for fabricating full penetration joints in thick plates with better mechanical properties and less weld distortion than is possible by fusion processes. Currently friction stir welding is being used in several industries to improve productivity, reduce weight, and increase the strength of welded structures. Examples include: (a) the aircraft/aerospace industry where stir technology is currently being used to fabricate the space shuttle's external tank as well as components of the Delta family of rockets; (b) the shipping industry where container manufacturers are using stir technology to produce lighter containers with more payload capacity; and (c) the oil industry where offshore platform manufactures are using automated stir welding plants to fabricate large panels and structures up to 16 meters long with widths as required. In all these cases, stir technology has been restricted to aluminum alloys; however, stainless and HSLA 65 steels have been recently stir welded with friction as the primary heat source. One of the difficulties in adapting stir welding to steel is tool wear aggravated by the high tool rubbing velocities needed to provide frictional heat input into the material. Early work showed that the tool shoulder reached temperatures above 1000 C and the weld seam behind the tool stayed within this temperature range for up to 25 mm behind the tool. Cross sections of stir welded samples showed that the heat-affected zone is relatively wide and follows the profile of the tool shoulder. Besides minimizing the tool wear by increasing the energy into the material, another benefit of the proposed Laser Assisted Stir Welding (LASW is to reduce the width of the heat affected zone which typically has the lowest hardness in the weld region. Additionally, thermal modeling of the friction stir process shows that the heat input is asymmetric and suggests that the degree of asymmetry could improve the efficiency of the process. These asymmetries occur because the leading edge of the tool supplies heat to cold material while the trailing edge provides heat to material already preheated by the leading edge. As a result, flow stresses on the advancing side of the joint are lower than corresponding values on the retreating side. The proposed LASW process enhances these asymmetries by providing directional heating to increase the differential in flow stress across the joint and improve the stir tool efficiency. Theoretically the LASW process can provide the energy input to allow the flow stresses on the advancing side to approach zero and the stir efficiency to approach 100 percent. Reducing the flow stresses on the advancing side of the weld creates the greatest pressure differential across the stir weld and eliminates the possibility of voids on the advancing side of the joint. Small pressure differentials result in poor stir welds because voids on the advancing side are not filled by the plastic flow of material from the retreating side.
Development of High Efficiency (14%) Solar Cell Array Module
NASA Technical Reports Server (NTRS)
Iles, P. A.; Khemthong, S.; Olah, S.; Sampson, W. J.; Ling, K. S.
1979-01-01
High efficiency solar cells required for the low cost modules was developed. The production tooling for the manufacture of the cells and modules was designed. The tooling consisted of: (1) back contact soldering machine; (2) vacuum pickup; (3) antireflective coating tooling; and (4) test fixture.
Patient Populations, Clinical Associations, and System Efficiency in Healthcare Delivery System
NASA Astrophysics Data System (ADS)
Liu, Yazhuo
The efforts to improve health care delivery usually involve studies and analysis of patient populations and healthcare systems. In this dissertation, I present the research conducted in the following areas: identifying patient groups, improving treatments for specific conditions by using statistical as well as data mining techniques, and developing new operation research models to increase system efficiency from the health institutes' perspective. The results provide better understanding of high risk patient groups, more accuracy in detecting disease' correlations and practical scheduling tools that consider uncertain operation durations and real-life constraints.
NASA transmission research and its probable effects on helicopter transmission design
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.; Coy, J. J.; Townsend, D. P.
1983-01-01
Transmissions studied for application to helicopters in addition to the more conventional geared transmissions include hybrid (traction/gear), bearingless planetary, and split torque transmissions. Research is being performed to establish the validity of analysis and computer codes developed to predict the performance, efficiency, life, and reliability of these transmissions. Results of this research should provide the transmission designer with analytical tools to design for minimum weight and noise with maximum life and efficiency. In addition, the advantages and limitations of drive systems as well as the more conventional systems will be defined.
NASA transmission research and its probable effects on helicopter transmission design
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.; Coy, J. J.; Townsend, D. P.
1984-01-01
Transmissions studied for application to helicopters in addition to the more conventional geared transmissions include hybrid (traction/gear), bearingless planetary, and split torque transmissions. Research is being performed to establish the validity of analysis and computer codes developed to predict the performance, efficiency, life, and reliability of these transmissions. Results of this research should provide the transmission designer with analytical tools to design for minimum weight and noise with maximum life and efficiency. In addition, the advantages and limitations of drive systems as well as the more conventional systems will be defined.
NASA Technical Reports Server (NTRS)
Prevot, Thomas
2012-01-01
This paper describes the underlying principles and algorithms for computing the primary controller managed spacing (CMS) tools developed at NASA for precisely spacing aircraft along efficient descent paths. The trajectory-based CMS tools include slot markers, delay indications and speed advisories. These tools are one of three core NASA technologies integrated in NASAs ATM technology demonstration-1 (ATD-1) that will operationally demonstrate the feasibility of fuel-efficient, high throughput arrival operations using Automatic Dependent Surveillance Broadcast (ADS-B) and ground-based and airborne NASA technologies for precision scheduling and spacing.
Scheduling and Separating Departures Crossing Arrival Flows in Shared Airspace
NASA Technical Reports Server (NTRS)
Chevalley, Eric; Parke, Bonny K.; Lee, Paul; Omar, Faisal; Lee, Hwasoo; Beinert, Nancy; Kraut, Joshua M.; Palmer, Everett
2013-01-01
Flight efficiency and reduction of flight delays are among the primary goals of NextGen. In this paper, we propose a concept of shared airspace where departures fly across arrival flows, provided gaps are available in these flows. We have explored solutions to separate departures temporally from arrival traffic and pre-arranged procedures to support controllers' decisions. We conducted a Human-in-the-Loop simulation and assessed the efficiency and safety of 96 departures from the San Jose airport (SJC) climbing across the arrival airspace of the Oakland and San Francisco arrival flows. In our simulation, the SJC tower had a tool to schedule departures to fly across predicted gaps in the arrival flow. When departures were mistimed and separation could not be ensured, a safe but less efficient route was provided to the departures to fly under the arrival flows. A coordination using a point-out procedure allowed the arrival controller to control the SJC departures right after takeoff. We manipulated the accuracy of departure time (accurate vs. inaccurate) as well as which sector took control of the departures after takeoff (departure vs. arrival sector) in a 2x2 full factorial plan. Results show that coordination time decreased and climb efficiency increased when the arrival sector controlled the aircraft right after takeoff. Also, climb efficiency increased when the departure times were more accurate. Coordination was shown to be a critical component of tactical operations in shared airspace. Although workload, coordination, and safety were judged by controllers as acceptable in the simulation, it appears that in the field, controllers would need improved tools and coordination procedures to support this procedure.
Building Diagnostic Market Deployment - Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katipamula, S.; Gayeski, N.
2012-04-30
Operational faults are pervasive across the commercial buildings sector, wasting energy and increasing energy costs by up to about 30% (Mills 2009, Liu et al. 2003, Claridge et al. 2000, Katipamula and Brambley 2008, and Brambley and Katipamula 2009). Automated fault detection and diagnostic (AFDD) tools provide capabilities essential for detecting and correcting these problems and eliminating the associated energy waste and costs. The U.S. Department of Energy's (DOE) Building Technology Program (BTP) has previously invested in developing and testing of such diagnostic tools for whole-building (and major system) energy use, air handlers, chillers, cooling towers, chilled-water distribution systems, andmore » boilers. These diagnostic processes can be used to make the commercial buildings more energy efficient. The work described in this report was done as part of a Cooperative Research and Development Agreement (CRADA) between the U.S. Department of Energy's Pacific Northwest National Laboratory (PNNL) and KGS Building LLC (KGS). PNNL and KGS both believe that the widespread adoption of AFDD tools will result in significant reduction to energy and peak energy consumption. The report provides an introduction and summary of the various tasks performed under the CRADA. The CRADA project had three major focus areas: (1) Technical Assistance for Whole Building Energy Diagnostician (WBE) Commercialization, (2) Market Transfer of the Outdoor Air/Economizer Diagnostician (OAE), and (3) Development and Deployment of Automated Diagnostics to Improve Large Commercial Building Operations. PNNL has previously developed two diagnostic tools: (1) whole building energy (WBE) diagnostician and (2) outdoor air/economizer (OAE) diagnostician. WBE diagnostician is currently licensed non-exclusively to one company. As part of this CRADA, PNNL developed implementation documentation and provided technical support to KGS to implement the tool into their software suite, Clockworks. PNNL also provided validation data sets and the WBE software tool to validate the KGS implementation. OAE diagnostician automatically detects and diagnoses problems with outdoor air ventilation and economizer operation for air handling units (AHUs) in commercial buildings using data available from building automation systems (BASs). As part of this CRADA, PNNL developed implementation documentation and provided technical support to KGS to implement the tool into their software suite. PNNL also provided validation data sets and the OAE software tool to validate the KGS implementation. Finally, as part of this CRADA project, PNNL developed new processes to automate parts of the re-tuning process and transfer those process to KGS for integration into their software product. The transfer of DOE-funded technologies will transform the commercial buildings sector by making buildings more energy efficient and also reducing the carbon footprint from the buildings. As part of the CRADA with PNNL, KGS implemented the whole building energy diagnostician, a portion of outdoor air economizer diagnostician and a number of measures that automate the identification of re-tuning measures.« less
Exploring oral nanoemulsions for bioavailability enhancement of poorly water-soluble drugs.
Kotta, Sabna; Khan, Abdul Wadood; Pramod, Kannissery; Ansari, Shahid H; Sharma, Rakesh Kumar; Ali, Javed
2012-05-01
More than 40% of new chemical entities discovered are poorly water soluble and suffer from low oral bioavailability. In recent years, nanoemulsions are receiving increasing attention as a tool of delivering these low-bioavailable moieties in an efficient manner. This review gives a brief description about how oral nanoemulsions act as a tool to improve the bioavailability of poorly water-soluble drugs. The recurrent confusion found in the literature regarding the theory behind the formation of nanoemulsions is clarified, along with the difference between nanoemulsion and lyotropic 'microemulsion' phase. This paper gives a clear-cut idea about all possible methods for the preparation of nanoemulsions and the advantages and disadvantages of each method are described. A description of the stability problems of nanoemulsions and their prevention methods is also provided, in addition to a comprehensive update on the patents and research works done in the arena of oral nanoemulsions. Low-energy emulsification techniques can also produce stable nanoemulsions. It is guaranteed that oral nanoemulsions can act as a potential tool for the delivery of poorly water-soluble therapeutic moieties in a very efficient manner.
IDEAL: Images Across Domains, Experiments, Algorithms and Learning
NASA Astrophysics Data System (ADS)
Ushizima, Daniela M.; Bale, Hrishikesh A.; Bethel, E. Wes; Ercius, Peter; Helms, Brett A.; Krishnan, Harinarayan; Grinberg, Lea T.; Haranczyk, Maciej; Macdowell, Alastair A.; Odziomek, Katarzyna; Parkinson, Dilworth Y.; Perciano, Talita; Ritchie, Robert O.; Yang, Chao
2016-11-01
Research across science domains is increasingly reliant on image-centric data. Software tools are in high demand to uncover relevant, but hidden, information in digital images, such as those coming from faster next generation high-throughput imaging platforms. The challenge is to analyze the data torrent generated by the advanced instruments efficiently, and provide insights such as measurements for decision-making. In this paper, we overview work performed by an interdisciplinary team of computational and materials scientists, aimed at designing software applications and coordinating research efforts connecting (1) emerging algorithms for dealing with large and complex datasets; (2) data analysis methods with emphasis in pattern recognition and machine learning; and (3) advances in evolving computer architectures. Engineering tools around these efforts accelerate the analyses of image-based recordings, improve reusability and reproducibility, scale scientific procedures by reducing time between experiments, increase efficiency, and open opportunities for more users of the imaging facilities. This paper describes our algorithms and software tools, showing results across image scales, demonstrating how our framework plays a role in improving image understanding for quality control of existent materials and discovery of new compounds.
Massively parallel multicanonical simulations
NASA Astrophysics Data System (ADS)
Gross, Jonathan; Zierenberg, Johannes; Weigel, Martin; Janke, Wolfhard
2018-03-01
Generalized-ensemble Monte Carlo simulations such as the multicanonical method and similar techniques are among the most efficient approaches for simulations of systems undergoing discontinuous phase transitions or with rugged free-energy landscapes. As Markov chain methods, they are inherently serial computationally. It was demonstrated recently, however, that a combination of independent simulations that communicate weight updates at variable intervals allows for the efficient utilization of parallel computational resources for multicanonical simulations. Implementing this approach for the many-thread architecture provided by current generations of graphics processing units (GPUs), we show how it can be efficiently employed with of the order of 104 parallel walkers and beyond, thus constituting a versatile tool for Monte Carlo simulations in the era of massively parallel computing. We provide the fully documented source code for the approach applied to the paradigmatic example of the two-dimensional Ising model as starting point and reference for practitioners in the field.
FASTSim: A Model to Estimate Vehicle Efficiency, Cost and Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooker, A.; Gonder, J.; Wang, L.
2015-05-04
The Future Automotive Systems Technology Simulator (FASTSim) is a high-level advanced vehicle powertrain systems analysis tool supported by the U.S. Department of Energy’s Vehicle Technologies Office. FASTSim provides a quick and simple approach to compare powertrains and estimate the impact of technology improvements on light- and heavy-duty vehicle efficiency, performance, cost, and battery batches of real-world drive cycles. FASTSim’s calculation framework and balance among detail, accuracy, and speed enable it to simulate thousands of driven miles in minutes. The key components and vehicle outputs have been validated by comparing the model outputs to test data for many different vehicles tomore » provide confidence in the results. A graphical user interface makes FASTSim easy and efficient to use. FASTSim is freely available for download from the National Renewable Energy Laboratory’s website (see www.nrel.gov/fastsim).« less
Combining Induced Pluripotent Stem Cells and Genome Editing Technologies for Clinical Applications.
Chang, Chia-Yu; Ting, Hsiao-Chien; Su, Hong-Lin; Jeng, Jing-Ren
2018-01-01
In this review, we introduce current developments in induced pluripotent stem cells (iPSCs), site-specific nuclease (SSN)-mediated genome editing tools, and the combined application of these two novel technologies in biomedical research and therapeutic trials. The sustainable pluripotent property of iPSCs in vitro not only provides unlimited cell sources for basic research but also benefits precision medicines for human diseases. In addition, rapidly evolving SSN tools efficiently tailor genetic manipulations for exploring gene functions and can be utilized to correct genetic defects of congenital diseases in the near future. Combining iPSC and SSN technologies will create new reliable human disease models with isogenic backgrounds in vitro and provide new solutions for cell replacement and precise therapies.
Social Media As a Leadership Tool for Pharmacists
Toney, Blake; Goff, Debra A.; Weber, Robert J.
2015-01-01
The profession of pharmacy is currently experiencing transformational change in health system practice models with pharmacists’ provider status. Gaining buy-in and support of stakeholders in medicine, nursing, and other advocates for patient care is critical. To this end, building momentum to advance the profession will require experimentation with and utilization of more efficient ways to disseminate relevant information. Traditional methods to communicate can be inefficient and painstakingly slow. Health care providers are turning to social media to network, connect, engage, educate, and learn. Pharmacy leaders can use social media as an additional tool in the leadership toolkit. This article of the Director’s Forum shows how social media can assist pharmacy leaders in further developing patient-centered pharmacy services. PMID:26448676
Liu, Nan; D'Aunno, Thomas
2012-01-01
Objective To develop simple stylized models for evaluating the productivity and cost-efficiencies of different practice models to involve nurse practitioners (NPs) in primary care, and in particular to generate insights on what affects the performance of these models and how. Data Sources and Study Design The productivity of a practice model is defined as the maximum number of patients that can be accounted for by the model under a given timeliness-to-care requirement; cost-efficiency is measured by the corresponding annual cost per patient in that model. Appropriate queueing analysis is conducted to generate formulas and values for these two performance measures. Model parameters for the analysis are extracted from the previous literature and survey reports. Sensitivity analysis is conducted to investigate the model performance under different scenarios and to verify the robustness of findings. Principal Findings Employing an NP, whose salary is usually lower than a primary care physician, may not be cost-efficient, in particular when the NP's capacity is underutilized. Besides provider service rates, workload allocation among providers is one of the most important determinants for the cost-efficiency of a practice model involving NPs. Capacity pooling among providers could be a helpful strategy to improve efficiency in care delivery. Conclusions The productivity and cost-efficiency of a practice model depend heavily on how providers organize their work and a variety of other factors related to the practice environment. Queueing theory provides useful tools to take into account these factors in making strategic decisions on staffing and panel size selection for a practice model. PMID:22092009
GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis
NASA Astrophysics Data System (ADS)
Nass, Andrea; van Gasselt, Stephan
2015-04-01
Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.
In Interactive, Web-Based Approach to Metadata Authoring
NASA Technical Reports Server (NTRS)
Pollack, Janine; Wharton, Stephen W. (Technical Monitor)
2001-01-01
NASA's Global Change Master Directory (GCMD) serves a growing number of users by assisting the scientific community in the discovery of and linkage to Earth science data sets and related services. The GCMD holds over 8000 data set descriptions in Directory Interchange Format (DIF) and 200 data service descriptions in Service Entry Resource Format (SERF), encompassing the disciplines of geology, hydrology, oceanography, meteorology, and ecology. Data descriptions also contain geographic coverage information, thus allowing researchers to discover data pertaining to a particular geographic location, as well as subject of interest. The GCMD strives to be the preeminent data locator for world-wide directory level metadata. In this vein, scientists and data providers must have access to intuitive and efficient metadata authoring tools. Existing GCMD tools are not currently attracting. widespread usage. With usage being the prime indicator of utility, it has become apparent that current tools must be improved. As a result, the GCMD has released a new suite of web-based authoring tools that enable a user to create new data and service entries, as well as modify existing data entries. With these tools, a more interactive approach to metadata authoring is taken, as they feature a visual "checklist" of data/service fields that automatically update when a field is completed. In this way, the user can quickly gauge which of the required and optional fields have not been populated. With the release of these tools, the Earth science community will be further assisted in efficiently creating quality data and services metadata. Keywords: metadata, Earth science, metadata authoring tools
Okabe, Yoshihiro; Asamizu, Erika; Saito, Takeshi; Matsukura, Chiaki; Ariizumi, Tohru; Brès, Cécile; Rothan, Christophe; Mizoguchi, Tsuyoshi; Ezura, Hiroshi
2011-01-01
To accelerate functional genomic research in tomato, we developed a Micro-Tom TILLING (Targeting Induced Local Lesions In Genomes) platform. DNA pools were constructed from 3,052 ethyl methanesulfonate (EMS) mutant lines treated with 0.5 or 1.0% EMS. The mutation frequency was calculated by screening 10 genes. The 0.5% EMS population had a mild mutation frequency of one mutation per 1,710 kb, whereas the 1.0% EMS population had a frequency of one mutation per 737 kb, a frequency suitable for producing an allelic series of mutations in the target genes. The overall mutation frequency was one mutation per 1,237 kb, which affected an average of three alleles per kilobase screened. To assess whether a Micro-Tom TILLING platform could be used for efficient mutant isolation, six ethylene receptor genes in tomato (SlETR1–SlETR6) were screened. Two allelic mutants of SlETR1 (Sletr1-1 and Sletr1-2) that resulted in reduced ethylene responses were identified, indicating that our Micro-Tom TILLING platform provides a powerful tool for the rapid detection of mutations in an EMS mutant library. This work provides a practical and publicly accessible tool for the study of fruit biology and for obtaining novel genetic material that can be used to improve important agronomic traits in tomato. PMID:21965606
Look@NanoSIMS--a tool for the analysis of nanoSIMS data in environmental microbiology.
Polerecky, Lubos; Adam, Birgit; Milucka, Jana; Musat, Niculina; Vagner, Tomas; Kuypers, Marcel M M
2012-04-01
We describe an open-source freeware programme for high throughput analysis of nanoSIMS (nanometre-scale secondary ion mass spectrometry) data. The programme implements basic data processing and analytical functions, including display and drift-corrected accumulation of scanned planes, interactive and semi-automated definition of regions of interest (ROIs), and export of the ROIs' elemental and isotopic composition in graphical and text-based formats. Additionally, the programme offers new functions that were custom-designed to address the needs of environmental microbiologists. Specifically, it allows manual and automated classification of ROIs based on the information that is derived either from the nanoSIMS dataset itself (e.g. from labelling achieved by halogen in situ hybridization) or is provided externally (e.g. as a fluorescence in situ hybridization image). Moreover, by implementing post-processing routines coupled to built-in statistical tools, the programme allows rapid synthesis and comparative analysis of results from many different datasets. After validation of the programme, we illustrate how these new processing and analytical functions increase flexibility, efficiency and depth of the nanoSIMS data analysis. Through its custom-made and open-source design, the programme provides an efficient, reliable and easily expandable tool that can help a growing community of environmental microbiologists and researchers from other disciplines process and analyse their nanoSIMS data. © 2012 Society for Applied Microbiology and Blackwell Publishing Ltd.
Colombet, B; Woodman, M; Badier, J M; Bénar, C G
2015-03-15
The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.
Okabe, Yoshihiro; Asamizu, Erika; Saito, Takeshi; Matsukura, Chiaki; Ariizumi, Tohru; Brès, Cécile; Rothan, Christophe; Mizoguchi, Tsuyoshi; Ezura, Hiroshi
2011-11-01
To accelerate functional genomic research in tomato, we developed a Micro-Tom TILLING (Targeting Induced Local Lesions In Genomes) platform. DNA pools were constructed from 3,052 ethyl methanesulfonate (EMS) mutant lines treated with 0.5 or 1.0% EMS. The mutation frequency was calculated by screening 10 genes. The 0.5% EMS population had a mild mutation frequency of one mutation per 1,710 kb, whereas the 1.0% EMS population had a frequency of one mutation per 737 kb, a frequency suitable for producing an allelic series of mutations in the target genes. The overall mutation frequency was one mutation per 1,237 kb, which affected an average of three alleles per kilobase screened. To assess whether a Micro-Tom TILLING platform could be used for efficient mutant isolation, six ethylene receptor genes in tomato (SlETR1-SlETR6) were screened. Two allelic mutants of SlETR1 (Sletr1-1 and Sletr1-2) that resulted in reduced ethylene responses were identified, indicating that our Micro-Tom TILLING platform provides a powerful tool for the rapid detection of mutations in an EMS mutant library. This work provides a practical and publicly accessible tool for the study of fruit biology and for obtaining novel genetic material that can be used to improve important agronomic traits in tomato.
Collision detection and modeling of rigid and deformable objects in laparoscopic simulator
NASA Astrophysics Data System (ADS)
Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru
2015-03-01
Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.
Validation and Diagnostic Efficiency of the Mini-SPIN in Spanish-Speaking Adolescents
Garcia-Lopez, LuisJoaquín; Moore, Harry T. A.
2015-01-01
Objectives Social Anxiety Disorder (SAD) is one of the most common mental disorders in adolescence. Many validated psychometric tools are available to diagnose individuals with SAD efficaciously. However, there is a demand for shortened self-report instruments that identify adolescents at risk of developing SAD. We validate the Mini-SPIN and its diagnostic efficiency in overcoming this problem in Spanish-speaking adolescents in Spain. Methods The psychometric properties of the 3-item Mini-SPIN scale for adolescents were assessed in a community (study 1) and clinical sample (study 2). Results Study 1 consisted of 573 adolescents, and found the Mini-SPIN to have appropriate internal consistency and high construct validity. Study 2 consisted of 354 adolescents (147 participants diagnosed with SAD and 207 healthy controls). Data revealed that the Mini-SPIN has good internal consistency, high construct validity and adequate diagnostic efficiency. Conclusions Our findings suggest that the Mini-SPIN has good psychometric properties on clinical and healthy control adolescents and general population, which indicates that it can be used as a screening tool in Spanish-speaking adolescents. Cut-off scores are provided. PMID:26317695
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fathi Boukadi
2011-02-05
In this report, technologies for petroleum production and exploration enhancement in deepwater and mature fields are developed through basic and applied research by: (1) Designing new fluids to efficiently drill deepwater wells that can not be cost-effectively drilled with current technologies. The new fluids will be heavy liquid foams that have low-density at shallow dept to avoid formation breakdown and high density at drilling depth to control formation pressure. The goal of this project is to provide industry with formulations of new fluids for reducing casing programs and thus well construction cost in deepwater development. (2) Studying the effects ofmore » flue gas/CO{sub 2} huff n puff on incremental oil recovery in Louisiana oilfields bearing light oil. An artificial neural network (ANN) model will be developed and used to map recovery efficiencies for candidate reservoirs in Louisiana. (3) Arriving at a quantitative understanding for the three-dimensional controlled-source electromagnetic (CSEM) geophysical response of typical Gulf of Mexico hydrocarbon reservoirs. We will seek to make available tools for the qualitative, rapid interpretation of marine CSEM signatures, and tools for efficient, three-dimensional subsurface conductivity modeling.« less
Validation and Diagnostic Efficiency of the Mini-SPIN in Spanish-Speaking Adolescents.
Garcia-Lopez, LuisJoaquín; Moore, Harry T A
2015-01-01
Social Anxiety Disorder (SAD) is one of the most common mental disorders in adolescence. Many validated psychometric tools are available to diagnose individuals with SAD efficaciously. However, there is a demand for shortened self-report instruments that identify adolescents at risk of developing SAD. We validate the Mini-SPIN and its diagnostic efficiency in overcoming this problem in Spanish-speaking adolescents in Spain. The psychometric properties of the 3-item Mini-SPIN scale for adolescents were assessed in a community (study 1) and clinical sample (study 2). Study 1 consisted of 573 adolescents, and found the Mini-SPIN to have appropriate internal consistency and high construct validity. Study 2 consisted of 354 adolescents (147 participants diagnosed with SAD and 207 healthy controls). Data revealed that the Mini-SPIN has good internal consistency, high construct validity and adequate diagnostic efficiency. Our findings suggest that the Mini-SPIN has good psychometric properties on clinical and healthy control adolescents and general population, which indicates that it can be used as a screening tool in Spanish-speaking adolescents. Cut-off scores are provided.
NASA Astrophysics Data System (ADS)
Vines, Aleksander; Hansen, Morten W.; Korosov, Anton
2017-04-01
Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).
Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew
2015-03-03
In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems, improve targeting of public health policy, and offers a foundation for strengthening relationships between policy makers, stakeholders, and researchers.
A tool for simulating parallel branch-and-bound methods
NASA Astrophysics Data System (ADS)
Golubeva, Yana; Orlov, Yury; Posypkin, Mikhail
2016-01-01
The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer's interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.
Laurent, Pélozuelo; Frérot, Brigitte
2007-12-01
Since the identification of female European corn borer, Ostrinia nubilalis (Hübner) pheromone, pheromone-baited traps have been regarded as a promising tool to monitor populations of this pest. This article reviews the literature produced on this topic since the 1970s. Its aim is to provide extension entomologists and other researchers with all the necessary information to establish an efficient trapping procedure for this moth. The different pheromone races of the European corn borer are described, and research results relating to the optimization of pheromone blend, pheromone bait, trap design, and trap placement are summarized followed by a state-of-the-art summary of data comparing blacklight trap and pheromone-baited trap techniques to monitor European corn borer flight. Finally, we identify the information required to definitively validate/invalidate the pheromone-baited traps as an efficient decision support tool in European corn borer control.
Jérôme, Marc; Martinsohn, Jann Thorsten; Ortega, Delphine; Carreau, Philippe; Verrez-Bagnis, Véronique; Mouchel, Olivier
2008-05-28
Traceability in the fish food sector plays an increasingly important role for consumer protection and confidence building. This is reflected by the introduction of legislation and rules covering traceability on national and international levels. Although traceability through labeling is well established and supported by respective regulations, monitoring and enforcement of these rules are still hampered by the lack of efficient diagnostic tools. We describe protocols using a direct sequencing method based on 212-274-bp diagnostic sequences derived from species-specific mitochondria DNA cytochrome b, 16S rRNA, and cytochrome oxidase subunit I sequences which can efficiently be applied to unambiguously determine even closely related fish species in processed food products labeled "anchovy". Traceability of anchovy-labeled products is supported by the public online database AnchovyID ( http://anchovyid.jrc.ec.europa.eu), which provided data obtained during our study and tools for analytical purposes.
Watts, Brook; Lawrence, Renée H; Drawz, Paul; Carter, Cameron; Shumaker, Amy Hirsch; Kern, Elizabeth F
2016-08-01
Effective team-based models of care, such as the Patient-Centered Medical Home, require electronic tools to support proactive population management strategies that emphasize care coordination and quality improvement. Despite the spread of electronic health records (EHRs) and vendors marketing population health tools, clinical practices still may lack the ability to have: (1) local control over types of data collected/reports generated, (2) timely data (eg, up-to-date data, not several months old), and accordingly (3) the ability to efficiently monitor and improve patient outcomes. This article describes a quality improvement project at the hospital system level to develop and implement a flexible panel management (PM) tool to improve care of subpopulations of patients (eg, panels of patients with diabetes) by clinical teams. An in-depth case analysis approach is used to explore barriers and facilitators in building a PM registry tool for team-based management needs using standard data elements (eg, laboratory values, pharmacy records) found in EHRs. Also described are factors that may contribute to sustainability; to date the tool has been adapted to 6 disease-focused subpopulations encompassing more than 200,000 patients. Two key lessons emerged from this initiative: (1) though challenging, team-based clinical end users and information technology needed to work together consistently to refine the product, and (2) locally developed population management tools can provide efficient data tracking for frontline clinical teams and leadership. The preliminary work identified critical gaps that were successfully addressed by building local PM registry tools from EHR-derived data and offers lessons learned for others engaged in similar work. (Population Health Management 2016;19:232-239).
A Modeling Tool for Household Biogas Burner Flame Port Design
NASA Astrophysics Data System (ADS)
Decker, Thomas J.
Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.
An Overview of Tools for Creating, Validating and Using PDS Metadata
NASA Astrophysics Data System (ADS)
King, T. A.; Hardman, S. H.; Padams, J.; Mafi, J. N.; Cecconi, B.
2017-12-01
NASA's Planetary Data System (PDS) has defined information models for creating metadata to describe bundles, collections and products for all the assets acquired by a planetary science projects. Version 3 of the PDS Information Model (commonly known as "PDS3") is widely used and is used to describe most of the existing planetary archive. Recently PDS has released version 4 of the Information Model (commonly known as "PDS4") which is designed to improve consistency, efficiency and discoverability of information. To aid in creating, validating and using PDS4 metadata the PDS and a few associated groups have developed a variety of tools. In addition, some commercial tools, both free and for a fee, can be used to create and work with PDS4 metadata. We present an overview of these tools, describe those tools currently under development and provide guidance as to which tools may be most useful for missions, instrument teams and the individual researcher.
Teo, Chin Chye; Tan, Swee Ngin; Yong, Jean Wan Hong; Hew, Choy Sin; Ong, Eng Shi
2009-02-01
An approach that combined green-solvent methods of extraction with chromatographic chemical fingerprint and pattern recognition tools such as principal component analysis (PCA) was used to evaluate the quality of medicinal plants. Pressurized hot water extraction (PHWE) and microwave-assisted extraction (MAE) were used and their extraction efficiencies to extract two bioactive compounds, namely stevioside (SV) and rebaudioside A (RA), from Stevia rebaudiana Bertoni (SB) under different cultivation conditions were compared. The proposed methods showed that SV and RA could be extracted from SB using pure water under optimized conditions. The extraction efficiency of the methods was observed to be higher or comparable to heating under reflux with water. The method precision (RSD, n = 6) was found to vary from 1.91 to 2.86% for the two different methods on different days. Compared to PHWE, MAE has higher extraction efficiency with shorter extraction time. MAE was also found to extract more chemical constituents and provide distinctive chemical fingerprints for quality control purposes. Thus, a combination of MAE with chromatographic chemical fingerprints and PCA provided a simple and rapid approach for the comparison and classification of medicinal plants from different growth conditions. Hence, the current work highlighted the importance of extraction method in chemical fingerprinting for the classification of medicinal plants from different cultivation conditions with the aid of pattern recognition tools used.
MilxXplore: a web-based system to explore large imaging datasets.
Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J
2013-01-01
As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis.
Hallgren, Kevin A; Bauer, Amy M; Atkins, David C
2017-06-01
Clinical decision making encompasses a broad set of processes that contribute to the effectiveness of depression treatments. There is emerging interest in using digital technologies to support effective and efficient clinical decision making. In this paper, we provide "snapshots" of research and current directions on ways that digital technologies can support clinical decision making in depression treatment. Practical facets of clinical decision making are reviewed, then research, design, and implementation opportunities where technology can potentially enhance clinical decision making are outlined. Discussions of these opportunities are organized around three established movements designed to enhance clinical decision making for depression treatment, including measurement-based care, integrated care, and personalized medicine. Research, design, and implementation efforts may support clinical decision making for depression by (1) improving tools to incorporate depression symptom data into existing electronic health record systems, (2) enhancing measurement of treatment fidelity and treatment processes, (3) harnessing smartphone and biosensor data to inform clinical decision making, (4) enhancing tools that support communication and care coordination between patients and providers and within provider teams, and (5) leveraging treatment and outcome data from electronic health record systems to support personalized depression treatment. The current climate of rapid changes in both healthcare and digital technologies facilitates an urgent need for research, design, and implementation of digital technologies that explicitly support clinical decision making. Ensuring that such tools are efficient, effective, and usable in frontline treatment settings will be essential for their success and will require engagement of stakeholders from multiple domains. © 2017 Wiley Periodicals, Inc.
Structural-Vibration-Response Data Analysis
NASA Technical Reports Server (NTRS)
Smith, W. R.; Hechenlaible, R. N.; Perez, R. C.
1983-01-01
Computer program developed as structural-vibration-response data analysis tool for use in dynamic testing of Space Shuttle. Program provides fast and efficient time-domain least-squares curve-fitting procedure for reducing transient response data to obtain structural model frequencies and dampings from free-decay records. Procedure simultaneously identifies frequencies, damping values, and participation factors for noisy multiple-response records.
Visualization of Concurrent Program Executions
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Havelund, Klaus; Honiden, Shinichi
2007-01-01
Various program analysis techniques are efficient at discovering failures and properties. However, it is often difficult to evaluate results, such as program traces. This calls for abstraction and visualization tools. We propose an approach based on UML sequence diagrams, addressing shortcomings of such diagrams for concurrency. The resulting visualization is expressive and provides all the necessary information at a glance.
Design Aids for Real-Time Systems (DARTS)
NASA Technical Reports Server (NTRS)
Szulewski, P. A.
1982-01-01
Design-Aids for Real-Time Systems (DARTS) is a tool that assists in defining embedded computer systems through tree structured graphics, military standard documentation support, and various analyses including automated Software Science parameter counting and metrics calculation. These analyses provide both static and dynamic design quality feedback which can potentially aid in producing efficient, high quality software systems.
DEVELOPMENT OF DIAGNOSTIC ANALYTICAL AND MECHANICAL ABILITY TESTS THROUGH FACET DESIGN AND ANALYSIS.
ERIC Educational Resources Information Center
GUTTMAN, LOUIS,; SCHLESINGER, I.M.
METHODOLOGY BASED ON FACET THEORY (MODIFIED SET THEORY) WAS USED IN TEST CONSTRUCTION AND ANALYSIS TO PROVIDE AN EFFICIENT TOOL OF EVALUATION FOR VOCATIONAL GUIDANCE AND VOCATIONAL SCHOOL USE. THE TYPE OF TEST DEVELOPMENT UNDERTAKEN WAS LIMITED TO THE USE OF NONVERBAL PICTORIAL ITEMS. ITEMS FOR TESTING ABILITY TO IDENTIFY ELEMENTS BELONGING TO AN…
A Preliminary Study on Building an E-Education Platform for Indian School-Level Curricula
ERIC Educational Resources Information Center
Kanth, Rajeev Kumar; Laakso, Mikko-Jussi
2016-01-01
In this study, we explore the possibilities of utilizing and implementing an e-Education platform for Indian school-level curricula. This study will demonstrate how the e-Education platform provides a positive result to the students' learning and how this tool helps in managing the overall teaching processes efficiently. Before describing the…
Introduction to the Use of Computers in Libraries: A Textbook for the Non-Technical Student.
ERIC Educational Resources Information Center
Ogg, Harold C.
This book outlines computing and information science from the perspective of what librarians and educators need to do with computer technology and how it can help them perform their jobs more efficiently. It provides practical explanations and library applications for non-technical users of desktop computers and other library automation tools.…
2006-05-30
implementation Final Report 4 TECHNICAL PLAN AND RESULTS Task 1: Initiate the Project Management System Two senior NGSS production management...1 Technical Plan and Results...Third the system is hosted on a handheld unit which provides the foremen with an efficient daily planning tool. The Pilot System which entails
USDA-ARS?s Scientific Manuscript database
A rapid method for extracting eriophyoid mites was adapted from previous studies to provide growers and IPM consultants with a practical, efficient, and reliable tool to monitor for rust mites in vineyards. The rinse in bag (RIB) method allows quick extraction of mites from collected plant parts (sh...
Applying a Framework to Evaluate Assignment Marking Software: A Case Study on Lightwork
ERIC Educational Resources Information Center
Heinrich, Eva; Milne, John
2012-01-01
This article presents the findings of a qualitative evaluation on the effect of a specialised software tool on the efficiency and quality of assignment marking. The software, Lightwork, combines with the Moodle learning management system and provides support through marking rubrics and marker allocations. To enable the evaluation a framework has…
ERIC Educational Resources Information Center
Riley, Olive L.; And Others
This manual is intended to help elementary teachers: (1) recognize and achieve the objectives of the art program; (2) organize the classroom (including the equipment, art materials, and tools) to provide an atmosphere conducive to congenial living, and efficient working and learning; (3) plan painting, drawing, and craft activities so that…
The Development of Word Processing and Its Implications for the Business Education Profession.
ERIC Educational Resources Information Center
Ober, B. Scot
In an attempt to deal with the paperwork explosion occurring in business offices, administrative management has developed the concept of word processing as a means of increasing office efficiency. Thus, the purpose of this study was to provide business educators with information on this new management tool and to identify those skills needed by…
BamTools: a C++ API and toolkit for analyzing and managing BAM files
Barnett, Derek W.; Garrison, Erik K.; Quinlan, Aaron R.; Strömberg, Michael P.; Marth, Gabor T.
2011-01-01
Motivation: Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. Results: We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. Availability: BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools. Contact: barnetde@bc.edu PMID:21493652
Robonaut Mobile Autonomy: Initial Experiments
NASA Technical Reports Server (NTRS)
Diftler, M. A.; Ambrose, R. O.; Goza, S. M.; Tyree, K. S.; Huber, E. L.
2006-01-01
A mobile version of the NASA/DARPA Robonaut humanoid recently completed initial autonomy trials working directly with humans in cluttered environments. This compact robot combines the upper body of the Robonaut system with a Segway Robotic Mobility Platform yielding a dexterous, maneuverable humanoid ideal for interacting with human co-workers in a range of environments. This system uses stereovision to locate human teammates and tools and a navigation system that uses laser range and vision data to follow humans while avoiding obstacles. Tactile sensors provide information to grasping algorithms for efficient tool exchanges. The autonomous architecture utilizes these pre-programmed skills to form complex behaviors. The initial behavior demonstrates a robust capability to assist a human by acquiring a tool from a remotely located individual and then following the human in a cluttered environment with the tool for future use.
The Role of the Clinical Laboratory in the Future of Health Care: Lean Microbiology
Samuel, Linoj
2014-01-01
This commentary will introduce lean concepts into the clinical microbiology laboratory. The practice of lean in the clinical microbiology laboratory can remove waste, increase efficiency, and reduce costs. Lean, Six Sigma, and other such management initiatives are useful tools and can provide dividends but must be accompanied by organizational leadership commitment to sustaining the lean culture in the laboratory setting and providing resources and time to work through the process. PMID:24574289
Khanna, Vishesh; Sambandam, Senthil N; Gul, Arif; Mounasamy, Varatharaj
2015-07-01
Smartphones have emerged as essential tools providing assistance in patient care, monitoring, rehabilitation, communication, diagnosis, teaching, research and reference. Among innumerable communication apps, WhatsApp has been widely popular and cost effective. The aim of our study was to report the impact of introduction of a smartphone app "WhatsApp" as an intradepartmental communication tool on (1) awareness of patient-related information, (2) efficiency of the handover process and (3) duration of traditional morning handovers among orthopedic residents in a 300-bedded tertiary care teaching center. Written handovers and paging used for communication at our center led to occasional inefficiencies among residents. Widespread use, low cost, availability and double password protection (phone lock and WhatsApp lock) made WhatsApp's group conversation feature an ideal tool for intradepartmental patient-related communication. Twenty-five consecutive admissions before and after WhatsApp (BW, AW) were included in the study. Eight orthopedic residents attempted fifty randomly arranged questions based on the twenty-five patients in each study period. A null hypothesis that introduction of WhatsApp group would neither increase the awareness of patient-related information nor improve the efficiency of the handovers among residents was assumed. A significant improvement observed in scores obtained by residents in the AW group led to rejection of the null hypothesis. The residents also reported swifter and efficient handovers after the introduction of WhatsApp. Our results indicate that the introduction of a smartphone app "WhatsApp" as an intradepartmental communication tool can bring about an improvement in patient-related awareness, communication and handovers among orthopedic residents.
An efficient visualization method for analyzing biometric data
NASA Astrophysics Data System (ADS)
Rahmes, Mark; McGonagle, Mike; Yates, J. Harlan; Henning, Ronda; Hackett, Jay
2013-05-01
We introduce a novel application for biometric data analysis. This technology can be used as part of a unique and systematic approach designed to augment existing processing chains. Our system provides image quality control and analysis capabilities. We show how analysis and efficient visualization are used as part of an automated process. The goal of this system is to provide a unified platform for the analysis of biometric images that reduce manual effort and increase the likelihood of a match being brought to an examiner's attention from either a manual or lights-out application. We discuss the functionality of FeatureSCOPE™ which provides an efficient tool for feature analysis and quality control of biometric extracted features. Biometric databases must be checked for accuracy for a large volume of data attributes. Our solution accelerates review of features by a factor of up to 100 times. Review of qualitative results and cost reduction is shown by using efficient parallel visual review for quality control. Our process automatically sorts and filters features for examination, and packs these into a condensed view. An analyst can then rapidly page through screens of features and flag and annotate outliers as necessary.
VariantSpark: population scale clustering of genotype information.
O'Brien, Aidan R; Saunders, Neil F W; Guo, Yi; Buske, Fabian A; Scott, Rodney J; Bauer, Denis C
2015-12-10
Genomic information is increasingly used in medical practice giving rise to the need for efficient analysis methodology able to cope with thousands of individuals and millions of variants. The widely used Hadoop MapReduce architecture and associated machine learning library, Mahout, provide the means for tackling computationally challenging tasks. However, many genomic analyses do not fit the Map-Reduce paradigm. We therefore utilise the recently developed SPARK engine, along with its associated machine learning library, MLlib, which offers more flexibility in the parallelisation of population-scale bioinformatics tasks. The resulting tool, VARIANTSPARK provides an interface from MLlib to the standard variant format (VCF), offers seamless genome-wide sampling of variants and provides a pipeline for visualising results. To demonstrate the capabilities of VARIANTSPARK, we clustered more than 3,000 individuals with 80 Million variants each to determine the population structure in the dataset. VARIANTSPARK is 80 % faster than the SPARK-based genome clustering approach, ADAM, the comparable implementation using Hadoop/Mahout, as well as ADMIXTURE, a commonly used tool for determining individual ancestries. It is over 90 % faster than traditional implementations using R and Python. The benefits of speed, resource consumption and scalability enables VARIANTSPARK to open up the usage of advanced, efficient machine learning algorithms to genomic data.
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
2000-01-01
The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.
FITSManager: Management of Personal Astronomical Data
NASA Astrophysics Data System (ADS)
Cui, Chenzhou; Fan, Dongwei; Zhao, Yongheng; Kembhavi, Ajit; He, Boliang; Cao, Zihuang; Li, Jian; Nandrekar, Deoyani
2011-07-01
With the increase of personal storage capacity, it is easy to find hundreds to thousands of FITS files in the personal computer of an astrophysicist. Because Flexible Image Transport System (FITS) is a professional data format initiated by astronomers and used mainly in the small community, data management toolkits for FITS files are very few. Astronomers need a powerful tool to help them manage their local astronomical data. Although Virtual Observatory (VO) is a network oriented astronomical research environment, its applications and related technologies provide useful solutions to enhance the management and utilization of astronomical data hosted in an astronomer's personal computer. FITSManager is such a tool to provide astronomers an efficient management and utilization of their local data, bringing VO to astronomers in a seamless and transparent way. FITSManager provides fruitful functions for FITS file management, like thumbnail, preview, type dependent icons, header keyword indexing and search, collaborated working with other tools and online services, and so on. The development of the FITSManager is an effort to fill the gap between management and analysis of astronomical data.
Pharmacological Tools to Study the Role of Astrocytes in Neural Network Functions.
Peña-Ortega, Fernando; Rivera-Angulo, Ana Julia; Lorea-Hernández, Jonathan Julio
2016-01-01
Despite that astrocytes and microglia do not communicate by electrical impulses, they can efficiently communicate among them, with each other and with neurons, to participate in complex neural functions requiring broad cell-communication and long-lasting regulation of brain function. Glial cells express many receptors in common with neurons; secrete gliotransmitters as well as neurotrophic and neuroinflammatory factors, which allow them to modulate synaptic transmission and neural excitability. All these properties allow glial cells to influence the activity of neuronal networks. Thus, the incorporation of glial cell function into the understanding of nervous system dynamics will provide a more accurate view of brain function. Our current knowledge of glial cell biology is providing us with experimental tools to explore their participation in neural network modulation. In this chapter, we review some of the classical, as well as some recent, pharmacological tools developed for the study of astrocyte's influence in neural function. We also provide some examples of the use of these pharmacological agents to understand the role of astrocytes in neural network function and dysfunction.
The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.
Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A
2010-03-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).
The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software
Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung
2010-01-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162
Enhancing population pharmacokinetic modeling efficiency and quality using an integrated workflow.
Schmidt, Henning; Radivojevic, Andrijana
2014-08-01
Population pharmacokinetic (popPK) analyses are at the core of Pharmacometrics and need to be performed regularly. Although these analyses are relatively standard, a large variability can be observed in both the time (efficiency) and the way they are performed (quality). Main reasons for this variability include the level of experience of a modeler, personal preferences and tools. This paper aims to examine how the process of popPK model building can be supported in order to increase its efficiency and quality. The presented approach to the conduct of popPK analyses is centered around three key components: (1) identification of most common and important popPK model features, (2) required information content and formatting of the data for modeling, and (3) methodology, workflow and workflow supporting tools. This approach has been used in several popPK modeling projects and a documented example is provided in the supplementary material. Efficiency of model building is improved by avoiding repetitive coding and other labor-intensive tasks and by putting the emphasis on a fit-for-purpose model. Quality is improved by ensuring that the workflow and tools are in alignment with a popPK modeling guidance which is established within an organization. The main conclusion of this paper is that workflow based approaches to popPK modeling are feasible and have significant potential to ameliorate its various aspects. However, the implementation of such an approach in a pharmacometric organization requires openness towards innovation and change-the key ingredient for evolution of integrative and quantitative drug development in the pharmaceutical industry.
Performance evaluation of nonhomogeneous hospitals: the case of Hong Kong hospitals.
Li, Yongjun; Lei, Xiyang; Morton, Alec
2018-02-14
Throughout the world, hospitals are under increasing pressure to become more efficient. Efficiency analysis tools can play a role in giving policymakers insight into which units are less efficient and why. Many researchers have studied efficiencies of hospitals using data envelopment analysis (DEA) as an efficiency analysis tool. However, in the existing literature on DEA-based performance evaluation, a standard assumption of the constant returns to scale (CRS) or the variable returns to scale (VRS) DEA models is that decision-making units (DMUs) use a similar mix of inputs to produce a similar set of outputs. In fact, hospitals with different primary goals supply different services and provide different outputs. That is, hospitals are nonhomogeneous and the standard assumption of the DEA model is not applicable to the performance evaluation of nonhomogeneous hospitals. This paper considers the nonhomogeneity among hospitals in the performance evaluation and takes hospitals in Hong Kong as a case study. An extension of Cook et al. (2013) [1] based on the VRS assumption is developed to evaluated nonhomogeneous hospitals' efficiencies since inputs of hospitals vary greatly. Following the philosophy of Cook et al. (2013) [1], hospitals are divided into homogeneous groups and the product process of each hospital is divided into subunits. The performance of hospitals is measured on the basis of subunits. The proposed approach can be applied to measure the performance of other nonhomogeneous entities that exhibit variable return to scale.
Freiburg RNA tools: a central online resource for RNA-focused research and teaching.
Raden, Martin; Ali, Syed M; Alkhnbashi, Omer S; Busch, Anke; Costa, Fabrizio; Davis, Jason A; Eggenhofer, Florian; Gelhausen, Rick; Georg, Jens; Heyne, Steffen; Hiller, Michael; Kundu, Kousik; Kleinkauf, Robert; Lott, Steffen C; Mohamed, Mostafa M; Mattheis, Alexander; Miladi, Milad; Richter, Andreas S; Will, Sebastian; Wolff, Joachim; Wright, Patrick R; Backofen, Rolf
2018-05-21
The Freiburg RNA tools webserver is a well established online resource for RNA-focused research. It provides a unified user interface and comprehensive result visualization for efficient command line tools. The webserver includes RNA-RNA interaction prediction (IntaRNA, CopraRNA, metaMIR), sRNA homology search (GLASSgo), sequence-structure alignments (LocARNA, MARNA, CARNA, ExpaRNA), CRISPR repeat classification (CRISPRmap), sequence design (antaRNA, INFO-RNA, SECISDesign), structure aberration evaluation of point mutations (RaSE), and RNA/protein-family models visualization (CMV), and other methods. Open education resources offer interactive visualizations of RNA structure and RNA-RNA interaction prediction as well as basic and advanced sequence alignment algorithms. The services are freely available at http://rna.informatik.uni-freiburg.de.
Toward an Efficient Icing CFD Process Using an Interactive Software Toolkit: Smagglce 2D
NASA Technical Reports Server (NTRS)
Vickerman, Mary B.; Choo, Yung K.; Schilling, Herbert W.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.
2001-01-01
Two-dimensional CID analysis for iced airfoils can be a labor-intensive task. The software toolkit SmaggIce 2D is being developed to help streamline the CID process and provide the unique features needed for icing. When complete, it will include a combination of partially automated and fully interactive tools for all aspects of the tasks leading up to the flow analysis: geometry preparation, domain decomposition. block boundary demoralization. gridding, and linking with a flow solver. It also includes tools to perform ice shape characterization, an important aid in determining the relationship between ice characteristics and their effects on aerodynamic performance. Completed tools, work-in-progress, and planned features of the software toolkit are presented here.
Improving Cognitive Abilities and e-Inclusion in Children with Cerebral Palsy
NASA Astrophysics Data System (ADS)
Martinengo, Chiara; Curatelli, Francesco
Besides overcoming the motor barriers for accessing to computers and Internet, ICT tools can provide a very useful, and often necessary, support for the cognitive development of motor-impaired children with cerebral palsy. In fact, software tools for computation and communication allow teachers to put into effect, in a more complete and efficient way, the learning methods and the educational plans studied for the child. In the present article, after a brief analysis of the general objectives to be pursued for favouring the learning for children with cerebral palsy, we take account of some specific difficulties in the logical-linguistic and logical-mathematical fields, and we show how they can be overcome using general ICT tools and specifically implemented software programs.
Intelligent Processing Equipment Within the Environmental Protection Agency
NASA Technical Reports Server (NTRS)
Greathouse, Daniel G.; Nalesnik, Richard P.
1992-01-01
Protection of the environment and environmental remediation requires the cooperation, at all levels, of government and industry. Intelligent processing equipment, in addition to other artificial intelligence based tools, was used by the Environmental Protection Agency to provide personnel safety and improve the efficiency of those responsible for protection and remediation of the environment. These exploratory efforts demonstrate the feasibility and utility of expanding development and widespread use of these tools. A survey of current intelligent processing equipment applications in the Agency is presented and is followed by a brief discussion of possible uses in the future.
Quality indexing with computer-aided lexicography
NASA Technical Reports Server (NTRS)
Buchan, Ronald L.
1992-01-01
Indexing with computers is a far cry from indexing with the first indexing tool, the manual card sorter. With the aid of computer-aided lexicography, both indexing and indexing tools can provide standardization, consistency, and accuracy, resulting in greater quality control than ever before. A brief survey of computer activity in indexing is presented with detailed illustrations from NASA activity. Applications from techniques mentioned, such as Retrospective Indexing (RI), can be made to many indexing systems. In addition to improving the quality of indexing with computers, the improved efficiency with which certain tasks can be done is demonstrated.
High accurate interpolation of NURBS tool path for CNC machine tools
NASA Astrophysics Data System (ADS)
Liu, Qiang; Liu, Huan; Yuan, Songmei
2016-09-01
Feedrate fluctuation caused by approximation errors of interpolation methods has great effects on machining quality in NURBS interpolation, but few methods can efficiently eliminate or reduce it to a satisfying level without sacrificing the computing efficiency at present. In order to solve this problem, a high accurate interpolation method for NURBS tool path is proposed. The proposed method can efficiently reduce the feedrate fluctuation by forming a quartic equation with respect to the curve parameter increment, which can be efficiently solved by analytic methods in real-time. Theoretically, the proposed method can totally eliminate the feedrate fluctuation for any 2nd degree NURBS curves and can interpolate 3rd degree NURBS curves with minimal feedrate fluctuation. Moreover, a smooth feedrate planning algorithm is also proposed to generate smooth tool motion with considering multiple constraints and scheduling errors by an efficient planning strategy. Experiments are conducted to verify the feasibility and applicability of the proposed method. This research presents a novel NURBS interpolation method with not only high accuracy but also satisfying computing efficiency.
Efficient 10 kW diode-pumped Nd:YAG rod laser
NASA Astrophysics Data System (ADS)
Akiyama, Yasuhiro; Takada, Hiroyuki; Sasaki, Mitsuo; Yuasa, Hiroshi; Nishida, Naoto
2003-03-01
As a tool for high speed and high precision material processing such as cutting and welding, we developed a rod-type all-solid-state laser with an average power of more than 10 kW, an electrical-optical efficiency of more than 20%, and a laser head volume of less than 0.05 m3. We developed a highly efficient diode pumped module, and successfully obtained electrical-optical efficiencies of 22% in CW operation and 26% in QCW operation at multi-kW output powers. We also succeeded to reduce the laser head volume, and obtained the output power of 12 kW with an efficiency of 23%, and laser head volume of 0.045 m3. We transferred the technology to SHIBAURA mechatronics corp., who started to provide the LD pumped Nd:YAG laser system with output power up to 4.5 kW. We are now continuing development for further high power laser equipment.
Pydna: a simulation and documentation tool for DNA assembly strategies using python.
Pereira, Filipa; Azevedo, Flávio; Carvalho, Ângela; Ribeiro, Gabriela F; Budde, Mark W; Johansson, Björn
2015-05-02
Recent advances in synthetic biology have provided tools to efficiently construct complex DNA molecules which are an important part of many molecular biology and biotechnology projects. The planning of such constructs has traditionally been done manually using a DNA sequence editor which becomes error-prone as scale and complexity of the construction increase. A human-readable formal description of cloning and assembly strategies, which also allows for automatic computer simulation and verification, would therefore be a valuable tool. We have developed pydna, an extensible, free and open source Python library for simulating basic molecular biology DNA unit operations such as restriction digestion, ligation, PCR, primer design, Gibson assembly and homologous recombination. A cloning strategy expressed as a pydna script provides a description that is complete, unambiguous and stable. Execution of the script automatically yields the sequence of the final molecule(s) and that of any intermediate constructs. Pydna has been designed to be understandable for biologists with limited programming skills by providing interfaces that are semantically similar to the description of molecular biology unit operations found in literature. Pydna simplifies both the planning and sharing of cloning strategies and is especially useful for complex or combinatorial DNA molecule construction. An important difference compared to existing tools with similar goals is the use of Python instead of a specifically constructed language, providing a simulation environment that is more flexible and extensible by the user.
PathCase-SB architecture and database design
2011-01-01
Background Integration of metabolic pathways resources and regulatory metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation in metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Description PathCase Systems Biology (PathCase-SB) is built and released. The PathCase-SB database provides data and API for multiple user interfaces and software tools. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate data of selected biological data sources on the web (currently, BioModels database and KEGG), and to provide more powerful and/or new capabilities via the new web-based integrative framework. This paper describes architecture and database design issues encountered in PathCase-SB's design and implementation, and presents the current design of PathCase-SB's architecture and database. Conclusions PathCase-SB architecture and database provide a highly extensible and scalable environment with easy and fast (real-time) access to the data in the database. PathCase-SB itself is already being used by researchers across the world. PMID:22070889
Progress Toward an Efficient and General CFD Tool for Propulsion Design/Analysis
NASA Technical Reports Server (NTRS)
Cox, C. F.; Cinnella, P.; Westmoreland, S.
1996-01-01
The simulation of propulsive flows inherently involves chemical activity. Recent years have seen substantial strides made in the development of numerical schemes for reacting flowfields, in particular those involving finite-rate chemistry. However, finite-rate calculations are computationally intensive and require knowledge of the actual kinetics, which are not always known with sufficient accuracy. Alternatively, flow simulations based on the assumption of local chemical equilibrium are capable of obtaining physically reasonable results at far less computational cost. The present study summarizes the development of efficient numerical techniques for the simulation of flows in local chemical equilibrium, whereby a 'Black Box' chemical equilibrium solver is coupled to the usual gasdynamic equations. The generalization of the methods enables the modelling of any arbitrary mixture of thermally perfect gases, including air, combustion mixtures and plasmas. As demonstration of the potential of the methodologies, several solutions, involving reacting and perfect gas flows, will be presented. Included is a preliminary simulation of the SSME startup transient. Future enhancements to the proposed techniques will be discussed, including more efficient finite-rate and hybrid (partial equilibrium) schemes. The algorithms that have been developed and are being optimized provide for an efficient and general tool for the design and analysis of propulsion systems.
Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Mcmanus, John William
1992-01-01
Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.
CTAS: Computer intelligence for air traffic control in the terminal area
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
1992-01-01
A system for the automated management and control of arrival traffic, referred to as the Center-TRACON Automation System (CTAS), has been designed by the ATC research group at NASA Ames research center. In a cooperative program, NASA and the FAA have efforts underway to install and evaluate the system at the Denver and Dallas/Ft. Worth airports. CTAS consists of three types of integrated tools that provide computer-generated intelligence for both Center and TRACON controllers to guide them in managing and controlling arrival traffic efficiently. One tool, the Traffic Management Advisor (TMA), establishes optimized landing sequences and landing times for aircraft arriving in the center airspace several hundred miles from the airport. In TRACON, TMA frequencies missed approach aircraft and unanticipated arrivals. Another tool, the Descent Advisor (DA), generates clearances for the center controllers handling at crossing times provided by TMA. In the TRACON, the final approach spacing tool (FAST) provides heading and speed clearances that produce and accurately spaced flow of aircraft on the final approach course. A data base consisting of aircraft performance models, airline preferred operational procedures and real time wind measurements contribute to the effective operation of CTAS. Extensive simulator evaluations of CTAS have demonstrated controller acceptance, delay reductions, and fuel savings.
A service-based BLAST command tool supported by cloud infrastructures.
Carrión, Abel; Blanquer, Ignacio; Hernández, Vicente
2012-01-01
Notwithstanding the benefits of distributed-computing infrastructures for empowering bioinformatics analysis tools with the needed computing and storage capability, the actual use of these infrastructures is still low. Learning curves and deployment difficulties have reduced the impact on the wide research community. This article presents a porting strategy of BLAST based on a multiplatform client and a service that provides the same interface as sequential BLAST, thus reducing learning curve and with minimal impact on their integration on existing workflows. The porting has been done using the execution and data access components from the EC project Venus-C and the Windows Azure infrastructure provided in this project. The results obtained demonstrate a low overhead on the global execution framework and reasonable speed-up and cost-efficiency with respect to a sequential version.
NASA Astrophysics Data System (ADS)
Masseroli, Marco; Pinciroli, Francesco
2000-12-01
To provide easy retrieval, integration and evaluation of multimodal cardiology images and data in a web browser environment, distributed application technologies and java programming were used to implement a client-server architecture based on software agents. The server side manages secure connections and queries to heterogeneous remote databases and file systems containing patient personal and clinical data. The client side is a Java applet running in a web browser and providing a friendly medical user interface to perform queries on patient and medical test dat and integrate and visualize properly the various query results. A set of tools based on Java Advanced Imaging API enables to process and analyze the retrieved cardiology images, and quantify their features in different regions of interest. The platform-independence Java technology makes the developed prototype easy to be managed in a centralized form and provided in each site where an intranet or internet connection can be located. Giving the healthcare providers effective tools for querying, visualizing and evaluating comprehensively cardiology medical images and records in all locations where they can need them- i.e. emergency, operating theaters, ward, or even outpatient clinics- the developed prototype represents an important aid in providing more efficient diagnoses and medical treatments.
SaaS Platform for Time Series Data Handling
NASA Astrophysics Data System (ADS)
Oplachko, Ekaterina; Rykunov, Stanislav; Ustinin, Mikhail
2018-02-01
The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a "Software as a Service" model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.
[The Internet and shared decision-making between patients and healthcare providers].
Silber, Denise
2009-10-01
Insurance companies like Kaiser Permanente in the United States remunerate physicians for their email correspondence with patients, increasing the efficiency of office visits. A survey by the French National Board of Physicians regarding the computerization of medical practices in April 2009, confirms that both physicians and patients in France are very favorable to the development of these tools. When patients can manage and/or access their medical files and determine which providers can access them, they become a true partner.
Jian, Weiyan; Huang, Yinmin; Hu, Mu; Zhang, Xiumei
2009-04-30
The medical performance evaluation, which provides a basis for rational decision-making, is an important part of medical service research. Current progress with health services reform in China is far from satisfactory, without sufficient regulation. To achieve better progress, an effective tool for evaluating medical performance needs to be established. In view of this, this study attempted to develop such a tool appropriate for the Chinese context. Data was collected from the front pages of medical records (FPMR) of all large general public hospitals (21 hospitals) in the third and fourth quarter of 2007. Locally developed Diagnosis Related Groups (DRGs) were introduced as a tool for risk adjustment and performance evaluation indicators were established: Charge Efficiency Index (CEI), Time Efficiency Index (TEI) and inpatient mortality of low-risk group cases (IMLRG), to reflect respectively work efficiency and medical service quality. Using these indicators, the inpatient services' performance was horizontally compared among hospitals. Case-mix Index (CMI) was used to adjust efficiency indices and then produce adjusted CEI (aCEI) and adjusted TEI (aTEI). Poisson distribution analysis was used to test the statistical significance of the IMLRG differences between different hospitals. Using the aCEI, aTEI and IMLRG scores for the 21 hospitals, Hospital A and C had relatively good overall performance because their medical charges were lower, LOS shorter and IMLRG smaller. The performance of Hospital P and Q was the worst due to their relatively high charge level, long LOS and high IMLRG. Various performance problems also existed in the other hospitals. It is possible to develop an accurate and easy to run performance evaluation system using Case-Mix as the tool for risk adjustment, choosing indicators close to consumers and managers, and utilizing routine report forms as the basic information source. To keep such a system running effectively, it is necessary to improve the reliability of clinical information and the risk-adjustment ability of Case-Mix.
Automated visual imaging interface for the plant floor
NASA Astrophysics Data System (ADS)
Wutke, John R.
1991-03-01
The paper will provide an overview of the challenges facing a user of automated visual imaging (" AVI" ) machines and the philosophies that should be employed in designing them. As manufacturing tools and equipment become more sophisticated it is increasingly difficult to maintain an efficient interaction between the operator and machine. The typical user of an AVI machine in a production environment is technically unsophisticated. Also operator and machine ergonomics are often a neglected or poorly addressed part of an efficient manufacturing process. This paper presents a number of man-machine interface design techniques and philosophies that effectively solve these problems.
Efficient propagation of the hierarchical equations of motion using the matrix product state method
NASA Astrophysics Data System (ADS)
Shi, Qiang; Xu, Yang; Yan, Yaming; Xu, Meng
2018-05-01
We apply the matrix product state (MPS) method to propagate the hierarchical equations of motion (HEOM). It is shown that the MPS approximation works well in different type of problems, including boson and fermion baths. The MPS method based on the time-dependent variational principle is also found to be applicable to HEOM with over one thousand effective modes. Combining the flexibility of the HEOM in defining the effective modes and the efficiency of the MPS method thus may provide a promising tool in simulating quantum dynamics in condensed phases.
Forsberg, Daniel; Gupta, Amit; Mills, Christopher; MacAdam, Brett; Rosipko, Beverly; Bangert, Barbara A; Coffey, Michael D; Kosmas, Christos; Sunshine, Jeffrey L
2017-03-01
The purpose of this study was to investigate how the use of multi-modal rigid image registration integrated within a standard picture archiving and communication system affects the efficiency of a radiologist while performing routine interpretations of cases including prior examinations. Six radiologists were recruited to read a set of cases (either 16 neuroradiology or 14 musculoskeletal cases) during two crossover reading sessions. Each radiologist read each case twice, one time with synchronized navigation, which enables spatial synchronization across examinations from different study dates, and one time without. Efficiency was evaluated based upon time to read a case and amount of scrolling while browsing a case using Wilcoxon signed rank test. Significant improvements in efficiency were found considering either all radiologists simultaneously, the two sections separately and the majority of individual radiologists for time to read and for amount of scrolling. The relative improvement for each individual radiologist ranged from 4 to 32% for time to read and from 14 to 38% for amount of scrolling. Image registration providing synchronized navigation across examinations from different study dates provides a tool that enables radiologists to work more efficiently while reading cases with one or more prior examinations.
NASA Astrophysics Data System (ADS)
S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr
2014-03-01
An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.
Pointo - a Low Cost Solution to Point Cloud Processing
NASA Astrophysics Data System (ADS)
Houshiar, H.; Winkler, S.
2017-11-01
With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.
Aerospace Power Systems Design and Analysis (APSDA) Tool
NASA Technical Reports Server (NTRS)
Truong, Long V.
1998-01-01
The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.
SModelS v1.1 user manual: Improving simplified model constraints with efficiency maps
NASA Astrophysics Data System (ADS)
Ambrogi, Federico; Kraml, Sabine; Kulkarni, Suchita; Laa, Ursula; Lessa, Andre; Magerl, Veronika; Sonneveld, Jory; Traub, Michael; Waltenberger, Wolfgang
2018-06-01
SModelS is an automatized tool for the interpretation of simplified model results from the LHC. It allows to decompose models of new physics obeying a Z2 symmetry into simplified model components, and to compare these against a large database of experimental results. The first release of SModelS, v1.0, used only cross section upper limit maps provided by the experimental collaborations. In this new release, v1.1, we extend the functionality of SModelS to efficiency maps. This increases the constraining power of the software, as efficiency maps allow to combine contributions to the same signal region from different simplified models. Other new features of version 1.1 include likelihood and χ2 calculations, extended information on the topology coverage, an extended database of experimental results as well as major speed upgrades for both the code and the database. We describe in detail the concepts and procedures used in SModelS v1.1, explaining in particular how upper limits and efficiency map results are dealt with in parallel. Detailed instructions for code usage are also provided.
Intracellular ROS mediates gas plasma-facilitated cellular transfection in 2D and 3D cultures
Xu, Dehui; Wang, Biqing; Xu, Yujing; Chen, Zeyu; Cui, Qinjie; Yang, Yanjie; Chen, Hailan; Kong, Michael G.
2016-01-01
This study reports the potential of cold atmospheric plasma (CAP) as a versatile tool for delivering oligonucleotides into mammalian cells. Compared to lipofection and electroporation methods, plasma transfection showed a better uptake efficiency and less cell death in the transfection of oligonucleotides. We demonstrated that the level of extracellular aqueous reactive oxygen species (ROS) produced by gas plasma is correlated with the uptake efficiency and that this is achieved through an increase of intracellular ROS levels and the resulting increase in cell membrane permeability. This finding was supported by the use of ROS scavengers, which reduced CAP-based uptake efficiency. In addition, we found that cold atmospheric plasma could transfer oligonucleotides such as siRNA and miRNA into cells even in 3D cultures, thus suggesting the potential for unique applications of CAP beyond those provided by standard transfection techniques. Together, our results suggest that cold plasma might provide an efficient technique for the delivery of siRNA and miRNA in 2D and 3D culture models. PMID:27296089
Quality drying of softwood lumber : guidebook - checklist
M. R. Milota; J. D. Danielson; R. S. Boone; D. W. Huber
The IMPROVE Lumber Drying Program is intended to increase awareness of the lumber drying system as a critical component in the manufacture of quality lumber. One objective of the program is to provide easy-to-use tools that a kiln operator can use to maintain an efficient kiln operation and therefore contribute to lumber drying quality. This report is one component of...
SARDA - Technologies for NextGen
2015-04-22
The Spot and Runway Departure Advisor, or SARDA, is NASA's contribution to improving the efficiency of airport surface operations. SARDA is comprised of software-based decision support tools for controllers in the FAA tower and in the airline ramp towers. It uses intelligent schedulers to provide surface management capabilities, including departure metering and advisories for individual aircraft movement at various locations on the airport surface.
USDA-ARS?s Scientific Manuscript database
Development of an analytical method for the simultaneous determination of multifarious skin whitening agents will provide an efficient tool to analyze skin whitening cosmetics. An HPLC-UV method was developed for quantitative analysis of six commonly used whitening agents, a-arbutin, ß-arbutin, koji...
Educating the medical community through a teratology newsletter.
Guttmacher, A E; Allen, E F
1993-01-01
To educate a geographically and professionally diverse group of health care providers about teratology in an economic and efficient manner, we developed a locally written and distributed teratology newsletter. Response to the newsletter, from readers as well as from our staff and funding agencies, suggests that such a newsletter can be a valuable tool in educating medical communities about teratology. PMID:8434594
Quality drying of hardwood lumber : guidebook -- checklist
R. S. Boone; M. R. Milota; J. D. Danielson; D. W. Huber
The IMPROVE Lumber Drying Program is intended to increase awareness of the lumber drying system as a critical component in the manufacture of quality lumber. One objective of the program is to provide easy-to-use tools that a kiln operator can use to maintain an efficient kiln operation and therefore improve lumber drying quality. This report is one component of the...
Quality drying in a hardwood lumber predryer : guidebook--checklist
E. M. Wengert; R. S. Boone
The IMPROVE Lumber Drying Program is intended to increase awareness of the lumber drying system as a critical component in the manufacture of quality lumber. One objective of the program is to provide easy-to-use tools that a kiln/predryer operator can use to maintain an efficient drying operation and therefore improve lumber drying quality. This report is one...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radousky, H B
This months issue has the following articles: (1) Innovative Solutions Reap Rewards--Commentary by George H. Miller; (2) Surveillance on the Fly--An airborne surveillance system can track up to 8,000 moving objects in an area the size of a small city; (3) A Detector Radioactive Particles Can't Evade--An ultrahigh-resolution spectrometer can detect the minute thermal energy deposited by a single gamma ray or neutron; (4) Babel Speeds Communication among Programming Languages--The Babel program allows software applications in different programming languages to communicate quickly; (5) A Gem of a Software Tool--The data-mining software Sapphire allows scientists to analyze enormous data sets generatedmore » by diverse applications; (6) Interferometer Improves the Search for Planets--With externally dispersed interferometry, astronomers can use an inexpensive, compact instrument to search for distant planets; (7) Efficiently Changing the Color of Laser Light--Yttrium-calcium-oxyborate crystals provide an efficient, compact approach to wavelength conversion for high-average-power lasers; (8) Pocket-Sized Test Detects Trace Explosives--A detection kit sensitive to more than 30 explosives provides an inexpensive, easy-to-use tool for security forces everywhere; (9) Tailor-Made Microdevices Serve Big Needs--The Center for Micro- and Nanotechnology develops tiny devices for national security.« less
Empowering a safer practice: PDAs are integral tools for nursing and health care.
Hudson, Kathleen; Buell, Virginia
2011-04-01
This study's purpose was to assess the characteristics of personal digital assistant (PDA) uptake and use in both clinical and classroom work for baccalaureate student nurses (BSN) within a rural Texas university. Patient care has become more complicated, risk prone, automated and costly. Efficiencies at the bedside are needed to continue to provide safe and successful within this environment. Purposive sample of nursing students using PDAs throughout their educational processes, conducted at three campus sites. The initial sample size was 105 students, followed by 94 students at end of the first semester and 75 students at curriculum completion at the end of a 2-year period. Students completed structured and open-ended questions to assess their perspectives on PDA usage. Student uptake varied in relation to overall competency, with minimal to high utilization noted, and was influenced by the current product costs. PDAs are developing into useful clinical tools by providing quick and important information for safer care. Using bedside PDAs effectively assists with maintaining patient safety, efficiency of care delivery and staff satisfaction. This study evaluates the initial implementation of PDAs by students, our future multitasking nurses. © 2011 The Authors. Journal compilation © 2011 Blackwell Publishing Ltd.
Are prices enough? The economics of material demand reduction
Aidt, Toke; Jia, Lili
2017-01-01
Recent policy proposals to achieve carbon targets have emphasized material demand reduction strategies aimed at achieving material efficiency. We provide a bridge between the way economists and engineers think about efficiency. We use the tools of economics to think about policies directed at material efficiency and to evaluate the role and rationale for such policies. The analysis highlights when prices (or taxes) can be used to induce changes in material use and when taxes may not work. We argue that the role of taxes is limited by concerns about their distributional consequences, by international trade and the lack of international agreement on carbon prices, and by investment failures. This article is part of the themed issue ‘Material demand reduction’. PMID:28461434
Are prices enough? The economics of material demand reduction
NASA Astrophysics Data System (ADS)
Aidt, Toke; Jia, Lili; Low, Hamish
2017-05-01
Recent policy proposals to achieve carbon targets have emphasized material demand reduction strategies aimed at achieving material efficiency. We provide a bridge between the way economists and engineers think about efficiency. We use the tools of economics to think about policies directed at material efficiency and to evaluate the role and rationale for such policies. The analysis highlights when prices (or taxes) can be used to induce changes in material use and when taxes may not work. We argue that the role of taxes is limited by concerns about their distributional consequences, by international trade and the lack of international agreement on carbon prices, and by investment failures. This article is part of the themed issue 'Material demand reduction'.
Blijleven, Vincent; Koelemeijer, Kitty; Wetzels, Marijntje; Jaspers, Monique
2017-10-05
Health care providers resort to informal temporary practices known as workarounds for handling exceptions to normal workflow unintendedly imposed by electronic health record systems (EHRs). Although workarounds may seem favorable at first sight, they are generally suboptimal and may jeopardize patient safety, effectiveness of care, and efficiency of care. Research into the scope and impact of EHR workarounds on patient care processes is scarce. This paper provides insight into the effects of EHR workarounds on organizational workflows and outcomes of care services by identifying EHR workarounds and determining their rationales, scope, and impact on health care providers' workflows, patient safety, effectiveness of care, and efficiency of care. Knowing the rationale of a workaround provides valuable clues about the source of origin of each workaround and how each workaround could most effectively be resolved. Knowing the scope and impact a workaround has on EHR-related safety, effectiveness, and efficiency provides insight into how to address related concerns. Direct observations and follow-up semistructured interviews with 31 physicians, 13 nurses, and 3 clerks and qualitative bottom-up coding techniques was used to identify, analyze, and classify EHR workarounds. The research was conducted within 3 specialties and settings at a large university hospital. Rationales were associated with work system components (persons, technology and tools, tasks, organization, and physical environment) of the Systems Engineering Initiative for Patient Safety (SEIPS) framework to reveal their source of origin as well as to determine the scope and the impact of each EHR workaround from a structure-process-outcome perspective. A total of 15 rationales for EHR workarounds were identified of which 5 were associated with persons, 4 with technology and tools, 4 with the organization, and 2 with the tasks. Three of these 15 rationales for EHR workarounds have not been identified in prior research: data migration policy, enforced data entry, and task interference. EHR workaround rationales associated with different SEIPS work system components demand a different approach to be resolved. Persons-related workarounds may most effectively be resolved through personal training, organization-related workarounds through reviewing organizational policy and regulations, tasks-related workarounds through process redesign, and technology- and tools-related workarounds through EHR redesign efforts. Furthermore, insights gained from knowing a workaround's degree of influence as well as impact on patient safety, effectiveness of care, and efficiency of care can inform design and redesign of EHRs to further align EHR design with work contexts, subsequently leading to better organization and (safe) provision of care. In doing so, a research team in collaboration with all stakeholders could use the SEIPS framework to reflect on the current and potential future configurations of the work system to prevent unfavorable workarounds from occurring and how a redesign of the EHR would impact interactions between the work system components. ©Vincent Blijleven, Kitty Koelemeijer, Marijntje Wetzels, Monique Jaspers. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 05.10.2017.
Prykhozhij, Sergey V; Rajan, Vinothkumar; Berman, Jason N
2016-02-01
The development of clustered regularly interspaced short palindromic repeats (CRISPR)/Cas9 technology for mainstream biotechnological use based on its discovery as an adaptive immune mechanism in bacteria has dramatically improved the ability of molecular biologists to modify genomes of model organisms. The zebrafish is highly amenable to applications of CRISPR/Cas9 for mutation generation and a variety of DNA insertions. Cas9 protein in complex with a guide RNA molecule recognizes where to cut the homologous DNA based on a short stretch of DNA termed the protospacer-adjacent motif (PAM). Rapid and efficient identification of target sites immediately preceding PAM sites, quantification of genomic occurrences of similar (off target) sites and predictions of cutting efficiency are some of the features where computational tools play critical roles in CRISPR/Cas9 applications. Given the rapid advent and development of this technology, it can be a challenge for researchers to remain up to date with all of the important technological developments in this field. We have contributed to the armamentarium of CRISPR/Cas9 bioinformatics tools and trained other researchers in the use of appropriate computational programs to develop suitable experimental strategies. Here we provide an in-depth guide on how to use CRISPR/Cas9 and other relevant computational tools at each step of a host of genome editing experimental strategies. We also provide detailed conceptual outlines of the steps involved in the design and execution of CRISPR/Cas9-based experimental strategies, such as generation of frameshift mutations, larger chromosomal deletions and inversions, homology-independent insertion of gene cassettes and homology-based knock-in of defined point mutations and larger gene constructs.
Kalendar, Ruslan; Tselykh, Timofey V; Khassenov, Bekbolat; Ramanculov, Erlan M
2017-01-01
This chapter introduces the FastPCR software as an integrated tool environment for PCR primer and probe design, which predicts properties of oligonucleotides based on experimental studies of the PCR efficiency. The software provides comprehensive facilities for designing primers for most PCR applications and their combinations. These include the standard PCR as well as the multiplex, long-distance, inverse, real-time, group-specific, unique, overlap extension PCR for multi-fragments assembling cloning and loop-mediated isothermal amplification (LAMP). It also contains a built-in program to design oligonucleotide sets both for long sequence assembly by ligase chain reaction and for design of amplicons that tile across a region(s) of interest. The software calculates the melting temperature for the standard and degenerate oligonucleotides including locked nucleic acid (LNA) and other modifications. It also provides analyses for a set of primers with the prediction of oligonucleotide properties, dimer and G/C-quadruplex detection, linguistic complexity as well as a primer dilution and resuspension calculator. The program consists of various bioinformatical tools for analysis of sequences with the GC or AT skew, CG% and GA% content, and the purine-pyrimidine skew. It also analyzes the linguistic sequence complexity and performs generation of random DNA sequence as well as restriction endonucleases analysis. The program allows to find or create restriction enzyme recognition sites for coding sequences and supports the clustering of sequences. It performs efficient and complete detection of various repeat types with visual display. The FastPCR software allows the sequence file batch processing that is essential for automation. The program is available for download at http://primerdigital.com/fastpcr.html , and its online version is located at http://primerdigital.com/tools/pcr.html .
The center for causal discovery of biomedical knowledge from big data
Bahar, Ivet; Becich, Michael J; Benos, Panayiotis V; Berg, Jeremy; Espino, Jeremy U; Glymour, Clark; Jacobson, Rebecca Crowley; Kienholz, Michelle; Lee, Adrian V; Lu, Xinghua; Scheines, Richard
2015-01-01
The Big Data to Knowledge (BD2K) Center for Causal Discovery is developing and disseminating an integrated set of open source tools that support causal modeling and discovery of biomedical knowledge from large and complex biomedical datasets. The Center integrates teams of biomedical and data scientists focused on the refinement of existing and the development of new constraint-based and Bayesian algorithms based on causal Bayesian networks, the optimization of software for efficient operation in a supercomputing environment, and the testing of algorithms and software developed using real data from 3 representative driving biomedical projects: cancer driver mutations, lung disease, and the functional connectome of the human brain. Associated training activities provide both biomedical and data scientists with the knowledge and skills needed to apply and extend these tools. Collaborative activities with the BD2K Consortium further advance causal discovery tools and integrate tools and resources developed by other centers. PMID:26138794
Web Audio/Video Streaming Tool
NASA Technical Reports Server (NTRS)
Guruvadoo, Eranna K.
2003-01-01
In order to promote NASA-wide educational outreach program to educate and inform the public of space exploration, NASA, at Kennedy Space Center, is seeking efficient ways to add more contents to the web by streaming audio/video files. This project proposes a high level overview of a framework for the creation, management, and scheduling of audio/video assets over the web. To support short-term goals, the prototype of a web-based tool is designed and demonstrated to automate the process of streaming audio/video files. The tool provides web-enabled users interfaces to manage video assets, create publishable schedules of video assets for streaming, and schedule the streaming events. These operations are performed on user-defined and system-derived metadata of audio/video assets stored in a relational database while the assets reside on separate repository. The prototype tool is designed using ColdFusion 5.0.
Examining Researcher Needs and Barriers for using Electronic Health Data for Translational Research
Stephens, Kari A.; Lee, E. Sally; Estiri, Hossein; Jung, Hyunggu
2015-01-01
To achieve the Learning Health Care System, we must harness electronic health data (EHD) by providing effective tools for researchers to access data efficiently. EHD is proliferating and researchers are relying on these data to pioneer discovery. Tools must be user-centric to ensure their utility. To this end, we conducted a qualitative study to assess researcher needs and barriers to using EHD. Researchers expressed the need to be confident about the data and have easy access, a clear process for exploration and access, and adequate resources, while barriers included difficulties in finding datasets, usability of the data, cumbersome processes, and lack of resources. These needs and barriers can inform the design process for innovating tools to increase utility of EHD. Understanding researcher needs is key to building effective user-centered EHD tools to support translational research. PMID:26306262
Development of the HD-Teen Inventory
Driessnack, Martha; Williams, Janet K.; Barnette, J. Jackson; Sparbel, Kathleen J.; Paulsen, Jane S.
2013-01-01
Adolescents, who have a parent with Huntington Disease (HD), not only are at genetic risk for HD but also are witness to its onset and devastating clinical progression as their parent declines. To date, no mechanism has been developed to direct health care providers to the atypical adolescent experiences of these teens. The purpose of this report is to describe the process of developing the HD-Teen Inventory clinical assessment tool. Forty-eight teens and young adults from 19 U.S. states participated in the evaluation of the HD-Teen Inventory tool. Following item analysis, the number of items was reduced and item frequency and reaction scales were combined, based on the strong correlation (r = .94). The resultant tool contains 15 inventory and 2 open-ended response items. The HD-Teen Inventory emerged as a more compact and efficient tool for identifying the most salient concerns of at-risk teens in HD families in research and/or clinical practice. PMID:21632913
Development of the HD-Teen Inventory.
Driessnack, Martha; Williams, Janet K; Barnette, J Jackson; Sparbel, Kathleen J; Paulsen, Jane S
2012-05-01
Adolescents, who have a parent with Huntington Disease (HD), not only are at genetic risk for HD but also are witness to its onset and devastating clinical progression as their parent declines. To date, no mechanism has been developed to direct health care providers to the atypical adolescent experiences of these teens. The purpose of this report is to describe the process of developing the HD-Teen Inventory clinical assessment tool. Forty-eight teens and young adults from 19 U.S. states participated in the evaluation of the HD-Teen Inventory tool. Following item analysis, the number of items was reduced and item frequency and reaction scales were combined, based on the strong correlation (r = .94). The resultant tool contains 15 inventory and 2 open-ended response items. The HD-Teen Inventory emerged as a more compact and efficient tool for identifying the most salient concerns of at-risk teens in HD families in research and/or clinical practice.
EasyModeller: A graphical interface to MODELLER
2010-01-01
Background MODELLER is a program for automated protein Homology Modeling. It is one of the most widely used tool for homology or comparative modeling of protein three-dimensional structures, but most users find it a bit difficult to start with MODELLER as it is command line based and requires knowledge of basic Python scripting to use it efficiently. Findings The study was designed with an aim to develop of "EasyModeller" tool as a frontend graphical interface to MODELLER using Perl/Tk, which can be used as a standalone tool in windows platform with MODELLER and Python preinstalled. It helps inexperienced users to perform modeling, assessment, visualization, and optimization of protein models in a simple and straightforward way. Conclusion EasyModeller provides a graphical straight forward interface and functions as a stand-alone tool which can be used in a standard personal computer with Microsoft Windows as the operating system. PMID:20712861
NASA Technical Reports Server (NTRS)
Witoff, Robert J.; Doody, David F.
2012-01-01
At the time of this reporting, there are 2,589 rich mobile devices used at JPL, including 1,550 iPhones and 968 Blackberrys. Considering a total JPL population of 5,961 employees, mobile applications have a total addressable market of 43 percent of the employees at JPL, and that number is rising. While it was found that no existing desktop tools can realistically be replaced by a mobile application, there is certainly a need to improve access to these desktop tools. When an alarm occurs and an engineer is away from his desk, a convenient means of accessing relevant data can save an engineer a great deal of time and improve his job efficiency. To identify which data is relevant, an engineer benefits from a succinct overview of the data housed in 13+ tools. This need can be well met by a single, rich, mobile application that provides access to desired data across tools in the ops infrastructure.
A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.
Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy
2016-12-01
Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.
Christison, Amy L; Daley, Brendan M; Asche, Carl V; Ren, Jinma; Aldag, Jean C; Ariza, Adolfo J; Lowry, Kelly W
2014-10-01
Recommendations to screen and counsel for lifestyle behaviors can be challenging to implement during well-child visits in the primary care setting. A practice intervention was piloted using the Family Nutrition and Physical Activity (FNPA) Screening Tool paired with a motivational interviewing (MI)-based counseling tool during well-child visits. Acceptability and feasibility of this intervention were assessed. Its impact on parent-reported obesigenic behavior change and provider efficacy in lifestyle counseling were also examined. This was an observational study in a pediatric primary care office. During well-child visits of 100 patients (ages 4-16 years), the FNPA tool was implemented and providers counseled patients in an MI-consistent manner based on its results. Duration of implementation, patient satisfaction of the intervention, and success of stated lifestyle goals were measured. Provider self-efficacy and acceptability were also surveyed. The FNPA assessment was efficient to administer, requiring minutes to complete and score. Patient acceptability was high, ranging from 4.0 to 4.8 on a 5-point scale. Provider acceptability was good, with the exception of duration of counseling; self-efficacy in assessing patient "readiness for change" was improved. Parent-reported success of primary lifestyle goal was 68% at 1 month and 46% at 6 months. The FNPA assessment with an MI-based counseling tool shows promise as an approach to identify and address obesigenic behaviors during pediatric well-child visits. It has the potential to improve provider efficacy in obesity prevention and also influence patient health behaviors, which can possibly impact childhood excessive weight gain. After refinement, this practice intervention will be used in a larger trial.
Experimental statistical signature of many-body quantum interference
NASA Astrophysics Data System (ADS)
Giordani, Taira; Flamini, Fulvio; Pompili, Matteo; Viggianiello, Niko; Spagnolo, Nicolò; Crespi, Andrea; Osellame, Roberto; Wiebe, Nathan; Walschaers, Mattia; Buchleitner, Andreas; Sciarrino, Fabio
2018-03-01
Multi-particle interference is an essential ingredient for fundamental quantum mechanics phenomena and for quantum information processing to provide a computational advantage, as recently emphasized by boson sampling experiments. Hence, developing a reliable and efficient technique to witness its presence is pivotal in achieving the practical implementation of quantum technologies. Here, we experimentally identify genuine many-body quantum interference via a recent efficient protocol, which exploits statistical signatures at the output of a multimode quantum device. We successfully apply the test to validate three-photon experiments in an integrated photonic circuit, providing an extensive analysis on the resources required to perform it. Moreover, drawing upon established techniques of machine learning, we show how such tools help to identify the—a priori unknown—optimal features to witness these signatures. Our results provide evidence on the efficacy and feasibility of the method, paving the way for its adoption in large-scale implementations.
Visual display aid for orbital maneuvering - Design considerations
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.; Ellis, Stephen R.
1993-01-01
This paper describes the development of an interactive proximity operations planning system that allows on-site planning of fuel-efficient multiburn maneuvers in a potential multispacecraft environment. Although this display system most directly assists planning by providing visual feedback to aid visualization of the trajectories and constraints, its most significant features include: (1) the use of an 'inverse dynamics' algorithm that removes control nonlinearities facing the operator, and (2) a trajectory planning technique that separates, through a 'geometric spreadsheet', the normally coupled complex problems of planning orbital maneuvers and allows solution by an iterative sequence of simple independent actions. The visual feedback of trajectory shapes and operational constraints, provided by user-transparent and continuously active background computations, allows the operator to make fast, iterative design changes that rapidly converge to fuel-efficient solutions. The planning tool provides an example of operator-assisted optimization of nonlinear cost functions.
Trust and Privacy Solutions Based on Holistic Service Requirements.
Sánchez Alcón, José Antonio; López, Lourdes; Martínez, José-Fernán; Rubio Cifuentes, Gregorio
2015-12-24
The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens' information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing.
Trust and Privacy Solutions Based on Holistic Service Requirements
Sánchez Alcón, José Antonio; López, Lourdes; Martínez, José-Fernán; Rubio Cifuentes, Gregorio
2015-01-01
The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens’ information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing. PMID:26712752
Simple Tools to Facilitate Project Management of a Nursing Research Project.
Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret
2016-07-01
Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.
Tool for cutting insulation from electrical cables
Harless, Charles E.; Taylor, Ward G.
1978-01-01
This invention is an efficient hand tool for precisely slitting the sheath of insulation on an electrical cable--e.g., a cable two inches in diameter--in a manner facilitating subsequent peeling or stripping of the insulation. The tool includes a rigid frame which is slidably fitted on an end section of the cable. The frame carries a rigidly affixed handle and an opposed, elongated blade-and-handle assembly. The blade-and-handle assembly is pivotally supported by a bracket which is slidably mounted on the frame for movement toward and away from the cable, thus providing an adjustment for the depth of cut. The blade-and-handle assembly is mountable to the bracket in two pivotable positions. With the assembly mounted in the first position, the tool is turned about the cable to slit the insulation circumferentially. With the assembly mounted in the second position, the tool is drawn along the cable to slit the insulation axially. When cut both circumferentially and axially, the insulation can easily be peeled from the cable.
Benchmarking CRISPR on-target sgRNA design.
Yan, Jifang; Chuai, Guohui; Zhou, Chi; Zhu, Chenyu; Yang, Jing; Zhang, Chao; Gu, Feng; Xu, Han; Wei, Jia; Liu, Qi
2017-02-15
CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats)-based gene editing has been widely implemented in various cell types and organisms. A major challenge in the effective application of the CRISPR system is the need to design highly efficient single-guide RNA (sgRNA) with minimal off-target cleavage. Several tools are available for sgRNA design, while limited tools were compared. In our opinion, benchmarking the performance of the available tools and indicating their applicable scenarios are important issues. Moreover, whether the reported sgRNA design rules are reproducible across different sgRNA libraries, cell types and organisms remains unclear. In our study, a systematic and unbiased benchmark of the sgRNA predicting efficacy was performed on nine representative on-target design tools, based on six benchmark data sets covering five different cell types. The benchmark study presented here provides novel quantitative insights into the available CRISPR tools. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.
2006-01-01
This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.
Lin, Hongli; Wang, Weisheng; Luo, Jiawei; Yang, Xuedong
2014-12-01
The aim of this study was to develop a personalized training system using the Lung Image Database Consortium (LIDC) and Image Database resource Initiative (IDRI) Database, because collecting, annotating, and marking a large number of appropriate computed tomography (CT) scans, and providing the capability of dynamically selecting suitable training cases based on the performance levels of trainees and the characteristics of cases are critical for developing a efficient training system. A novel approach is proposed to develop a personalized radiology training system for the interpretation of lung nodules in CT scans using the Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) database, which provides a Content-Boosted Collaborative Filtering (CBCF) algorithm for predicting the difficulty level of each case of each trainee when selecting suitable cases to meet individual needs, and a diagnostic simulation tool to enable trainees to analyze and diagnose lung nodules with the help of an image processing tool and a nodule retrieval tool. Preliminary evaluation of the system shows that developing a personalized training system for interpretation of lung nodules is needed and useful to enhance the professional skills of trainees. The approach of developing personalized training systems using the LIDC/IDRL database is a feasible solution to the challenges of constructing specific training program in terms of cost and training efficiency. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
ParBiBit: Parallel tool for binary biclustering on modern distributed-memory systems
Expósito, Roberto R.
2018-01-01
Biclustering techniques are gaining attention in the analysis of large-scale datasets as they identify two-dimensional submatrices where both rows and columns are correlated. In this work we present ParBiBit, a parallel tool to accelerate the search of interesting biclusters on binary datasets, which are very popular on different fields such as genetics, marketing or text mining. It is based on the state-of-the-art sequential Java tool BiBit, which has been proved accurate by several studies, especially on scenarios that result on many large biclusters. ParBiBit uses the same methodology as BiBit (grouping the binary information into patterns) and provides the same results. Nevertheless, our tool significantly improves performance thanks to an efficient implementation based on C++11 that includes support for threads and MPI processes in order to exploit the compute capabilities of modern distributed-memory systems, which provide several multicore CPU nodes interconnected through a network. Our performance evaluation with 18 representative input datasets on two different eight-node systems shows that our tool is significantly faster than the original BiBit. Source code in C++ and MPI running on Linux systems as well as a reference manual are available at https://sourceforge.net/projects/parbibit/. PMID:29608567
ParBiBit: Parallel tool for binary biclustering on modern distributed-memory systems.
González-Domínguez, Jorge; Expósito, Roberto R
2018-01-01
Biclustering techniques are gaining attention in the analysis of large-scale datasets as they identify two-dimensional submatrices where both rows and columns are correlated. In this work we present ParBiBit, a parallel tool to accelerate the search of interesting biclusters on binary datasets, which are very popular on different fields such as genetics, marketing or text mining. It is based on the state-of-the-art sequential Java tool BiBit, which has been proved accurate by several studies, especially on scenarios that result on many large biclusters. ParBiBit uses the same methodology as BiBit (grouping the binary information into patterns) and provides the same results. Nevertheless, our tool significantly improves performance thanks to an efficient implementation based on C++11 that includes support for threads and MPI processes in order to exploit the compute capabilities of modern distributed-memory systems, which provide several multicore CPU nodes interconnected through a network. Our performance evaluation with 18 representative input datasets on two different eight-node systems shows that our tool is significantly faster than the original BiBit. Source code in C++ and MPI running on Linux systems as well as a reference manual are available at https://sourceforge.net/projects/parbibit/.
GIS tool to locate major Sikh temples in USA
NASA Astrophysics Data System (ADS)
Sharma, Saumya
This tool is a GIS based interactive and graphical user interface tool, which locates the major Sikh temples of USA on a map. This tool is using Java programming language along with MOJO (Map Object Java Object) provided by ESRI that is the organization that provides the GIS software. It also includes some of the integration with Google's API's like Google Translator API. This application will tell users about the origin of Sikhism in India and USA, the major Sikh temples in each state of USA, location, name and detail information through their website. The primary purpose of this application is to make people aware about this religion and culture. This tool will also measure the distance between two temple points in a map and display the result in miles and kilometers. Also, there is an added support to convert each temple's website language from English to Punjabi or any other language using a language convertor tool so that people from different nationalities can understand their culture. By clicking on each point on a map, a new window will pop up showing the picture of the temple and a hyperlink that will redirect to the website of that particular temple .It will also contain links to their dance, music, history, and also a help menu to guide the users to use the software efficiently.
NASA Technical Reports Server (NTRS)
Melton, Forrest S.
2017-01-01
In agricultural regions around the world, threats to water supplies from drought and groundwater depletion are driving increased demand for tools to advance agricultural water use efficiency and support sustainable groundwater management. Satellite mapping of evapotranspiration (ET) from irrigated agricultural lands can provide agricultural producers and water resource managers with information that can be used to both optimize ag water use and improve estimates of groundwater withdrawals for irrigation. We describe the development of two remote sensing-based tools for ET mapping in California, including important lessons in terms of system design, partnership development, and transition to operations. For irrigation management, the integration of satellite data and surface sensor networks to provide timely delivery of information on crop water requirements can make irrigation scheduling more practical, convenient, and accurate.Developed through a partnership between NASA and the CA Department of Water Resources, the Satellite Irrigation Management Support (SIMS) framework integrates satellite data with information from agricultural weather networks to map crop canopy development and crop water requirements at the scale of individual fields. Information is distributed to agricultural producers and water managers via a web-based interface and web data services. SIMS also provides an API that facilitates integration with other irrigation decision support tools, such as CropManage and IrriQuest. Field trials using these integrated tools have shown that they can be used to sustain yields while improving water use efficiency and nutrient management. For sustainable groundwater management, the combination of satellite-derived estimates of ET and data on surface water deliveries for irrigation can increase the accuracy of estimates of groundwater pumping. We are developing an OpenET platform to facilitate access to ET data from multiple models and accelerate operational use of ET data in support of a range of water management applications, including implementation of the Sustainable Groundwater Management Act in CA. By providing a shared basis for decision making, we anticipate that the OpenET platform will accelerate implementation of solutions for sustainable groundwater management.
NASA Astrophysics Data System (ADS)
Melton, F. S.; Huntington, J. L.; Johnson, L.; Guzman, A.; Morton, C.; Zaragoza, I.; Dexter, J.; Rosevelt, C.; Michaelis, A.; Nemani, R. R.; Cahn, M.; Temesgen, B.; Trezza, R.; Frame, K.; Eching, S.; Grimm, R.; Hall, M.
2017-12-01
In agricultural regions around the world, threats to water supplies from drought and groundwater depletion are driving increased demand for tools to advance agricultural water use efficiency and support sustainable groundwater management. Satellite mapping of evapotranspiration (ET) from irrigated agricultural lands can provide agricultural producers and water resource managers with information that can be used to both optimize ag water use and improve estimates of groundwater withdrawals for irrigation. We describe the development of two remote sensing-based tools for ET mapping in California, including important lessons in terms of system design, partnership development, and transition to operations. For irrigation management, the integration of satellite data and surface sensor networks to provide timely delivery of information on crop water requirements can make irrigation scheduling more practical, convenient, and accurate. Developed through a partnership between NASA and the CA Department of Water Resources, the Satellite Irrigation Management Support (SIMS) framework integrates satellite data with information from agricultural weather networks to map crop canopy development and crop water requirements at the scale of individual fields. Information is distributed to agricultural producers and water managers via a web-based interface and web data services. SIMS also provides an API that facilitates integration with other irrigation decision support tools, such as CropManage and IrriQuest. Field trials using these integrated tools have shown that they can be used to sustain yields while improving water use efficiency and nutrient management. For sustainable groundwater management, the combination of satellite-derived estimates of ET and data on surface water deliveries for irrigation can increase the accuracy of estimates of groundwater pumping. We are developing an OpenET platform to facilitate access to ET data from multiple models and accelerate operational use of ET data in support of a range of water management applications, including implementation of the Sustainable Groundwater Management Act in CA. By providing a shared basis for decision making, we anticipate that the OpenET platform will accelerate implementation of solutions for sustainable groundwater management.
Benefits of Efficient Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Tool Efficiency Analysis model research in SEMI industry
NASA Astrophysics Data System (ADS)
Lei, Ma; Nana, Zhang; Zhongqiu, Zhang
2018-06-01
One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states, and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.
An efficient annotation and gene-expression derivation tool for Illumina Solexa datasets.
Hosseini, Parsa; Tremblay, Arianne; Matthews, Benjamin F; Alkharouf, Nadim W
2010-07-02
The data produced by an Illumina flow cell with all eight lanes occupied, produces well over a terabyte worth of images with gigabytes of reads following sequence alignment. The ability to translate such reads into meaningful annotation is therefore of great concern and importance. Very easily, one can get flooded with such a great volume of textual, unannotated data irrespective of read quality or size. CASAVA, a optional analysis tool for Illumina sequencing experiments, enables the ability to understand INDEL detection, SNP information, and allele calling. To not only extract from such analysis, a measure of gene expression in the form of tag-counts, but furthermore to annotate such reads is therefore of significant value. We developed TASE (Tag counting and Analysis of Solexa Experiments), a rapid tag-counting and annotation software tool specifically designed for Illumina CASAVA sequencing datasets. Developed in Java and deployed using jTDS JDBC driver and a SQL Server backend, TASE provides an extremely fast means of calculating gene expression through tag-counts while annotating sequenced reads with the gene's presumed function, from any given CASAVA-build. Such a build is generated for both DNA and RNA sequencing. Analysis is broken into two distinct components: DNA sequence or read concatenation, followed by tag-counting and annotation. The end result produces output containing the homology-based functional annotation and respective gene expression measure signifying how many times sequenced reads were found within the genomic ranges of functional annotations. TASE is a powerful tool to facilitate the process of annotating a given Illumina Solexa sequencing dataset. Our results indicate that both homology-based annotation and tag-count analysis are achieved in very efficient times, providing researchers to delve deep in a given CASAVA-build and maximize information extraction from a sequencing dataset. TASE is specially designed to translate sequence data in a CASAVA-build into functional annotations while producing corresponding gene expression measurements. Achieving such analysis is executed in an ultrafast and highly efficient manner, whether the analysis be a single-read or paired-end sequencing experiment. TASE is a user-friendly and freely available application, allowing rapid analysis and annotation of any given Illumina Solexa sequencing dataset with ease.
NASA Astrophysics Data System (ADS)
Sanan, P.; Tackley, P. J.; Gerya, T.; Kaus, B. J. P.; May, D.
2017-12-01
StagBL is an open-source parallel solver and discretization library for geodynamic simulation,encapsulating and optimizing operations essential to staggered-grid finite volume Stokes flow solvers.It provides a parallel staggered-grid abstraction with a high-level interface in C and Fortran.On top of this abstraction, tools are available to define boundary conditions and interact with particle systems.Tools and examples to efficiently solve Stokes systems defined on the grid are provided in small (direct solver), medium (simple preconditioners), and large (block factorization and multigrid) model regimes.By working directly with leading application codes (StagYY, I3ELVIS, and LaMEM) and providing an API and examples to integrate with others, StagBL aims to become a community tool supplying scalable, portable, reproducible performance toward novel science in regional- and planet-scale geodynamics and planetary science.By implementing kernels used by many research groups beneath a uniform abstraction layer, the library will enable optimization for modern hardware, thus reducing community barriers to large- or extreme-scale parallel simulation on modern architectures. In particular, the library will include CPU-, Manycore-, and GPU-optimized variants of matrix-free operators and multigrid components.The common layer provides a framework upon which to introduce innovative new tools.StagBL will leverage p4est to provide distributed adaptive meshes, and incorporate a multigrid convergence analysis tool.These options, in addition to a wealth of solver options provided by an interface to PETSc, will make the most modern solution techniques available from a common interface. StagBL in turn provides a PETSc interface, DMStag, to its central staggered grid abstraction.We present public version 0.5 of StagBL, including preliminary integration with application codes and demonstrations with its own demonstration application, StagBLDemo. Central to StagBL is the notion of an uninterrupted pipeline from toy/teaching codes to high-performance, extreme-scale solves. StagBLDemo replicates the functionality of an advanced MATLAB-style regional geodynamics code, thus providing users with a concrete procedure to exceed the performance and scalability limitations of smaller-scale tools.
Recov'Heat: An estimation tool of urban waste heat recovery potential in sustainable cities
NASA Astrophysics Data System (ADS)
Goumba, Alain; Chiche, Samuel; Guo, Xiaofeng; Colombert, Morgane; Bonneau, Patricia
2017-02-01
Waste heat recovery is considered as an efficient way to increase carbon-free green energy utilization and to reduce greenhouse gas emission. Especially in urban area, several sources such as sewage water, industrial process, waste incinerator plants, etc., are still rarely explored. Their integration into a district heating system providing heating and/or domestic hot water could be beneficial for both energy companies and local governments. EFFICACITY, a French research institute focused on urban energy transition, has developed an estimation tool for different waste heat sources potentially explored in a sustainable city. This article presents the development method of such a decision making tool which, by giving both energetic and economic analysis, helps local communities and energy service companies to make preliminary studies in heat recovery projects.
U.S. Geological Survey science for the Wyoming Landscape Conservation Initiative—2014 annual report
Bowen, Zachary H.; Aldridge, Cameron L.; Anderson, Patrick J.; Assal, Timothy J.; Bartos, Timothy T.; Biewick, Laura R; Boughton, Gregory K.; Chalfoun, Anna D.; Chong, Geneva W.; Dematatis, Marie K.; Eddy-Miller, Cheryl A.; Garman, Steven L.; Germaine, Stephen S.; Homer, Collin G.; Huber, Christopher; Kauffman, Matthew J.; Latysh, Natalie; Manier, Daniel; Melcher, Cynthia P.; Miller, Alexander; Miller, Kirk A.; Olexa, Edward M.; Schell, Spencer; Walters, Annika W.; Wilson, Anna B.; Wyckoff, Teal B.
2015-01-01
Finally, capabilities of the WLCI Web site and the USGS ScienceBase infrastructure were maintained and upgraded to help ensure access to and efficient use of all the WLCI data, products, assessment tools, and outreach materials that have been developed. Of particular note is the completion of three Web applications developed for mapping (1) the 1900−2008 progression of oil and gas development;(2) the predicted distributions of Wyoming’s Species of Greatest Conservation Need; and (3) the locations of coal and wind energy production, sage-grouse distribution and core management areas, and alternative routes for transmission lines within the WLCI region. Collectively, these applications tools provide WLCI planners and managers with powerful tools for better understanding the distributions of wildlife species and potential alternatives for energy development.
A General-purpose Framework for Parallel Processing of Large-scale LiDAR Data
NASA Astrophysics Data System (ADS)
Li, Z.; Hodgson, M.; Li, W.
2016-12-01
Light detection and ranging (LiDAR) technologies have proven efficiency to quickly obtain very detailed Earth surface data for a large spatial extent. Such data is important for scientific discoveries such as Earth and ecological sciences and natural disasters and environmental applications. However, handling LiDAR data poses grand geoprocessing challenges due to data intensity and computational intensity. Previous studies received notable success on parallel processing of LiDAR data to these challenges. However, these studies either relied on high performance computers and specialized hardware (GPUs) or focused mostly on finding customized solutions for some specific algorithms. We developed a general-purpose scalable framework coupled with sophisticated data decomposition and parallelization strategy to efficiently handle big LiDAR data. Specifically, 1) a tile-based spatial index is proposed to manage big LiDAR data in the scalable and fault-tolerable Hadoop distributed file system, 2) two spatial decomposition techniques are developed to enable efficient parallelization of different types of LiDAR processing tasks, and 3) by coupling existing LiDAR processing tools with Hadoop, this framework is able to conduct a variety of LiDAR data processing tasks in parallel in a highly scalable distributed computing environment. The performance and scalability of the framework is evaluated with a series of experiments conducted on a real LiDAR dataset using a proof-of-concept prototype system. The results show that the proposed framework 1) is able to handle massive LiDAR data more efficiently than standalone tools; and 2) provides almost linear scalability in terms of either increased workload (data volume) or increased computing nodes with both spatial decomposition strategies. We believe that the proposed framework provides valuable references on developing a collaborative cyberinfrastructure for processing big earth science data in a highly scalable environment.
MilxXplore: a web-based system to explore large imaging datasets
Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J
2013-01-01
Objective As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. Materials and methods MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Discussion Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. Conclusions MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis. PMID:23775173
Efficient design of multituned transmission line NMR probes: the electrical engineering approach.
Frydel, J A; Krzystyniak, M; Pienkowski, D; Pietrzak, M; de Sousa Amadeu, N; Ratajczyk, T; Idzik, K; Gutmann, T; Tietze, D; Voigt, S; Fenn, A; Limbach, H H; Buntkowsky, G
2011-01-01
Transmission line-based multi-channel solid state NMR probes have many advantages regarding the cost of construction, number of RF-channels, and achievable RF-power levels. Nevertheless, these probes are only rarely employed in solid state-NMR-labs, mainly owing to the difficult experimental determination of the necessary RF-parameters. Here, the efficient design of multi-channel solid state MAS-NMR probes employing transmission line theory and modern techniques of electrical engineering is presented. As technical realization a five-channel ((1)H, (31)P, (13)C, (2)H and (15)N) probe for operation at 7 Tesla is described. This very cost efficient design goal is a multi port single coil transmission line probe based on the design developed by Schaefer and McKay. The electrical performance of the probe is determined by measuring of Scattering matrix parameters (S-parameters) in particular input/output ports. These parameters are compared to the calculated parameters of the design employing the S-matrix formalism. It is shown that the S-matrix formalism provides an excellent tool for examination of transmission line probes and thus the tool for a rational design of these probes. On the other hand, the resulting design provides excellent electrical performance. From a point of view of Nuclear Magnetic Resonance (NMR), calibration spectra of particular ports (channels) are of great importance. The estimation of the π/2 pulses length for all five NMR channels is presented. Copyright © 2011 Elsevier Inc. All rights reserved.
From field notes to data portal - An operational QA/QC framework for tower networks
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.
2016-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.
Sadick, Maliha; Dally, Franz Josef; Schönberg, Stefan O; Stroszczynski, Christian; Wohlgemuth, Walter A
2017-10-01
Background Radiology is an interdisciplinary field dedicated to the diagnosis and treatment of numerous diseases and is involved in the development of multimodal treatment concepts. Method Interdisciplinary case management, a broad spectrum of diagnostic imaging facilities and dedicated endovascular radiological treatment options are valuable tools that allow radiology to set up an interdisciplinary center for vascular anomalies. Results Image-based diagnosis combined with endovascular treatment options is an essential tool for the treatment of patients with highly complex vascular diseases. These vascular anomalies can affect numerous parts of the body so that a multidisciplinary treatment approach is required for optimal patient care. Conclusion This paper discusses the possibilities and challenges regarding effective and efficient patient management in connection with the formation of an interdisciplinary center for vascular anomalies with strengthening of the clinical role of radiologists. Key points · Vascular anomalies, which include vascular tumors and malformations, are complex to diagnose and treat.. · There are far more patients with vascular anomalies requiring therapy than interdisciplinary centers for vascular anomalies - there is currently a shortage of dedicated interdisciplinary centers for vascular anomalies in Germany that can provide dedicated care for affected patients.. · Radiology includes a broad spectrum of diagnostic and minimally invasive therapeutic tools which allow the formation of an interdisciplinary center for vascular anomalies for effective, efficient and comprehensive patient management.. Citation Format · Sadick M, Dally FJ, Schönberg SO et al. Strategies in Interventional Radiology: Formation of an Interdisciplinary Center of Vascular Anomalies - Chances and Challenges for Effective and Efficient Patient Management. Fortschr Röntgenstr 2017; 189: 957 - 966. © Georg Thieme Verlag KG Stuttgart · New York.
Design and Control of Modular Spine-Like Tensegrity Structures
NASA Technical Reports Server (NTRS)
Mirletz, Brian T.; Park, In-Won; Flemons, Thomas E.; Agogino, Adrian K.; Quinn, Roger D.; SunSpiral, Vytas
2014-01-01
We present a methodology enabled by the NASA Tensegrity Robotics Toolkit (NTRT) for the rapid structural design of tensegrity robots in simulation and an approach for developing control systems using central pattern generators, local impedance controllers, and parameter optimization techniques to determine effective locomotion strategies for the robot. Biomimetic tensegrity structures provide advantageous properties to robotic locomotion and manipulation tasks, such as their adaptability and force distribution properties, flexibility, energy efficiency, and access to extreme terrains. While strides have been made in designing insightful static biotensegrity structures, gaining a clear understanding of how a particular structure can efficiently move has been an open problem. The tools in the NTRT enable the rapid exploration of the dynamics of a given morphology, and the links between structure, controllability, and resulting gait efficiency. To highlight the effectiveness of the NTRT at this exploration of morphology and control, we will provide examples from the designs and locomotion of four different modular spine-like tensegrity robots.
Using RFID Positioning Technology to Construct an Automatic Rehabilitation Scheduling Mechanism.
Wang, Ching-Sheng; Hung, Lun-Ping; Yen, Neil Y
2016-01-01
Accurately and efficiently identifying the location of patients during the course of rehabilitation is an important issue. Wireless transmission technology can reach this goal. Tracking technologies such as RFID (Radio frequency identification) can support process improvement and improve efficiencies of rehabilitation. There are few published models or methods to solve the problem of positioning and apply this technology in the rehabilitation center. We propose a mechanism to enhance the accuracy of positioning technology and provide information about turns and obstacles on the path; and user-centered services based on location-aware to enhanced quality care in rehabilitation environment. This paper outlines the requirements and the role of RFID in assisting rehabilitation environment. A prototype RFID hospital support tool is established. It is designed to provide assistance for monitoring rehabilitation patients. It can simultaneously calculate the rehabilitant's location and the duration of treatment, and automatically record the rehabilitation course of the rehabilitant, so as to improve the management efficiency of the rehabilitation program.
Ordinary kriging as a tool to estimate historical daily streamflow records
Farmer, William H.
2016-01-01
Efficient and responsible management of water resources relies on accurate streamflow records. However, many watersheds are ungaged, limiting the ability to assess and understand local hydrology. Several tools have been developed to alleviate this data scarcity, but few provide continuous daily streamflow records at individual streamgages within an entire region. Building on the history of hydrologic mapping, ordinary kriging was extended to predict daily streamflow time series on a regional basis. Pooling parameters to estimate a single, time-invariant characterization of spatial semivariance structure is shown to produce accurate reproduction of streamflow. This approach is contrasted with a time-varying series of variograms, representing the temporal evolution and behavior of the spatial semivariance structure. Furthermore, the ordinary kriging approach is shown to produce more accurate time series than more common, single-index hydrologic transfers. A comparison between topological kriging and ordinary kriging is less definitive, showing the ordinary kriging approach to be significantly inferior in terms of Nash–Sutcliffe model efficiencies while maintaining significantly superior performance measured by root mean squared errors. Given the similarity of performance and the computational efficiency of ordinary kriging, it is concluded that ordinary kriging is useful for first-order approximation of daily streamflow time series in ungaged watersheds.
Exploiting Software Tool Towards Easier Use And Higher Efficiency
NASA Astrophysics Data System (ADS)
Lin, G. H.; Su, J. T.; Deng, Y. Y.
2006-08-01
In developing countries, using data based on instrument made by themselves in maximum extent is very important. It is not only related to maximizing science returns upon prophase investment -- deep accumulations in every aspects but also science output. Based on the idea, we are exploiting a software (called THDP: Tool of Huairou Data Processing). It is used for processing a series of issues, which is met necessary in processing data. This paper discusses its designed purpose, functions, method and specialities. The primary vehicle for general data interpretation is through various techniques of data visualization, techniques of interactive. In the software, we employed Object Oriented approach. It is appropriate to the vehicle. it is imperative that the approach provide not only function, but do so in as convenient a fashion as possible. As result of the software exploiting, it is not only easier to learn data processing for beginner and more convenienter to need further improvement for senior but also increase greatly efficiency in every phrases include analyse, parameter adjusting, result display. Under frame of virtual observatory, for developing countries, we should study more and newer related technologies, which can advance ability and efficiency in science research, like the software we are developing
SkZpipe: A Python3 module to produce efficiently PSF-fitting photometry with DAOPHOT, and much more
NASA Astrophysics Data System (ADS)
Mauro, F.
2017-07-01
In an era characterized by big sky surveys and the availability of large amount of photometric data, it is important for astronomers to have tools to process their data in an efficient, accurate and easy way, minimizing reduction time. We present SkZpipe, a Python3 module designed mainly to process generic data, performing point-spread function (PSF) fitting photometry with the DAOPHOT suite (Stetson 1987). The software has already demonstrated its accuracy and efficiency with the adaptation VVV-SkZ_pipeline (Mauro et al. 2013) for the "VISTA Variables in the Vía Láctea" ESO survey, showing how it can replace the users, avoiding repetitive interaction in all the operations, retaining all of the benefits of the power and accuracy of the DAOPHOT suite, detaching them from the burden of data precessing. This software provides not only a pipeline, but also all the tools to run easily each atomic step of the photometric procedure, to match the results, and to retrieve information from fits headers and the internal instrumental database. We plan to add the support to other photometric softwares in the future.
Evaluation of a Dispatcher's Route Optimization Decision Aid to Avoid Aviation Weather Hazards
NASA Technical Reports Server (NTRS)
Dorneich, Michael C.; Olofinboba, Olu; Pratt, Steve; Osborne, Dannielle; Feyereisen, Thea; Latorella, Kara
2003-01-01
This document describes the results and analysis of the formal evaluation plan for the Honeywell software tool developed under the NASA AWIN (Aviation Weather Information) 'Weather Avoidance using Route Optimization as a Decision Aid' project. The software tool aims to provide airline dispatchers with a decision aid for selecting optimal routes that avoid weather and other hazards. This evaluation compares and contrasts route selection performance with the AWIN tool to that of subjects using a more traditional dispatcher environment. The evaluation assesses gains in safety, in fuel efficiency of planned routes, and in time efficiency in the pre-flight dispatch process through the use of the AWIN decision aid. In addition, we are interested in how this AWIN tool affects constructs that can be related to performance. The construct of Situation Awareness (SA), workload, trust in an information system, and operator acceptance are assessed using established scales, where these exist, as well as through the evaluation of questionnaire responses and subject comments. The intention of the experiment is to set up a simulated operations area for the dispatchers to work in. They will be given scenarios in which they are presented with stored company routes for a particular city-pair and aircraft type. A diverse set of external weather information sources is represented by a stand-alone display (MOCK), containing the actual historical weather data typically used by dispatchers. There is also the possibility of presenting selected weather data on the route visualization tool. The company routes have not been modified to avoid the weather except in the case of one additional route generated by the Honeywell prototype flight planning system. The dispatcher will be required to choose the most appropriate and efficient flight plan route in the displayed weather conditions. The route may be modified manually or may be chosen from those automatically displayed.
NASA Astrophysics Data System (ADS)
Al Zayed, Islam Sabry; Elagib, Nadir Ahmed
2017-12-01
This study proposes a novel monitoring tool based on Satellite Remote Sensing (SRS) data to examine the status of water distribution and Water Use Efficiency (WUE) under changing water policies in large-scale and complex irrigation schemes. The aim is to improve our understanding of the water-food nexus in such schemes. With a special reference to the Gezira Irrigation Scheme (GeIS) in Sudan during the period 2000-2014, the tool devised herein is well suited for cases where validation data are absent. First, it introduces an index, referred to as the Crop Water Consumption Index (CWCI), to assess the efficiency of water policies. The index is defined as the ratio of actual evapotranspiration (ETa) over agricultural areas to total ETa for the whole scheme where ETa is estimated using the Simplified Surface Energy Balance model (SSEB). Second, the tool uses integrated Normalized Difference Vegetation Index (iNDVI), as a proxy for crop productivity, and ETa to assess the WUE. Third, the tool uses SSEB ETa and NDVI in an attempt to detect wastage of water. Four key results emerged from this research as follows: 1) the WUE has not improved despite the changing agricultural and water policies, 2) the seasonal ETa can be used to detect the drier areas of GeIS, i.e. areas with poor irrigation water supply, 3) the decreasing trends of CWCI, slope of iNDVI-ETa linear regression and iNDVI are indicative of inefficient utilization of irrigation water in the scheme, and 4) it is possible to use SSEB ETa and NDVI to identify channels with spillover problems and detect wastage of rainwater that is not used as a source for irrigation. In conclusion, the innovative tool developed herein has provided important information on the efficiency of a large-scale irrigation scheme to help rationalize laborious water management processes and increase productivity.
Adaptive bill morphology for enhanced tool manipulation in New Caledonian crows
Matsui, Hiroshi; Hunt, Gavin R.; Oberhofer, Katja; Ogihara, Naomichi; McGowan, Kevin J.; Mithraratne, Kumar; Yamasaki, Takeshi; Gray, Russell D.; Izawa, Ei-Ichi
2016-01-01
Early increased sophistication of human tools is thought to be underpinned by adaptive morphology for efficient tool manipulation. Such adaptive specialisation is unknown in nonhuman primates but may have evolved in the New Caledonian crow, which has sophisticated tool manufacture. The straightness of its bill, for example, may be adaptive for enhanced visually-directed use of tools. Here, we examine in detail the shape and internal structure of the New Caledonian crow’s bill using Principal Components Analysis and Computed Tomography within a comparative framework. We found that the bill has a combination of interrelated shape and structural features unique within Corvus, and possibly birds generally. The upper mandible is relatively deep and short with a straight cutting edge, and the lower mandible is strengthened and upturned. These novel combined attributes would be functional for (i) counteracting the unique loading patterns acting on the bill when manipulating tools, (ii) a strong precision grip to hold tools securely, and (iii) enhanced visually-guided tool use. Our findings indicate that the New Caledonian crow’s innovative bill has been adapted for tool manipulation to at least some degree. Early increased sophistication of tools may require the co-evolution of morphology that provides improved manipulatory skills. PMID:26955788
Overview of the H.264/AVC video coding standard
NASA Astrophysics Data System (ADS)
Luthra, Ajay; Topiwala, Pankaj N.
2003-11-01
H.264/MPEG-4 AVC is the latest coding standard jointly developed by the Video Coding Experts Group (VCEG) of ITU-T and Moving Picture Experts Group (MPEG) of ISO/IEC. It uses state of the art coding tools and provides enhanced coding efficiency for a wide range of applications including video telephony, video conferencing, TV, storage (DVD and/or hard disk based), streaming video, digital video creation, digital cinema and others. In this paper an overview of this standard is provided. Some comparisons with the existing standards, MPEG-2 and MPEG-4 Part 2, are also provided.
NASA Technical Reports Server (NTRS)
Phillips, Shaun
1996-01-01
The Graphical Observation Scheduling System (GROSS) and its functionality and editing capabilities are reported on. The GROSS system was developed as a replacement for a suite of existing programs and associated processes with the aim of: providing a software tool that combines the functionality of several of the existing programs, and provides a Graphical User Interface (GUI) that gives greater data visibility and editing capabilities. It is considered that the improved editing capability provided by this approach enhanced the efficiency of the second astronomical Spacelab mission's (ASTRO-2) mission planning.
Peregrine Sustainer Motor Development
NASA Technical Reports Server (NTRS)
Brodell, Chuck; Franklin, Philip
2015-01-01
The Peregrine sounding rocket is an in-house NASA design that provides approximately 15 percent better performance than the motor it replaces. The design utilizes common materials and well-characterized architecture to reduce flight issues encountered with the current motors. It engages NASA design, analysts, test engineers and technicians, ballisticians, and systems engineers. The in-house work and collaboration within the government provides flexibility to efficiently accommodate design and program changes as the design matures and enhances the ability to meet schedule milestones. It provides a valuable tool to compare industry costs, develop contracts, and it develops foundational knowledge for the next generation of NASA engineers.
Quality and Efficiency Improvement Tools for Every Radiologist.
Kudla, Alexei U; Brook, Olga R
2018-06-01
In an era of value-based medicine, data-driven quality improvement is more important than ever to ensure safe and efficient imaging services. Familiarity with high-value tools enables all radiologists to successfully engage in quality and efficiency improvement. In this article, we review the model for improvement, strategies for measurement, and common practical tools with real-life examples that include Run chart, Control chart (Shewhart chart), Fishbone (Cause-and-Effect or Ishikawa) diagram, Pareto chart, 5 Whys, and Root Cause Analysis. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Suárez Álvarez, Óscar; Fernández-Feito, Ana; Vallina Crespo, Henar; Aldasoro Unamuno, Elena; Cofiño, Rafael
2018-05-11
It is essential to develop a comprehensive approach to institutionally promoted interventions to assess their impact on health from the perspective of the social determinants of health and equity. Simple, adapted tools must be developed to carry out these assessments. The aim of this paper is to present two tools to assess the impact of programmes and community-based interventions on the social determinants of health. The first tool is intended to assess health programmes through interviews and analysis of information provided by the assessment team. The second tool, by means of online assessments of community-based interventions, also enables a report on inequality issues that includes recommendations for improvement. In addition to reducing health-related social inequities, the implementation of these tools can also help to improve the efficiency of public health interventions. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia
2012-01-01
The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools. PMID:23201980
FLaapLUC: A pipeline for the generation of prompt alerts on transient Fermi-LAT γ-ray sources
NASA Astrophysics Data System (ADS)
Lenain, J.-P.
2018-01-01
The large majority of high energy sources detected with Fermi-LAT are blazars, which are known to be very variable sources. High cadence long-term monitoring simultaneously at different wavelengths being prohibitive, the study of their transient activities can help shedding light on our understanding of these objects. The early detection of such potentially fast transient events is the key for triggering follow-up observations at other wavelengths. A Python tool, FLaapLUC, built on top of the Science Tools provided by the Fermi Science Support Center and the Fermi-LAT collaboration, has been developed using a simple aperture photometry approach. This tool can effectively detect relative flux variations in a set of predefined sources and alert potential users. Such alerts can then be used to trigger target of opportunity observations with other facilities. It is shown that FLaapLUC is an efficient tool to reveal transient events in Fermi-LAT data, providing quick results which can be used to promptly organise follow-up observations. Results from this simple aperture photometry method are also compared to full likelihood analyses. The FLaapLUC package is made available on GitHub and is open to contributions by the community.
Muellner, Ulrich J; Vial, Flavie; Wohlfender, Franziska; Hadorn, Daniela; Reist, Martin; Muellner, Petra
2015-01-01
The reporting of outputs from health surveillance systems should be done in a near real-time and interactive manner in order to provide decision makers with powerful means to identify, assess, and manage health hazards as early and efficiently as possible. While this is currently rarely the case in veterinary public health surveillance, reporting tools do exist for the visual exploration and interactive interrogation of health data. In this work, we used tools freely available from the Google Maps and Charts library to develop a web application reporting health-related data derived from slaughterhouse surveillance and from a newly established web-based equine surveillance system in Switzerland. Both sets of tools allowed entry-level usage without or with minimal programing skills while being flexible enough to cater for more complex scenarios for users with greater programing skills. In particular, interfaces linking statistical softwares and Google tools provide additional analytical functionality (such as algorithms for the detection of unusually high case occurrences) for inclusion in the reporting process. We show that such powerful approaches could improve timely dissemination and communication of technical information to decision makers and other stakeholders and could foster the early-warning capacity of animal health surveillance systems.
A web-based rapid assessment tool for production publishing solutions
NASA Astrophysics Data System (ADS)
Sun, Tong
2010-02-01
Solution assessment is a critical first-step in understanding and measuring the business process efficiency enabled by an integrated solution package. However, assessing the effectiveness of any solution is usually a very expensive and timeconsuming task which involves lots of domain knowledge, collecting and understanding the specific customer operational context, defining validation scenarios and estimating the expected performance and operational cost. This paper presents an intelligent web-based tool that can rapidly assess any given solution package for production publishing workflows via a simulation engine and create a report for various estimated performance metrics (e.g. throughput, turnaround time, resource utilization) and operational cost. By integrating the digital publishing workflow ontology and an activity based costing model with a Petri-net based workflow simulation engine, this web-based tool allows users to quickly evaluate any potential digital publishing solutions side-by-side within their desired operational contexts, and provides a low-cost and rapid assessment for organizations before committing any purchase. This tool also benefits the solution providers to shorten the sales cycles, establishing a trustworthy customer relationship and supplement the professional assessment services with a proven quantitative simulation and estimation technology.
He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z
2013-12-04
Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.
Entry one: striving for best practice in professional assessment.
English, Justin M; Mykyta, Lu
2002-01-01
The aim of this study was to develop a best practice model of professional assessment to ensure efficient and effective delivery of home-based services to frail and disabled elders. In 2000, an innovative model of professional assessment was introduced by one of Australia's largest providers of home-based care in order to reduce multiple assessments and to reduce the utilisation of assessment as a gatekeeping tool for limiting access to services. Data was analysed from a random sample of 1500 clients drawn from a population of 5000 as well as through the use of a survey tool administered to the Organisation's assessment staff and other key stakeholders. Results revealed that, contrary to popular belief, carer advocacy plays a significant role in the professional assessment process to the point that clients with carers received significantly more services and service time that clients without such support. However, if not monitored, assessment can also be used as a gate-keeping tool as opposed to one that can provide significant benefits to the consumers through comprehensive need articulation. We argue that the "professional" approach does not preclude empowerment and that assessment should not be used as a gate-keeping tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khaleel, Mohammad A.; Lin, Zijing; Singh, Prabhakar
2004-05-03
A 3D simulation tool for modeling solid oxide fuel cells is described. The tool combines the versatility and efficiency of a commercial finite element analysis code, MARC{reg_sign}, with an in-house developed robust and flexible electrochemical (EC) module. Based upon characteristic parameters obtained experimentally and assigned by the user, the EC module calculates the current density distribution, heat generation, and fuel and oxidant species concentration, taking the temperature profile provided by MARC{reg_sign} and operating conditions such as the fuel and oxidant flow rate and the total stack output voltage or current as the input. MARC{reg_sign} performs flow and thermal analyses basedmore » on the initial and boundary thermal and flow conditions and the heat generation calculated by the EC module. The main coupling between MARC{reg_sign} and EC is for MARC{reg_sign} to supply the temperature field to EC and for EC to give the heat generation profile to MARC{reg_sign}. The loosely coupled, iterative scheme is advantageous in terms of memory requirement, numerical stability and computational efficiency. The coupling is iterated to self-consistency for a steady-state solution. Sample results for steady states as well as the startup process for stacks with different flow designs are presented to illustrate the modeling capability and numerical performance characteristic of the simulation tool.« less
Aligator: A computational tool for optimizing total chemical synthesis of large proteins.
Jacobsen, Michael T; Erickson, Patrick W; Kay, Michael S
2017-09-15
The scope of chemical protein synthesis (CPS) continues to expand, driven primarily by advances in chemical ligation tools (e.g., reversible solubilizing groups and novel ligation chemistries). However, the design of an optimal synthesis route can be an arduous and fickle task due to the large number of theoretically possible, and in many cases problematic, synthetic strategies. In this perspective, we highlight recent CPS tool advances and then introduce a new and easy-to-use program, Aligator (Automated Ligator), for analyzing and designing the most efficient strategies for constructing large targets using CPS. As a model set, we selected the E. coli ribosomal proteins and associated factors for computational analysis. Aligator systematically scores and ranks all feasible synthetic strategies for a particular CPS target. The Aligator script methodically evaluates potential peptide segments for a target using a scoring function that includes solubility, ligation site quality, segment lengths, and number of ligations to provide a ranked list of potential synthetic strategies. We demonstrate the utility of Aligator by analyzing three recent CPS projects from our lab: TNFα (157 aa), GroES (97 aa), and DapA (312 aa). As the limits of CPS are extended, we expect that computational tools will play an increasingly important role in the efficient execution of ambitious CPS projects such as production of a mirror-image ribosome. Copyright © 2017 Elsevier Ltd. All rights reserved.
AZTEC. Parallel Iterative method Software for Solving Linear Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchinson, S.; Shadid, J.; Tuminaro, R.
1995-07-01
AZTEC is an interactive library that greatly simplifies the parrallelization process when solving the linear systems of equations Ax=b where A is a user supplied n X n sparse matrix, b is a user supplied vector of length n and x is a vector of length n to be computed. AZTEC is intended as a software tool for users who want to avoid cumbersome parallel programming details but who have large sparse linear systems which require an efficiently utilized parallel processing system. A collection of data transformation tools are provided that allow for easy creation of distributed sparse unstructured matricesmore » for parallel solutions.« less
Citizens unite for computational immunology!
Belden, Orrin S; Baker, Sarah Catherine; Baker, Brian M
2015-07-01
Recruiting volunteers who can provide computational time, programming expertise, or puzzle-solving talent has emerged as a powerful tool for biomedical research. Recent projects demonstrate the potential for such 'crowdsourcing' efforts in immunology. Tools for developing applications, new funding opportunities, and an eager public make crowdsourcing a serious option for creative solutions for computationally-challenging problems. Expanded uses of crowdsourcing in immunology will allow for more efficient large-scale data collection and analysis. It will also involve, inspire, educate, and engage the public in a variety of meaningful ways. The benefits are real - it is time to jump in! Copyright © 2015 Elsevier Ltd. All rights reserved.
Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)
NASA Technical Reports Server (NTRS)
Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.
2003-01-01
A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.
S3D: An interactive surface grid generation tool
NASA Technical Reports Server (NTRS)
Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David
1992-01-01
S3D, an interactive software tool for surface grid generation, is described. S3D provides the means with which a geometry definition based either on a discretized curve set or a rectangular set can be quickly processed towards the generation of a surface grid for computational fluid dynamics (CFD) applications. This is made possible as a result of implementing commonly encountered surface gridding tasks in an environment with a highly efficient and user friendly graphical interface. Some of the more advanced features of S3D include surface-surface intersections, optimized surface domain decomposition and recomposition, and automated propagation of edge distributions to surrounding grids.
A high order approach to flight software development and testing
NASA Technical Reports Server (NTRS)
Steinbacher, J.
1981-01-01
The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.
High throughput SNP discovery and genotyping in hexaploid wheat.
Rimbert, Hélène; Darrier, Benoît; Navarro, Julien; Kitt, Jonathan; Choulet, Frédéric; Leveugle, Magalie; Duarte, Jorge; Rivière, Nathalie; Eversole, Kellye; Le Gouis, Jacques; Davassi, Alessandro; Balfourier, François; Le Paslier, Marie-Christine; Berard, Aurélie; Brunel, Dominique; Feuillet, Catherine; Poncet, Charles; Sourdille, Pierre; Paux, Etienne
2018-01-01
Because of their abundance and their amenability to high-throughput genotyping techniques, Single Nucleotide Polymorphisms (SNPs) are powerful tools for efficient genetics and genomics studies, including characterization of genetic resources, genome-wide association studies and genomic selection. In wheat, most of the previous SNP discovery initiatives targeted the coding fraction, leaving almost 98% of the wheat genome largely unexploited. Here we report on the use of whole-genome resequencing data from eight wheat lines to mine for SNPs in the genic, the repetitive and non-repetitive intergenic fractions of the wheat genome. Eventually, we identified 3.3 million SNPs, 49% being located on the B-genome, 41% on the A-genome and 10% on the D-genome. We also describe the development of the TaBW280K high-throughput genotyping array containing 280,226 SNPs. Performance of this chip was examined by genotyping a set of 96 wheat accessions representing the worldwide diversity. Sixty-nine percent of the SNPs can be efficiently scored, half of them showing a diploid-like clustering. The TaBW280K was proven to be a very efficient tool for diversity analyses, as well as for breeding as it can discriminate between closely related elite varieties. Finally, the TaBW280K array was used to genotype a population derived from a cross between Chinese Spring and Renan, leading to the construction a dense genetic map comprising 83,721 markers. The results described here will provide the wheat community with powerful tools for both basic and applied research.
Shallow aquifer storage and recovery (SASR): Initial findings from the Willamette Basin, Oregon
NASA Astrophysics Data System (ADS)
Neumann, P.; Haggerty, R.
2012-12-01
A novel mode of shallow aquifer management could increase the volumetric potential and distribution of groundwater storage. We refer to this mode as shallow aquifer storage and recovery (SASR) and gauge its potential as a freshwater storage tool. By this mode, water is stored in hydraulically connected aquifers with minimal impact to surface water resources. Basin-scale numerical modeling provides a linkage between storage efficiency and hydrogeological parameters, which in turn guides rulemaking for how and where water can be stored. Increased understanding of regional groundwater-surface water interactions is vital to effective SASR implementation. In this study we (1) use a calibrated model of the central Willamette Basin (CWB), Oregon to quantify SASR storage efficiency at 30 locations; (2) estimate SASR volumetric storage potential throughout the CWB based on these results and pertinent hydrogeological parameters; and (3) introduce a methodology for management of SASR by such parameters. Of 3 shallow, sedimentary aquifers in the CWB, we find the moderately conductive, semi-confined, middle sedimentary unit (MSU) to be most efficient for SASR. We estimate that users overlying 80% of the area in this aquifer could store injected water with greater than 80% efficiency, and find efficiencies of up to 95%. As a function of local production well yields, we estimate a maximum annual volumetric storage potential of 30 million m3 using SASR in the MSU. This volume constitutes roughly 9% of the current estimated summer pumpage in the Willamette basin at large. The dimensionless quantity lag #—calculated using modeled specific capacity, distance to nearest in-layer stream boundary, and injection duration—exhibits relatively high correlation to SASR storage efficiency at potential locations in the CWB. This correlation suggests that basic field measurements could guide SASR as an efficient shallow aquifer storage tool.
A Fixed Point VHDL Component Library for a High Efficiency Reconfigurable Radio Design Methodology
NASA Technical Reports Server (NTRS)
Hoy, Scott D.; Figueiredo, Marco A.
2006-01-01
Advances in Field Programmable Gate Array (FPGA) technologies enable the implementation of reconfigurable radio systems for both ground and space applications. The development of such systems challenges the current design paradigms and requires more robust design techniques to meet the increased system complexity. Among these techniques is the development of component libraries to reduce design cycle time and to improve design verification, consequently increasing the overall efficiency of the project development process while increasing design success rates and reducing engineering costs. This paper describes the reconfigurable radio component library developed at the Software Defined Radio Applications Research Center (SARC) at Goddard Space Flight Center (GSFC) Microwave and Communications Branch (Code 567). The library is a set of fixed-point VHDL components that link the Digital Signal Processing (DSP) simulation environment with the FPGA design tools. This provides a direct synthesis path based on the latest developments of the VHDL tools as proposed by the BEE VBDL 2004 which allows for the simulation and synthesis of fixed-point math operations while maintaining bit and cycle accuracy. The VHDL Fixed Point Reconfigurable Radio Component library does not require the use of the FPGA vendor specific automatic component generators and provide a generic path from high level DSP simulations implemented in Mathworks Simulink to any FPGA device. The access to the component synthesizable, source code provides full design verification capability:
Influence of Process Parameters on the Process Efficiency in Laser Metal Deposition Welding
NASA Astrophysics Data System (ADS)
Güpner, Michael; Patschger, Andreas; Bliedtner, Jens
Conventionally manufactured tools are often completely constructed of a high-alloyed, expensive tool steel. An alternative way to manufacture tools is the combination of a cost-efficient, mild steel and a functional coating in the interaction zone of the tool. Thermal processing methods, like laser metal deposition, are always characterized by thermal distortion. The resistance against the thermal distortion decreases with the reduction of the material thickness. As a consequence, there is a necessity of a special process management for the laser based coating of thin parts or tools. The experimental approach in the present paper is to keep the energy and the mass per unit length constant by varying the laser power, the feed rate and the powder mass flow. The typical seam parameters are measured in order to characterize the cladding process, define process limits and evaluate the process efficiency. Ways to optimize dilution, angular distortion and clad height are presented.
Agents for Plan Monitoring and Repair
2003-04-01
events requires time and effort. In this paper, we describe how Heracles and Theseus , two information gathering and monitoring tools that we built...on an information agent platform, called Theseus , that provides the technology for efficiently executing agents for information gather- ing and...we can easily define a system for interactively planning a trip. The second is the Theseus information agent platform [Barish et al., 2000], which
ALMA Array Operations Group process overview
NASA Astrophysics Data System (ADS)
Barrios, Emilio; Alarcon, Hector
2016-07-01
ALMA Science operations activities in Chile are responsibility of the Department of Science Operations, which consists of three groups, the Array Operations Group (AOG), the Program Management Group (PMG) and the Data Management Group (DMG). The AOG includes the Array Operators and have the mission to provide support for science observations, operating safely and efficiently the array. The poster describes the AOG process, management and operational tools.
1988-09-01
The current prototyping tool also provides a multiversion data object control mechanism. In a real-time database system, synchronization protocols...data in distributed real-time systems. The semantic informa- tion of read-only transactions is exploited for improved efficiency, and a multiversion ...are discussed. ." Index Terms: distributed system, replication, read-only transaction, consistency, multiversion . I’ I’ I’ 4. -9- I I I ° e% 4, 1
Scaling Support Vector Machines On Modern HPC Platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
You, Yang; Fu, Haohuan; Song, Shuaiwen
2015-02-01
We designed and implemented MIC-SVM, a highly efficient parallel SVM for x86 based multicore and many-core architectures, such as the Intel Ivy Bridge CPUs and Intel Xeon Phi co-processor (MIC). We propose various novel analysis methods and optimization techniques to fully utilize the multilevel parallelism provided by these architectures and serve as general optimization methods for other machine learning tools.
Natural language processing and the Now-or-Never bottleneck.
Gómez-Rodríguez, Carlos
2016-01-01
Researchers, motivated by the need to improve the efficiency of natural language processing tools to handle web-scale data, have recently arrived at models that remarkably match the expected features of human language processing under the Now-or-Never bottleneck framework. This provides additional support for said framework and highlights the research potential in the interaction between applied computational linguistics and cognitive science.
Crux: Rapid Open Source Protein Tandem Mass Spectrometry Analysis
2015-01-01
Efficiently and accurately analyzing big protein tandem mass spectrometry data sets requires robust software that incorporates state-of-the-art computational, machine learning, and statistical methods. The Crux mass spectrometry analysis software toolkit (http://cruxtoolkit.sourceforge.net) is an open source project that aims to provide users with a cross-platform suite of analysis tools for interpreting protein mass spectrometry data. PMID:25182276
ERIC Educational Resources Information Center
Buchholz, Jesse
2017-01-01
Increasing a correctional offender's mindset, resilience, and self-efficacy can be accomplished through the efficient use of technology within correctional education. Correctional facilities that employ the use of technology have the capacity to provide offenders with a tool that will serve them while they are incarcerated and again when they are…
2010-12-01
Simulation of Free -Field Blast ........................................................................45 27. (a) Peak Incident Pressure and (b...several types of problems involving blast propagation. Mastin et al. (1995) compared CTH simulations to free -field incident pressure as predicted by...a measure of accuracy and efficiency. To provide this direct comparison, a series of 2D-axisymmetric free -field air blast simulations were
Improving the accuracy of total quality management instruments.
Bechtel, G A; Wood, D
1996-03-01
Total quality management (TQM) instruments are essential tools in defining concepts identified in an Ishikawa or ¿cause-and-effect¿ diagram. Collecting meaningful and accurate data using TQM instruments is imperative if productivity and quality of care are to be enhanced. This article provides managers with techniques and guidelines that will enhance the reliability and validity of TQM instruments, thereby promoting organization efficiency and customer satisfaction.
STSE: Spatio-Temporal Simulation Environment Dedicated to Biology.
Stoma, Szymon; Fröhlich, Martina; Gerber, Susanne; Klipp, Edda
2011-04-28
Recently, the availability of high-resolution microscopy together with the advancements in the development of biomarkers as reporters of biomolecular interactions increased the importance of imaging methods in molecular cell biology. These techniques enable the investigation of cellular characteristics like volume, size and geometry as well as volume and geometry of intracellular compartments, and the amount of existing proteins in a spatially resolved manner. Such detailed investigations opened up many new areas of research in the study of spatial, complex and dynamic cellular systems. One of the crucial challenges for the study of such systems is the design of a well stuctured and optimized workflow to provide a systematic and efficient hypothesis verification. Computer Science can efficiently address this task by providing software that facilitates handling, analysis, and evaluation of biological data to the benefit of experimenters and modelers. The Spatio-Temporal Simulation Environment (STSE) is a set of open-source tools provided to conduct spatio-temporal simulations in discrete structures based on microscopy images. The framework contains modules to digitize, represent, analyze, and mathematically model spatial distributions of biochemical species. Graphical user interface (GUI) tools provided with the software enable meshing of the simulation space based on the Voronoi concept. In addition, it supports to automatically acquire spatial information to the mesh from the images based on pixel luminosity (e.g. corresponding to molecular levels from microscopy images). STSE is freely available either as a stand-alone version or included in the linux live distribution Systems Biology Operational Software (SB.OS) and can be downloaded from http://www.stse-software.org/. The Python source code as well as a comprehensive user manual and video tutorials are also offered to the research community. We discuss main concepts of the STSE design and workflow. We demonstrate it's usefulness using the example of a signaling cascade leading to formation of a morphological gradient of Fus3 within the cytoplasm of the mating yeast cell Saccharomyces cerevisiae. STSE is an efficient and powerful novel platform, designed for computational handling and evaluation of microscopic images. It allows for an uninterrupted workflow including digitization, representation, analysis, and mathematical modeling. By providing the means to relate the simulation to the image data it allows for systematic, image driven model validation or rejection. STSE can be scripted and extended using the Python language. STSE should be considered rather as an API together with workflow guidelines and a collection of GUI tools than a stand alone application. The priority of the project is to provide an easy and intuitive way of extending and customizing software using the Python language.
Ultrasonic/Sonic Rotary-Hammer Drills
NASA Technical Reports Server (NTRS)
Badescu, Mircea; Sherrit, Stewart; Bar-Cohen, Yoseph; Bao, Xiaoqi; Kassab, Steve
2010-01-01
Ultrasonic/sonic rotary-hammer drill (USRoHD) is a recent addition to the collection of apparatuses based on ultrasonic/sonic drill corer (USDC). As described below, the USRoHD has several features, not present in a basic USDC, that increase efficiency and provide some redundancy against partial failure. USDCs and related apparatuses were conceived for boring into, and/or acquiring samples of, rock or other hard, brittle materials of geological interest. They have been described in numerous previous NASA Tech Briefs articles. To recapitulate: A USDC can be characterized as a lightweight, lowpower, piezoelectrically driven jackhammer in which ultrasonic and sonic vibrations are generated and coupled to a tool bit. A basic USDC includes a piezoelectric stack, an ultrasonic transducer horn connected to the stack, a free mass ( free in the sense that it can bounce axially a short distance between hard stops on the horn and the bit), and a tool bit. The piezoelectric stack creates ultrasonic vibrations that are mechanically amplified by the horn. The bouncing of the free mass between the hard stops generates the sonic vibrations. The combination of ultrasonic and sonic vibrations gives rise to a hammering action (and a resulting chiseling action at the tip of the tool bit) that is more effective for drilling than is the microhammering action of ultrasonic vibrations alone. The hammering and chiseling actions are so effective that unlike in conventional twist drilling, little applied axial force is needed to make the apparatus advance into the material of interest. There are numerous potential applications for USDCs and related apparatuses in geological exploration on Earth and on remote planets. In early USDC experiments, it was observed that accumulation of cuttings in a drilled hole causes the rate of penetration of the USDC to decrease steeply with depth, and that the rate of penetration can be increased by removing the cuttings. The USRoHD concept provides for removal of cuttings in the same manner as that of a twist drill: An USRoHD includes a USDC and a motor with gearhead (see figure). The USDC provides the bit hammering and the motor provides the bit rotation. Like a twist drill bit, the shank of the tool bit of the USRoHD is fluted. As in the operation of a twist drill, the rotation of the fluted drill bit removes cuttings from the drilled hole. The USRoHD tool bit is tipped with a replaceable crown having cutting teeth on its front surface. The teeth are shaped to promote fracturing of the rock face through a combination of hammering and rotation of the tool bit. Helical channels on the outer cylindrical surface of the crown serve as a continuation of the fluted surface of the shank, helping to remove cuttings. In the event of a failure of the USDC, the USRoHD can continue to operate with reduced efficiency as a twist drill. Similarly, in the event of a failure of the gearmotor, the USRoHD can continue to operate with reduced efficiency as a USDC.
A Qualitative Analysis Evaluating The Purposes And Practices Of Clinical Documentation
Ho, Y.-X.; Gadd, C. S.; Kohorst, K.L.; Rosenbloom, S.T.
2014-01-01
Summary Objectives An important challenge for biomedical informatics researchers is determining the best approach for healthcare providers to use when generating clinical notes in settings where electronic health record (EHR) systems are used. The goal of this qualitative study was to explore healthcare providers’ and administrators’ perceptions about the purpose of clinical documentation and their own documentation practices. Methods We conducted seven focus groups with a total of 46 subjects composed of healthcare providers and administrators to collect knowledge, perceptions and beliefs about documentation from those who generate and review notes, respectively. Data were analyzed using inductive analysis to probe and classify impressions collected from focus group subjects. Results We observed that both healthcare providers and administrators believe that documentation serves five primary domains: clinical, administrative, legal, research, education. These purposes are tied closely to the nature of the clinical note as a document shared by multiple stakeholders, which can be a source of tension for all parties who must use the note. Most providers reported using a combination of methods to complete their notes in a timely fashion without compromising patient care. While all administrators reported relying on computer-based documentation tools to review notes, they expressed a desire for a more efficient method of extracting relevant data. Conclusions Although clinical documentation has utility, and is valued highly by its users, the development and successful adoption of a clinical documentation tool largely depends on its ability to be smoothly integrated into the provider’s busy workflow, while allowing the provider to generate a note that communicates effectively and efficiently with multiple stakeholders. PMID:24734130
Java-based Graphical User Interface for MAVERIC-II
NASA Technical Reports Server (NTRS)
Seo, Suk Jai
2005-01-01
A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in files and examine the output data later. Users can also view the output stored in output files by calling a plotting program such as gnuplot. A typical scenario of the use of MAVERIC consists of three-steps; editing existing input data files, running MAVERIC, and plotting output results.
Bringing the Virtual Astronomical Observatory to the Education Community
NASA Astrophysics Data System (ADS)
Lawton, B.; Eisenhamer, B.; Mattson, B. J.; Raddick, M. J.
2012-08-01
The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. The Education and Public Outreach (EPO) program for the VAO will be led by the Space Telescope Science Institute in collaboration with the High Energy Astrophysics Science Archive Research Center (HEASARC) EPO program and Johns Hopkins University. VAO EPO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public and education community. Our EPO efforts will be structured to provide uniform access to VAO information, enabling educational and research opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that the VO has already built many tools for EPO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. However, it is not enough to simply provide tools. Tools must meet the needs of the education community and address national education standards in order to be broadly utilized. To determine which tools the VAO will incorporate into the EPO program, needs assessments will be conducted with educators across the U.S.
Overview of Nuclear Physics Data: Databases, Web Applications and Teaching Tools
NASA Astrophysics Data System (ADS)
McCutchan, Elizabeth
2017-01-01
The mission of the United States Nuclear Data Program (USNDP) is to provide current, accurate, and authoritative data for use in pure and applied areas of nuclear science and engineering. This is accomplished by compiling, evaluating, and disseminating extensive datasets. Our main products include the Evaluated Nuclear Structure File (ENSDF) containing information on nuclear structure and decay properties and the Evaluated Nuclear Data File (ENDF) containing information on neutron-induced reactions. The National Nuclear Data Center (NNDC), through the website www.nndc.bnl.gov, provides web-based retrieval systems for these and many other databases. In addition, the NNDC hosts several on-line physics tools, useful for calculating various quantities relating to basic nuclear physics. In this talk, I will first introduce the quantities which are evaluated and recommended in our databases. I will then outline the searching capabilities which allow one to quickly and efficiently retrieve data. Finally, I will demonstrate how the database searches and web applications can provide effective teaching tools concerning the structure of nuclei and how they interact. Work supported by the Office of Nuclear Physics, Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-98CH10886.
Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan
2015-10-29
This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parametersmore » applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.« less
VISAGE Visualization for Integrated Satellite, Airborne and Ground-Based Data Exploration
NASA Technical Reports Server (NTRS)
Conover, Helen; Berendes, Todd; Naeger, Aaron; Maskey, Manil; Gatlin, Patrick; Wingo, Stephanie; Kulkarni, Ajinkya; Gupta, Shivangi; Nagaraj, Sriraksha; Wolff, David;
2017-01-01
The primary goal of the VISAGE project is to facilitate more efficient Earth Science investigations via a tool that can provide visualization and analytic capabilities for diverse coincident datasets. This proof-of-concept project will be centered around the GPM Ground Validation program, which provides a valuable source of intensive, coincident observations of atmospheric phenomena. The data are from a wide variety of ground-based, airborne and satellite instruments, with a wide diversity in spatial and temporal scales, variables, and formats, which makes these data difficult to use together. VISAGE will focus on "golden cases" where most ground instruments were in operation and multiple research aircraft sampled a significant weather event, ideally while the GPM Core Observatory passed overhead. The resulting tools will support physical process studies as well as satellite and model validation.
Automating the application of smart materials for protein crystallization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khurshid, Sahir; Govada, Lata; EL-Sharif, Hazim F.
2015-03-01
The first semi-liquid, non-protein nucleating agent for automated protein crystallization trials is described. This ‘smart material’ is demonstrated to induce crystal growth and will provide a simple, cost-effective tool for scientists in academia and industry. The fabrication and validation of the first semi-liquid nonprotein nucleating agent to be administered automatically to crystallization trials is reported. This research builds upon prior demonstration of the suitability of molecularly imprinted polymers (MIPs; known as ‘smart materials’) for inducing protein crystal growth. Modified MIPs of altered texture suitable for high-throughput trials are demonstrated to improve crystal quality and to increase the probability of successmore » when screening for suitable crystallization conditions. The application of these materials is simple, time-efficient and will provide a potent tool for structural biologists embarking on crystallization trials.« less
Improving the sampling efficiency of Monte Carlo molecular simulations: an evolutionary approach
NASA Astrophysics Data System (ADS)
Leblanc, Benoit; Braunschweig, Bertrand; Toulhoat, Hervé; Lutton, Evelyne
We present a new approach in order to improve the convergence of Monte Carlo (MC) simulations of molecular systems belonging to complex energetic landscapes: the problem is redefined in terms of the dynamic allocation of MC move frequencies depending on their past efficiency, measured with respect to a relevant sampling criterion. We introduce various empirical criteria with the aim of accounting for the proper convergence in phase space sampling. The dynamic allocation is performed over parallel simulations by means of a new evolutionary algorithm involving 'immortal' individuals. The method is bench marked with respect to conventional procedures on a model for melt linear polyethylene. We record significant improvement in sampling efficiencies, thus in computational load, while the optimal sets of move frequencies are liable to allow interesting physical insights into the particular systems simulated. This last aspect should provide a new tool for designing more efficient new MC moves.
ROOT.NET: Using ROOT from .NET languages like C# and F#
NASA Astrophysics Data System (ADS)
Watts, G.
2012-12-01
ROOT.NET provides an interface between Microsoft's Common Language Runtime (CLR) and .NET technology and the ubiquitous particle physics analysis tool, ROOT. ROOT.NET automatically generates a series of efficient wrappers around the ROOT API. Unlike pyROOT, these wrappers are statically typed and so are highly efficient as compared to the Python wrappers. The connection to .NET means that one gains access to the full series of languages developed for the CLR including functional languages like F# (based on OCaml). Many features that make ROOT objects work well in the .NET world are added (properties, IEnumerable interface, LINQ compatibility, etc.). Dynamic languages based on the CLR can be used as well, of course (Python, for example). Additionally it is now possible to access ROOT objects that are unknown to the translation tool. This poster will describe the techniques used to effect this translation, along with performance comparisons, and examples. All described source code is posted on the open source site CodePlex.
Lehotsky, Ákos; Morvai, Júlia; Szilágyi, László; Bánsághi, Száva; Benkó, Alíz; Haidegger, Tamás
2017-07-01
Hand hygiene is probably the most effective tool of nosocomial infection prevention, however, proper feedback and control is needed to develop the individual hand hygiene practice. Assessing the efficiency of modern education tools, and digital demonstration and verification equipment during their wide-range deployment. 1269 healthcare workers took part in a training organized by our team. The training included the assessment of the participants' hand hygiene technique to identify the most often missed areas. The hand hygiene technique was examined by a digital device. 33% of the participants disinfected their hands incorrectly. The most often missed sites are the fingertips (33% on the left hand, 37% on the right hand) and the thumbs (42% on the left hand, 32% on the right hand). The feedback has a fundamental role in the development of the hand hygiene technique. With the usage of electronic devices feedback can be provided efficiently and simply. Orv Hetil. 2017; 158(29): 1143-1148.
NASA Astrophysics Data System (ADS)
Wilson, Eric Lee
Due to increased competition in a world economy, steel companies are currently interested in developing techniques that will allow for the improvement of the steelmaking process, either by increasing output efficiency or by improving the quality of their product, or both. Slag foaming is one practice that has been shown to contribute to both these goals. However, slag foaming is highly dynamic and difficult to model or control. This dissertation describes an effort to use artificial intelligence-based tools (genetic algorithms, fuzzy logic, and neural networks) to both model and control the slag foaming process. Specifically, a neural network is trained and tested on slag foaming data provided by a steel plant. This neural network model is then controlled by a fuzzy logic controller, which in turn is optimized by a genetic algorithm. This tuned controller is then installed at a steel plant and given control be a more efficient slag foaming controller than what was previously used by the steel plant.
A Computational Framework for Efficient Low Temperature Plasma Simulations
NASA Astrophysics Data System (ADS)
Verma, Abhishek Kumar; Venkattraman, Ayyaswamy
2016-10-01
Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.
[Advances in CRISPR-Cas-mediated genome editing system in plants].
Wang, Chun; Wang, Kejian
2017-10-25
Targeted genome editing technology is an important tool to study the function of genes and to modify organisms at the genetic level. Recently, CRISPR-Cas (clustered regularly interspaced short palindromic repeats and CRISPR-associated proteins) system has emerged as an efficient tool for specific genome editing in animals and plants. CRISPR-Cas system uses CRISPR-associated endonuclease and a guide RNA to generate double-strand breaks at the target DNA site, subsequently leading to genetic modifications. CRISPR-Cas system has received widespread attention for manipulating the genomes with simple, easy and high specificity. This review summarizes recent advances of diverse applications of the CRISPR-Cas toolkit in plant research and crop breeding, including expanding the range of genome editing, precise editing of a target base, and efficient DNA-free genome editing technology. This review also discusses the potential challenges and application prospect in the future, and provides a useful reference for researchers who are interested in this field.
A collaborative framework for Distributed Privacy-Preserving Support Vector Machine learning.
Que, Jialan; Jiang, Xiaoqian; Ohno-Machado, Lucila
2012-01-01
A Support Vector Machine (SVM) is a popular tool for decision support. The traditional way to build an SVM model is to estimate parameters based on a centralized repository of data. However, in the field of biomedicine, patient data are sometimes stored in local repositories or institutions where they were collected, and may not be easily shared due to privacy concerns. This creates a substantial barrier for researchers to effectively learn from the distributed data using machine learning tools like SVMs. To overcome this difficulty and promote efficient information exchange without sharing sensitive raw data, we developed a Distributed Privacy Preserving Support Vector Machine (DPP-SVM). The DPP-SVM enables privacy-preserving collaborative learning, in which a trusted server integrates "privacy-insensitive" intermediary results. The globally learned model is guaranteed to be exactly the same as learned from combined data. We also provide a free web-service (http://privacy.ucsd.edu:8080/ppsvm/) for multiple participants to collaborate and complete the SVM-learning task in an efficient and privacy-preserving manner.
Wang, Meng; Keeley, Ryan; Zalivina, Nadezhda; Halfhide, Trina; Scott, Kathleen; Zhang, Qiong; van der Steen, Peter; Ergas, Sarina J
2018-07-01
The synergistic activity of algae and prokaryotic microorganisms can be used to improve the efficiency of biological wastewater treatment, particularly with regards to nitrogen removal. For example, algae can provide oxygen through photosynthesis needed for aerobic degradation of organic carbon and nitrification and harvested algal-prokaryotic biomass can be used to produce high value chemicals or biogas. Algal-prokaryotic consortia have been used to treat wastewater in different types of reactors, including waste stabilization ponds, high rate algal ponds and closed photobioreactors. This review addresses the current literature and identifies research gaps related to the following topics: 1) the complex interactions between algae and prokaryotes in wastewater treatment; 2) advances in bioreactor technologies that can achieve high nitrogen removal efficiencies in small reactor volumes, such as algal-prokaryotic biofilm reactors and enhanced algal-prokaryotic treatment systems (EAPS); 3) molecular tools that have expanded our understanding of the activities of algal and prokaryotic communities in wastewater treatment processes. Copyright © 2018 Elsevier Ltd. All rights reserved.
Hoefling, Martin; Lima, Nicola; Haenni, Dominik; Seidel, Claus A. M.; Schuler, Benjamin; Grubmüller, Helmut
2011-01-01
Förster Resonance Energy Transfer (FRET) experiments probe molecular distances via distance dependent energy transfer from an excited donor dye to an acceptor dye. Single molecule experiments not only probe average distances, but also distance distributions or even fluctuations, and thus provide a powerful tool to study biomolecular structure and dynamics. However, the measured energy transfer efficiency depends not only on the distance between the dyes, but also on their mutual orientation, which is typically inaccessible to experiments. Thus, assumptions on the orientation distributions and averages are usually made, limiting the accuracy of the distance distributions extracted from FRET experiments. Here, we demonstrate that by combining single molecule FRET experiments with the mutual dye orientation statistics obtained from Molecular Dynamics (MD) simulations, improved estimates of distances and distributions are obtained. From the simulated time-dependent mutual orientations, FRET efficiencies are calculated and the full statistics of individual photon absorption, energy transfer, and photon emission events is obtained from subsequent Monte Carlo (MC) simulations of the FRET kinetics. All recorded emission events are collected to bursts from which efficiency distributions are calculated in close resemblance to the actual FRET experiment, taking shot noise fully into account. Using polyproline chains with attached Alexa 488 and Alexa 594 dyes as a test system, we demonstrate the feasibility of this approach by direct comparison to experimental data. We identified cis-isomers and different static local environments as sources of the experimentally observed heterogeneity. Reconstructions of distance distributions from experimental data at different levels of theory demonstrate how the respective underlying assumptions and approximations affect the obtained accuracy. Our results show that dye fluctuations obtained from MD simulations, combined with MC single photon kinetics, provide a versatile tool to improve the accuracy of distance distributions that can be extracted from measured single molecule FRET efficiencies. PMID:21629703
CRISPR-Cpf1: A New Tool for Plant Genome Editing.
Zaidi, Syed Shan-E-Ali; Mahfouz, Magdy M; Mansoor, Shahid
2017-07-01
Clustered regularly interspaced palindromic repeats (CRISPR)-CRISPR-associated proteins (CRISPR-Cas), a groundbreaking genome-engineering tool, has facilitated targeted trait improvement in plants. Recently, CRISPR-CRISPR from Prevotella and Francisella 1 (Cpf1) has emerged as a new tool for efficient genome editing, including DNA-free editing in plants, with higher efficiency, specificity, and potentially wider applications than CRISPR-Cas9. Copyright © 2017 Elsevier Ltd. All rights reserved.
Usadel, Björn; Nagel, Axel; Steinhauser, Dirk; Gibon, Yves; Bläsing, Oliver E; Redestig, Henning; Sreenivasulu, Nese; Krall, Leonard; Hannah, Matthew A; Poree, Fabien; Fernie, Alisdair R; Stitt, Mark
2006-12-18
Microarray technology has become a widely accepted and standardized tool in biology. The first microarray data analysis programs were developed to support pair-wise comparison. However, as microarray experiments have become more routine, large scale experiments have become more common, which investigate multiple time points or sets of mutants or transgenics. To extract biological information from such high-throughput expression data, it is necessary to develop efficient analytical platforms, which combine manually curated gene ontologies with efficient visualization and navigation tools. Currently, most tools focus on a few limited biological aspects, rather than offering a holistic, integrated analysis. Here we introduce PageMan, a multiplatform, user-friendly, and stand-alone software tool that annotates, investigates, and condenses high-throughput microarray data in the context of functional ontologies. It includes a GUI tool to transform different ontologies into a suitable format, enabling the user to compare and choose between different ontologies. It is equipped with several statistical modules for data analysis, including over-representation analysis and Wilcoxon statistical testing. Results are exported in a graphical format for direct use, or for further editing in graphics programs.PageMan provides a fast overview of single treatments, allows genome-level responses to be compared across several microarray experiments covering, for example, stress responses at multiple time points. This aids in searching for trait-specific changes in pathways using mutants or transgenics, analyzing development time-courses, and comparison between species. In a case study, we analyze the results of publicly available microarrays of multiple cold stress experiments using PageMan, and compare the results to a previously published meta-analysis.PageMan offers a complete user's guide, a web-based over-representation analysis as well as a tutorial, and is freely available at http://mapman.mpimp-golm.mpg.de/pageman/. PageMan allows multiple microarray experiments to be efficiently condensed into a single page graphical display. The flexible interface allows data to be quickly and easily visualized, facilitating comparisons within experiments and to published experiments, thus enabling researchers to gain a rapid overview of the biological responses in the experiments.
Tools & Resources | Efficient Windows Collaborative
Selection Tool Mobile App Window Selection Tool Mobile App Use the Window Selection Tool Mobile App for new Window Selection Tool Mobile App. LBNL's RESFEN RESFEN RESFEN is used for calculating the heating and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdelaziz, Omar; Fricke, Brian A; Vineyard, Edward Allan
Commercial refrigeration systems are known to be prone to high leak rates and to consume large amounts of electricity. As such, direct emissions related to refrigerant leakage and indirect emissions resulting from primary energy consumption contribute greatly to their Life Cycle Climate Performance (LCCP). In this paper, an LCCP design tool is used to evaluate the performance of a typical commercial refrigeration system with alternative refrigerants and minor system modifications to provide lower Global Warming Potential (GWP) refrigerant solutions with improved LCCP compared to baseline systems. The LCCP design tool accounts for system performance, ambient temperature, and system load; systemmore » performance is evaluated using a validated vapor compression system simulation tool while ambient temperature and system load are devised from a widely used building energy modeling tool (EnergyPlus). The LCCP design tool also accounts for the change in hourly electricity emission rate to yield an accurate prediction of indirect emissions. The analysis shows that conventional commercial refrigeration system life cycle emissions are largely due to direct emissions associated with refrigerant leaks and that system efficiency plays a smaller role in the LCCP. However, as a transition occurs to low GWP refrigerants, the indirect emissions become more relevant. Low GWP refrigerants may not be suitable for drop-in replacements in conventional commercial refrigeration systems; however some mixtures may be introduced as transitional drop-in replacements. These transitional refrigerants have a significantly lower GWP than baseline refrigerants and as such, improved LCCP. The paper concludes with a brief discussion on the tradeoffs between refrigerant GWP, efficiency and capacity.« less
Singh, Brajesh K; Srivastava, Vineet K
2015-04-01
The main goal of this paper is to present a new approximate series solution of the multi-dimensional (heat-like) diffusion equation with time-fractional derivative in Caputo form using a semi-analytical approach: fractional-order reduced differential transform method (FRDTM). The efficiency of FRDTM is confirmed by considering four test problems of the multi-dimensional time fractional-order diffusion equation. FRDTM is a very efficient, effective and powerful mathematical tool which provides exact or very close approximate solutions for a wide range of real-world problems arising in engineering and natural sciences, modelled in terms of differential equations.
Singh, Brajesh K.; Srivastava, Vineet K.
2015-01-01
The main goal of this paper is to present a new approximate series solution of the multi-dimensional (heat-like) diffusion equation with time-fractional derivative in Caputo form using a semi-analytical approach: fractional-order reduced differential transform method (FRDTM). The efficiency of FRDTM is confirmed by considering four test problems of the multi-dimensional time fractional-order diffusion equation. FRDTM is a very efficient, effective and powerful mathematical tool which provides exact or very close approximate solutions for a wide range of real-world problems arising in engineering and natural sciences, modelled in terms of differential equations. PMID:26064639
Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2016-01-01
An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.
Miyagi, Michiko; Yokoyama, Hirokazu; Hibi, Toshifumi
2007-07-01
An HPLC protocol for sugar microanalysis based on the formation of ultraviolet-absorbing benzoyl chloride derivatives was improved. Here, samples were prepared with a C-8 cartridge and analyzed with a high efficiency ODS column, in which porous spherical silica particles 3 microm in diameter were packed. These devices allowed us to simultaneously quantify multiple sugars and sugar alcohols up to 10 ng/ml and to provide satisfactory separations of some sugars, such as fructose and myo-inositol and sorbitol and mannitol. This protocol, which does not require special apparatuses, should become a powerful tool in sugar research.
A potent effect of observational learning on chimpanzee tool construction
Price, Elizabeth E.; Lambeth, Susan P.; Schapiro, Steve J.; Whiten, Andrew
2009-01-01
Although tool use occurs in diverse species, its complexity may mark an important distinction between humans and other animals. Chimpanzee tool use has many similarities to that seen in humans, yet evidence of the cumulatively complex and constructive technologies common in human populations remains absent in free-ranging chimpanzees. Here we provide the first evidence that chimpanzees have a latent capacity to socially learn to construct a composite tool. Fifty chimpanzees were assigned to one of five demonstration conditions that varied in the amount and type of information available in video footage of a conspecific. Chimpanzees exposed to complete footage of a chimpanzee combining the two components to retrieve a reward learned to combine the tools significantly more than those exposed to more restricted information. In a follow-up test, chimpanzees that constructed tools after watching the complete demonstration tended to do so even when the reward was within reach of the unmodified components, whereas those that spontaneously solved the task (without seeing the modification process) combined only when necessary. Social learning, therefore, had a powerful effect in instilling a marked persistence in the use of a complex technique at the cost of efficiency, inhibiting insightful tool use. PMID:19570785
Development of a Comprehensive Community Nitrogen Oxide Emissions Reduction Toolkit (CCNERT)
NASA Astrophysics Data System (ADS)
Sung, Yong Hoon
The main objective of this study is to research and develop a simplified tool to estimate energy use in a community and its associated effects on air pollution. This tool is intended to predict the impacts of selected energy conservation options and efficiency programs on emission reduction. It is intended to help local government and their residents understand and manage information collection and the procedures to be used. This study presents a broad overview of the community-wide energy use and NOx emissions inventory process. It also presents various simplified procedures to estimate each sector's energy use. In an effort to better understand community-wide energy use and its associated NOx emissions, the City of College Station, Texas, was selected as a case study community for this research. While one community might successfully reduce the production of NOx emissions by adopting electricity efficiency programs in its buildings, another community might be equally successful by changing the mix of fuel sources used to generate electricity, which is consumed by the community. In yet a third community low NOx automobiles may be mandated. Unfortunately, the impact and cost of one strategy over another changes over time as major sources of pollution are reduced. Therefore, this research proposes to help community planners answer these questions and to assist local communities with their NOx emission reduction plans by developing a Comprehensive Community NOx Emissions Reduction Toolkit (CCNERT). The proposed simplified tool could have a substantial impact on reducing NOx emission by providing decision-makers with a preliminary understanding about the impacts of various energy efficiency programs on emissions reductions. To help decision makers, this study has addressed these issues by providing a general framework for examining how a community's non-renewable energy use leads to NOx emissions, by quantifying each end-user's energy usage and its associated NOx emissions, and by evaluating the environmental benefits of various types of energy saving options.
Efficiency and credit ratings: a permutation-information-theory analysis
NASA Astrophysics Data System (ADS)
Fernandez Bariviera, Aurelio; Zunino, Luciano; Belén Guercio, M.; Martinez, Lisana B.; Rosso, Osvaldo A.
2013-08-01
The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity-entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification.
Hernández-Sancho, F; Molinos-Senante, M; Sala-Garrido, R
2011-12-01
Efficiency and productivity are important measures for identifying best practice in businesses and optimising resource-use. This study analyses how these two measures change across the period 2003-2008 for 196 wastewater treatment plants (WWTPs) in Spain, by using the benchmarking methods of Data Envelopment Analysis and the Malmquist Productivity Index. To identify which variables contribute to the sustainability of the WWTPs, differences in efficiency scores and productivity indices for external factors are also investigated. Our results indicate that both efficiency and productivity decreased over the five years. We verify that the productivity drop is primarily explained by technical change. Furthermore, certain external variables affected WWTP efficiency, including plant size, treatment technology and energy consumption. However, plants with low energy consumption are the only ones which improve their productivity. Finally, the benchmarking analyses proved to be useful as management tools in the wastewater sector, by providing vital information for improving the sustainability of plants.
NASA Astrophysics Data System (ADS)
Shokrollahpour, Elsa; Hosseinzadeh Lotfi, Farhad; Zandieh, Mostafa
2016-06-01
Efficiency and quality of services are crucial to today's banking industries. The competition in this section has become increasingly intense, as a result of fast improvements in Technology. Therefore, performance analysis of the banking sectors attracts more attention these days. Even though data envelopment analysis (DEA) is a pioneer approach in the literature as of an efficiency measurement tool and finding benchmarks, it is on the other hand unable to demonstrate the possible future benchmarks. The drawback to it could be that the benchmarks it provides us with, may still be less efficient compared to the more advanced future benchmarks. To cover for this weakness, artificial neural network is integrated with DEA in this paper to calculate the relative efficiency and more reliable benchmarks of one of the Iranian commercial bank branches. Therefore, each branch could have a strategy to improve the efficiency and eliminate the cause of inefficiencies based on a 5-year time forecast.
Open Energy Information System version 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
OpenEIS was created to provide standard methods for authoring, sharing, testing, using, and improving algorithms for operational building energy efficiency with building managers and building owners. OpenEIS is designed as a no-cost/low-cost solution that will propagate the fault detection and diagnostic (FDD) solutions into the marketplace by providing state- of- the-art analytical and diagnostic algorithms. As OpenEIS penetrates the market, demand by control system manufacturers and integrators serving small and medium commercial customers will help push these types of commercial software tool offerings into the broader marketplace.
Topology-Preserving Rigid Transformation of 2D Digital Images.
Ngo, Phuc; Passat, Nicolas; Kenmochi, Yukiko; Talbot, Hugues
2014-02-01
We provide conditions under which 2D digital images preserve their topological properties under rigid transformations. We consider the two most common digital topology models, namely dual adjacency and well-composedness. This paper leads to the proposal of optimal preprocessing strategies that ensure the topological invariance of images under arbitrary rigid transformations. These results and methods are proved to be valid for various kinds of images (binary, gray-level, label), thus providing generic and efficient tools, which can be used in particular in the context of image registration and warping.
Accessing and distributing EMBL data using CORBA (common object request broker architecture).
Wang, L; Rodriguez-Tomé, P; Redaschi, N; McNeil, P; Robinson, A; Lijnzaad, P
2000-01-01
The EMBL Nucleotide Sequence Database is a comprehensive database of DNA and RNA sequences and related information traditionally made available in flat-file format. Queries through tools such as SRS (Sequence Retrieval System) also return data in flat-file format. Flat files have a number of shortcomings, however, and the resources therefore currently lack a flexible environment to meet individual researchers' needs. The Object Management Group's common object request broker architecture (CORBA) is an industry standard that provides platform-independent programming interfaces and models for portable distributed object-oriented computing applications. Its independence from programming languages, computing platforms and network protocols makes it attractive for developing new applications for querying and distributing biological data. A CORBA infrastructure developed by EMBL-EBI provides an efficient means of accessing and distributing EMBL data. The EMBL object model is defined such that it provides a basis for specifying interfaces in interface definition language (IDL) and thus for developing the CORBA servers. The mapping from the object model to the relational schema in the underlying Oracle database uses the facilities provided by PersistenceTM, an object/relational tool. The techniques of developing loaders and 'live object caching' with persistent objects achieve a smart live object cache where objects are created on demand. The objects are managed by an evictor pattern mechanism. The CORBA interfaces to the EMBL database address some of the problems of traditional flat-file formats and provide an efficient means for accessing and distributing EMBL data. CORBA also provides a flexible environment for users to develop their applications by building clients to our CORBA servers, which can be integrated into existing systems.
Accessing and distributing EMBL data using CORBA (common object request broker architecture)
Wang, Lichun; Rodriguez-Tomé, Patricia; Redaschi, Nicole; McNeil, Phil; Robinson, Alan; Lijnzaad, Philip
2000-01-01
Background: The EMBL Nucleotide Sequence Database is a comprehensive database of DNA and RNA sequences and related information traditionally made available in flat-file format. Queries through tools such as SRS (Sequence Retrieval System) also return data in flat-file format. Flat files have a number of shortcomings, however, and the resources therefore currently lack a flexible environment to meet individual researchers' needs. The Object Management Group's common object request broker architecture (CORBA) is an industry standard that provides platform-independent programming interfaces and models for portable distributed object-oriented computing applications. Its independence from programming languages, computing platforms and network protocols makes it attractive for developing new applications for querying and distributing biological data. Results: A CORBA infrastructure developed by EMBL-EBI provides an efficient means of accessing and distributing EMBL data. The EMBL object model is defined such that it provides a basis for specifying interfaces in interface definition language (IDL) and thus for developing the CORBA servers. The mapping from the object model to the relational schema in the underlying Oracle database uses the facilities provided by PersistenceTM, an object/relational tool. The techniques of developing loaders and 'live object caching' with persistent objects achieve a smart live object cache where objects are created on demand. The objects are managed by an evictor pattern mechanism. Conclusions: The CORBA interfaces to the EMBL database address some of the problems of traditional flat-file formats and provide an efficient means for accessing and distributing EMBL data. CORBA also provides a flexible environment for users to develop their applications by building clients to our CORBA servers, which can be integrated into existing systems. PMID:11178259
A high-performance spatial database based approach for pathology imaging algorithm evaluation
Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A.D.; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J.; Saltz, Joel H.
2013-01-01
Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. Aims: (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and loaded into a spatial database. To support efficient data loading, we have implemented a parallel data loading tool that takes advantage of multi-core CPUs to accelerate data injection. The spatial database manages both geometric shapes and image features or classifications, and enables spatial sampling, result comparison, and result aggregation through expressive structured query language (SQL) queries with spatial extensions. To provide scalable and efficient query support, we have employed a shared nothing parallel database architecture, which distributes data homogenously across multiple database partitions to take advantage of parallel computation power and implements spatial indexing to achieve high I/O throughput. Results: Our work proposes a high performance, parallel spatial database platform for algorithm validation and comparison. This platform was evaluated by storing, managing, and comparing analysis results from a set of brain tumor whole slide images. The tools we develop are open source and available to download. Conclusions: Pathology image algorithm validation and comparison are essential to iterative algorithm development and refinement. One critical component is the support for queries involving spatial predicates and comparisons. In our work, we develop an efficient data model and parallel database approach to model, normalize, manage and query large volumes of analytical image result data. Our experiments demonstrate that the data partitioning strategy and the grid-based indexing result in good data distribution across database nodes and reduce I/O overhead in spatial join queries through parallel retrieval of relevant data and quick subsetting of datasets. The set of tools in the framework provide a full pipeline to normalize, load, manage and query analytical results for algorithm evaluation. PMID:23599905
The fastclime Package for Linear Programming and Large-Scale Precision Matrix Estimation in R.
Pang, Haotian; Liu, Han; Vanderbei, Robert
2014-02-01
We develop an R package fastclime for solving a family of regularized linear programming (LP) problems. Our package efficiently implements the parametric simplex algorithm, which provides a scalable and sophisticated tool for solving large-scale linear programs. As an illustrative example, one use of our LP solver is to implement an important sparse precision matrix estimation method called CLIME (Constrained L 1 Minimization Estimator). Compared with existing packages for this problem such as clime and flare, our package has three advantages: (1) it efficiently calculates the full piecewise-linear regularization path; (2) it provides an accurate dual certificate as stopping criterion; (3) it is completely coded in C and is highly portable. This package is designed to be useful to statisticians and machine learning researchers for solving a wide range of problems.
Effective Energy Simulation and Optimal Design of Side-lit Buildings with Venetian Blinds
NASA Astrophysics Data System (ADS)
Cheng, Tian
Venetian blinds are popularly used in buildings to control the amount of incoming daylight for improving visual comfort and reducing heat gains in air-conditioning systems. Studies have shown that the proper design and operation of window systems could result in significant energy savings in both lighting and cooling. However, there is no convenient computer tool that allows effective and efficient optimization of the envelope of side-lit buildings with blinds now. Three computer tools, Adeline, DOE2 and EnergyPlus widely used for the above-mentioned purpose have been experimentally examined in this study. Results indicate that the two former tools give unacceptable accuracy due to unrealistic assumptions adopted while the last one may generate large errors in certain conditions. Moreover, current computer tools have to conduct hourly energy simulations, which are not necessary for life-cycle energy analysis and optimal design, to provide annual cooling loads. This is not computationally efficient, particularly not suitable for optimal designing a building at initial stage because the impacts of many design variations and optional features have to be evaluated. A methodology is therefore developed for efficient and effective thermal and daylighting simulations and optimal design of buildings with blinds. Based on geometric optics and radiosity method, a mathematical model is developed to reasonably simulate the daylighting behaviors of venetian blinds. Indoor illuminance at any reference point can be directly and efficiently computed. They have been validated with both experiments and simulations with Radiance. Validation results show that indoor illuminances computed by the new models agree well with the measured data, and the accuracy provided by them is equivalent to that of Radiance. The computational efficiency of the new models is much higher than that of Radiance as well as EnergyPlus. Two new methods are developed for the thermal simulation of buildings. A fast Fourier transform (FFT) method is presented to avoid the root-searching process in the inverse Laplace transform of multilayered walls. Generalized explicit FFT formulae for calculating the discrete Fourier transform (DFT) are developed for the first time. They can largely facilitate the implementation of FFT. The new method also provides a basis for generating the symbolic response factors. Validation simulations show that it can generate the response factors as accurate as the analytical solutions. The second method is for direct estimation of annual or seasonal cooling loads without the need for tedious hourly energy simulations. It is validated by hourly simulation results with DOE2. Then symbolic long-term cooling load can be created by combining the two methods with thermal network analysis. The symbolic long-term cooling load can keep the design parameters of interest as symbols, which is particularly useful for the optimal design and sensitivity analysis. The methodology is applied to an office building in Hong Kong for the optimal design of building envelope. Design variables such as window-to-wall ratio, building orientation, and glazing optical and thermal properties are included in the study. Results show that the selected design values could significantly impact the energy performance of windows, and the optimal design of side-lit buildings could greatly enhance energy savings. The application example also demonstrates that the developed methodology significantly facilitates the optimal building design and sensitivity analysis, and leads to high computational efficiency.
Neufuss, Johanna; Humle, Tatyana; Cremaschi, Andrea; Kivell, Tracy L
2017-02-01
There has been an enduring interest in primate tool-use and manipulative abilities, most often with the goal of providing insight into the evolution of human manual dexterity, right-hand preference, and what behaviours make humans unique. Chimpanzees (Pan troglodytes) are arguably the most well-studied tool-users amongst non-human primates, and are particularly well-known for their complex nut-cracking behaviour, which has been documented in several West African populations. However, their sister-taxon, the bonobos (Pan paniscus), rarely engage in even simple tool-use and are not known to nut-crack in the wild. Only a few studies have reported tool-use in captive bonobos, including their ability to crack nuts, but details of this complex tool-use behaviour have not been documented before. Here, we fill this gap with the first comprehensive analysis of bonobo nut-cracking in a natural environment at the Lola ya Bonobo sanctuary, Democratic Republic of the Congo. Eighteen bonobos were studied as they cracked oil palm nuts using stone hammers. Individual bonobos showed exclusive laterality for using the hammerstone and there was a significant group-level right-hand bias. The study revealed 15 hand grips for holding differently sized and weighted hammerstones, 10 of which had not been previously described in the literature. Our findings also demonstrated that bonobos select the most effective hammerstones when nut-cracking. Bonobos are efficient nut-crackers and not that different from the renowned nut-cracking chimpanzees of Bossou, Guinea, which also crack oil palm nuts using stones. © 2016 Wiley Periodicals, Inc.
Water Development, Allocation, and Institutions: A Role for Integrated Tools
NASA Astrophysics Data System (ADS)
Ward, F. A.
2008-12-01
Many parts of the world suffer from inadequate water infrastructure, inefficient water allocation, and weak water institutions. Each of these three challenges compounds the burdens imposed by inadequacies associated with the other two. Weak water infrastructure makes it hard to allocate water efficiently and undermines tracking of water rights and use, which blocks effective functioning of water institutions. Inefficient water allocation makes it harder to secure resources to develop new water infrastructure. Poorly developed water institutions undermine the security of water rights, which damages incentives to develop water infrastructure or use water efficiently. This paper reports on the development of a prototype basin scale economic optimization, in which existing water supplies are allocated more efficiently in the short run to provide resources for more efficient long-run water infrastructure development. Preliminary results provide the basis for designing water administrative proposals, building effective water infrastructure, increasing farm income, and meeting transboundary delivery commitments. The application is to the Kabul River Basin in Afghanistan, where food security has been compromised by a history of drought, war, damaged irrigation infrastructure, lack of reservoir storage, inefficient water allocation, and weak water institutions. Results illustrate increases in economic efficiency achievable when development programs simultaneously address interdependencies in water allocation, development, and institutions.
Assessing global resource utilization efficiency in the industrial sector.
Rosen, Marc A
2013-09-01
Designing efficient energy systems, which also meet economic, environmental and other objectives and constraints, is a significant challenge. In a world with finite natural resources and large energy demands, it is important to understand not just actual efficiencies, but also limits to efficiency, as the latter identify margins for efficiency improvement. Energy analysis alone is inadequate, e.g., it yields energy efficiencies that do not provide limits to efficiency. To obtain meaningful and useful efficiencies for energy systems, and to clarify losses, exergy analysis is a beneficial and useful tool. Here, the global industrial sector and industries within it are assessed by using energy and exergy methods. The objective is to improve the understanding of the efficiency of global resource use in the industrial sector and, with this information, to facilitate the development, prioritization and ultimate implementation of rational improvement options. Global energy and exergy flow diagrams for the industrial sector are developed and overall efficiencies for the global industrial sector evaluated as 51% based on energy and 30% based on exergy. Consequently, exergy analysis indicates a less efficient picture of energy use in the global industrial sector than does energy analysis. A larger margin for improvement exists from an exergy perspective, compared to the overly optimistic margin indicated by energy. Copyright © 2012 Elsevier B.V. All rights reserved.
Evaluation of the efficiency and reliability of software generated by code generators
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1994-01-01
There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.
Subnanometer and nanometer catalysts, method for preparing size-selected catalysts
Vajda, Stefan , Pellin, Michael J.; Elam, Jeffrey W [Elmhurst, IL; Marshall, Christopher L [Naperville, IL; Winans, Randall A [Downers Grove, IL; Meiwes-Broer, Karl-Heinz [Roggentin, GR
2012-04-03
Highly uniform cluster based nanocatalysts supported on technologically relevant supports were synthesized for reactions of top industrial relevance. The Pt-cluster based catalysts outperformed the very best reported ODHP catalyst in both activity (by up to two orders of magnitude higher turn-over frequencies) and in selectivity. The results clearly demonstrate that highly dispersed ultra-small Pt clusters precisely localized on high-surface area supports can lead to affordable new catalysts for highly efficient and economic propene production, including considerably simplified separation of the final product. The combined GISAXS-mass spectrometry provides an excellent tool to monitor the evolution of size and shape of nanocatalyst at action under realistic conditions. Also provided are sub-nanometer gold and sub-nanometer to few nm size-selected silver catalysts which possess size dependent tunable catalytic properties in the epoxidation of alkenes. Invented size-selected cluster deposition provides a unique tool to tune material properties by atom-by-atom fashion, which can be stabilized by protective overcoats.
Subnanometer and nanometer catalysts, method for preparing size-selected catalysts
Vajda, Stefan [Lisle, IL; Pellin, Michael J [Naperville, IL; Elam, Jeffrey W [Elmhurst, IL; Marshall, Christopher L [Naperville, IL; Winans, Randall A [Downers Grove, IL; Meiwes-Broer, Karl-Heinz [Roggentin, GR
2012-03-27
Highly uniform cluster based nanocatalysts supported on technologically relevant supports were synthesized for reactions of top industrial relevance. The Pt-cluster based catalysts outperformed the very best reported ODHP catalyst in both activity (by up to two orders of magnitude higher turn-over frequencies) and in selectivity. The results clearly demonstrate that highly dispersed ultra-small Pt clusters precisely localized on high-surface area supports can lead to affordable new catalysts for highly efficient and economic propene production, including considerably simplified separation of the final product. The combined GISAXS-mass spectrometry provides an excellent tool to monitor the evolution of size and shape of nanocatalyst at action under realistic conditions. Also provided are sub-nanometer gold and sub-nanometer to few nm size-selected silver catalysts which possess size dependent tunable catalytic properties in the epoxidation of alkenes. Invented size-selected cluster deposition provides a unique tool to tune material properties by atom-by-atom fashion, which can be stabilized by protective overcoats.
The Case for Case-Mix: A New Construct for Hospital Management
Plomann, Marilyn Peacock; Garzino, Fred R.
1981-01-01
Case-mix is a useful methodology for health care management, planning and control. It provides managers with a powerful tool by providing a framework for relating resource consumption profiles with specific treatment patterns. In the long run, it will assist hospital planners in analyzing the demands which different classes of patients bring to the hospital. Decisions concerning capital financing, facilities planning, new services, and the medical and financial implications of physician activities are more efficiently analyzed within a case-mix framework. In the near term, inventory management, staffing policies and the on-going need for the astute management of cash flow will be postively and decisively affected by the use of case-mix measures. The benefits derived from a case-mix system are not limited to hospitals possessing sophisticated management information systems. The case-mix methodology also provides a useful tool for hospitals with less advanced data processing systems and management practices in applying a variety of management science techniques to their planning and control activities.
NASA Technical Reports Server (NTRS)
Shih, Hsin-Yi; Tien, James S.; Ferkul, Paul (Technical Monitor)
2001-01-01
The recently developed numerical model of concurrent-flow flame spread over thin solids has been used as a simulation tool to help the designs of a space experiment. The two-dimensional and three-dimensional, steady form of the compressible Navier-Stokes equations with chemical reactions are solved. With the coupled multi-dimensional solver of the radiative heat transfer, the model is capable of answering a number of questions regarding the experiment concept and the hardware designs. In this paper, the capabilities of the numerical model are demonstrated by providing the guidance for several experimental designing issues. The test matrix and operating conditions of the experiment are estimated through the modeling results. The three-dimensional calculations are made to simulate the flame-spreading experiment with realistic hardware configuration. The computed detailed flame structures provide the insight to the data collection. In addition, the heating load and the requirements of the product exhaust cleanup for the flow tunnel are estimated with the model. We anticipate that using this simulation tool will enable a more efficient and successful space experiment to be conducted.
NASA Astrophysics Data System (ADS)
Mueller, David S.
2013-04-01
Selection of the appropriate extrapolation methods for computing the discharge in the unmeasured top and bottom parts of a moving-boat acoustic Doppler current profiler (ADCP) streamflow measurement is critical to the total discharge computation. The software tool, extrap, combines normalized velocity profiles from the entire cross section and multiple transects to determine a mean profile for the measurement. The use of an exponent derived from normalized data from the entire cross section is shown to be valid for application of the power velocity distribution law in the computation of the unmeasured discharge in a cross section. Selected statistics are combined with empirically derived criteria to automatically select the appropriate extrapolation methods. A graphical user interface (GUI) provides the user tools to visually evaluate the automatically selected extrapolation methods and manually change them, as necessary. The sensitivity of the total discharge to available extrapolation methods is presented in the GUI. Use of extrap by field hydrographers has demonstrated that extrap is a more accurate and efficient method of determining the appropriate extrapolation methods compared with tools currently (2012) provided in the ADCP manufacturers' software.
IMG ER: a system for microbial genome annotation expert review and curation.
Markowitz, Victor M; Mavromatis, Konstantinos; Ivanova, Natalia N; Chen, I-Min A; Chu, Ken; Kyrpides, Nikos C
2009-09-01
A rapidly increasing number of microbial genomes are sequenced by organizations worldwide and are eventually included into various public genome data resources. The quality of the annotations depends largely on the original dataset providers, with erroneous or incomplete annotations often carried over into the public resources and difficult to correct. We have developed an Expert Review (ER) version of the Integrated Microbial Genomes (IMG) system, with the goal of supporting systematic and efficient revision of microbial genome annotations. IMG ER provides tools for the review and curation of annotations of both new and publicly available microbial genomes within IMG's rich integrated genome framework. New genome datasets are included into IMG ER prior to their public release either with their native annotations or with annotations generated by IMG ER's annotation pipeline. IMG ER tools allow addressing annotation problems detected with IMG's comparative analysis tools, such as genes missed by gene prediction pipelines or genes without an associated function. Over the past year, IMG ER was used for improving the annotations of about 150 microbial genomes.
Identifying and managing inappropriate hospital utilization: a policy synthesis.
Payne, S M
1987-01-01
Utilization review, the assessment of the appropriateness and efficiency of hospital care through review of the medical record, and utilization management, deliberate action by payers or hospital administrators to influence providers of hospital services to increase the efficiency and effectiveness with which services are provided, are valuable but relatively unfamiliar strategies for containing hospital costs. The purpose of this synthesis is to increase awareness of the scope of and potential for these approaches among health services managers and administrators, third-party payers, policy analysts, and health services researchers. The synthesis will assist the reader to trace the conceptual context and the historical development of utilization review from unstructured methods using individual physicians' professional judgment to structured methods using explicit criteria; to establish the context of utilization review and clarify its uses; to understand the concepts and tools used in assessing the efficiency of hospital use; and to select, design, and evaluate utilization review and utilization management programs. The extent of inappropriate (medical unnecessary) hospital utilization and the factors associated with it are described. Implications for managers, providers, and third-party payers in targeting utilization review and in designing and evaluating utilization management programs are discussed. PMID:3121538
SU-E-E-02: An Excel-Based Study Tool for ABR-Style Exams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cline, K; Stanley, D; Defoor, D
2015-06-15
Purpose: As the landscape of learning and testing shifts toward a computer-based environment, a replacement for paper-based methods of studying is desirable. Using Microsoft Excel, a study tool was developed that allows the user to populate multiple-choice questions and then generate an interactive quiz session to answer them. Methods: The code for the tool was written using Microsoft Excel Visual Basic for Applications with the intent that this tool could be implemented by any institution with Excel. The base tool is a template with a setup macro, which builds out the structure based on user’s input. Once the framework ismore » built, the user can input sets of multiple-choice questions, answer choices, and even add figures. The tool can be run in random-question or sequential-question mode for single or multiple courses of study. The interactive session allows the user to select answer choices and immediate feedback is provided. Once the user is finished studying, the tool records the day’s progress by reporting progress statistics useful for trending. Results: Six doctoral students at UTHSCSA have used this tool for the past two months to study for their qualifying exam, which is similar in format and content to the American Board of Radiology (ABR) Therapeutic Part II exam. The students collaborated to create a repository of questions, met weekly to go over these questions, and then used the tool to prepare for their exam. Conclusion: The study tool has provided an effective and efficient way for students to collaborate and be held accountable for exam preparation. The ease of use and familiarity of Excel are important factors for the tool’s use. There are software packages to create similar question banks, but this study tool has no additional cost for those that already have Excel. The study tool will be made openly available.« less
Feedback in Plastic and Reconstructive Surgery Education: Past, Present, and Future.
Connolly, Katharine A; Azouz, Solomon M; Smith, Anthony A
2015-11-01
Education is to be provided efficiently and effectively according to guidelines in the United States by the Accreditation Council for Graduate Medical Education as core competencies and in Canada by the Royal College according to the CanMEDS framework. This article defines formative feedback, reviews the currently available validated feedback tools, and describes the future use of technology to enhance feedback in plastic surgery education.
NASA Technical Reports Server (NTRS)
Badger, Julia M.; Claunch, Charles; Mathis, Frank
2017-01-01
The Modular Autonomous Systems Technology (MAST) framework is a tool for building distributed, hierarchical autonomous systems. Originally intended for the autonomous monitoring and control of spacecraft, this framework concept provides support for variable autonomy, assume-guarantee contracts, and efficient communication between subsystems and a centralized systems manager. MAST was developed at NASA's Johnson Space Center (JSC) and has been applied to an integrated spacecraft example scenario.
Giorgi, Rodorico; Ambrosi, Moira; Toccafondi, Nicola; Baglioni, Piero
2010-08-16
Nanotechnology provides new concepts and materials for the consolidation and protection of wall paintings. In particular, humble calcium and barium hydroxide nanoparticles offer a versatile and highly efficient tool to combat the main degradation processes altering wall paintings. Clear example of the efficacy and potentiality of nanotechnology is represented by the conservation in situ of Maya wall paintings in the archaeological area in Calakmul (Mexico).
Decision Support Tool Prototype for the Enlistment Incentive Review Board: Phase 2
2014-07-01
was conducted by the U.S. Military Academy (USMA) to assess which preferences of youth could be influenced by incentives (Joles, Charbonneau , & Barr...to designing more effective and more efficient incentive strategies. In an attempt to provide this information, Joles, Charbonneau , and Barr (1998...Military Academy. Joles, J., Charbonneau , S., & Barr, D. (1998, February). An enlistment bonus distribution model. West Point, NY: United States
Commercial Building Energy Saver, API
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Tianzhen; Piette, Mary; Lee, Sang Hoon
2015-08-27
The CBES API provides Application Programming Interface to a suite of functions to improve energy efficiency of buildings, including building energy benchmarking, preliminary retrofit analysis using a pre-simulation database DEEP, and detailed retrofit analysis using energy modeling with the EnergyPlus simulation engine. The CBES API is used to power the LBNL CBES Web App. It can be adopted by third party developers and vendors into their software tools and platforms.
Quantifying Energy and Water Savings in the U.S. Residential Sector.
Chini, Christopher M; Schreiber, Kelsey L; Barker, Zachary A; Stillwell, Ashlynn S
2016-09-06
Stress on water and energy utilities, including natural resource depletion, infrastructure deterioration, and growing populations, threatens the ability to provide reliable and sustainable service. This study presents a demand-side management decision-making tool to evaluate energy and water efficiency opportunities at the residential level, including both direct and indirect consumption. The energy-water nexus accounts for indirect resource consumption, including water-for-energy and energy-for-water. We examine the relationship between water and energy in common household appliances and fixtures, comparing baseline appliances to ENERGY STAR or WaterSense appliances, using a cost abatement analysis for the average U.S. household, yielding a potential annual per household savings of 7600 kWh and 39 600 gallons, with most upgrades having negative abatement cost. We refine the national average cost abatement curves to understand regional relationships, specifically for the urban environments of Los Angeles, Chicago, and New York. Cost abatement curves display per unit cost savings related to overall direct and indirect energy and water efficiency, allowing utilities, policy makers, and homeowners to consider the relationship between energy and water when making decisions. Our research fills an important gap of the energy-water nexus in a residential unit and provides a decision making tool for policy initiatives.
Development of web tools to disseminate space geodesy data-related products
NASA Astrophysics Data System (ADS)
Soudarin, Laurent; Ferrage, Pascale; Mezerette, Adrien
2015-04-01
In order to promote the products of the DORIS system, the French Space Agency CNES has developed and implemented on the web site of the International DORIS Service (IDS) a set of plot tools to interactively build and display time series of site positions, orbit residuals and terrestrial parameters (scale, geocenter). An interactive global map is also available to select sites, and to get access to their information. Besides the products provided by the CNES Orbitography Team and the IDS components, these tools allow comparing time evolutions of coordinates for collocated DORIS and GNSS stations, thanks to the collaboration with the Terrestrial Frame Combination Center of the International GNSS Service (IGS). A database was created to improve robustness and efficiency of the tools, with the objective to propose a complete web service to foster data exchange with the other geodetic services of the International Association of Geodesy (IAG). The possibility to visualize and compare position time series of the four main space geodetic techniques DORIS, GNSS, SLR and VLBI is already under way at the French level. A dedicated version of these web tools has been developed for the French Space Geodesy Research Group (GRGS). It will give access to position time series provided by the GRGS Analysis Centers involved in DORIS, GNSS, SLR and VLBI data processing for the realization of the International Terrestrial Reference Frame. In this presentation, we will describe the functionalities of these tools, and we will address some aspects of the time series (content, format).
Comparative analytics of infusion pump data across multiple hospital systems.
Catlin, Ann Christine; Malloy, William X; Arthur, Karen J; Gaston, Cindy; Young, James; Fernando, Sudheera; Fernando, Ruchith
2015-02-15
A Web-based analytics system for conducting inhouse evaluations and cross-facility comparisons of alert data generated by smart infusion pumps is described. The Infusion Pump Informatics (IPI) project, a collaborative effort led by research scientists at Purdue University, was launched in 2009 to provide advanced analytics and tools for workflow analyses to assist hospitals in determining the significance of smart-pump alerts and reducing nuisance alerts. The IPI system allows facility-specific analyses of alert patterns and trends, as well as cross-facility comparisons of alert data uploaded by more than 55 participating institutions using different types of smart pumps. Tools accessible through the IPI portal include (1) charts displaying aggregated or breakout data on the top drugs associated with alerts, numbers of alerts per device or care area, and override-to-alert ratios, (2) investigative reports that can be used to characterize and analyze pump-programming errors in a variety of ways (e.g., by drug, by infusion type, by time of day), and (3) "drill-down" workflow analytics enabling users to evaluate alert patterns—both internally and in relation to patterns at other hospitals—in a quick and efficient stepwise fashion. The formation of the IPI analytics system to support a community of hospitals has been successful in providing sophisticated tools for member facilities to review, investigate, and efficiently analyze smart-pump alert data, not only within a member facility but also across other member facilities, to further enhance smart pump drug library design. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.